Odd calculated deco "ripples" (was Re: RFC: color change for calculated deco)

Dirk Hohndel dirk at hohndel.org
Sun Jan 13 21:57:12 PST 2013

"Robert C. Helling" <helling at atdotde.de> writes:

> On Jan 14, 2013, at 1:50 AM, Linus Torvalds wrote:
> Good morning!
>> Anybody who as a 32-bit build environment all set, and willing to just
>> do a search-and-replace of "double" with "long double" in the deco
>> code? Do the ripples go away?
> I just checked, I have lost my account at the Max Planck Institute where that might have been possible.
> In any case, I would be very surprised if floating point noise is responsible for this as it is too regular (and as you point out likely to be related to the sample rate).
> My plan is to devote my lunch break to this. Currently my money is on a problem either with plotting or with the calling pattern (maybe add_segment() is not really called once of every second of the dive but for those seconds that hit a sample point or something similar?!?).

That could be. Maybe the boundaries aren't done correctly in my code. I
wonder if I am missing the last second of some intervals (or count them
double or something)?


More information about the subsurface mailing list