Odd calculated deco "ripples" (was Re: RFC: color change for calculated deco)

Linus Torvalds torvalds at linux-foundation.org
Mon Jan 14 10:11:57 PST 2013

On Mon, Jan 14, 2013 at 9:29 AM, Dirk Hohndel <dirk at hohndel.org> wrote:
> That still changes the calculation by quite a bit if you are ascending
> at typical 'travel speed' between deco stops - you'll move by 5 feet in
> those ten seconds, right?

Here's another way of explaining my second patch:

 - we calculate the ceiling at every single second, using the
interpolated depth.

 - we then only *save* the ceiling at the points where we have a
profile event (the whole deco_allowed_depth() function doesn't change
any state, so we can just drop it entirely at points that we aren't
going to save)

So no, I do not agree with your "Right?". Because it does not change
the calculations at all: the actual deco calculations are all using
the exact same data.

What it changes is what *ceiling* it shows based on those calculations.

But that's a *visualization* thing, not a calculation thing. And it
actually does it incorrectly.

Why is it incorrect? I'll try to walk through my understanding of it,
by switching things around a bit.

 - the whole "minimum tissue tolerance" thing could equally well be
rewritten to be about "maximum ceiling". And that's easier to think
about (since it's what we actually show), so let's do that. Agreed?

 - so turning "min_pressure" into "max_ceiling", doing the whole
comparison inside the loop means is that we are calculating the
maximum ceiling value for the duration of the last sample. And then
instead of visualizing the ceiling AT THE TIME OF MAXIMUM CEILING, we
visualize that maximal ceiling value AT THE TIME OF THE SAMPLE.

End result: we visualize the ceiling at the wrong time. We visualize
what was *a* ceiling somewhere in between that sample and the previous
one, but we then assign that value to the time of the sample itself.

So it ends up having random odd effects.

And that also explains why you only see the effect during the ascent.
During the descent, the max ceiling will be at the end of our
linearization of the sampling, which is - surprise surprise - the
position of the sample itself. So we end up seeing the right ceiling
at the right time while descending. So the visualization matches the

But during desaturation, the maximum ceiling is not at the end of the
sample period, it's at the beginning. So the whole "max ceiling" thing
has basically turned what should be a smooth graph into something that
approaches being a step-wise graph at each sample. Ergo: a ripple.

And doing the "max_ceiling during the sample interval" thing may sound
like the safe thing to do, but the thing is, that really *is* a false
sense of safety. The ceiling value is *not* what we compute. The
ceiling value is just a visualization of what we computed. Playing
games with it can only make the visualization of the real data worse,
not better.


More information about the subsurface mailing list