Odd calculated deco "ripples" (was Re: RFC: color change for calculated deco)
dirk at hohndel.org
Mon Jan 14 08:13:50 PST 2013
I haven't looked at the patch (on my phone) but that seems like the
wrong fix. Add-segment assumes constant depth. So now you do an
unnecessary staircase approximation, right?
On January 14, 2013 8:09:46 AM Linus Torvalds
<torvalds at linux-foundation.org> wrote:
> On Mon, Jan 14, 2013 at 5:16 AM, Robert C. Helling <helling at lmu.de> wrote:
> > This is as usually deco_allowed_depth is called every four seconds. But
> > every fourth time, it takes eight seconds before it is computed again.
> > Looking at the .xml, there is the same pattern for time differences between
> > samples: 4,4,4,8,4,4,4,8 etc. So it seems to me as the origin of the ripples
> > is that in the plot the sample number somehow works as the index rather than
> > the (real)-time.
> Ahhah. So we call deco_allowed_depth() only once per sample, but we
> call add_segment() with our linear interpolation of depth. No problem,
> that works.
> But! The issue is that we interpolate the depth, but we do *not*
> interpolate "min_pressure", which is used for the ceiling
> calculations. Instead, we use the minimum tissue pressure over the
> sample length.
> So basically, "min_pressure" doesn't really match the deco
> calculations for the sample point - min_pressure is a step-wise
> function that is basically the minimum tissue pressure of the current
> sample point and the last one, which the deco calculations have been
> done using interpolated depth. That mixing of depth interpolation and
> the "take the minimum tissue pressure" seems to be the problem. And
> explains why the ripples only happen on the ascent, methinks.
> This patch "fixes" it for the profile. I put "fixes" in quotes,
> because it simply removes the whole inter-sample interpolation. Which
> should be ok for profiles that have samples at least 10 seconds apart,
> but the *planning* does not do that, so who knows..
More information about the subsurface