Re: faster representation of pow on some áreas of subsurface?
Dirk Hohndel
dirk at hohndel.org
Sun Jan 19 14:02:42 UTC 2014
On Jan 19, 2014, at 1:58 PM, Linus Torvalds <torvalds at linux-foundation.org> wrote:
> On Sun, Jan 19, 2014 at 1:41 PM, Dirk Hohndel <dirk at hohndel.org> wrote:
>>
>> OK, I compared the results for a number of test dives and they are identical.
>
> They had better be. The "Stop doing the (very expensive) pow()
> calculation pointlessly" patch should be mathematically identical to
> what we did before, just written slightly differently. So at most, it
> might cause some single-ULP difference due to different order of
> operations or similar. But it does the exact same deco calculations,
> just does them more efficiently.
I know. I read through the patch. It looked right. But just for good measure
I did some testing as well :-)
>> So I’ll take that old patch. Next I’ll look at the test patches.
>
> So the two test patches actually cause different math to be done, and
> might thus show slight differences. I don't think they'll show
> anything that is visible to the human eye, though.
Not so much. See my second email…
> For example, when we don't do the deco calculation every second
> (instead just doing it every sample), that means that we now no longer
> do the depth interpolation between samples:
>
> int depth = interpolate(entry[-1].depth, entry[0].depth, j - t0, t1 - t0);
>
> goes away, and we just do the "add_segment()" with the end entry
> depth, and the time between entries. Now, we *have* done depth
> interpolation to get the actual dive-plot entries (see
> populate_plot_entries()), but that is done at 10s intervals rather
> than 1s intervals.
>
> But quite frankly, while the math changes, I don't think it really
> makes much sense to do depth interpolation (and deco calculations) at
> 1s segments. There's really nothing that says that our interpolated
> values are any better than the sample values, so..
Well, assuming that there is roughly linear movement between the depth,
the 1sec interval does in fact give you a better approximation of the real
situation…
> As far as I can tell, the only real reason for the 1s interpolation
> was that we originally had a deco model that *only* worked in 1s
> increments. You added the ability to do non-1s periods in commit
> 2c336032568 ("Consider previous dives when calculating deco"), because
> then you want to take the surface interval into account (and doing
> *that* one second at a time is insane), but the "within the dive" deco
> was doing that 1s interpolation because of the old limited interface.
That’s possible - I’d still say that a smaller granularity step-function is
a better approximation of the real “smooth” depth graph that you would
get from infinitely small sample intervals.
Whether this difference matters enough to us is a different question,
but at least what is reported to the user is quite significantly different.
/D
More information about the subsurface
mailing list