Issue with duplicate dive sites
Dirk Hohndel
dirk at hohndel.org
Tue Sep 8 07:32:06 PDT 2015
Hey Guido,
For a number of good reasons, our mailing list doesn't point back to itself when
responding. This means one CAN reply privately if one wants to, but one needs
to do Reply-All to keep the mailing list in the loop. I'd prefer to keep most discussions
on the mailing list unless there is a strong reason not to...
> On Sep 8, 2015, at 5:48 AM, Guido Lerch <guido.lerch at gmail.com> wrote:
>
> Could you let me know how I can delete all existing dive sites, they seem not to be stored in
> my logbook. Please see below.
Interesting question. I don't see a real use case for normal users to want to do that.
So adding this as a UI feature seems weird. You can manually remove them from
an XML file - but that's a bit tedious as well (it's easy enough to delete them from
the start of the file - but I haven't checked what Subsurface does when it encounters
references to dive sites in the dives and then doesn't have those sites... so I think
one would have to edit each dive as well... but a little perl or awk script could do
that quite easily.
> 2015-09-08 2:25 GMT+02:00 Dirk Hohndel <dirk at hohndel.org <mailto:dirk at hohndel.org>>:
> On Mon, Sep 07, 2015 at 11:49:02PM +0200, Guido Lerch wrote:
> > Hi Dirk
> >
> > I looked in the dive site stuff ... comments below.
> >
> > Another thought I had is that the interactive menu on the dive table should
> > contain a
> > menu to actually update a dive - what do you think ?
>
> Not sure what you mean by that - can you explain in more detail what you
> have in mind?
>
> it would be cool, in the future, if one could select one or multiple dives and say update
> which would synchronize the Uemis with SubSurface if you for example modified an
> entry on the computer or in SubSurface.
Again, I don't see the broad appeal to have a specialized UI for this. But you can
do this today. Go to the download menu and pick "download all" and "prefer divecomputer"
and then only pick the ones that you want to really get before accepting them. Admittedly,
especially on a Uemis with the incredibly slow and painful download this is not a good
experience.
So if you have a good, solid use case why this is something people would really want
to do, we could consider adding a UI that makes this easier...
> > > This means that your algorithm creating dive sites in the Uemis downloader
> > > doesn't check
> > > if a dive site already exists before creating it.
> >
> > true but I dont know (yet) how to do this, the way you coded that in is
> > that if it maps a dive detail you create
> > a new site (every time) called "from Uemis". At the end of your algorithm
> > you map and mark those dive site,
> > hence at the time you create a new one I am not sure how to check if a dive
> > site exists already.
> > I will look further into it but if you have a hint this would be
> > appreciated.
>
> Yeah, I mentioned this in my first review. Don't use time(NULL) - this has
> second resolution and will therefore frequently return the same value for
> consecutive calls. Instead just use the dive->when which should be
> different for every dive that you have. This will give you a different
> uuid.
>
> I have changed that but ended up with 1032 dive sites in Subsurface which makes
> tracking down my current issues very difficult, I would like to start at zero with the
> dive sites and see if the duplicate error's I am getting is going away with the new
> code that I am testing currently.
Ah, that explains it. Yeah, that's a pain. And there is no trivial solution. This
shouldn't happen to anyone but developers (I hope), but it is very annoying.
So one thing you can do to test your new code is of course to start with an
empty dive log - then you see if your algorithm now gets it right. That still
leaves you with the mess in your current file.
Let me look for an easy way to deal with this. I'm thinking that just deleting
them from the beginning of the XML file should be enough. But I want to
make sure Subsurface does the right thing there.
/D
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.subsurface-divelog.org/pipermail/subsurface/attachments/20150908/7855844b/attachment.html>
More information about the subsurface
mailing list