git backend: actually update local cache from remote
Joakim Bygdell
j.bygdell at gmail.com
Wed Jun 10 21:48:20 PDT 2015
> On 11 Jun 2015, at 6:16, Dirk Hohndel <dirk at hohndel.org> wrote:
>
> I've been going back and forth about this a bit, and here's my current
> thinking (which may be silly, but it's much easier to start a discussion
> by putting something out there and then have Davide come up with something
> better :-D)
>
> - when you start Subsurface (with cloud storage picked as default) or open
> the cloud storage from the menu, if there is a local cache, the local
> cache is read and the UI is show. At the same time a background thread
> tries to connect to the remote.
> If it gets an error it stops and displays a message at the bottom of the
> window "can't connect to cloud storage, you can continue working from
> the local cache" but doesn't otherwise disrupt operation.
> If it succeeds with the initial connection it brings up a spinner and a
> message "synching with remote storage" - I guess it should allow
> "cancel" for that, but unless cancelled it should disallow modifications
> to the data while this is happening.
>
> - when you click save or quit we immediately save to the local cache but
> also bring up a dialog that asks "should I attempt to write this data to
> the cloud storage?".
> If the user says no, just ignore the remote.
> If the user says yes, sync in the background (if this was save) or show
> a spinner while saving (if this was quit).
>
> This is kind of a hybrid of what you are talking about. If you are on a
> boat or on an island with shitty network you can simply work from the
> local cache. And if you suddenly have some connectivity you can trigger an
> upload to the remote in a very straight forward fashion.
>
> What am I missing?
Not much.
Syncing on start/open and save/quit/close with an option for the user to cancel the operation would be enough.
/Jocke
More information about the subsurface
mailing list