making remote git storage work [was Re: Subsurface ans Dropbox]

Dirk Hohndel dirk at
Sun May 31 23:02:24 PDT 2015

On Fri, May 29, 2015 at 04:24:04PM +0100, Long, Martin wrote:
> I quite understand that we need to keep this simple for the user, and hence
> my suggestion to use https.I thought it would be simpler to do this using
> http/https than it would using a convoluted method of fetching and
> decrypting a key using a REST api, especially when the result is ultimately
> the same - login using a username/password. I don't think at any point did
> I suggest that [non-advance] users should be creating SSH keys, rather that
> we ought to consider user/password security over http as a better fit
> implementation for that use case.

I spent a lot more time thinking about this.

And I came to the conclusion that Martin is right and I was wrong.

Doing this via ssh seemed easier and straight forward as it integrated
into an existing backend infrastructure, but the more time I spent
thinking about the details of how this would be implemented, the more I
questioned my earlier assumptions.

I spent a good chunk of today stubbing out what I think Martin suggested.
I pushed this to master so people can play with it.

This is far from functional (for one thing, someone (cough, cough) needs
to implement the missing pieces of remote git storage, most notably update
and push), but I think it can serve to show the idea.

Preferences still has the the two fields for email address and password
(renamed from PIN). It now talks about Cloud Storage but that's simply a
more commonly used term, I think.

There are two new entries in the file menu: Cloud storage open and Cloud
storage save.

Right now there is no api for people to create accounts, but there are two
test accounts with idential data for people to play with.

ssrftest [geheim]
subsurface at [geheim]

the second one is the pattern I want to implement... when a user
configures email and password in Subsurface, Subsurface connects to the
server and checks if the account exists and if it doesn't, it creates it
and sends a confirmation email with a link the user needs to click to
enable it.

Once that has happened, the user is able to store their data in the
Subsurface cloud :-)

As far as git is concerned the repo is named

branch <encoded_email>

To avoid people messing with things I am encoding email so that everything
that isn't a letter, number, '.', '-', '_' is dropped and the '@' is
changed to "_at_". So subsurface at turns into I think this should still keep them uniqu but
prevent bad people from trying to insert shell escapes, functions,
processes and what not into things...

So this is all done via https (and yes, I ended up buying a wildcard
certificate for Subsurface). Once I figured out (and cleaned up) the UI,
we can even allow additional branches so people easily can track the dives
for multiple divers without having to change their email/password in the

All this needs better UI (working remote git functionality - see above)
and a ton more things, but I think this gets us a big step closer to the
thing I have wanted for more than 2 years now... truly transparent cloud
storage for our data that will make it trivial to keep the data file in
sync between different computers (and thanks to Grace, soon to Android

Right now I realize that there are no checks on what users store on the
server - I need to figure out a way to do some sanity checks so that this
doesn't become an abused free git server for people. But no one on this
list would do something like that to me, right?

Anyway, I'm sure I'm missing two thirds of the explanation, but please
poke at it and let me know what you think, what I need to change, etc.

I simply ran out of steam trying to implement the automated repo setup
(the authentication / authorization infrastructures already is in place).

I'll kick of daily builds and get some sleep :-)


More information about the subsurface mailing list