An interesting thought exercise. What if the Internet had never become a giant vacuum for malevolent ad agencies and desktops hadn’t become stupidly over provisioned thin clients for web pages? Instead, what if the Internet was only used to facilitate data synchronization between endpoints? Could we get there from our current place?
Let’s ask ourselves: “what if the Internet was offline first? And what if we had local-first software paving the way into an offline SaaS model?” Actually, the authors of this paper (“Local-First Software: You Own Your Data, in spite of the Cloud”) raise these exact same questions in their work, and it’ll be our matter at hand today. How would an offline-first Internet look like?
Does not work.
I remember during the slow dial-up times, I was heavily investing in proxies. I had even had some scripts to convert the local browser caches into usable local offline internet when the connection was unavailable. However most of the things we actually need was on the other side of the modem.
It is even worse today. Instead of a single machine shared by family members we have multiple machines running different operating systems. We have lots of Internet connected devices. We also have a NAS, but setting up and maintaining a Windows domain is a chore, upon which I gave up.
The issue is not the apps. It is the data. We have Office 365, we have phones that shoot photos and store them locally. We have Adobe Lightroom, and many others. But sharing those documents and photos across all those devices and accounts is a chore.
Once again not everyone can be a domain admin.
sukru,
IMHO statements to the effect of “it’s too difficult for normal users” fail to consider the innovations that we would have if companies weren’t so focused on selling data-silo services in the first place. It’s not the fact that services are running in a data center that makes them easy, it’s the fact that applications and operating systems have been refined and integrated to work with the services in an easy to use manor. And that’s the thing, while companies have been focusing on corporate data solos as their business model, it does not mean that local services have to be difficult!
Services like one-drive would work extremely well over self contained NAS boxes that communicate and share via P2P. There’s no technical reason it has to be difficult if only the companies would develop the tools and standards to make it happen. But therein lies the problem: companies are focusing on “cloud” nonsense at the expense of local features. For better or worse local resources are actually getting harder to use with today’s mobile technology. So frankly I agree with you that sharing local resources without going through an internet service can be excruciatingly painful. But this is not because it’s inherently difficult! It’s the fault of products that make it difficult by design in order to promote cloud services. It’s the reason chrome makes it a pain in the ass print on a printer a few feet away without sending documents through google’s services. It’s the reason both IOS and android make it so damn difficult to share documents natively between devices on your LAN without relying on a “cloud” service provider to bounce files through.
So I would agree running local technology is much too difficult as it stands today, but I would strongly push back on the notion that local resources have to be difficult or unintuitive for normal users, especially if we consider an alternate evolutionary path where companies could have been encouraged & incentivized to actually work on innovations for local systems.
Also not for nothing, but devices that rely on “the cloud” to function can have their own difficulties. For one, it introduces end of life headaches. I’m already beginning to accumulate more devices that are becoming bricks because the cloud services supporting them are going offline. Think about the home automation company revolv, consumers expected their products to last a long time in part because the company promised they would. Yet when google bought their parent company nest, google shut down the online services and the home automation hardware was bricked. Also, tethering local devices to centralized servers is fundamentally less robust than local P2P protocols. In some cases if you loose internet connectivity your local devices can become inoperable. Consider chromecast, a nice piece of hardware but with absolutely stupid software limitations that require all streaming, even local streaming, to be controlled by google’s servers.
https://lifehacker.com/stream-local-media-to-a-chromecast-without-an-internet-1709666323
Obviously google’s engineers knew this was a stupid design. They could have designed to support flawless & reliable local usage (like miracast), but they chose to keep control over owner devices running locally.
I’m not sure why these nest cloud controlled smoke detectors failed, but I wouldn’t be surprised if it’s because communications to the centralized server got interrupted (it’s bound to happen eventually for some owners…), It’s quite antithetical to “cloud” simplifying technology for end users.
http://www.youtube.com/watch?v=BpsMkLaEiOY
http://www.youtube.com/watch?v=tj0K2fEakqc
I know you are employed by google, and I don’t mean any of my criticism to reflect on you, but sadly google is a fairly large culprit as to why modern technology is so hard to use locally without centralized services. They’ve promoted centralized products & services that are objectively poor with local functionality, They overly & unnecessarily focus on solutions that rely on google servers by design. I wouldn’t blame you for defending your employer, but will you acknowledge that there is some truth behind what I’m saying anyways? To be fair, I’ll concede that it isn’t just google. Microsoft, apple, amazon are other big corps pushing this transition to “cloud” everywhere so that we’re all dependent on them.
I agree that engineer *could have* pushed for a local first solution. But at what cost?
When I was doing my thesis, I wrote a simple SilverLight application and sent to our researchers abroad. I could have written a local Windows app, learned how to develop for Mac, make a Mac version, debug all kinds of compatibility issues, and have them run the app locally. Or I could just write more papers.
Same with system administration. When we had a shared online database we went for OAuth for logins. We could have built our own account systems, and try to support people all across the globe, or once again, I could spend the time writing more papers.
Having a central database, and offloading identity / sync services to SaaS providers save time. There needs to be a real requirement to push them to implement custom solutions.
That is why I don’t think we will have practical local first software.
sukru
At the cost of centralized solution providers, obviously.
Perhaps “cloud services” would have gotten a new meaning: in this hypothetical alternate universe a “cloud service” could actually refer to local devices communicating via P2P. It would be more appealing to me if this were the case.
It sounds like you are are judging the merits of local solutions as they exist today, in fact most of us do because we don’t have the ability to change the world. But obviously that isn’t the same as to say that localized solutions wouldn’t be preferable in a world that had embraced and invested in local solutions over centralized ones. The tools that you and I choose are very much a byproduct of our environments, but like programming languages, it’s often easiest to choose things that are popular, but it doesn’t imply they are the best.
Standards are very important whether you are looking at decentralized/P2P technology or at centralized technology. In both cases, you will get a downright lousy experience without good standards and integration. But once again, such standards could have evolved around local & P2P models instead of centralized data silos.
There’s the rub, many companies are guilty of disregarding the local operation because their financial incentive is to make technology that keeps us tethered. I understand their financial motives, but it really sucks that our modern mobile platforms and tech companies are doing such a bad job with local resources. I’m not denying your point that it’s difficult, but I’d like for you to not deny my point that it’s difficult because corporations are pushing us in the direction of centralized solutions instead.
sukru and Alfman, I really appreciated both of your contributions. sukru, I think you’re making an important pragmatic point: most of us just want to get stuff done, and we don’t have time to to mess around with ‘solutions’ that may be better but just get in the way. I’ve heard that described as ‘yak shaving’ — Seth Godin tells a great story about that. https://seths.blog/2005/03/dont_shave_that/ (Digression: he misunderstood what yak shaving was really about: all the seemingly useless but necessary side things you have to do in order to get a job done. But aaaaanyway…)
Altman, I also see you pointing to what could have been, and what ought to be, and I appreciate that visionary side. The nice thing is that there are plenty of visionaries out there who are willing to stick their necks out and risk getting them chopped off… All the entrepreneurs out there… One nice thing about P2P is that your startup spends waaaay less on infrastructure because the users host most of their own data and you just add a tiny bit of infrastructure for live-updating their binaries and maintaining a bit of base capacity.
But there is still one problem: in order to get adoption, your product has to be useful, usable, and well-marketeted. This requires money. (This isn’t a problem for open-source hobby devs, who are motivated by other things than massive adoption and sales. But they also aren’t so motivated to spend time on usability either.) I don’t know what the answer to that problem is… I think startup founders who care about decentralisation and user autonomy will have to shift their sights from the 10× growth ‘unicorn’ dream to becoming a 2× growth ‘zebra’ https://medium.com/zebras-unite/zebrasfix-c467e55f9d96
Decentralised tech is also very hard — you have to care a lot more about keeping data in integrity, which gets even harder when your app has a big network of potentially malicious participants. So I can appreciate the appeal of building for the cloud — it’s just a lot simpler.
skeezix,
I concur. A small team can build the technology stack to make local resources more directly accessible, but that means zilch if they don’t have the means to enter the market. In terms of improving mobile local capabilities, there’s little they can do without getting apple/google involved. I don’t have an answer for that either.
In all likelihood today’s tech giants will continue to be the major incumbents in their respective markets over the next 10-20 years. Whatever solutions they produce is ultimately what the world will embrace. If they produce centralized solutions, the world will take up centralized solutions. If they produce federated/P2P solutions, the world will take up those solutions. They have vast power to shape the world however they see fit. The best opportunities for small newcomers to influence the world has always been in new markets where they’re not completing directly against pre-existing powerful corporations. For better or worse I think a lot of good ideas are too late to make a difference now.
I get what you are saying. I feel that if a comparable amount of development resources had gone into decentralized solutions/frameworks, these would be much more advanced and easier too. Alas, centralized won the economic battle and that’s all that matters.
Yes, OSNews is a good platform to have civil discussions. Thanks for the positive comments!
Back to the topic: I think the current practices use a lot of “hybrid” models taking away key advantages from peer-to-peer designs.
Two things comes to my mind when I think p2p: distributed and decentralized.
However, the current web platforms are already highly distributed. Last time I checked Akamai had several hundred research papers on highly available distributed caches. And that was about 15 years ago!
For example, when you go to adobe.com, or download the latest Xbox update, your bits come from the local ISPs data center from an “edge server”, and not travel the distance to Seattle, or wherever Microsoft datacenters are.
Second advantage – decentralization – come with issues. For example, before OneDrive / SkyDrive it was actually “Windows Live Sync”. It was a platform for syncing your files across your Windows machines (hence local first). Over time it evolved into “cloud first”, avoiding issues where machines were down, or multiple devices had incompatible changes to files. Cloud became the main copy, while local was secondary.
(It still offers a very good local only workflow with Office, though).
Over time theroetical advantages of fully decentralized p2p networks were eroded by practical developments. In fact even p2p went from eD2k to curated torrent sites.
sukru,
The problems to be solved aren’t really unique to either centralized or decentralized and much of the same technology that works in centralized systems can work for decentralized ones too. Whether it’s centralized or not, the solution to outages is more redundancy, so I don’t see this as being unsolvable or even that different in either case.
However I think we’re overlooking a much more fundamental point which gets more to the root of our disagreement: centralization of ownership & control. Everyone should be asking themselves these questions:
Are you dependent on a 3rd party entity to control your data and access?
Are the software & protocols vendor agnostic such that you have the ability to migrate your data to another provider (or even self-host) if desired?
Do you have the option & tools to extend functionality as needed or are you dependent upon the 3rd party provider to do it?
Does a 3rd party provider have the ability to coercively push changes without your permission?
I don’t think they were eroded so much as ignored in order to promote centralized models. And insofar as torrents go that’s more of a federated solution since it’s possible for anyone to host torrents without depending on a company like facebook/google/microsoft etc to host it.. I’m ok with federated models. The problem is that many centralized providers don’t provide federation because they want us to be tethered to them and only them.
NAT traversal is for chumps. Any forward looking application should start assuming total end to end connectivity, preferably via IPv6. If you hit a NAT problem, throw an error telling the user their ISP is broken and bail out. Cell phones all have public IPv6 addresses these days. The whole reason cloud shit became popular is that NAT makes it too difficult for people to collaborate directly, p2p. ISPs that don’t provide IPv6 subnets need to be named and shamed, and possibly fined. Imagine a world where you can just send files to each other without using some third party server or running a torrent tracker.
tidux,
You’re technically right, but it’s pragmatically futile for millions of us who don’t have a choice in the matter. I am personally stuck with an ISP monopoly that still doesn’t support IPv6. As a developer myself, I wouldn’t even be able to run my own software if I followed your guideline. 🙁
I hadn’t considered a legal mandate, that might actually work… But it isn’t clear that it would be politically viable. Companies lobby for, and often get, massive deregulation bills asking for no governmental oversight over their business practices. The republican party as a whole has been extremely successful at deregulating everything, including technological initiatives like like net neutrality that were designed to protect user rights. Even things that are literally dangerous to public health such as pollution and water quality are being swept under the rug in favor of letting companies do as they please. While I hate that technology has gotten political, unfortunately it has.
Yeah, you have no idea how disappointed I am in both IOS and android for not embracing P2P, it could have been so transformative…
I’m in full agreement with you that NAT traversal sucks and I make no apologies for those responsible for us ending up here. However with that said it’s not always so infeasible that you have to imagine P2P working. P2P can still work, it’s just a lot more complex to implement P2P using STUN / TURN / ICE / UPNP / NAT-PMP / etc. Yes they are all terrible hacks, but as a factual matter they often can work regardless.
https://www.vocal.com/networking/classic-stun-simple-traversal-of-udp-through-nat/
http://miniupnp.free.fr/libnatpmp.html
…
This won’t happen at the Internet level, unfortunately. Any standard that is not forward compatible with IPv4 (BTW, is there any?), stands no chance. It’s far easier to implement ISP and country level NATs than replace the whole Internet.
A strong, coordinated worldwide push could help, but who is going to do it? Governments and corporations treat p2p communication as a threat to national security and business models. It’s far more likely we will organise ourselves around these limitations, than remove them. P2P protocols are already frowned upon and the last time I requested a static IP I received a warning this is a security and privacy risk.
‘Local first software ?
I don’t know your age guys, but I connected to BBS with the C64 and so on, and until mid nineties we lived in a local first software world, where the net was only to download mails, news, and (very light ) apps, and the browser were intended to browse news.
No I am not an old nostalgic and I am in love with the Google Suite since 2013, I push it hard everywhere (and I am also prone to start a flame versus O365 supporters).
The point is that I rememeber when you weren’t ALWAYS online, but you powered on the modem only if you have to.
I understand today some of you don’t conceive this, and also for me if the fiber is down I am lost.
But sometimes I think about another direction of technology, a radical change where browsers doesn’t even exist, only SFTP transfers (for all ! also chat messages) and only local executables on machines, syncing only data (and never downloading executables or scripts ), and we all are admins, The only real privilege could be installing / upgrading the OS and Apps, in a very restricted mode.
Sunday fool ideas….
t0nZ,
Imagine if google suite had been build around P2P frameworks & standards. Obviously this would not have happened because it’s antithetical to google’s business model but it really could have pushed simplicity, innovation and privacy on the P2P side of the fence.
Refreshing to see this subject! Ink and Switch write some amazing stuff; their Local-First essay (cited in the above article, written a few years ago) resonates strongly with me. I find it surprising to hear people saying that it’s too complicated or that it has a bad user experience. Yes, with decentralised you (or at least the devs who write the app you’re using) have to worry about NAT hole punching and other garbage, but I’ve had some direct experience with absolutely HORRIBLE UX as a consequence of cloud hosting.
* The time Slack went down for half a day and we couldn’t get any work done
* The time GitLab went down for half a day and we couldn’t get any work done
* The time I was on a 5h drive and had a bunch of stuff to write, but I couldn’t because I can’t save local copies of Google docs in my laptop’s browser (don’t worry, my wife was driving)
The annoying thing about the first two cases is that we were LITERALLY DOWN THE HALL FROM EACH OTHER, but our workday came to a halt because of some stupid server rack in Oregon that was having a bad day. That feels like a really fragile place for a team to be in.
In the second case, I could blame myself for refusing to get a data plan on my phone, but (a) we live in the mountains so it woudln’t have worked anyway, and (b) it really highlights how the cloud is pushing an asymmetry of privilege. If you’ve got a fast, always-on connection you get to participate in the world; if you haven’t, tough luck. Made me feel connected in some small way to the people on the other side of the digital divide.
Contrast that with the time I started the day using just local-first software day and didn’t even realise my internet was down until lunchtime! The information was out-of-date, but it was usable. Now that’s good UX.
Git, Telegram, Dropbox, Secure Scuttlebutt, Resilio Sync, and even Google Docs for Android are great examples of local-first software that enjoy wide use and Just Work. Some of them are P2P and some of them rely at least partially on the cloud, but they show that local-first UX can be better than cloud or local-only.