Europeans risk seeing social media services Facebook and Instagram shut down this summer, as Ireland’s privacy regulator doubled down on its order to stop the firm’s data flows to the United States.
The Irish Data Protection Commission on Thursday informed its counterparts in Europe that it will block Facebook-owner Meta from sending user data from Europe to the U.S. The Irish regulator’s draft decision cracks down on Meta’s last legal resort to transfer large chunks of data to the U.S., after years of fierce court battles between the U.S. tech giant and European privacy activists.
[…]Meta has repeatedly warned that such a decision would shutter many of its services in Europe, including Facebook and Instagram.
Don’t threaten us with a good time, Zuck.
Good riddance, privacy-violator.
Not to defend Facebook, however…
A social graph cannot function properly, unless outgoing edges are available locally.
In other words, data not only needs to have a copy on the original location, but also in all other locales it can possibly be accessed as well.
If I’m in the US and my cousin is in the EU, we can’t have an efficient network unless our data is copied to each other’s location.
Anyway, this is the technical part.
sukru,
Sounds like a facebook problem to me, haha. If facebook had permission then it’s one thing, but otherwise user data shouldn’t be changing jurisdictions without clear and explicit permission.
I know we’re not going to see eye to eye on this given previous discussions, but I’d far prefer to see a large federated network of local/regional providers or even P2P networks instead of multinational giants controlling everything.
Alfman,
The main problem is physics. Whichever network it would be, it is never wise to transfer large amounts of data in every request.
Humans assume a page is “slow” if it takes more than roughly 400ms to load.
Ping time from San Francisco to Frankfurt is 160ms. Two trips: one for page, one for images, and it is already 360ms. The application itself would take about 60ms to process at best (but*…), and any social network, any application, that is split between these continents will automatically be perceived as bad. This is basic physics at work.
but*: Of course this assumes local optimizations, if the app data itself is “federated”, it can take minutes+ to bring everything together (been there, done that).
And this does not include additional costs, and wasted computing power (it is bad for environment to waste processor power: data centers already consume 1-2% of global electricity).
Anyway, this is only about the physical and technical part of it. I can understand people politically having other preferences, or disliking certain entities.
bad math… still not too far off.
sukru yes physcis do get in the way but there are agreements/laws that are in the way that facebook and others have ignored.
**If I’m in the US and my cousin is in the EU, we can’t have an efficient network unless our data is copied to each other’s location.**
This example of yours has problems. Something to be aware of USA does not have uniform privacy laws. But the USA federal government did agree to “International Covenant on Civil and Political Rights (ICCPR)” in 1992 Article 17 “right of privacy” this covers “family, home, or correspondence.” Really a lot of what facebook and other social media solutions does is in breach of Article 17.
https://en.wikipedia.org/wiki/International_Covenant_on_Civil_and_Political_Rights
Yes this is almost every country bar china has signed the ICCPR.
The reality is Europe is asking facebook to obey the ICCPR agreement. So a person must have a lawful reason to access/copy/store correspondence. Yes what applies to a mail person handling packages and letters applies to all correspondence electronic included.
The reality here is the base agreement is the same be you in the USA or in EU. Your example that data has to be copied to each location is true but there is catch. Facebook and other secial media have take the idea that we can just copy as much as we like. Problem with this lets say instead of your example its two EU people talking to each other and you have just copied to the USA on the off chance that a USA person might read it. Problem here is that this just copied data across legal jurisdictions without grounds allowed under ICCPR.
**Of course this assumes local optimizations, if the app data itself is “federated”, it can take minutes+ to bring everything together (been there, done that).**
We already have globally used federated solutions. But when you get into legalities of this ICCPR means you should be using caching.
Yes its a lot harder when you have to use caching. ICCPR means that facebook should have moderators in every country they have servers. ICCPR means that facebook and others should not be sending large volume of data between countries that there is not user driven demand for. USA servers of EU people should at max have just cache of what USA people accessed of EU people. EU servers should only have a cache USA people data of what EU people accessed of USA people. This repeats for all counties. If you access something that is not cached bugger you have lag.
Yes the ICCPR problem should have caused the USA to unified privacy laws. The horrible reality is following ICCPR exactly facebook should have servers fairly much in every USA state that has a privacy law(basically most of them) because these laws are unique to each states jurisdictions.
The reality here is social media global social graphs for delivering content to end users was not inside the rules todo even before facebook started. Up until now no one has been enforcing the rules on the books. Yes facebook and others are being complain long and loud about what is being asked of them when the problem is nothing that is being asked for is new. Everything that been asked by the EU now for is covered by ICCPR that prior to facebook existence by over a decade in the USA.
The only parties allowed collect data to make global social graphs after ICCPR was agreed to is intelligence groups like the CIA, MI6, KGB…. Private companies like it or not by the ICCPR agreement are not meant todo this. Yes private companies obeying ICCPR should be making intelligence groups job harder not simpler due to fragmented data.
Sukra social data should be compartmentalize into per legal jurisdiction segments. Yes legal jurisdiction is due to ICCPR section 17 forbidding unlawful search. If you have a search warrant in one legal jurisdiction it does not magically allow to search in a different legal jurisdiction. The way facebook has been handling data allows a person to perform a illegal search because data is on the servers that should not be there.
sukru,
But it’s not like they have to load the whole page from overseas. And if they’re using browser caching competently it certainly should not be every request. more like once per session and 400ms for live background updates would be more than good enough.
I actually think it would be barely noticeable if they were smart about loading resources in the background. I realize complying may be very inconvenient for companies versus just doing whatever they want with user data, but I humbly suggest the problems are solvable if they want to.
Well, I doubt we can make meaningful conclusions about that without talking specific implementation details. A clever purpose build design could actually be more efficient, but I don’t think we can make generalizations in the abstract. Arguably data centers have been less efficient at mass content distribution than the P2P networks they replaced years ago. Centralized services ended up winning because of favorable business models, but that doesn’t mean they were technologically superior. Consider how microsoft added P2P to windows 10 update to make it faster & more efficient.
There are challenges but I don’t think we should jump to conclusions about all technical solutions automatically being bad.
oiaohm, Alfman,
There are not many ways to build such a distributed social network. We can probably summarize it to four options depending on how you distribute your code and data.
Assuming:
In our simple model, the Web Frontend can be replicated to all locales, The rest depends on whether our API calls are local or distributed, and whether our storage is local or distributed.
The most optimal solution is when both your data and code are local. This is roughly how all systems work today.
The worst design is having both code and data distributed. Such a system proposal is probably a good way to fail the distributed systems class.
The other two comes with differing tradeoffs.
If you keep data local to your app instances, and then distribute the high level calls, then you essentially get one “stream” for each locale. (Federation if you will). Hence if there are 20 jurisdictions, you have 20 different result sets, which will then be joined together in a layer called “combiner” or “mixer” (which actually is a separate layer in real life designs anyway). So this has a constant factor overhead in resource usage.
If you access data remotely, the system would save on computing resources, since the existing system can build only one result set. However it would then need to access indices across continents, essentially trading computation resources for network latency (and cost).
This is a simplification of course. A real life global scale social network is composed of at least 7 or 8 different layers with unique responsibilities. Hence there are possible trade-off can in each one of them.
But at the end of the day, the new system will either cost order of magnitude more (again wasting global resources), or order of magnitude slower (higher latency), or somewhere in between.
Alfman,
re: p2p vs centralized.
The answer is somewhere in between.
All truly p2p networks have failed to become mainstream (freenet, gnutella, or even kazaa, or edonkey), but semi-centralized (i.e.: distributed) BitTorrent is still very much active (and even being used by large software companies as you mentioned).
oiaohm,
re: ICCPR
Technically Facebook’s reading is probably correct. Any data that is public is potentially visible by all people in all locales, and a candidate for recommendation.
Unless it is ACL’ed to a limited group of people all in that locale, of course.
Yet, this reading might not match lawmakers’ expectations.
sukru,
I cannot agree. P2P networks not only became mainstream but they were extremely popular all around the world. Today they are a figment of what they were because the RIAA and Holliwood did everything they could to stop both the users and developers behind these networks because their content was being distributed for free. Nevertheless I honestly think it’s fair to say the content owners are the ones who selected centralized services over P2P because of business models. But this change was not driven by demand.
sukru,
I still don’t think we can generalize claims in the abstract without considering the specifics of UIs presenting the data.
Take an email client for example, as long as the UI itself is quick and snappy a user may not notice or care that a message comes in 400ms or even 1000ms instead of 100ms.
Youtube is another example, it downloads content (comments/results/thumbnails) a page or two down past the end of the screen. This way when the user scrolls it displays results immediately hiding the network latency behind the scenes.
Sites like twitter and facebook can incorporate these things too (and they may already do it), Such features might need tweaking, but if done well most users won’t even notice that data from foreign contacts may be coming across the ocean.
Alfman,
Again, there are not really many different options, at least from a high level perspective.
You have your user, you have your data, and you have your code. The goal in a good distributed system design is having the data as close as possible.
In practice, where you have the each individual piece dictates the methods that you can use.
As for YouTube, or any other streaming platform, which has the media as the main content, you want to avoid distance to your video data. This is usually done by heavy use of CDNs, duplication, and caching.
Internet is and has always been an asymmetric platform. Using long distance links are not only prone to latency, but there are shared limits to throughput across continents which are not there for your connection to your local ISP.
(And of course whole another chapter about CAP implications).
Alfman,
Yes but whether the user perceives any lag is highly dependent on UI choices. It’s why I gave examples of ways that latency can be completely masked by background synchronization that doesn’t feel slow. A clever web app does not necessarily have to be laggy even across oceans, but the point I’m trying to get across is that this depends on what exactly the UI is doing. We cannot just make broad generalizations that things have to be laggy and that’s that.
Ok, but I still think P2P & federation had excellent properties for this. These got replaced because internet business preferred centralized services the controlled and so they pumped tons of money into centralized services instead of P2P ones.
Still, what I’m referring to is what models are possible rather than what models our internet companies have invested in.
Alfman,
I think we are onto something.
Yes, an application can be tolerant to occasional latencies. However a good system tries to avoid those latencies in the first place. For a small audience, designing without regard to how data is organized is probably acceptable. For a billion user system, every little detail is important.
I think a good analogy would be having too little RAM. If you know your program is going to trash virtual memory all the time (i.e.: global scale streaming platform without local CDNs), and you have reached the limit of RAM, you need to redesign to make best use of the memory and cache hierarchy. Same with the Internet.
(Netflix, for example, consumes more than 10% of global bandwidth. Can they do the same if only remote links are used)?
Internet is already peer to peer. It is just that, peers are among “equals”. ISPs, root DNS servers, so called “Tier-1” networks, all peer with each other and use dynamic protocols to re-organize in real time.
This has always been the case. When in university, my machine had direct access to the Internet, and had a static IP. I could leave it 24/7 on to run any service I wanted. However the department servers would never accept it as an equal peer. Yes mail would be forwarded, but that was about it.
We had for example official mirrors for open source projects like Fedora and such. That came with access to “master” repos for rsync. I could not run it on my local machine, but the main department web server only. (Again the department was a peer of Fedora, not me).
Yes, at one point in time, for content distribution “p2p” networks were popular. However today things are different. But we seem to disagree on reasons. That is okay.
(Actually in absolute terms, p2p usage grew: from about 50% of ~3,000PB/s in 2006 to 2% of ~300,000PB/s today).
I still think it’s application specific. 400ms latency is fine for watching a movie, but terrible for playing a video game. We shouldn’t rule out storing data in an appropriate jurisdiction without looking at the specifics of the application and possible mitigations first.
I believe netflix content is always hosted from within the regions that are allowed to have it so there wouldn’t be a need to source movies from overseas. But if we ignore that yes I actually think P2P can handle it efficiency.
I want to clarify a point though: video services like netflix, youtube, vemo, etc have explicit contracts for distributing content in various regions of the world. These services are used with the intention of distributing videos to the public. and when they host content locally they do so with express permission of the owners. But in talking about the personal user data that facebook has, it’s not really the same scenario. It’s hard for me to envision a scenario where netflix would be allowed to stream content to region X but not be allowed to host the same content from region X.
A minor nitpick on your use of “is”…it used to be more peer to peer than it is today. The ISPs and devices of today place more emphasis on centralized services leaving the original peer to peer aspects of the internet non-viable for many people in many places, unfortunately 🙁
It’s not uncommon for projects to have their own peering policies. I assume Federa’s policy is reasonable and non-discriminatory and anyone contributing resources can apply to become an official mirror. Some networks are private, which may be by design, but the problem of course is when the market forms monopolies/oligopolies and the dominant leaders won’t peer with others because they want to control the market. On the one hand companies should be allowed to run their network how they want, but on the other hand what companies do may not be in the interests of consumers and can systematically harm the free market.
Alfman,
I am probably failing to convey the concerns in distributed systems, specifically those on planet wide scale.
However, I might recommend doing your own calculations. I looked for recent statistics on the global Internet, but was unable to find any public sources. Sorry.
However there seems to be an older presentation from Jeff Dean. As a starting point Slide 24 here has some (slightly obsolete) numbers:
https://www.cs.cornell.edu/projects/ladis2009/talks/dean-keynote-ladis2009.pdf
sukru
ICCPR
Article 17
1. No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence, nor to unlawful attacks on his honour and reputation.
**Technically Facebook’s reading is probably correct. Any data that is public is potentially visible by all people in all locales, and a candidate for recommendation.
Unless it is ACL’ed to a limited group of people all in that locale, of course.
Yet, this reading might not match lawmakers’ expectations.**
That does not match the ICCPR. Just because someone releases something public does not give Facebook the automatic right to transfer it across jurisdiction. Note the unlawful attacks on his honour and reputation. So what a person can legally write in one jurisdiction that is not a breach of privacy not classes as unlawful attacks on honour or reputation in another jurisdiction can be.
Technically facebook is legally liable in every one of these cases. Because they have transferred data between jurisdictions without moderating it.
Also not the word arbitrary yes transferring everything globally as default would be classed as treating correspondence in arbitrary way. Reality is you should restrict to current jurisdiction by default and ask person what other jurisdictions they want their post to be transferred to.
Facebook never set their systems up making sure they were not treating communication in arbitrary way.
1 There need be reason why communication is transferred. Not just that it will make our system perform better. This is 17 arbitrary interference is not allowed with communication
2 the transferred communication better to be causing a legal problem because by 17 you are meant to check for that.
3 unlawful interference on communication is also fun. This means legal jurisdictions are free to define rules on what communication can be transferred cross their legal jurisdiction boundaries. Yes if a legal jurisdiction rules that particular type of communication is unlawful transferring it into their country is unlawful interference with their countries communications. Exactly what is facebook doing to make sure when it transfers stuff internationally that its a legal message.
For a single sentence of text the legal effect is huge. Yes that single sentence does not put it on the person posting the communication to tag it 100 percent correctly.
Yes section 17 also means people being trolled on facebook and other social media in lots of cases is also breaking section 17. Yes this is only one sentence out of ICCPR there are other sentences that apply to a operation like facebook as well they are not obeying.
Reality is facebook is only being asked to obey what ICCPR had already in place. Locking communication to a jurisdiction can be import to prevent interference with court cases and the like as well.
Yes the idea of push it on to user to ACL the content does not conform to ICCPR. Facebook moderators need to ACL the content. Wait facebook has so many messages being sent around to everyone globally they don’t have enough personal todo this.
A person posting message to their own legal jurisdiction facebook would be in their right to presume they should know their own legal jurisdiction and be obeying that law. The problem comes once that message is transferred to another legal jurisdiction.
sukru,
I just think we’re talking to different points. I fully acknowledge the physical reality of latency as you’ve mentioned and as presented on slide #24. I never meant to bring that into contention. My point was more about being able to build UIs that mitigate the effects of lag in applications. It is wrong to conclude that remote data implies a bad application experience without looking at what we can do to mask UI latency in specific implementations.
Out of curiosity I’ve run some pings from NY to data centers in Europe (some of these are actually less than I expected)…
Amsterdam 88ms
Doublin 90ms
Frankfort 92ms
Helsinki 116ms
Kyiv 122ms
Lisbon 105ms
London 79ms
Madrid 95ms
Paris 83ms
Progue 100ms
Stockholm 110ms
Vienna 112ms
Warsaw 106ms
Zürich 104ms
Asia Pacific…
Hong Kong 222ms
Singapore 226ms
Tokyo 184ms
This is the cape town science center at the tip of Africa…
Cape Town 261ms
Obviously YMMV.
My Australian associates had some feedback for me.
They barely missed FB when it was banned.
The main complainers were media types, who lost online profiles and the associated adverting streams.
The wider general public didn’t even notice it missing, many have never gone back or now interact in a greatly reduced manner.
The reasons we can’t leave facebook are:
> You have family & friends’ connections (A workaround is just to use messenger and point your friends/family where to message)
> You have a business page and you are running ads to sell your products or someone else
I am no longer using Facebook btw, except messenger.
Facebook is sticky, and I do have to admit there are some good parts. But those good parts have nothing to do with any facebook proprietary technology, they are used because facebook is used. Facebook groups are fine. There are groups I’m begrudgingly part of because they are there. The group used to be on yahoo, but moved as people left the yahoo system. So, they’ve done it before, they can do it again. They just need a good group based system with good security and privacy and exist for free.
Bill Shooter of Bul,
I’ve been saying the same for a long time. The tech isn’t the limiting factor so much as the userbase is. Sometimes I’m reminded of the search engine startup that was built by ex-google engineers, they had relevant skills and technology but they couldn’t capture users..
https://wallstreetpit.com/345-new-search-engine-cuil-launched-by-ex-google-engineers/
The network effect is a big catch-22.
Since people are sheep, you could get some internet celebrities on board (mrbeast, whatnot) and people will follow. The problem for a startup is there simply may not be the money to snipe celebrities off of much wealthier competing platforms.
If you’re a normal startup with no cash cow. then maybe you can just fake a userbase. This strategy may actually be more common than people realize…
“I Made a Fake Multiplayer .io Game”
https://www.youtube.com/watch?v=_X3DngmZDzs
Is it possible to kick start a social network with chat bots? Intelligent discussion would be hard of course, but for a domain like politics where it’s mostly trolling I think the answer might be yes, haha.
The difficult part of any of this is that facebook already exists, and everyone already has an account. Its good enough. Its bad behavior isn’t enough for most people to move their usage to anything else. I don’t know how to start a migration elsewhere.
Not s bad thing. Social media is a disease.
I cant wait if this happens. So many companies have moved to facebook and even public service announcements are being made on FB, that the panic will become massive, once FB is in black out. This might teach people how good own homepages really are, instead of relying fully on FB.
I really hope it goes black…. The outcry from ordinaery people and corporations will call for extra popcorn.
brostenen,
I’ve seen this too and it is obnoxious. Local governments sometimes put resources on facebook that the public cannot access without a FB account. I don’t have an account and don’t want one, but I still want to access the government resources. I did try but apparently facebook’s registration has gotten very picky with phone numbers and it wouldn’t take my legitimate number so I was literally blocked from accessing my town’s facebook page even though I tried. I was told that happens sometimes, but you just have to use someone else’s number when that happens. What a garbage heap.
@Alfman
Yes I agree.
But of course this is a problem with local bureaucracies rather than FB, FB is just acting as the facilitator but of course shares the blame.
I suppose there are just as many Doxing megalomaniacs running local government services as there are in FB. It’s become much worse since COVID, by forcing these pathways they can track and trace everything, and it affects both clients and employees.
One minute you are submitting a complaint about your local government services, minutes later you are getting a threatening letter from some bureaucrat you’ve never heard of or spoken to!
Personally, I don’t get it, I don’t get how government based authorities, whether they are Local or Federal, get away with obfuscating their responsibility to protect the identity of the general public, but they are!
cpcf,
Meh, I know.
I think most people in positions of power already have their mind set on the policies they want to push. The events du jour change don’t change their mind, but it changes what they can get away with.
Oh no!
Anyway, I hope they will move out and something local & better (at least faster, because Facebook is so damn slow) will move in.
It’s not hard to understand what will happen, if Facebook shutters it’s services users will move to something local. There’s nothing Facebook provides that can’t be replicated.
I am so happy to know that Europe has taken a stand against Facebook. This is a step in the right direction.