Beta users of SpaceX’s Starlink satellite-broadband service are getting download speeds ranging from 11Mbps to 60Mbps, according to tests conducted using Ookla’s speedtest.net tool. Speed tests showed upload speeds ranging from 5Mbps to 18Mbps.
The same tests, conducted over the past two weeks, showed latencies or ping rates ranging from 31ms to 94ms. This isn’t a comprehensive study of Starlink speeds and latency, so it’s not clear whether this is what Internet users should expect once Starlink satellites are fully deployed and the service reaches commercial availability. We asked SpaceX several questions about the speed-test results yesterday and will update this article if we get answers.
For what is essentially still a service in development, this is pretty impressive.
What is really amazing are the latency numbers. 30-40 ms. That’s pretty impressive.
Greg,
They are much closer than previous satellite internet services.
Starlink in low earth orbit ~ 550 km
Hughesnet in geosynchronous orbit ~ 35,786 km (65X further away!)
phys.org/news/2019-05-starlink-satellites-orbiting-altitude-space.html
http://www.hughesnet.com/about/how-it-works
en.wikipedia.org/wiki/Geosynchronous_orbit
Apparently these satellites are so close they can be seen by the naked eye when they reflect sunlight, which has been it’s own controversy. Supposedly they’re installing sun shades on new satellites to prevent them from ruining the sky.
https://spaceflightnow.com/2020/04/28/spacex-to-debut-satellite-dimming-sunshade-on-starlink-launch-next-month/
Satellite internet isn’t as scalable, but still, maybe it will be perfect for rural areas with relatively few people and a lack of terrestrial service.
It’s meaningless. Says nothing about the stability over a longer period of time than around 10-20 seconds.
sj87,
I suspect the technology will probably be stable, if not now then by the time the beta is completed. IMHO the bigger question is whether demand will outstrip supply (of bandwidth).
Rubbish. A corporate WLAN can keep your connection whilst you walk 500m through the building, connecting to 5-10 different AP’s along your journey. Very few connection issues and very little speed drops.
If companies like Extreme Networks, Cisco, Juniper, Aruba etc. can do it, i’m sure you can scale the principle up and do the same with satellite internet.
The123king,
I’m playing devil’s advocate here, but there are additional obstruction conditions that are more problematic than the office scenario. If you’ve ever had direct-tv you’ll most likely have experienced weather related outages as well as airplanes and birds causing brief yet unmistakable drops. We actually lived by an airport where this happened quite a bit, haha.
Now it’s not all together unsolvable, you might build the service with more physical redundancy using more satellites and/or stations at the same time to reduce the chance that both are simultaneously obstructed, but such redundancy would either have to kick in after a connection issue has already occurred, or it could be continuous redundancy, at the cost of doubling bandwidth requirements, which starlink can’t afford to do, at least not for everyone.
Given that starlink would have to cope with the possibility of a satellite going offline, ground-stations must be programmed to accommodate such failures by locking onto another nearby satellite, but I’m curious how long it takes to detect and correct the failure…?
This could really make a difference to de-facto monopoly regions in the United States.
Many areas are served by a single Internet provider, and they either have terrible Internet service quality, terrible customer service quality, or very high prices (or a combination of these).
Now practically the entire globe can sign on to Internet from anywhere they want. This could potentially be a gamechanger.
sukru,
You are right about the monopoly situation. I’ve got some family members with no access to broadband at all and for them this could be a game changer. However there are fundamental bandwidth limits that prevent the service from scaling to large numbers of users in a region. It isn’t like cables or antennas where you can just add more lines and access points to increase capacity. With satellite the bandwidth over a given region is fixed and the more subscribers there are the less bandwith everyone gets, meaning service in big cities will be worse than rural areas.
Here’s what starlink themselves say…
https://arstechnica.com/information-technology/2020/03/musk-says-starlink-isnt-for-big-cities-wont-be-huge-threat-to-telcos/
So, it seems like this won’t be the technology to displace ISP monopolies. For that we should start by allowing municipalities to offer internet. Everywhere that happens the competition suddenly kicks in and everybody gets very high speeds for very low costs. The main problem is cutting out the corruption…
Internet from space is really pointless, 11Mbps to 60Mbps is nowhere near enough to justify this. It is much better to spend the money on covering the land with good fiber and mobile connections.
You could mandate that every time the ground is dug up to put down power cables, you should also put down a fiber, even though it is not strictly used yet. Then when a business or a home owner orders an internet connection, the fiber is there, ready to connect. Not some flimsy satellite connection. This is actually how it is done in parts of Denmark, the power company owns the fiber network and you can get fiber connections rather cheap even in rural areas.
Denmark and netherlands are quite unique as there are no mountains and the soil is very soft and there is a lot less stone in general. Perhaps the swdish model is better suited for the US where the government gicves tax rebates to companies that puts out fiber. Sweden has the best fiber broadband coverage in the world and only south korea has higher average speeds.
Not in the US, where the population density is low and internet providers are local monopolies with implicit non-concurrence agreements and no incentive or legal mandate to invest.
Fiber is mostly useful in dense areas. Unless it is heavily subsidised (like in Europe), no company want to invest in fiber in medium or low density areas, because it is not profitable at all. In all other areas, it makes no sense, unless it is only for replacement of broken copper cables. And that is where satellite internet providers do shine.
The largest swedish state “upper norrland” has a lower population density (3.3/km2) than All US states except Montana (2.7/km2), Wyoming (2.3/km2) and Alaska (0.6/km2). Yet they have almost universal fibre coverage except in the highest mountain areas. Granted it is also the second richest area per capita in all of Sweden after only the capitol district. (guessing because of all of the mines and industry)
It has become very common, recently, in the US to use the excuse of “low population density” (or at times the dog whistle re “high social diversity”) as the reasons why their country is so hopelessly behind the rest of the industrialized world in key areas.
Some Americans think that everyone in Europe lives packed like sardines in just one city where everybody is white.
Interesting, here is the hope starlink will be available outside of US. I’ve started looking into providers independent from my country’s government after recent internet blockage in Belarus…
At some point, I did read that spaceX was thinking about satellite to satellite communication. Among other things, this would allow to have ‘nodes’ in space with some sort of cache to only have 2 ground-space trips instead of 4 if satellite are only used to relay the data back to earth.
That totally make sens actually if one look at the next step: server farms in orbit:
– no cooling cost
– no real estate
– free electricity
– international domain regarding data location
– mega scalable, when obsolete just send a new one and de-orbit the old one
– weight of electronics and data storage going down
Issue: if this arise, spaceX would have such a head start (they would be the only spaceground service provider) that it would become “the internet company”
Ok, now maybe I am just extrapolating too much but I am a fan of Sci-fi 🙂
Have you skipped physics class? Vacuum in space is thermal insulator, so one of problem for all space ships and satellites is heat sink. Especially since sun provide a lot of additional heat.
I don’t deny their are technical problems (without challenge, engineering wouldn’t be fun, would it?), but I do not believe cooling is a major one.
Indeed their is no convection and conduction to cool through the vacuum of space. All the heat is radiated. Therefore if you shield from heat sources (sun, earth) and expose a heat sink toward interstellar space (say 4K), you can cool down a satellite to extremely low temperatures. Following this principle the James Webb telescope will operate at 7 K (-447°F, -266°C). OK it has a fantastic heat shield and will not orbit earth but the sun, etc… again I understand it is not that simple.
Other problem: memory corruption due to high energy particles, among many other. I am working on it from time to time 🙂 (for aircraft at high altitudes)
bipbip,
It certainly would be possible, but a telescope is a far cry from a data center in terms of power consumption & dissipation.
We talked about datacenters in space here:
http://www.osnews.com/story/130845/european-cloud-project-draws-backlash-from-us-tech-giants/
It’s all technically possible, but at scale you would need an extreme megastructure to provide the necessary power and cooling. On the one hand, there’s no shortage of space, so maybe it wouldn’t matter. But network connectivity is always going to be a problem. A geostationary orbit automatically implies bad latency. If it’s in low earth orbit, latency will vary depending on it’s position throughout the day. The enormous megastructure will be easily visible when it blocks the stars, reflects sunlight, or casts a shadow. Bandwidth could be a problem using conventional RF links, but I think it might be overcome with laser communications. You’d need some new technology to scale up the bandwidth, but I think R&D could make it happen, maybe using a laser array.
There’s no doubt it’s possible, but given the latency issues and the enormous costs to launch anything into space, I don’t think it will ever be a competitive option. Maybe someday there will be people living in space, and they’ll want their own data centers.
You can’t dissipate heat in space using convention but you can using radiation. All you need is a nice heatsink on the “dark” side of the satellite, in fact you can get much better dissipation via radiation in space than on Earth.
Partially, I don’t the point of this service if you are not in deepest darkest Africa or South America or Anartica, and then who pays in that circumstance for what is a very very high cost solution that needs to be amortised over huge user numbers and still only delivers year 2000 performance.
In the USA, Europe or other cultures with a reasonable penetration of technology comparatively dirt cheap Mesh or FTTN can get a user similar or better bandwidth with far higher reliability at a fraction of the cost, and do so without polluting the skies to astronomers and observers!
cpcf,
I also prefer ground based solutions, but unfortunately we’re not really there yet. Companies generally focus on profitable areas with higher customer density rather than rural areas that are in need. There are still areas with no residential broadband including my parents. The property they rent had DSL before ATT decided to terminate huge swaths of the DSL network. I assume whatever government incentives they had to build it in the first place are long gone and as I understand it a significant number of US DSL connected homes are in the same boat. For them, they have even fewer options than two decades ago.
http://www.dslreports.com/shownews/ATT-Is-Letting-Millions-of-Unwanted-DSL-Users-Rot-on-the-Vine-134124
To me, these are the people who benefit from starlink, not because it’s better, but because starlink will actually have them as customers.
Are they regional city based?
Lots of regional centres in my part of the globe are moving towards community based cooperative mesh networks, local councils are taking the lead as they relate the local economic and social impacts of poor connectivity.
Mind you, they didn’t get started voluntarily, it’s been forced on them after years of small regional towns being neglected by big business. But it’s working out so well lots of regional centres are starting to take notice even if they are serviced by big business, it’s making broadband prices plummet.
The only downside I see, many are resorting to low cost Chinese hardware as the technical solution, and I fear that may have some impact down the line.
cpcf,
In grass valley region of california. At my recommendation they had site surveys done WISP providers, there were at least 3 WISP providers but none reached them.
If I lived there I might try to start my own WISP, although it’d probably a fortune just to get myself an internet trunk.
Yeah, I’d be temped to use own linux distro to build a mesh network. But off the shelf access points are probably a lot cheaper.
I think the reason local/regional governments are getting onto this is that many already have the backbone to the office that is either underutilised or easily upgraded. I suppose it’s just another thing they can turn into a profit centre, but one that has tangible benefits.
I’d even be looking into it if I was a government school, but of course they may be on a specially moderated network in some regions.
A while ago I read of one small town that broke even with only 150 or so homes taking up the service. No many homes are needed in ADSL or VDSL range, just a 3 mile radius and everybody gets reasonable broadband.
cpcf,
I’m not sure if you are familiar with the way ISP monopolies in the US have successfully lobbied to interfere with municipal internet providers in many states. Many local governments that had plans in the works were promptly blocked, which is what I was alluding to in my response to sukru above (…we should start by allowing municipalities to offer internet).
Anyways I just looked it up and apparently only 22 states have municipal internet roadblocks in 2020, which is progress!
https://broadbandnow.com/report/municipal-broadband-roadblocks/
The article supports your suggestion that local government community internet is useful.
I think the most useful activity for Starlink, OneWeb or others will be breaking the geo-political blockades of China, Russia or various other totalitarian regimes.
How does this system compare to the current satellite Internet offerings?
codewrangler,
+1, if anyone gets a chance to review this, please do!
Anyways, the biggest difference will be latency. traditionally satellite internet is just awful due to latencies of 600ms+. Starlink latency appears to be quite reasonable. based on the speedtests linked in the article.
Hughesnet plans range from $60 for 10GB to $150 for 50GB, which is quite bad for broadband, but that’s what you got for the privilege of using satellite. Starlink has a similar price point, but I’m not sure they’ve mentioned anything about data caps yet? If it’s ~100GB, then it seems like a good deal to me (is this too optimistic?). The service is mostly going to be for rural customers though because there won’t be enough capacity for cities.
Yep, I agree about the pricing. At $60 for 10GB or $150 for 50GB that is priced for a New York finance trader or a Silicon Valley manager having a weekend in the country, you won’t get much regional take up at that price.
For me in our area, most of the targeted users are or would have been on plans that might be ADSL2+ equivalent of 12bps to 25Mbps Unlimited data, at typical pricing of $50 to $80 per month. Just barely streaming capable, but functional, and data allowance is almost irrelevant at those speeds anyway unless your locked in a dungeon downloading 24×7.