I can’t even imagine how stressful it must be to know that your code in your OS powers a machine that, if it breaks, represents millions of dollars lost. The interviewee sounds like he could handle the pressure.
it seems that powerpc is used ny NASA in all the space crafts and satellites. According to the article, a PowerPc processor running at 150 mhz was enough to accomplish its task. What about the Pentiums and AMD processors? Are they(powerpc) better processors????
I guess they myth of Pentiums processors running at 4GHz was a lie. am i right?????
I think that the PPC is over all better, however, the Pentium and AMD processors have more so much more code developed for them that moving off will take a huge push.
And I believe that there is a reason they used such a ‘slow’ processor. Most chips have their parts crammed in as close as possible. This is to make the transistor count as high as possible. However, these chips were only ever meant for use on Earth. When you get into space and you have to deal with cosmic rays, it becomes a big problem for sensitive electronics. So electronics must be made with their internals more spaced out and thus slower.
Then again, this was told to me about 6 years ago, so I could be way off base.
1. VxWorks is known for being buggy, poorly written and difficult to use.
It’s popular because it’s comparatively cheap to license, and there are a lot of people who know how to use it. Also the development tools are industry popular and VxWorks supports a large number of processors, peripherals and standards.
2. Embedded PowerPCs are much lower in power consumption than x86 and it’s much simpler to deal with from a hardware point of view.
x86 has so much legacy cruft that designing with it is a real pain. RISC processors are usually easier to design with.
3. Radiation hardening is an issue and not many people make rad hardened parts.
I was talking to a friend of mine about it and he told me that at the JPL, you can still find Amiga computers supporting satellite missions, etc…., and they just have a 50 MHz processor. ( I love Amiga.)
It is amazing that they can achieve their goals just using this kind of hardware on these days. I guess I was wrong about the PowerPc processors.
The choice of processor is explained in the article. Basically, to put a CPU into space, it has to be resistent to power surges caused by ambient radiation (little to no athmosphere to filter it out). Making radiation-hardened processors takes a long time, so the state of the art in radiation hardened processors is pretty old technology. The Mars Explorer mission actually used a 25MHz Rad6000, which is a CPU derived from the IBM POWER-1.
I don’t really see what you’re trying to say about the “myth” of a 4GHz P4. P4s certainly do run at 4GHz (well, 3.8 GHz or whatever is the top-rated part these days). The reason they can get away with such a slow CPU (the Rad6000 is 25-30 MIPS), is that it really doesn’t need to do all that much work. It’s a data collection machine — processing is done back at NASA. So it basically just needs to move data around, buffer it while waiting for a transmission window, run control functions, etc.
As I read before it was more IBM’s processor that saved the day. They just lucked out that it reloaded the OS by design. Otherwise it would not have worked.
“I can’t even imagine how stressful it must be to know that your code in your OS powers a machine that, if it breaks, represents millions of dollars lost. The interviewee sounds like he could handle the pressure”
Eh, it’s really not that bad. Probably one of the better things to ever have do deal with. Would be far more stressful to deal with soemthing that peoples lives are at risk with, so think pretty much anything around you on this planet that has a computer in it.
Or, if you want to think of just cost. If you coded something in a product that millions where sold and made a mistake that caused a recall or similar that would cost way more then a mistake in this mission. If the Furbi had a software flaw that caused them all to be recalled that would be much more expensive mistake.
The biggest differance is the profile of this mission. But then, if you screwed up code in say a Jetliner and it crashed, that two would become high profile.
– NASA uses PowerPCs because they require a low power stable processor that has been proven–older and more efficient sometimes works best in their machines
– POWER in IMHO is the best CPU out there, while i think sparc comes in second. I also like mips since it was mainly designed to run in a multi-cpu environment(i think). If only the alpha wasn’t dead. damn you HP. The goal was to keep it simple and efficient and that suited their needs. x86 CPUs clearly have too many problems.
– I think NASA would not develope it’s own OS(although they have before-and some linux distro) when if there is an OS problem that could blame the company whom makes it… also it would help support private industry which is governemnt policy
CPU used by NASA need to be physicaly validated against space hostility and certified using formal method (i.e. prof that there are mathematicaly correct). This process takes at least 5 years. The process is not made by IBM. NASA bought CPU (VHDL?) implementation and made then validated by subcontractor using their own money. I guess that they also create models for exact CPU power comsuption per instruction.
NASA try also to reuse proven design. This is why you still can find magnetic memories from the 70’s in NASA shuttles.
“If NASA can do incredible things with such “old” hardware, imagine what it could accomplish with new technology.”
Not a lot more. You are being deceived by the advertising from the computer trade that tells you to buy this week’s newest and latest product.
Top priorities in the real world are reliability and a system that is very well understood because it has been around for a long time. Cutting edge hardware and millions of lines of new code are more suitable for big boys’ toys.
What if some alien creatures encounter this flying piece of 4-bit junk? What are they going to think about humans? NASA should’ve bought QNX. Now that’d be cool.
I assume they used liquid nitrogen to cool it, but do you have the link to the HardOCP article? Was it benchmark stable at 6GHZ? I checked their archive and didn’t see anything posted about it yesterday.
When I talked about the myth of the pentium processors, what I wanted to say was that for years they made the consumers believed that performance = speed and that was wrong. Apple tried to fight this assumption until they were succesful to do it. I heard also that the new pentium processors won’t include the speed of their processors to describe them.
“what I wanted to say was that for years they made the consumers believed that performance = speed and that was wrong.”
“I heard also that the new pentium processors won’t include the speed of their processors to describe them.”
AMD did it also with the athlon XP and up(Sempron, Athlon64 etc) processors. Using a numbering scheme to denote the performance instead of the clock speed. ex Athlon XP 2500+ instead of Athlon 1.83 GHz
ESA uses a processor developed for space applications that is known as ERC32. It is a SPARC v7 based processor. SMART-1 probe uses it. ESA uses other CPUs too…
Is there an emulator to run VxWorks on Windows to develop for it? More specifically for the MIPS processor?
I can’t even imagine how stressful it must be to know that your code in your OS powers a machine that, if it breaks, represents millions of dollars lost. The interviewee sounds like he could handle the pressure.
Nice article.
Does anyone have a link that explains the architecture of the OS they’re using in detail?
it seems that powerpc is used ny NASA in all the space crafts and satellites. According to the article, a PowerPc processor running at 150 mhz was enough to accomplish its task. What about the Pentiums and AMD processors? Are they(powerpc) better processors????
I guess they myth of Pentiums processors running at 4GHz was a lie. am i right?????
Some tech advice, please!
thanks.
-2501
read the article. esp. the last question & answer. he explains why sticking to a well known piece of hardware/software makes sense.
flo
I think that the PPC is over all better, however, the Pentium and AMD processors have more so much more code developed for them that moving off will take a huge push.
And I believe that there is a reason they used such a ‘slow’ processor. Most chips have their parts crammed in as close as possible. This is to make the transistor count as high as possible. However, these chips were only ever meant for use on Earth. When you get into space and you have to deal with cosmic rays, it becomes a big problem for sensitive electronics. So electronics must be made with their internals more spaced out and thus slower.
Then again, this was told to me about 6 years ago, so I could be way off base.
1. VxWorks is known for being buggy, poorly written and difficult to use.
It’s popular because it’s comparatively cheap to license, and there are a lot of people who know how to use it. Also the development tools are industry popular and VxWorks supports a large number of processors, peripherals and standards.
2. Embedded PowerPCs are much lower in power consumption than x86 and it’s much simpler to deal with from a hardware point of view.
x86 has so much legacy cruft that designing with it is a real pain. RISC processors are usually easier to design with.
3. Radiation hardening is an issue and not many people make rad hardened parts.
thanks for the answers!
I was talking to a friend of mine about it and he told me that at the JPL, you can still find Amiga computers supporting satellite missions, etc…., and they just have a 50 MHz processor. ( I love Amiga.)
It is amazing that they can achieve their goals just using this kind of hardware on these days. I guess I was wrong about the PowerPc processors.
-2501
The choice of processor is explained in the article. Basically, to put a CPU into space, it has to be resistent to power surges caused by ambient radiation (little to no athmosphere to filter it out). Making radiation-hardened processors takes a long time, so the state of the art in radiation hardened processors is pretty old technology. The Mars Explorer mission actually used a 25MHz Rad6000, which is a CPU derived from the IBM POWER-1.
I don’t really see what you’re trying to say about the “myth” of a 4GHz P4. P4s certainly do run at 4GHz (well, 3.8 GHz or whatever is the top-rated part these days). The reason they can get away with such a slow CPU (the Rad6000 is 25-30 MIPS), is that it really doesn’t need to do all that much work. It’s a data collection machine — processing is done back at NASA. So it basically just needs to move data around, buffer it while waiting for a transmission window, run control functions, etc.
As I read before it was more IBM’s processor that saved the day. They just lucked out that it reloaded the OS by design. Otherwise it would not have worked.
To what I remember, the NASA is also using hardened i486s for the Space Shuttles. And the main CPU in the Voyager probes are 4-bits. Woo.
“I can’t even imagine how stressful it must be to know that your code in your OS powers a machine that, if it breaks, represents millions of dollars lost. The interviewee sounds like he could handle the pressure”
Eh, it’s really not that bad. Probably one of the better things to ever have do deal with. Would be far more stressful to deal with soemthing that peoples lives are at risk with, so think pretty much anything around you on this planet that has a computer in it.
Or, if you want to think of just cost. If you coded something in a product that millions where sold and made a mistake that caused a recall or similar that would cost way more then a mistake in this mission. If the Furbi had a software flaw that caused them all to be recalled that would be much more expensive mistake.
The biggest differance is the profile of this mission. But then, if you screwed up code in say a Jetliner and it crashed, that two would become high profile.
– NASA uses PowerPCs because they require a low power stable processor that has been proven–older and more efficient sometimes works best in their machines
– POWER in IMHO is the best CPU out there, while i think sparc comes in second. I also like mips since it was mainly designed to run in a multi-cpu environment(i think). If only the alpha wasn’t dead. damn you HP. The goal was to keep it simple and efficient and that suited their needs. x86 CPUs clearly have too many problems.
– I think NASA would not develope it’s own OS(although they have before-and some linux distro) when if there is an OS problem that could blame the company whom makes it… also it would help support private industry which is governemnt policy
What is this “myth of pentium4 processors running at 4GHz” that everyone keeps talking about?
I think the myth was just brought up by 2501. No one else seems to have mentioned it.
(From Memory – may be an old Ars article)
CPU used by NASA need to be physicaly validated against space hostility and certified using formal method (i.e. prof that there are mathematicaly correct). This process takes at least 5 years. The process is not made by IBM. NASA bought CPU (VHDL?) implementation and made then validated by subcontractor using their own money. I guess that they also create models for exact CPU power comsuption per instruction.
NASA try also to reuse proven design. This is why you still can find magnetic memories from the 70’s in NASA shuttles.
If NASA can do incredible things with such “old” hardware, imagine what it could accomplish with new technology.
But what about ESA? Does anyone know what kind of hardware they use? Like on Mars Express or SMART-1, for instance?
“If NASA can do incredible things with such “old” hardware, imagine what it could accomplish with new technology.”
Not a lot more. You are being deceived by the advertising from the computer trade that tells you to buy this week’s newest and latest product.
Top priorities in the real world are reliability and a system that is very well understood because it has been around for a long time. Cutting edge hardware and millions of lines of new code are more suitable for big boys’ toys.
What if some alien creatures encounter this flying piece of 4-bit junk? What are they going to think about humans? NASA should’ve bought QNX. Now that’d be cool.
BTW, P4s run at over 6GHz if you put really ridiculous coolers onto them . There was an article on HardOCP yesterday. Neat stuff.
I assume they used liquid nitrogen to cool it, but do you have the link to the HardOCP article? Was it benchmark stable at 6GHZ? I checked their archive and didn’t see anything posted about it yesterday.
thanks again for all the answers!
When I talked about the myth of the pentium processors, what I wanted to say was that for years they made the consumers believed that performance = speed and that was wrong. Apple tried to fight this assumption until they were succesful to do it. I heard also that the new pentium processors won’t include the speed of their processors to describe them.
nice article.
– 2501
.
“what I wanted to say was that for years they made the consumers believed that performance = speed and that was wrong.”
“I heard also that the new pentium processors won’t include the speed of their processors to describe them.”
AMD did it also with the athlon XP and up(Sempron, Athlon64 etc) processors. Using a numbering scheme to denote the performance instead of the clock speed. ex Athlon XP 2500+ instead of Athlon 1.83 GHz
ESA uses a processor developed for space applications that is known as ERC32. It is a SPARC v7 based processor. SMART-1 probe uses it. ESA uses other CPUs too…