Only after learning what is under the hood can your programming skills blossom; so, look at how computers first worked and how they have evolved through the years. Read the article at InformIT.
Only after learning what is under the hood can your programming skills blossom; so, look at how computers first worked and how they have evolved through the years. Read the article at InformIT.
This is a great article. Funny this is…I seem to remember being taught all this stuff in the enineering core and then systematically forgetting all of it. It is good to be reminded. I am glad its here.
Oops. I meant “EE school.”
Funny how something which refers to “the early years” make a lot of assumptions which you couldn’t make in the early years. Eight bits make a byte? In the early days, you used words, and if you used bytes, their definition was all but firmly set. And ASCII? ASCII is a new kid on the block. Note how eight-bit ASCII not only isn’t an American Standard Code (only seven bits are defined in the vanilla ASCII, leaving the last bit in a standardisation flux), but that ASCII wasn’t defined until the late sixties. Before that, lots of different character sets were used, some as short as five bits, since there is a lot of space even in the seven bits.
The first two sections are pretty boring; I know how a keyboard and twos compliment works, thanks. Then you finally get to the good part and….section 3 is missing. 404.
So it’d probably be a much better article, if they had kept the “Intro to computing 101” and “History of Programming Langauges” seperate. That, and they fixed the links. I’ll check again later I guess.
http://www.levenez.com/lang/
This shows a quite complete family tree of programming languages.
This seems like 8th grade material to me. Mabey I am old and people dont know this stuff anymore which is sad.
…what about BeOS? We’re not being represented!
…6 comments (now 7 or so) about this, a zillion about “SCO and IBM,” “Microsoft and Linux,”, and bla bla bla.
This is a reactionary world. No need to think (except when courts are involved.)
BTW, there was a time when people got paid big money to run around in a hot room changing tubes, so the payroll checks and reports would print. Almost all code was “home-brew.” In those days, just “binary” was a mystique.
How can the claim to document the early years of programming languages without even mentioning Lisp? The languages they did mention are now mostly dead, yet Lisp is still alive (but not quite thriving). New features in Java, C#, Python, Ruby, and friends have been in Lisp since the 1960s.
Those people who do not know Lisp history are destined to repeat it.. poorly.
…as we’ve all commented, and thanks for someone pointing out that ASCII is 7 bits, not 8. A word was that which the computer handled in processing, not statically set.
We never heard of functional languages at all there. I teach binary arithmetic to 6th graders at times. I do believe that a base knowledge of these things are critical to those who want to be “great programmers.”
I never had the opportunity due to my age to be swapping tubes, but did compete for FORTRAN processing (I hate cards and fixed-width coding) on a timeshare system at Uni. My first experience w/ Unix was in 1984 and for certain many things are missed for even a 101 class.
HOWEVER, article is is just fine, we just shouldn’t be claiming titles as these for the limited content of the articles themselves.
—
3lixyqueue
“…what about BeOS? We’re not being represented! ”
Ummmm…..when did BeOS switch from an OS to a programming language?
btw, I remember reading about russian computers having 9 bit words… I really wonder how they worked =)
Multiples of 8 words are not a necessity for computers.
There have been: 18/24/36/48/60 bit word machines through history.
Many minis were actually 18-bit machines, so half a word would be 9bits, which is what they ussed for atomic data unit.
BeOS is everything and anything to everyone and everybody, f00l!
Unisys still makes Clearpath IX servers which use the old Sperry/UNIVAC 1100 architecture (36-bit words, with ASCII characters stored in quarter-words or 9-bit bytes).
When I worked at NWA (1993 through 2001), we used that hardware, and we still used a 6-bit character set called FIELDATA. 🙂
If really
7*6
means
7+7+7+7+7+7
so
7*10000000
means
7+7+7+… (10000000 times)
and it should take much longer then 7*6. But I didn’t noticed any difference! In Intel specification of IMUL operation time does not depend on operand’s values.
Strange…
And programming language history without Algol?
Strange…
Andreas