“As computer games became more and more complex in the late 1980s, the days of the individual developer seemed to be waning. For a young teenager sitting alone in his room, the dream of creating the next great game by himself was getting out of reach. Yet out of this dilemma these same kids invented a unique method of self-expression, something that would end up enduring longer than Commodore itself. In fact, it still exists today. This was the demo scene.”
A lot of demos are pretty cool, but just try to imagine on what limited hardware they run and they are just amazing.
And what more, that it isn’t like a PowerPoint that throws some pre-rendered stuff together with some effects. The most amazing part of demos for me is that they can produce something so artful from very very technical code. If it were just a “slideshow” then artistic style would be a given since the time would be spent on the art and not the code, but a demo is almost all code and so it takes a herculean effort to bash metal that close and produce something artistic.
Yup, read this, french demoscene on Atari ST, which was less hardware featured than an Amiga and thus needed more software work :
http://www.codercorner.com/blog/?cat=8&paged=2
http://www.codercorner.com/blog/?cat=8
Kochise
But also consider that those who have been programming “limited” systems (think of C64, Amiga 500 and successors, the various Atari ST and pre-ST computers) could use simple tools. Tools that we cannot use anymore.
Simple task: put a pixel on the screen
Simple solution: write some data to a specific memory location, often in assembler (faster is better)
Also simple, but often less efficient: use a predefined command in the supplied programming language, like putpixel(xpos, ypos, color), in a higher-level language that developers today would already consider lowest-level (because assember does not exist)
Today’s common solution: I cannot even describe it, because depending on the system, you will have to deal with a heap of abstraction layers, libraries and other conglomerates of code that add complexity and remove efficiency (or even the ability to do something
more efficient). It’s not easy anymore.
Sometimes, when I need to program something simple which involves simple hardware, I face the situation that it’s becoming complex and complicated. Having done the same thing 20 years ago, seeing me jump through hoops today doesn’t look very efficient. For example, switch some relays via parallel port: just write specific data to that port, inport() and outport() are even supplied by the system’s library, relying on BIOS calls. Today? Not even a parallel port! USB, converters, microcontrollers, firmware, connectors, adaptors… it’s not fun anymore, and definitely not simple. ๐
That’s why I’m still fascinated by today’s demo scene, primarily due to the fact that it still exists. Having been a member of that scene myself in the past, it’s nice to see that some stuff “experts” consider dead are still alive and kicking. Also the mentality of “we can do it better” (not in the sense of “better than you”, but more like “more efficient”, “faster”, “in less bytes” or “with higher performant code”) is worth being kept alive, especially in comparison to today’s obsession with layers of abstraction, libraries, frameworks, stacked on top, with no real understanding of what’s happening. Of course “working on bare metal” requires much more knowledge and experience than clicking around in some pre-chewed environment that makes use of frameworks that “take care of everything”, and in the end, result in unmaintainable code, slow applications, errors, crashes, wrong results and bloat.
I still have my Amiga collection, as well as some Atari computers. Some day, I hope, I will use them for something interesting. Of course they are still working perfectly. It’s not that they rot quickly as today’s “modern” computers. ๐
USB, serial, relays, fun :
http://www.velleman.eu/products/view/?country=fr&lang=fr&id=351346
http://www.velleman.eu/products/view/?country=fr&lang=fr&id=351282
Otherwise, Arduino, Raspberry Pi, …
Kochise
The difference is back then you could ask for and get the full blueprints and opcodes for the chips and moreover it wasn’t that hard to make a mental map of what the chip was doing because their designs were MUCH simpler, heck I remember reading the guys at Commodore used to build a working mockup for their chips using a big breadboard and a LOT of point to point wiring.
Now compare that with the chip I’m typing on which isn’t even state of the art yet has 6 cores, 3 levels of cache, if you look at a diagram of a modern chip layout its simply too complex for using the simple solutions anymore. Heck even ARM which used to be all about simplicity is up to 6 cores and 64 bits, so while its possible to use ASM today the odds that you will be able to cook up better than the compiler is pretty slim unless you are a superbrain.
Sometimes you didn’t have to ask for them: This specific kind of documentation was standard in the handbook delivered with the computer. For example, my Amiga 500 manual contains all this “lowest level” stuff: circuit diagrams, pins, codes. Of course manuals also included basics of programming, making them much more “high level” educational material than what’s distributed with today’s PCs.
Heck I had a customer that wrote a simulation of a sat going across the sky for his dish selling company back in the day and he said when he ran into trouble with some of the code on the first draft with the Commodore 128 he was using he faxed the company…not only did he get one of the actual chip designers to answer his questions the guy sent him his personal number and they spent a good chunk of a weekend talking on the phone with the actual designer helping him write his simulation!
Back then the companies were smaller and made up of geeks so getting all the nitty gritty on the chip’s logic and I/O was just easier to do then than now, and you sure as heck aren’t gonna get one of the actual designers on the horn if your last name isn’t Gates or Dell.
But while I appreciate some wanting to go back and play with the old gear again because of its simplicity give me a modern system any day of the week. Folks need to appreciate what we have now, that X6 CPU I paid just $105 for is so insanely powerful that if you would have told me at the time I was learning on my VIC 20 about it I would have laughed and told you nobody but a billionaire would ever end up with anything like that, we went from paying from several thousand a Mb of RAM to RAM being so cheap that even my little cheapo netbook has 8GB installed, and my first HDD was just 40MB yet cost nearly twice what the 3TB I have now cost.
We are truly in a golden age of computing so i hope everybody just takes a minute to look upon how incredible we have it now and how far we have come, because its pretty amazing.
Though “golden age” for most people means “when I was young”
Yeah, I have been converting the backend of a toy compiler I have done in 1999 to use Assembly directly, instead of bytecodes that map into NASM macros.
The idea was to remove some code that I cannot publish, while removing a few dependencies in the process. The basic runtime was also rewritten in Assembly instead of C.
It might not make much sense, but it has been fun so far!
So many hours spent with friends listening/doing ProTracker music and dabbling in 68000 Assembly books.
This is why I always seen the UNIX environment as primitive in what concerns multimedia.
Sadly I had to contend with a 386SX PC and visit friends that owned Amiga systems.
Grew up in the middle of all that, I remember doing gfx for a few demos in Deluxe Paint, pixel by pixel Not to forget LightWave, ProTracker, and having access to everything in assembly. I do miss all that power.
I almost cried when I realized that the future of computing revolved around the PC and Windows/Office. Way to suck the soul out of computing.
It is a pity that AROS never took off, but then again, wouldn’t have been much fun on the PC platform. Perhaps all these fancy ARM SoCs will give us a resurgence of alternative systems some sunny day.
A man can dream.
I’m surprised this article doesn’t mention that there was also an Atari ST demo scene (of course it’s about the Amiga, but it also mentions the PC). I was at boarding school in the early 90s and I recall there were a group of demo coders there as one of the older boys had an ST – they had originally been the Power Posse and they then chose another name, which I can’t remember, but there was a famous ST demo called “The Inner Circle’s Decade Demo” and they did a spoof called “The Mini Psychos’ Dickhead Demo”. They used STOS (there was a similar product for the Amiga called AMOS) – not sure if they used assembly as well (if that was even possible with STOS). Their demos were definitely sold through the ST magazines of the time.
If memory serves me well, they were mostly active in Germany.
Mostly, because Atari ST computers, believe it or not, were also used in most companies, for their power and efficiency. German people are pragmatic and gave the Atari ST line (even the TT) with the fame it deserved.
Also the Swedish and Polish demo scenes were stunning.
Kochise
Sure do.
Atari and Amiga systems were great, who knows how home computing would have developed if their owner companies didn’t went astray.
Most likely Linus would never had created Linux and the boring MS-DOS systems would have died.
Who knows.
I can bet on AmigaOS which was a micro-kernel, was modular and pre-emptive multi-tasking.
On the other hand, the Atari’s TOS operating system was bug ridden, on ROM so you had to put a load of patch at boot time and ensure they were loaded in the correct order, not multi tasking, hard to make it evolve.
Atari’s fate was sealed when the ST line was replicated in several flavors (Mega, Stacy, …) without providing much more, while Atari was obviously building promising prototypes that never reached the public audience.
AmigaOS, MorphOS, the Cube, at least the Amiga legacy lived longer than the Atari fandom (stopped at using an overclocked 68060 @ 95 MHZ and a “compatible” Coldfire-based computer that runs @ 200 MHz but costs 700 euros)
Kochise
And no memory protection (which made sense back then of course, but also makes it hard to evolve it into modern times), generally with quite common Guru Meditations.
PS. Can an OS without memory protection be really called a microkernel? (which is about separation of memory)
Edited 2013-05-06 23:57 UTC
An interesting aspect of the scene history is the unexpectet resurgence of the C64 scene not infrequently featuring the same people that were the heroes back 25 years ago, that came to assembly coding after having kids that are the same age they where when they started.
The best part? Current day demos not only repeat old time nostalgia with this same old techniques. They managed to discover completely new stuff never thought before, and thats on 30 year old rig that’s rev engineered down to gate level . Prods like Cubase64 simply blow me away. I’ve been pretty savvy asm coder but have no idea how in a world did Ithese guys managed to achieve that.years old
This is a step back in time. I can still hear the synth in my ears. State of the Art by Spaceballs was one of the first I ever saw.
Try http://www.modarchive.org/
Kochise