“When I started writing programs in the late 80s it was pretty primitive and required a lot of study and skill. I was a young kid doing this stuff, the adults at that time had it even worse and some of them did start in the punch card era. This was back when programmers really had to earn their keep, and us newer generations are losing appreciation for that. A generation or two ago they may have been been better coders than us. More importantly they were better craftsmen, and we need to think about that.” I’m no programmer, but I do understand that the current crop of programmers could learn a whole lot from older generations. I’m not going to burn my fingers on if they were better programmers or not, but I do believe they have a far greater understanding of the actual workings of a computer. Does the average ‘app developer’ have any clue whatsoever about low-level code, let alone something like assembly?
I’m a modern programmer that picked up assembly early on and it has given me a greater appreciation for the workings of higher level languages. Not that everyone needs to do that, but I do think the more familiar a developer is with the lower layers (whether that be OS, CPU, Assembly code, etc) the more likely they are to turn out efficient, well-designed code.
On the other hand, I know programmers who have been in the field for decades who are terrible developers who like to just go wading into the middle of a pile of code, deleting and copy/pasting stuff without any idea of what the code does or why it works the way it does.
Short version: Being a good developer isn’t a generational thing, it’s a combination of experience, education and attitude.
Personally I would argue that all developers – even Web Developers – need to have exposure to Assembly programming as it just changes the way you think, and how you approach programming. It’s just one of those things that really makes the difference between an Okay developer and a Good developer.
And what was their background? Computer Science?
The “generational” side of it is that the older good programmers were EE trained, not Computer Science trained. Computer Science has really not done much but lower the standards; and honestly we need a good software engineering programming in the engineering departments that takes on the roll of training programmers for industry; leaving Computer Science to the academics – kind of like the difference between a Physics Degree and a Mechanical Engineering Degree; related but not the same thing.
The author asks:
Well, the truth is that we ARE learning more, but there is so much more to learn now than in the old days that we learn a lot less about everything. For example, if you programmed on a Commodore 64 in the early 80’s, you probably coded in either BASIC or 6502 assembler. These days, there’s probably 328302439230 languages to choose from, with at least as many frameworks for each language. Thus, it would literally be impossible for us to be as knowledgeable as people were back then. I’ve heard that it used to be possible to know everything there was to know about PCs. Now days? Perhaps if you had NZT or something
The most you can do is to try and become an expert in a specific language. (Or maybe 2 or 3, depending on whether or not you have a life.) For example, if you’re coding web apps for a living, it probably wouldn’t serve you as well to become an expert at hardware, as much as it would if you were coding close to the metal on embedded devices.
I hear some people bragging about how much more they understood their C64 than the modern computer user, but I can do far more useful and productive stuff using just AutoHotKey than they ever could on those old machines, so as far as I’m concerned, they can suck it I give more props to people who can actually get more productivity out of a machine than to someone who can explain in detail what each register does inside of my CPU.
Edited 2012-09-28 22:37 UTC
Those ‘old’ programmers might have just been from a select group of people who were really interested and educated in computers. I don’t think it’s a shock that they actually be very talented and knowledgeable.
As computers expanded into a general field, more and more people entered it of varying quality. Since software development is not a regulated field in anyway, there was no way to maintain any sort of quality as other professions do (medicine, law, nursing, engineering).
The result is the a valid perception that your average programmer today is not very knowledgeable and perhaps not as good as ‘ye old programmer’.
But I think if you took a good solid software developer of today and compared them to ‘ye old software developer’, they’d stand up quite well in terms of the inner workings… and they probably know a bit more in terms of rapid development and web technologies, frameworks, UI, usability, various specializations, security…
Edited 2012-09-28 22:45 UTC
The stone age people who used moss, leaves and berries to create paint on cave walls that still this day we can tell were supposed to be various animals were extremely skilled at using the absolute most primitive low level tools possible to produce something that isn’t great art by current standards but is still instantly recognisable.
Were they better than the guy thousands of years hence who can use bic pens, the culmination in many many years of writing evolution, to draw near photorealistic renderings of people and animals?
Then there are the guys using those modern tools, the bic pens, like me who can draw a pretty mean stick man and that’s about it.
The point I’m making and my feeling about it is that the tool is irrelevant: what you produce in relation to what is possible to produce, is.
A former housemate of mine taught drawing at a major university. He told me that that photorealistic drawing and painting takes very little skill. It is actually far harder to draw high quality abstract art.
The cave painters at Lascaux were far more talented artists than some guy mdern guy drawing with a ballpoint.
Harder != better.
For all we know they could be equivalent.
I feel that last sentence is dead on, and what I was feeling throughout reading the article but couldn’t put in to words, the best I could do was an analogy:
Just because Darwin was “first” in describing evolution, doesn’t make him a better biologist than Rosalind Franklin for discovering DNA, or Boyer and Cohen who were the first to modify it in a living organism.
We think everyone in the past was great, but that is because all of the mediocres are lost to history; like all the kids that copied code from computer magazines and painstakingly, manually, “pasted” in to their BASIC machines. 🙂
Rosalind Franklin most definitively did not “discover” DNA.
My apologies, “the structure of DNA” is what I meant.
Well, you also need to add 3 more names to that discovery 😉
BTW Franklin was a chemist, not a biologist. Sorry to be so anal.
Carry on….
Franklin was researching the structure of crystals, among them DNA, Watson saw one of her x-ray photographs, and heavily changed their (his and Crick’s) model to match what Franklin had found. Luckily she died before the Nobel prize was handed out, as it would have been a mess otherwise. 😉
Source: http://profiles.nlm.nih.gov/ps/retrieve/ResourceMetadata/SCBBFW
(edited to add Nobel prize statement)
Edited 2012-09-30 14:01 UTC
I would agree the programmers of the past were a lot better, but I would not attribute that to the knowledge of low level workings of a computer. They just had to work out things for themselves a lot more than they need today.
The “low levels” are just implementation details. Assembler isn’t some kind of old magic* passed down from wizard to wizard. Assembler is just another programming language. There are a lot more functions and a lot more side effects and a lot more detailed documentation about those side effects but there’s nothing that makes it different from problem solving in general.
Problem solving skills are what’s important. If your solution to a problem is shit, no amount of good coding can save it.
* JCL is though. You never write your own JCL. You copy it from a colleague who’s worked on z/OS longer than you, who’ve built up their own JCL book of spells in the same way. You add and remove cards as you need. There is a mystical 71 character limit and none of the wizards remember where the continuation column is.
I’ve met that generation, there were some good ones, some terrible ones, some okay ones. Just like today. They never scaled real time apps cross datacenters or dealt with the flood of data some of us deal with. I think the real thing is that more people today call themselves programmers when all they really know how to do is install drupal on godaddy.
Maybe it’s the changing kind of job markets today that lets bad programmers easily “hide” lack of skills to gain job positions they are not really qualified for. Why am I thinking this? Because I meet them nearly every day: People occupying a precious job place described as programmer, developer, engineer, even in upper management – and not being able to deal with the most basic tasks that one would suggest with those jobs. I’m sure there have been such “clever guys” in the past, but I assume it was not as easy in the past to get into a job position and stay there, no matter how incompetent one may be…
Maybe it’s also today’s job descriptions that do not require basic knowledge about how computers work, instead they emphasize expertise in one of today’s favourite flavor of scripting language, web framework or tool.
I’d like to suggest reading Jeff Atwood’s article “Why Can’t Programmers.. Program?” from his “Coding Horror” blog:
http://www.codinghorror.com/blog/2007/02/why-cant-programmers-progr…
There may be other follow-up blog entries that are worth reading.
No problem – it seems that that’s all they’re expected to do. For everything else… “We have a company for that!”, refering to the continuous trend of oursourcing particular tasks, even “sub-tasks of program development” fall into that category. The hope of some “decision makers” is that there will be more qualified and motivated programmers somewhere else that only costs a dime, because the “big money” is already spent on the “programmers” (quotes deserved in this example) of the own company that cannot get the job done.
In my experience, you find good and bad programmers in the same distribution as in the past, with one difference: Good programmers were the ones having a job, bad programmers lost it. Today, bad programmers keep the jobs, good programmers are “cheap wage slaves” or even unemployed. One of the reasons (at least here in Germany) is that employed programmers get corporate support for attending “courses” where they earn certificates. That doesn’t imply that they actually earn new skills, learn stuff or gain knowledge! They just get a shiny paper, costing 5000 – 10,000 Euro or more. On the job market, that makes them “valuable”. Those programmers who invest their free time to learn new things, code, exercise, test and try out new things, don’t get those certificates, that makes them “inferior” on the job market. Even worse, if they don’t match particular job titles (from previous employments), they sometimes don’t have the chance to be considered for a job that would perfectly meet their knowledge and experience. But no shiny papers, no job, or just a low-level wage: “Why should I pay you to solve my problem? I’m paying more than enough for my own staff of superstar programmers.” (I’ve actually been told that by a lead developer.)
Maybe job markets work differently in more developed countries – where actually present skills and knowledge count. I really hope so…
I would generally say that those who are actually interested in what they are doing have the chance to get good or even excellent programmers, while those who just do it “as a job” and not really being interested in what’s related to be a good programmer, will be doomed to stay bad programmers.
I don’t think that the older generations were better, they were fewer and most of them were really doing low-level or stuff that were lower level. Now, there are a lot of copy-paste programmers, a lot of mix-and-match programmers that just put things together; without the need to optimize, without the need for better algorithms. These may look as representative, but they are not; SOMEONE is writing the tools for these guys, and those programmers are better. Better than the older generation, better than the current generation.
So it may look like we’re invaded by incompetent programmers, and in a sense it is so. Probably you’ll have a 60-80% of the general programmers population that wouldn’t be able to write a sorting algorithm correctly. But the numbers are far greater, and the rest of 20% are really good programmers, in huge numbers, doing stuff that could never fit the imagination of the normal mainframe developer.
I am afraid it’s 99 to 1, not 80 to 20 rate. It was 50×50 15 years ago before expansion of computers to developing countries with no computer or logical thinking background.
Even so, 1% out of 20 million is far better than 50% out of 50.000
So, if you’d be in the business and looking for some developer, you’d prefer to find one easily, but with 99% probability that s/he’ll make a buggy product, than working a little harder to find one, but with 50% confidence s/he will do the job right?
No, if I were in the business I’d settle for 3 mediocre programmers which I would call elite for learning basic design patterns and applaud them for understanding ‘pass by value’ vs. ‘pass by reference’.
Which is what businesses actually do nowadays.
Sadly, I can only confirm.
Because, you know, the west had computer background built-in from the start even before we had computers…
I’m not even going to start on the “logical thinking” thing, it’s just too moronic.
Seriously, wtf?
I think he means that thinking jobs moved to developing countries and the know-how got severely diluted in the developed countries.
Like, for example, India, which has no logical thinking background [0].
—
[0]: http://en.wikipedia.org/wiki/Indian_mathematics
Back in the 60s and 70s you often weren’t weren’t allowed to programme a computer unless you were a mathmetician doing graduate level research. It cost so much to operate computers that no one was going to let a brainless noob anywhere near a punch card or teletype.
It is incredible the amount of lousy so called developers we get assigned in our projects.
The world of Fortune 500 consulting companies is full with copy-paste developers, coding methods/functions with more 1000 lines, no tests, and code quality to make you cry for every written code line.
They don’t have any idea how things work, sometimes I even wonder how they managed to get through university.
But this trend is unstoppable, because for the managers that have the top two qualities, they are cheap and replaceable.
It might show my age, but I miss those days like the article’s author.
Nowadays I’ve discovered that knowing the language inside-out is not enough. You may know Java but you could not be able to deal with web development other than basic servlets. It is all about framework specialisation and it takes lots of real-world practice to become familiar with framework. So in enterprise Java world for example we can say that we have Spring programmers and Tapestry programmers and Wicket programmers etc. They all use Java but the rules of framework are so unimaginably different that you need to be expert on that particular framework to be able to do anything more than simple hello world.
Yeah, well, the grass was greener too.
Another point of view is tjhat in older days people wrote their own bugs Nowadays they spend massive amount of time understanding the frameworks and working around the features/bugs of them.
Y2K.
I started when people rode Penny Farthings to work. At least it feels that way.
One big issue “back in the day” was that processing was so expensive your program was scheduled to run overnight. You got ONE chance to run your new program every 24 hrs. Missed a comma, or full stop? Hard luck. Try again tomorrow.
You had to be far more confident that you’d checked everything thoroughly; not use a run-it-and-see approach (as I tend to do these days)!
I’m not a programmer so I’d be reluctant to claim any truth to my observation……But, code-bloat says it all. Yes, I’m aware computers do and have more to do than in the past but if the power and bloat needed today for what they could do in the past is any indication then yes, programmers must have been better.
I don’t know about “app developers”, but I can claim for personal experience that a large part of self-proclaimed “web developers” have absolutely no clue about what programming means.
Some gems I’ve heard from this kind of people:
– Nobody ever used Java really
– C is an extinct dinosaur
– C++ is for geeks only, the Linux apps I’m using right now are written in QT
– The kernel isn’t part of the operating system
– What’s the kernel?
– You don’t need to turn strings into bytes anymore, modern CPUs can do the math with strings as well
– Javascript is as fast as assembly if not faster
– You can’t even develop in assembly anymore
– PHP doesn’t need algorithms
What a list!
I’m feeling like printing it and hang it on my office.
I am a web developer and I develop web services for our app developers to use. I don’t agree with what you say. I’ve worked with guys that have said similarly funny stuff like what you quote (my favorite was “C has classes now in the newest version by Microsoft, C#!â€) but those weren’t clueless or unskilled programmers. They where programmers that had settled for one language, in this case PHP. That made them good PHP programmers but rather helpless when it comes to other languages or other problem domains.
The problem I see with this article and with what you and way to many other do is to define a “good programmer†as somebody who knows X, Y and Z. Anybody who doesn’t know about those things is a clueless, bad programmer.
The problem is today we need a lot of programmers, and there are a hell of a lot of kinds of programmers too. So even tough I know C/C++ and have a strong interest in operating systems I do web stuff it dynamic languages (right now in JavaScript and Ruby) for a living and I like it. The field has grown and is now so damn big that none of us have a fair chance to be competent in more then one or two problem domains and some don’t even get past one programming language. That doesn’t make them or rather US (me and all the other devs here) incompetent or useless. A lot of time has passed since programming meant writing batch processes for mainframes or small BASIC scripts for your home computer.
We need those legions of programmers to get all the work done that exists now (how many devices to you own now that you can classify as computer?) so the barriers to entering the job market is low but only the good and capable get all the way up to the good jobs.
Back in the days all programmers where the equivalent to engineers, now we have mechanics too.
Mine is just a small selection of sentences to show the situation, but talking and working with this kind of “programmers” reveals their real knowledge or even interest in their job. I said “a large part”, not “all”, and I choose “web developers” because that’s the category where this effect is most visible nowadays. Of course nobody can be an expert in every field, but you can’t miss the fundamentals.
Most of the people I heard these sentences from were very good at other task – say web design, graphics works, management, or simple zealous workers. They also manage to make scripts that appears to be working, or they wouldn’t be in this sistuation.
But probably the best way to describe them is that they _refuse_ to learn a better method, to get a larger vision of their work. They’re happy because their products (apparently) works, they’re happy because their boss is happy that their products (apparently) works, and they’re happy because an happy boss will help in keeping thir job. I’ve learned that when you find yourself in this situation you should either get out as soon as possible, or work as bas as your neighbour.
Some people do programming as a day job, some do it as a vocation.
I tend to spend a bit of my own times learning new techniques, my colleague while she is a very solid programmer has her skillset is mostly VB6 and C#.NET 1.1 stuff. Which is fine, the code works and is perfectly readable what she produces … although it is a little bit “oldschool” compared to C# 4.0.
Let me start off by saying that I am currently a linux device driver developer for SAS storage controllers…so I consider myself a “to the metal” programmer. However, I also have quite a bit of experience with test automation, so I know both sides of the coin.
So as a rejoinder to this, I’ve heard hardcore C(++) and HDL also show a vast ignorance. To wit:
– (In regards to a distributed app) “Where is the exe that I can run to install this?”
– Recursion is stupid and should never be done because you can blow the stack (not realizing that other languages have tail-call optimization)
– Who have never heard of TDD
– Who have never heard of continuous integration
– Who think it is ok to rsh or telnet into remote machines
– Who thought that self-balancing trees were a new “invention”
I can’t tell you how much prejudice I’ve seen with the “metal” programmers deriding web app and database guys. And yet if you ask them if they have ever done any themselves…they haven’t.
My web application experience is extremely limited. I have played with a few web servers (cherrypy, pyramid, and jetty), and written some very small toy “hello world” kind of web apps. But I can tell you this…web programming is way harder than most “to the metal” programmers think it is. Same thing with database programming.
Figuring out network problems, understanding the topology of the system (are you using virtualization?), understanding either multi-threaded or asynchronous (reactor/proactor) servers handling several tens of thousands of requests simultaneously…none of that is easy.
And yes, everything I described is back-end development. But I think many “metal” programmers only think of the front end devs (the guys writing front php or javascript, css, adobe AIR or javaFX for example). But even that isn’t as easy as you think. They deride that as “scripting”.
I know EE’s and CE’s who just couldn’t grasp closures in javascript or lambdas in python. Showing them python generators or classes that dynamically generated classes or functions was a real stretch for them. In other words, their idea that “scripting” languages are toy languages was simply ignorance.
For some reason, many “to the metal” programmers somehow think that being able to do bit manipulation, access registers, time the hardware with the software, and juggle virtual-to-physical addresses somehow makes them superior. Even desktop application developers don’t have to worry about things like dependency management or networking issues.
The sad fact is, most programmers specialize and are therefore unaware of the difficulties faced by the kinds of programming others do. I am fond of an old chinese saying regarding religion, but which I shall paraphrase for programming:
“If you only know one programming style, you know none”
I already explained my thought better here: http://www.osnews.com/thread?537017
Programmers used to have to be more precise simply because a run was a big deal. You didn’t have your own machine at your beck and call. Instead, you shared a machine and computer time was a scarce resource (it cost money or it took you a long wait in line to get your program on the machine).
Also back in the punch card era, it wasn’t so convenient to alter your code as it is today.
So I don’t know whether old programmers were “better,” but I do believe they had to code more carefully and rely much more on desk-checking than simply running a program over and over to eliminate error.
Makes sense. Necessity and perfectionism both tend to result in more cautious coding. For example, I’m an admittedly very overworked perfectionist and, the more time I can find, the more unit tests and auditing I do.
Once I finish my current project to the point where I can make the code public without worrying about schema migration issues, I’m planning to go back and give all my older little utilities a full audit and unit test suite.
(I may be auditing my own code, but the methodical approach I use for writing comprehensive unit tests does a pretty good job of doing double duty as an effective self-audit)
Edited 2012-09-29 16:28 UTC
Yes, you’re right. Thats how it was late when I started programming in the 1970s. If you didn’t have your own computer or a terminal on your desk, just coding sheets and punched cards then you spent a lot of the day reading code and running programs in your head. As it happens that is still an essential skill today. Being able to visualise a program in your head is one of the things that separates an expert programmer from a poor one.
In fact, the reason why programming hasn’t actually changed much is because you mostly do it in your head. It’s a bonus if you can edit code easily or get more than two compiles a day, but it doesn’t change what if feels like when you are doing programming much.
I started at University using a teletype with an interactive language, which was really very similar to a ruby or python console that we use today. When I started my first job 1978, it wasn’t interactive like that though – everything was batch processing even developing online applications wasn’t done interactively.
To me what has changed is that a programmer today can teach themselves new languages and skills because we have the internet and personal computers. We can learn more by collaborating with other people to write Free Software on github. In the 1970s when I started you couldn’t do that. If you wanted to learn something new you could buy a book on ‘Jackson Structured Programming’, but you would need your employer to send you on a course if you wanted to learn a new programming language. We are much more in control of our careers than we were then in my opinion.
The article seems to believe that because programming used to be harder, it attracted better people somehow. I don’t see how that follows at all. There used to be just as many poor programmers 30 years ago as there are today.
Programming began in the head, not on the terminal’s keyboard (if you had one). I think there was more emphasize of the “pre-coding work”. Today you don’t need that approach anymore as “trial & error” is inexpensive, all on company time. 🙂
You could be happy to even see the machine you’re woring on (or for) once in your life. 🙂
So you know where “accounting” in relation to computer resources (CPU time, storage, hardcopy) originates from. The UNIX and Linux operating systems still have this functionality built in.
I remember my CS professor stating: “‘Trial and error’ is not a programming concept!” 🙂
For those not familiar with this important era of computing, I suggest reading “Programming with Punched Cards” by Dale Fisk:
http://www.columbia.edu/cu/computinghistory/fisk.pdf
It’s more funny than you may think, and as a historic sidenote, it depicts the role of women in IT when IT wasn’t actually called IT (but data processing).
I admit it’s hard to compare in terms of worse or better. At least it’s much different, even though some basic elements have been shared throughout IT history. Those who are willing to read, to learn, to think and to experiment will always be superior to “code monkeys” and “typing & clicking drones” involved in so many parts of corporate IT.
I think visualising the problem first is what separates the good from the bad developers.
I am trying to get our junior developer to break down problems into a set of steps. I sat him down with me and went through what I was doing and why.
He just wrote everything down, completely missing the overall point.
While I rarely write a lot of pseudo-code anymore, I normally have a diagram, set of rough steps or something similar I wrote first to get the problem well understood in my head.
Edited 2012-09-30 13:28 UTC
I think part of it is that they’ve dumbed down Computer Science programs over the years, in the interest of higher enrollment numbers. I’ve interviewed some co-op students, and they mostly list school assignments/projects on their resumes being that they lack real work experience. The projects they’ve done aren’t nearly as challenging as the stuff we had to do in 1st and 2nd year a decade ago.
Another part is that since computing and operating systems have matured, a lot less tinkering is needed nowadays; everything mostly just works out of the box. You don’t need to apply patches against the Linux kernel source code, and compile the kernel yourself just to have NAT nowadays. Many of the students put Linux on their resume, but it’s usually just Ubuntu and their experience is strictly limited to being a simple user, with no systems administration whatsoever.
Applying patches to a kernel makes you a systems administrator which isn’t a programmer.
I have seen this argument before. It starts with a belief that somehow, the lower the level your knowledge is, the better of a programmer you are. I say that while it never hurts…it’s also non-sense. I know guys who are whizzes at VHDL or Verilog, but gave up on LISP or haskell.
I know Electrical Engineers who do VHDL or low level C(++) programming, and yet haven’t even heard of Alan Turing or what the Halting problem is. And even if they heard of or know what a Turing machine is or being Turing Complete, no EE I have yet met has heard of Alonzo Church and lambda calculus. I know many Computer Science guys working on high machine learning apps that don’t know what a stack pointer or stack frame are. I honestly have even met CS people who don’t know what the difference is between the heap and the stack (thanks Java). I know EE’s who think that recursion is stupid because it can blow a call stack, without realizing that is a limitation of their chosen programming language, and not recursion itself.
So where does it all begin? School of course. It also depends on how you want to concentrate your study. As I mentioned, I know gurus who know ELF, DWARF, and x86_64 assembly, and yet know nothing about computer science. They do not know about the Incompletenes Theorem, they did not take Algorithmic Complexity, and wouldn’t have the foggiest about what the difference is between a lexer, a scanner and a parser (or what the difference is between a context free grammar and a regular expression).
Conversely, I know guys that can create their own toy programming languages, can compute algorithmic complexity off the tops of their heads, and yet don’t understand timing diagrams, what an interrupt is, or even know how skew, jitter or cross talk can screw up your data.
And then there are programmers who know neither of the above. They just take a (higher level) language, the libraries, and design stuff. These guys (usually) know about TDD, SDLC, requirements gathering, and other software engineering best practices.
Those EE’s that play with FPGA’s or embedded firmware who may be able to look at disassembled C code and trouble shoot it are also the same people whom I have looked at their code…and found them copying and pasting sections of code, or found them writing 1200+ line functions because “the price of pushing a new function on the call stack is too expensive”. They have nary a clue about databases, web programming, or even for that matter more exotic data structures like AVL or Red Black trees. And they wouldn’t have the foggiest how to work with huge data sets (all the NoSQL stuff).
Those Computer Science types can also fall prey to the lack of software engineering know-how. They also, to paraphrase Alan Perlis, “know the value of everything, but the cost of nothing”. Having a computer science degree myself, I have always analogized it to the difference between physicists and mechanical engineers. A physicist cares about the concepts, but glosses over the reality. “Don’t worry about friction”, or “treat it like a point object”. But rather than just plug in formulas, the objective is to understand the theory. As Edsger Dijkstra famously said, “Computer science is no more about computers than astronomy is about telescopes”. Unfortunately, academia fails CS grads in many ways because they don’t learn many “real-world” skills. Revision control? Team programming? Shared libraries? Modular design? At least at my school, these best software engineering practices were all things you had to learn on your own.
And speaking of software engineers…they have their use (and limitations) too. While they may solve problems with brute force rather than a more efficient algorithm, and they treat the underlying hardware as a black box, their insights into the “artisanship” of programming can’t be overlooked. Much of the nuts and bolts that is created are done by folks of this ilk. I knew EE’s who whined and moaned about moving to revision control systems (although if it was moving to ClearCase, then I wholeheartedly agree). I mean really? And it’s thanks to them that we even have new careers like Software Configuration Management engineers and Build/Release engineers.
So all this being said, there’s more than one criteria to be a “good” programmer. I have been fortunate. As I said, I have a BSCS degree, but I have done a little of everything even in my short 6yr career. I have done embedded firmware development, including some assembly for the Blackfin ADSP, I have done Java Swing and Eclipse RCP apps, I have done distributed computing with some enterprise Java, dabbled in MySQL, I have done test automation with python, I have had to work with logic analyzers, oscilloscopes and DVMs, I have worked with PCIe and SAS bus analyzers, and I am currently a linux driver developer. I also learn functional programming languages for fun (currently clojure and scheme) since unfortunately it’s super hard to find a job to use them. And one of these days, I am going to make a (non-compliant) version of scheme using LLVM.
So really…there are programmers of many types, and they all need to learn from each other. It’s very rare to find someone who can do it all. Consider that different problems require different knowledge sets and finding someone who can work in all these domains is a rarity.
Dude, I so want to mod you up for this post. Very insightful. Alas, I am not allowed to.
Really, can we please be adults and be allowed to mod posts even if we have posted somewhere else in the thread?
Well, for one you need to know all this stuff before being able to fully comprehend that it doesn’t really matter.
Usually when this topic pops out somewhere, it’s the people who have a firm grasp on the basics, detailed knowledge of their field, and a passing similiarity with everything else who form the most leveled opinions.
Yes, knowing how handling strings at the low level works, coupled with JIT familiarity will tell you when to use StringBuilder/Buffer in Java and when to stick with Strings. And some people will not rest until they know all there is to know. But still, what counts “outside” is getting the work done. What counts “inside” is your internal pride in craftmanship.
But if you want to compare I think you’re stuck with “outside” comparisons, lest you risk a flame war, or whatever real-life equivalent (a row? heated discussion? fist fighting?)
Sometimes these two overlap if there are some serious problems (like that guy who has been laying down bathroom tiles for 10 years, and still botched the job at my bathroom)
No, I agree entirely with what you said. My problem is with the a somewhat common perception (in my experience) that low-level programmers are somehow better.
I know exactly what you mean when you say that sometimes you just have to get things done vs. the pride in your work. In fact, that is why I switched from being an embedded FW developer to being a Test Engineer….and back to being a developer.
I became a Test Engineer precisely because I was tired of submitting half-assed, half-baked products that marketing deemed was good enough. Sadly, that first company I left didn’t even HAVE a Test or QA group. So all I knew about QA/Testing was what I read. Of course, it didn’t hurt I made a pretty good raise too.
But then I discovered it was the same thing packaged in a different form. Let test coverage slide. Make sure that checking off test cases from the test plan trumps whether we actually know we found any problems. I became a Test Engineer in the hopes that I could help build better products, but I realized it was really just the same thing in different clothing.
I did not have time to read article but:
1. half of Atari ST demo coders today (I believe it is same with other platforms!) work as lead programers or they are “optimization gurus” for games.
2. or just take a look at demoscene today (my favorite Farbrauch )
or take a look at creation of Second Reality:
http://www.youtube.com/watch?v=LIIBRr31DIU
or SpaceBalls:
http://www.youtube.com/watch?v=WgriMuXZ3QY
today hardly you will find kids doing stuff like these!
but that article from few days before, very nicely explain todays situation:
“Developers, engineers, scientists”
http://jeremyckahn.github.com/blog/2012/09/23/developers-vs-enginee…
Most demos, most demosceners were quite shitty (and forgotten). You simply display selective memory, remembering mostly just the greatest ones, the stars.
Does the average ‘app developer’ have any clue whatsoever about low-level code, let alone something like assembly?
Depends on the education. Those who got a good computer science education usually understand the workings of the computer.
You know, I had a discussion recently where my mother sent me a quote from circa 1965 about how terrible a job parents were doing raising their kids… and COMPLETELY missed the irony. I found especially hilarious that she said, “I really do think we’re headed for a Fall of Rome scenario.” Doesn’t she remember what people thought of HER generation? My commie, free-love, drug-addled, flower-child mother, who now drives a mini-van in the suburbs, going on about how kids today have no moral compass!
Point by point:
1) “Old Programmers Had to be More Resourceful and Inventive”
Yes and no. Sure, they were far more resource constrained. But then, no one expected them to launch a social network site for millions of users… by Monday. Today, developers have lots of resources, but far more is asked of them.
2) “The Barriers were Higher Back in The Day”
Again, a plus and minus. How many really smart people didn’t bother with programming because it was too difficult to get in to?
3) “CopyPasta is Rampant”
Eye of the beholder. If you write your own sort or search algorithm in production code you should be fired on the spot. I would NOT tolerate a coder who thinks he’s better than everyone else out there, and thus leaves an unsupportable mess behind. True, copy-pasting won’t get you to be the next Herb Sutter, but I’m in business to MAKE MONEY, not make better developers.
4) “People use Frameworks Too Much”
Same as point 3. If there’s a framework out there, USE IT. If you’re writing a C program, even “Hello, world!,” you’d damn well better be using the APR or Glib. Don’t you dare try to write your own XML parser. Again, fire-able offenses.
Final note) You can know a lot and still be an idiot. For example, after learning that add instructions are faster than multiply, a developer starts writing “x+x+x” instead of “3*x” in their code. But adding is not TWICE as fast as multiplying, so he just slowed down his code. And he may well have given up some optimization a later compiler might be able to do.
Bottom line: Everybody always claims it was better back in the day. You could call it “Rosy Retrospection” if you like. In Latin they called it “memoria praeteritorum bonorum.” That’s right, even in ancient Rome they thought back fondly on those times before indoor plumbing.
ingraham,
I upvoted your post because I thought it was insightful. However…
“For example, after learning that add instructions are faster than multiply, a developer starts writing ‘x+x+x’ instead of ‘3*x’ in their code. But adding is not TWICE as fast as multiplying, so he just slowed down his code.”
It is silly to manipulate source code to optimise cases that can just as easily be handled by any respectable compiler, but I found your example ironically humorous because I performed this micro benchmark just now, and x+x+x was indeed faster than 3*x on my x86 processor. Only once I reached 4 did it become slower. The fastest solution by far was (x<<1)+x (this can be done in a single opcode on x86 btw).
Edit: Programers should be aware that the compiler already does these optimisations under the hood.
Edited 2012-09-30 04:47 UTC
Real programmers don’t rely on nebulous optimization that are less optimum, as known from “The Story of Mel, a real programmer”. 🙂
http://www.pbm.com/~lindahl/mel.html
Doc Pain,
“Real programmers don’t rely on nebulous optimization that are less optimum, as known from ‘The Story of Mel, a real programmer’. :-)”
Haha, it’s a fun story. But that sounds more like the manifesto of a defiant programmer than anything having practical value.
Hell, even in intel processor terms I’ve seen similar patterns, like programmers who relied on the memory wrapping behaviour of the 8086 at the 1M boundary due to the processor’s original limit of 20 address lines. It was ridiculous to rely on that quirk, yet some (microsoft) programmers did and consequently there’s been a number of hardware hacks to control the A20 address line ever since.
http://www.openwatcom.org/index.php/A20_Line
I do appreciate clever tricks as a form of CS “art”, however I kind of hope the employees responsible for the A20 mess were fired over it since it was very irresponsible.
I myself have often cited shortcomings of the GCC optimiser, leaving me to contemplate whether to use non-portable assembly or to use GCC’s code as is. In most cases suboptimal code is irrelevant in the scheme of the program so it’s not even worth looking at. However in very tight loops such as those in encryption/compression/etc algorithms, hand optimisation can make an observable difference.
In any case, suffice it to say that GCC can handle the multiply by shift/addition on it’s own. So my personal preference is to see x*CONST in code.
Is (((x<<1)+x)<<1)+x faster than x*7? I don’t know without profiling it. On x86 it’d compile down to two LEA opcodes, which are darn fast. What about (x<<3)-x? Long story short, I’d rather let GCC handle it when it can since it’s architecture specific anyway.
Thanks, Alfman; very interesting stuff. I must admit that I did not encounter the “x+x+x” issue myself. I *THINK* I heard that story from Herb Sutter and Bjarne Stroustrup at SD West about five years ago, but I could be misremembering. The point remains, however, that you can mis-apply knowledge, even if that knowledge is accurate.
This pretty much says it all. I don’t think anyone would like to work with someone like you. Good companies not only care about making money, but also about retaining their staff by promoting learning and freedom of expression.
You seem to have a rather disrespectful opinion about those people who ocasionally “reinvent the wheel”. If the existing solution has deficiencies, there is every reason for people to come up with their own implementations. The call whether to use the existing code or roll your own is not just for you to make, it should be made by the team and the individual developers who are responsible for delivery and maintenance of the final product.
to be honest if someone write a custom XML parser, I would want them out of my team.
WHen I used to hang out on Usenet, the Commodore groups, people would sometimes ask how to do something in Assembler.
Someone would reply with some 25 lines of code, someone else would reply to that with 18 lines and then someone would come up only 11 which someone else would change so it would run faster.
Although I had no idea what was going on it always seemed very cool and almost magical. But this required a high level of Assembler and C64 chip knowledge. It’s not realistic or practical to code user applications using Assembler or take in to account the hardware as hardware is very diverse.
So you have to use libs and APIs, but I think it’s rather annoying when a “programmer” makes something, that doesn’t work and he then blames Microsoft or someone else.
In the old days and with all that low level stuff people could at least debug their code almost all the way. Now they suggest to reboot, reinstall the application, reinstall .NET or reinstall Windows. And even then that’s no guarantee it works!
If when something doesn’t work and the fault is with some library/include, the programmer should at least be able to see what he sends to an external routine and what comes back, but for some strange reason they don’t see this as an option.
Recently we had a process that ran as a Windows service. Suddenly it kept stopping until it even refused to start. The guy on the phone couldn’t figure it out (and yes, he did the reboot and reinstall thing), but lucky the expert joined in… and he didn’t know what to do either.
MOS6510,
“In the old days and with all that low level stuff people could at least debug their code almost all the way. Now they suggest to reboot, reinstall the application, reinstall .NET or reinstall Windows. And even then that’s no guarantee it works!”
To be fair, the reinstall strategy often DOES solve the problem. It’s absolutely lame, but software is so dense these days that as a *user* reinstalling is often the path of least resistance. A good developer should track down and fix the root cause, but good luck reaching one through tech support.
The situation has improved, but in the days of NT 4.0 there were admins who use to have their server reboot automatically every night or else stuff stopped working or slowed down too much.
It’s all zeroes and ones, so when something doesn’t work right it should be possible to figure it out. It’s just that between us and the actual coders there are a lot of human shields and even when the coders can be informed they either do nothing or can’t figure it out.
MOS6510,
Absolutely…
as a developer myself I can recount three occasions where I’ve tracked down the exact cause of a bug, reported it to tech support so they could escalate the bug to developers for a fix.
With cyberpower tech support I was in contact with the project manager himself and he was very receptive & helpful, within a few days I had my hands on their new software.
With netgear, I had to fight through the tech support pyramid, going so far as supplying source code to prove the existence of a reproducible VOIP packet corruption bug. Eventually they acknowledged the bug, and promised an update, but a fix was never issued even though these devices were still “officially supported”.
With microsoft, I uncovered an XML parsing vulnerability. I must say I was impressed with their tech support competency and they were able to address the issue promptly. (It was through a paid support contract).
If you’ve got a problem with your computer’s 0’s and 1’s though and you cannot trace it yourself, then I’m not sure many companies will be willing to dig into it to find out why their software failed. Presumably if there are alot of mystery bug complaints queuing up the tech support lines, then they’ll investigate, otherwise it’s “reinstall” and away with you.
Edited 2012-09-30 20:13 UTC
Very few people had such asm skills …I’d say quite possibly fewer than now (it’s just that proportions are different; and generally, our memory tends to see old times as always better – while the opposite tends to be the case; also, we have written examples of ~”the end is coming / moral decay of youth will destroy civilisation” since the very beginning of preserved written world, on some Mesopotamian tablets)
http://dilbert.comhttp://dilbert.com/strips/comic/1992-09-08/“ <img src=”http://dilbert.com“ border=”0″ alt=”Dilbert.com” />
I think I probably *am* one of these “ancient” programmers you’re talking about. Punched cards in college, started programming on Minicomputers in Macro-11 (assembler) in the “real world.”
Back in the day, people “fell into” programming because they liked it and they were naturally good at it. There were few schools that had a computer science major, and most of the CS classes were taught in the dept of mathematics (heresy!).
We’re programmers BETTER then than they are now? No, not at all, NO WAY.
Back then, computer problems were SMALLER. The listing (in assembler) for a complete PDP-11 operating system (including all the system services and many utilities) would fit into a a 3 ring binder. Compare that with Windows or Linux today! Back then, you could KNOW all the code in a given operating system pretty easily. You could see it in your head. It was pretty easy to be an “expert.”
Also, there were few “off the shelf” solutions for problems. If you wanted to control a basic lab peripheral you had to write the driver for it (again, in assembler language). So, naturally, lots of folks knew how to write drivers and knew how the OS worked. A complex “line of business” application comprised prompting somebody on a character-oriented terminal and printing out some clever spread-sheets with dashes and plus signs.
So, guys back then (sorry, they WERE mostly guys) knew different things… more in-depth things that were closer to the hardware… because that’s all there WAS. That was the class of problem we were solving.
But was there more ENGINEERING talent back then? No. Did people right better CODE back then? HELL no. There was ZERO concern for security back then (there was very little malicious hacking so nobody paid any attention to the threat). You’d be FIRED if you wrote code today like we used to write it “back in the day.”
Seriously… the world is different, that’s all.
Why haven’t “new” programmers “learned more” from us old guys? The biggest reason is because a lot of what we struggled with just isn’t relevant anymore. Anybody care to hear about how we use overlays to fit “big” programs into the 32KB virtual address space on a PDP-11? Or how we implemented zero copy interprocess communication using OS memory management directives? Or how we used to do dynamic assignment of Unibus Mapping Registers to manage a scare resource? NO? You don’t want to learn this stuff? Well, hell, *I* don’t even want to remember that shit. Nobody does. It’s not relevant anymore.
Logic is logic. Engineering is engineering. Good engineering is the same now as it was then… an elegant trade-off of factors to achieve a prioritized series of goals.
On the other hand, I *do* miss programming in Macro-11 every once in a while 😉
Working on sub 1mhz computers did do miracles for efficient code. That is why I like the roots of Linux.
Well, some people managed to make systems that were slow as basic (cp/m). Guess what is the roots of windows.
Peace Be With You.