InfoWorld’s Peter Wayner outlines the 12 most common programming mistakes, and how to avoid them. “Certain programming practices send the majority of developers reaching for their hair upon opening a file that has been exhibiting too much ‘character’. Spend some time in a bar near any tech company, and you’ll hear the howls: Why did the programmer use that antiquated structure? Where was the mechanism for defending against attacks from the Web? Wasn’t any thought given to what a noob would do with the program?” Wayner writes. From playing it fast and loose, to delegating too much to frameworks, to relying too heavily on magic boxes, to overdetermining the user experience – each programming pitfall is accompanied by its opposing pair, lending further proof that “programming may in fact be transforming into an art, one that requires a skilled hand and a creative mind to achieve a happy medium between problematic extremes”.
Instead of saying “programming practice A can be bad, but (not A) can also be bad” six times, the author could have just quoted the Golden Mean.
…point # 5…
Nothing Earth shattering here, and not a very cutting article. Someone let me know if I missed the panacea of good coding in one of the later points.
Yer. There is absolutely nothing here.
Made the list — #4 over-reliance on libraries.
You see that one in web design all the time — which is pure idiocy given that we’re talking about a format where size is one of the ‘all-important’ factors. With fat bloated libraries like YUI, MooTools and jquery bloating out web pages we’re seeing a rise in megabyte PLUS sized websites to deliver tens of K of actual content… Used to be when you saw a one megabyte website it was because someone didn’t research image compression; today it’s not that at all with many sites having larger javascripts than they do IMAGES! Quite often for simple stuff like hover effects that can be done in CSS without scripting at all!
There’s a thread over on Sitepoint I made:
http://www.sitepoint.com/forums/showthread.php?t=713376
Discussing a friend contacting me asking about why one mainstream website was such a wreck…
http://www.nytimes.com/interactive/2010/11/08/health/20101108_thank…
He had issues navigating, I tried to access it and it took a minute and thirty to load despite my 22mbps connection JUST because of the horrid ping-time between here and there combined with the absurd number of separate files.
While certainly the five MEGABYTES of 145 images is a contributing factor, the 857k of javascript in 45 files is just as guilty… the CSS ALONE is larger than I would allocate for HTML+CSS+SCRIPTING+IMAGES for a page template before adding content of an entire site! … and where does most of that bloated scripting blame belong? Javascript just to break conventional navigation and jquery bloat.
It’s like statistics package nonsense too — Google analytics is a bunch of bloated pointless trash; I don’t know about you, but I have analog and webalizer giving me much the same information — in fact more accurate information based on my apache logs, why the blue blazes do people waste the end user’s time on them loading some 30k of javascript tracking on top of something servers already do out-of-box?!?
It gets worse when you talk CSS libraries like 960 Grid or the CSS part of YUI — since they inherently defeat one of the entire POINTS of using CSS in the first place, separation of presentation from content. Inherently they use classes like “grid_4” or “grid_6” which basically say how something appears; and in modern markup HTML is there to say what things ARE, not how they appear… you add in all the properties that you might NEVER call on your page and it’s little more than bloat — this isn’t like optimized compilers where unused code in the source doesn’t end up in the executable, it’s more like an old school basic program with thousands of lines of code you never even call…
The really sad part of a lot of these libraries is that many programmers are thoroughly convinced it makes life ‘easier’ or ‘simpler’ when it ends up nothing more than a bloated train wreck that’s an accessibility failure, maintenance nightmare and hosting cost disaster… AND ends up more work in the long term. Ends up even more pathetic when it comes to things like ‘AJAX for nothing’ when they are ‘sold’ to the suits who know nothing of programming as a means of saving bandwidth and hosting costs when quite often the opposite is the result.
What do I mean ‘AJAX for nothing’ — a great example is AJAX or even javascripted tabs… they prevent people from backlinking to the content (bad for promotion), often don’t get indexed by the search engines (bad for SEO), don’t work properly with the normal browser navigation like middle click or forward/back (bad usability) often don’t work for alternative browsing methods (screen readers, handhelds and the like, bad accessibility), don’t work at all when javascript is disabled (even worse accessibility), and often the javascript libraries used to make them work are larger than the content they are being used to serve! (so much for bandwidth savings). A friend of mine who passed away this time last year used to joke that AJAX was often used as the new framesets — and no, he didn’t mean that as a compliment. (see how webmail is being flushed down the toilet with all this web application nonsense)
There’s an old saying — the less code you use the less there is to break… seems like a LOT of coders have lost sight of that with the endless copypasta and massive script libraries they use to implement useless trash like ‘gee ain’t it neat’ animated crap that actually interferes with the user experience (yes lightbox, I’m looking at YOU!), things HTML and CSS can do all on their own without scripting assistance, things that should be handled server side (and would still HAVE to be handled server side like form validation) and page building methods that are all-around bad ideas.
In other programming arena’s there are plenty of useful libraries — SDL, OpenAL, etc… but for web development they are universally fat bloated trash that ruin perfectly good website concepts.
But to put that in perspective I say the same thing about fixed width layouts and px metric font sizes on content areas… which is why I use user.css in Opera to override the layout here on OSNews into something useful… but then, the WCAG says that about px fonts and fixed widths too.
Edited 2010-12-07 09:48 UTC
Kudos to you. + 10000000000….
I’ve said the same, although less extensively, in a previous comment here on OSnews: the so-called “progress” we should have seen is actually a regression due to the multiplication of inter website connection is actually making the time needed to load a page longer. Kind of like slower iterations of OSes and applications when the computing power has in fact swollen. That’s why I have a subscription on OSnews and any site I regularly visit. That’s why I started hating Yahoo Mail (actually, the “Info” panel in Opera says 199 605 bytes for the main page and 504 046 bytes for the inline elements… opened the source and almost 90% is displayed in red! (or non Opera users, javascript is syntax colored in red) and all of my 199 secondary addresses are in the source; but trying to modify one still takes a good dozen of seconds on a superfast enterprise network before the editing page displays. God, that’s stupid).
Also, too many network requests for stupid things like google-analytics and facebook (and I don’t even mention ads)… as if I were interested in any of these. Put a subscription and I’ll pay if the service/site is of interest to me. I’m a software engineer myself and I know software or sites don’t materialize from ether; I’d pay just to get rid of the lags. Oh, maybe that’s their plan? make the free service so bad that users will prefer to pay?
This horrible trend is also the reason why I think web-oriented thingies (Google Docs, Chrome OS, cloud comsucking, javascript-heavy-HTML5) won’t have a success with me or dare I say in general: the pervasiveness of web access and its ease of use are not there yet. I can’t stand spending more time now on hi-speed broadband waiting for pages to load than I did ten years ago using modems and local caches. All the less since we’ve been used to faster page loading until 4 or 5 years back when the speed started to dive. Which, btw, nullifies the “browser speed” competition or at least turns it into a “mine is bigger than yours” contest. Anyway, shame on moronic devs and management; I wish all of them were Darwin Awards winners.
What you are talking about is bad software engineering.
Yes if the tech is used wrong it will be a mess. That isn’t the tech fault rather the engineering.
Yes if you make tons of HTTP requests, don’t minify your JS and CSS etc etc, and tons of Dom manipulation … it will load slowly. Funnily enough if you don’t do this it loads quickly.
JS, CSS and Background Images on Webpages are cached by the browser once they are downloaded, so initial page load might be longer but subsequent page loads will be shorter.
jQuery is not bloat, lets me write JS code against an extensively tested library that works cross browser.
It abstracts away the browser and lets me get on with the logic of what I want to do, which is how libraries should be used. If I were to write my own code it would take me far longer to write, and will be more buggy, because it won’t be extensively tested.
Considering Microsoft are behind jQuery and bundling with Visual Studio is a testament to its quality.
Yes and No, I’m talking about bad frameworks and and piss poor coding methodologies — and jQuery tops that list.
… and when the tech in question is wrong in the first place?
Minification is a crutch for people who can’t write good code; especially in the world of mod_deflate and server side caching of the compressed copy. If wasting your time white-space stripping your CSS and/or Javascript shows actual benefits in bandwidth and CPU consumption, then there is something fundementally flawed with your CSS and/or Javascript.
Given that browsers are a billion times faster than they recieve the information — that should be a non-issue. It’s part of why this whole “which browser is fastest” nonsense grinds my gears… See when your die hard “don’t ever use tables for anything, even tabular data zealots” who run around abusing definition lists run their mouths about tables rendering too slowly; when a 386/40 running IE4 on Win 3.1 can render a table in an entirely adequate amount of time, the ‘speed’ of a browser in the multi-ghz multi-core era should mean exactly two things — and Jack left town.
Still doesn’t make megabyte plus firstloads acceptable… Hello bounce rate!
jQuery is a blight upon the Internet resulting in typically larger, more complex and needlessly cryptic scrips for no good reason. 90%+ of the garbage I’ve seen people vomit up using jQuery could have been written faster using less code without it! The only way you could flush your site design process worse would be to toss Dreamweaver and Fireworks in the mix.
Though over half the library is little more than goofy animated crap that gets in the way of the user experience — but to put that in perspective I’m the guy who hates desktop animations. I hit minimize I want it gone NOW, not two to three seconds from now after some goofy fade or slide animation. I click or hover a menu item I want it open NOW, not two to three seconds from now after some goofy fade or slide animation. I drag a window around I want the borders to retain their shape so I can see where I’m dropping the window, instead of having to sit there with my thumb up my backside waiting for the deformation animation to stop jiggling like Jello.
As you can tell, I’m not a fan of OSX or Compiz/Beryl/whatever the free***’s are calling it this week… and immediately turn all that crap off whenever/whereever possible. It’s “gee ain’t it neat” bloated garbage that actually DETRACTS from the user experience.
Lightbox is another great example — every time I come across it usually I end up screaming “Oh for **** sake just let me open the ******* IMAGE!” (especially many of the lightbox ripoffs that don’t even let you middle click — at least the REAL lightbox gives you that to bypass it)
… and who cares if it takes 40-90K to do 4k’s job? Because of course a black box library and twice as much code by the time you’re done using it is SO much simpler, cleaner and easier to maintain — NOT.
Worse, it’s generally used for garbage that one shouldn’t even be wasting time doing on a website in the first place!
Don’t know that I’d use Microsoft as a indicator of quality when it comes to web technologies… given they’ve missed the boat at every opportunity when it comes to site developments and their websites being some of the worst on the Internet — see how over the past decade they flush hotmail further down the crapper, especially in this latest iteration of useless AJAX train wreck that doesn’t even work right in IE, much less alternative browsers… Said disaster probably costing more bandwidth and cpu overhead much less not being as useful to the end user as the old HTML 3.2 version was circa 1998!
As my friend used to say “The only thing you can learn from jQuery is how not to write javascript!” — which was usually preceded by “the only thing you can learn from WYSIWYGS is how not to write HTML”…
Going hand in hand with his sage advice: “The only thing about Dreamweaver that can be considered professional grade tools are the people promoting it’s use.”
Edited 2010-12-07 23:33 UTC
Not only did you not make a lot of sense, you failed to provide any valid points.
Spring is my example. Most the time it is far too heavy a tool for the job at hand. Where it is not, it is very easy to loose understanding of what the code is actually doing and when, where, and how it is being executed, since the code that the developer wrote (and therefore best understands) ends up being only a small fraction of the code that is being run.
As the saying goes–and as the author seemed to be indicating in his counterpoints–the extreme of virtue is the extreme of vice. That applies as well to OOP, abstraction, frameworks and black boxes as it does to most activities.
Simple is better. KISS me baby!
An heterogeneous list of 12 items is exhausting. It should be structured in nested collections of 6-7 items at most.
My happiness is complete if the code is easy to understand and things have to be changed just once. So I’d just say simplicity and DRY.
About DRY: Caution with DRY and noobs. They may believe that is a syntax issue. It is not. It is a semantic one. Eg: Being A and B constants with different meaning, just because at the moment of coding they had the same value, they should not be defined such as A = B, nor B = A, nor one should replace the other, etc.
Beautiful is better than ugly.
Explicit is better than implicit.
Simple is better than complex.
Complex is better than complicated.
Flat is better than nested.
Sparse is better than dense.
Readability counts.
Special cases aren’t special enough to break the rules.
Although practicality beats purity.
Errors should never pass silently.
Unless explicitly silenced.
In the face of ambiguity, refuse the temptation to guess.
There should be one– and preferably only one –obvious way to do it.
Although that way may not be obvious at first unless you’re Dutch.
Now is better than never.
Although never is often better than *right* now.
If the implementation is hard to explain, it’s a bad idea.
If the implementation is easy to explain, it may be a good idea.
Namespaces are one honking great idea — let’s do more of those!
And from personal experience :
Tunnel vision, NIH syndrome, “look this generator generated 100K lines of code, what a timesaver!”, Let’s cut corners so we can finish on time, too much code already depends on the quirks, foreign keys make my live hard, i’ll remove and cleanup after we ship.
It made me quit programming and starting to enjoy live again
Shame on you for not linking to the source of this wisdom. 🙂
I looked into Ruby on Rails and Groovy&Grails hated both for being too implicit. Struts2 is explicit but fails everything else on that list (and then some).
The only nice web framework I have encountered is Django. If you already know about http request&response, urls and databases you will easily understand Django.
One other pitfall not mentioned is insomnia and bad mental health.
No wonder some of the commentors is so cranky in here.
http://news.idg.no/cw/art.cfm?id=BB7208FE-1A64-67EA-E41662CE9F48CCA…
“Teach a man a program and frustrate him for a day.
Teach a man to program and frustrate him for a lifetime”
Edited 2010-12-07 11:56 UTC
When has the design and coding of complex software NOT been an art?
Code which simply “works” is usually not the same as easily maintainable code, and elegant code often trumps both in terms of both functionality and long-term maintainability.
I don’t think the subjectivity of coding is limited to style, either. Some folks simply have a better ability to divide a problem into discrete steps and create an elegant generalized solution than the rest of us…
1. Forgetting to add cutomer_bill_rate++ to the build script.
2. Not removing the “this thing is so f##ked up” comments before release.
3. Not inserting the mandatory 5% bugs for the maintenance budget.
4. Omitting the 25 meg of zero-padding in the compile step.
5. Accidentally deleting the CIA back-door plug-in.
Yeah I am guilty of #2 all the time.
Besides the fact Bruce is an old acquaintance, his book is literally the best ever written on the subject of things to avoid in OO development.
http://www.amazon.com/Pitfalls-Object-Oriented-Development-Webster/…
Seems its cheap too!
13. Following the advice in this article too much.
14. Not following the advice in this article enough.