Derek M. Jones looks at low-level coding errors and the use of coding guidelines as a cost-effective means of avoiding some of the more common instances of such errors.
Derek M. Jones looks at low-level coding errors and the use of coding guidelines as a cost-effective means of avoiding some of the more common instances of such errors.
“Preventive Programming Techniques: avoiding and correcting common mistakes by Brian Hawkins”.
…but this is more than coding guidelines. It SHOULD come down to good, solid design principals. At the beginning of a design you should lay out general coding expectations (for instance, always initialize variables or DON’T initialize and trap for null, etc).
When you design, design for SIMPLICITY (KISS); design errors INTO the code (design code so that ONLY specific errors can occur); comment, comment, comment you’re code so that the next guy that has to fix your bugs knows what it was you were thinking at the time. I could go on, but I think most of what I would have to say would be common sense, except for this: Do NOT OVERDESIGN! That is the biggest PITA when you get committee after committee together to come up with this LONG RANGE PLAN of how all code in the company will be integrated and therefore this design will have to be this or do that blah blah blah.
You SHOULD spend a lot of time on design, but don’t pull in the every latest and greatest design principle, tool and buzzword that you can, because you know what? It is all going to change anyway in another year.
So there.
When you design, design for SIMPLICITY (KISS)
AMEN to that one
<< When you design, design for SIMPLICITY (KISS); design errors INTO the code (design code so that ONLY specific errors can occur)>>
Why on earth would you want to design errors into your code?
I listen it at military college (not from teacher but friend) that common mulitary unit hierarhy based on that rule (proved by ages). So it is just a legend … And about code guidelines, it is so personal. My own rules – always free memory after alloc ( i spend a lot of time finding memleaks and other “free w/o alloc” and now it is like religion, delay between i write alloc and corresponding free must be < 2 seconds ). May be stupid but it works for me. And another one, now i put “;” after anything like crazy, even function close brackets. This because i hunt a week one bug when i eventualy delete “}” inside huge global static multidimentional array initialization, and compiler (vc++6) see all functions after that up to eof as constants. Compiler errors was so cryptic that make me think about delete that file and rewrite from scratch.
My own rules – always free memory after alloc ( i spend a lot of time finding memleaks and other “free w/o alloc” and now it is like religion, delay between i write alloc and corresponding free must be < 2 seconds ). May be stupid but it works for me.
This is why I only work in high level languages (EG: PHP) where spending time on such thing is mostly irrelevant. If I’m done with a variable, I unset() it. Usually, said variable is inside a function, and when function terminates, that’s the end of varable.
I never understood the need to manually manage memory. The c language is great for when performance is critical, but for most business apps that I write, it’s mostly irrelevant. The SQL database is almost always the bottleneck, spend time on that and save time by using PHP/Perl/Python instead!
You only need to manage memory when you’re allocating on the heap, meaning you want to be able to pass a variable outside of the context in which it was created.
Allocations on the stack are managed by the compiler.
Example:
int c[50]; // Create an array of 50 ints on the stack
int *c = alloc(50 * sizeof(int)); // Allocate memory for 50 ints in the heap, and set c to the first entry.
In the first case, no need to free() the ints. In the second, free(c) -Must- be called to avoid a memleak.
I think the problems you mention stem not from bad coding guidelines or any flaws on your part, but from using an inferior language. Manual memory management and C++ syntax are both very error-prone. Instead of recommending some language and causing the zealots on both sides to a language war, I’ll just say that if you spend a week hunting down a syntax error, then try something cleaner than C++.
Absolutely agree. Programming in C# and Java is just so much of a nicer experience than in say C/C++. The programmer can only handle so much complexity. Some things should be abstracted out from the programmer I think like memory management. If you are for example writing a performance critical OS kernel then obviously you will be better off doing everything manually as in C/C++. I guess there is a right weapon for every job.
KISS:
C# — YUCK!
Ada — much, MUCH better!
> Ada — much, MUCH better!
Much better for what purposes? To control a plane or a train, probably.
To write a GUI, I do not think so.
> Ada — much, MUCH better!
Much better for what purposes? To control a plane or a train, probably.
To write a GUI, I do not think so.
Why do you not think so ? How much have you actually thought about it ?
> Why do you not think so ? How much have you actually thought about it ?
How many (native) GUI libraries/frameworks are there targetting ADA?
It is not that you can not write a GUI application with ADA (you could probably write a decent GUI app using FORTRAN or COBOL, given enough time and drug). It is just that it was not designed for such a task; there are much better alternatives out there (C++, C#, Delphi or Java).
With respect, it seems to me that you do not know Ada very well, including how to spell it.
Ada has better facilities for interfacing to non-native APIs than most languages, including representation clauses and foreign calling conventions, defined by the language.
The area of application for Ada is anywhere where reliable and maintainable software is required.
Having said that, I love C++. It has enabled me to make a very good living as a contract programmer, sorting out the messes created by others.
Unfortunately Ada is a “dying” language. By dying I mean not in much “new” usage. I do not know of anyone with Ada on their resume and I know quite a few programmers. Neither is Ada brought up much in language wars.
Face it, Ada is not going to be adopted by any significant number of programmers.
> With respect, it seems to me that you do not know Ada very well, including how to spell it.
It is case-insensitive :]
Languages are nothing but tools. A good programmer will write excellent code in any language (well, in most languages); a bad one will write crappy code in all languages, including Ada (or Eiffel).
You can cut paper using a chainsaw; nonetheless, it is overengineering.
E.g. if you work in a company with ten C++ developers and decide to write an application of medium size in crystal-clear Ada, do you think it will be *practically* maintainable?
Response is: it will not be. That is probably why some companies slowly moved from Ada to other (inferior, less safe) languages when the DoD announced, a few years ago, that Ada will not be a requirement anymore in its contracts.
With the automotive industry driving demand for C and C++ developers and tools to develop safety-critical systems, the Ada market will slowly collapse in the years to come, not because it is technically wrong but simply because it is not economically competitive.
This change is driven as much by fashion as anything else. Ada could be revived just by changing its name and providing a large standard library package.
The finance area (investment banks etc.), in which most of my work now is, is slowing jettisoning C/C++, because of the cost (the true, overall cost), in favour of Java and C#.
C and C++ will never be as safe as Ada, which is why it remains the language of choice in embedded and safety-critical arenas.
For example Pascal uses := for assignment and = for comparison. This makes sense really, people read the = sign as “equals”. Further, it makes assigment a statement, as opposed to an expression, so there’s no way to get confused.
Likewise, for division, Pascal’s ‘/’ operator will always perform floating-point division. You have to use the ‘div’ operator to perform integer division. Further, floats are not automatically converted to integers, you have to explicity use a function like Ceil(), Floor(), Trunc() or Round(). This helps remove another class of errors.
The thing is, Pascal is not particularly high-level. It’s always amazed me that people insist on using the C syntax in new languages like Java, PHP and C#, even though it’s been made clear that it’s less readable than the alternatives, and allows developers to perform certain classes of errors. PHP, an example a poster mentioned above, actually adds to these problems. Not only does it feature the accidental assignement error, thanks to using = and ==, but it features a whole other class of bugs as it has two equality operators, == and === to work around it’s type-coercion facilities.
Not only does it feature the accidental assignement error, thanks to using = and ==, but it features a whole other class of bugs as it has two equality operators, == and === to work around it’s type-coercion facilities.
This is similar to “equals()” vs. “==” in java.
Not entirely. In Java it’s always clear that == refers to object references and equals() refers to object values. This distinction is made pretty early on in most courses. The problem with == and === in PHP is that people often don’t realise when they have to use it, and because == will work most of the time, they end up laying a trap that may not be tripped till after the product is deployed.
First of all, everyone should read Code Complete. If you still write crappy code after that, some coworker should hit you in the head with said book.
To the = vs. == issue:
use if(3==a) instead of if(a==3) and you’ll get an error if you forget one = (at least in non-exotic languages ).
use if(3==a) instead of if(a==3) and you’ll get an error if you forget one =
won’t work with if (a==b) though
Memory management really isn’t that hard. Follow these rules and you’ll be fine.
1) Create variables on the stack. <u>Any</u> usage of new/delete must be justified, if it won’t fit on the stack, for example.
2) If you must use new/delete, use auto_ptr. If you need arrays, write your own array auto_ptr in 15 minutes.
3) Never pass ownership of a pointer. Reap what you sow.
3a) Never pass by non-const reference. Too dangerous.
3b) Pass in/out parameters by pointer so you can tell it is being changed. You never pass ownership (#3), so this is safe.
If you follow these steps, you’ll have maybe one pointer per 5000 lines of code. That is quite manageable. The power of C++ is on the stack – use it.
DOWNLOAD Windows Vista beta 2: http://windows.czweb.org/
For all those demanding the use of a specific language, or specific class of languages, I have only this to say:
Learn the language before you use it. A program is only as good as the programmer(s) who wrote it, and you’re only as good as your knowledge of the language. Learn the language, whichever it is, and become intimately familiar with all of its nuances, its library set, and its features.
Personally, I prefer C/Objective-C and Python for 95% of what I do. For the rest, it’s sh, with some asm and some C++ thrown in when I can’t avoid it.
Others prefer C# and/or Java, for rather pointless reasons as ‘you don’t have to manage memory yourself’, but somebody has to, and if it’s not bulletproof, you’ll have problems.
We’re at a point where everyone knows how to write ‘hello world’, but nobody knows how the print() statement actually functions, nobody knows what is actually involved with managing memory. Nobody knows the key parts of the system, and that is going to be the biggest problem I see for future generations of programmers. Nobody knows anything about the system, they only know ‘I can write a complete GUI with no code!’, so who will take us into the future, if nobody knows what is actually happening beyond (‘I’ means a given person advocating managed languages for their convenience):
– System.PrintLine(“Hello World”) performs some magic to get ‘Hello World’ to appear on the screen.
– Memory is plentiful, and I have no idea how to manage it.
– I write something and 20 things happen.
* Use of the term ‘nobody’ is with the long-accepted meaning of ‘a large majority’, not the actual dictionary definition.
Run the source code through a pretty printer. That shows you what the compiler will think it is, as opposed to what you think it is. That would have helped STTS, above.
–Cogito ergosum– but somehow my registration did not succeed.