“Online Python Tutor is a free educational tool that helps students overcome a fundamental barrier to learning programming: understanding what happens as the computer executes each line of a program’s source code. Using this tool, a teacher or student can write a Python program directly in the web browser and visualize what the computer is doing step-by-step as it executes the program.”
This is really similar to how i taught myself python while at uni.
Python was the first language i enjoyed and helped me learn other languages by giving me the ability to look at things in a programmatic way.
Great to see stuff like this!
It’s also similar to how Scheme is taught in SICP. Though it’s better put together and interactive.
I have a soft spot for python, tho’ I never get to use it anymore.
While the tool and the approach itself seems nice, I still think that it’s done backwards. The students need to be explained and taught how a computer works, how code is executed, what happens when a program runs, how and what instructions do, and then the barrier will not exist anymore. Such tools can be useful in explaining all that, for sure, but not as a programming-learning tool, more a programming/language-understanding tool.
l3v1,
“While the tool and the approach itself seems nice, I still think that it’s done backwards. The students need to be explained and taught how a computer works, how code is executed, what happens when a program runs, how and what instructions do, and then the barrier will not exist anymore.”
Personally, I think the way you do. When I was starting out, I was not satisfied merely learning how to program in a language, I wanted to understand the technology behind my program. I delved into assembly, software interrupts, controlling peripherals directly, etc. I wouldn’t have had it any other way. On the other hand, real CS skills are becoming marginalised. Few employers need us to work on CS problems any more, instead CS graduates are finding employment in generic IT roles where CS skills are not especially in demand.
The following rant is my view on the evolution of this field and my pessimistic view of the future in terms of jobs.
In the past, most businesses needed CS people because they developed software & databases in-house from the ground up. In my very early experience as a summer intern, many of these systems were written in pascal & C using simple binary flat records, typically reserving some empty filler in each record. These records were loaded strait into memory and displayed on a textual user interface. Some programs were simple, others much more complex, but CS developers were in high demand (although as an intern in high school, I wasn’t actually paid).
Fast forward to today, businesses use off the shelf software instead of writing it themselves. Now instead of a CS guy, a business actually needs more of an IT guy. The software companies selling this software still need CS guys, but the trend has been for them to consolidate and lay-off such that one software package could be installed at a multitude of client companies who would no longer need their own CS guy. This undoubtedly made businesses more efficient on the whole. The demand for IT went up, but CS went down.
Now we’re in the midst of another trend, the shift away from off the shelf products towards software as a service ones. It’s being sold as way for companies to become more cost effective by outsourcing it’s in-house IT functions to the provider. (Email, documents, databases, etc). There are strong proponents of either side, and there’s still doubts as to whether it will pan out or not. But one thing is for sure, if the promise of SaaS cost effectiveness is to ring true, then IT spending will have to decrease proportionally to revenue. In other words the future demand for IT is likely to plunge as SaaS improves business efficiency.
Edited 2012-09-21 05:36 UTC
I don’t necessarily think that’s the best approach for everyone. Starting at the high-end helps ease the student into thinking in an appropriately structured way. There is a certain barrier in understanding just how rigidly things are interpreted, and just how generic the building blocks are – and how to structure something to get the intended results. With that in place, digging deeper is easier: If you can understand not-entirely-trivial python programs, it’s possible to understand (roughly!) how the python interpreter has to work, and from there on to “it’s also (maybe indirectly) (sys-)calling the kernel to get certain things done” isn’t a giant leap. The details of what makes the kernel special (memory mapping & swapping, interrupts, locking, IO, etc.) follow naturally, and somewhere in that it makes sense to look at how a CPU actually works.
Starting top-down does probably lead to more sloppy programmers, since they might not care as much about the lower levels when they feel they can already do something useful. Starting bottom-up will probably bore some potentially good future programmers to death (since they won’t see the use yet), and does probably lead to more myopic programmers (the microöptimization above structure – type).
And no, I’m really not sure what the best solution is. “Everything at once” would be great, but that might not be completely doable…
In a university context, one might imagine having one course on high-level programming and another on low-level computer architecture. It has been done before, and seems to work quite well…
Certainly, and I’ve been in both kinds – but that doesn’t really solve the question of “if we want programmers that care a bit about the lower-level effects of the code they write, while still being decent at high-level code and structure – what’s the optimal order to teach in?”
Well, both courses can be taught simultaneously, isn’t it ?
Sure, though that could also be argued to be the only way they can learn both without benefiting from already having learned the other.
I think that it may depends on the student..
Me, I find learning low level first easier because when something “magic” happens in a high level language you can look at how it is implemented to better understand it.
I find that a low-level description is actively harmful for most new students. It’s simply too irrelevant for their attempts at learning the language at hand and only adds to the information they have to memorize. Of course, I may be misunderstanding what you mean by low-level. I take it to mean explicitly detailing what op codes/assembly instructions and registries are/do.
Typically, I find that functional languages are easier to teach because everyone has some experience with math and can do basic substitution (even if that’s not how things are actually evaluated, substitution is a good enough model to start with.)
I guess low level is a bit vague, but when I use it I’m referring to understanding what the hardware is doing at a hardware level. Conversely high level for me would mean hiding the hardware behind abstractions.
As much as I appreciate the low level stuff, I think it’s hard to make a case for why most students would ever need it.
I can think of quite a few big market who might actually need it, such as embedded stuff or hardware development. Now, I guess what you implied is that “one doesn’t really need the low-level stuff to build user-mode software”.
I’d say that this, too, can be debatable, though. As an example, when people know so little about computer memory hierarchies and their management that they end up considering memory allocation as a magic thing that brings extra objects to their code whenever they need it, memory management overhead will cause performance problems in their software in the long run.
Edited 2012-09-22 07:06 UTC
Perhaps. Are you talking about manually-managed languages like C or ones that use reference counting and/or garbage collection like Python, Javascript, and certain approaches to using C++?
…because for something as high-level as Python or Javascript, I can easily see a “you can lead a horse to water but you can’t make him drink” problem in teaching lower-level details.
There are a lot of high-level programmers who fairly firmly see anything beyond the care and feeding of a reference count system or the tips for getting along with their particular garbage collector as too academic to be worth their precious attention.
Of course, I will definitely agree that high-level teaching needs to be improved. I’m mostly self-taught (because most of what University taught me about computer science, I’d already taught myself) and it was little more than luck that I recognized “syscalls are heavy. do as few of them as possible.” as a general rule from reading Python profiler reports.
(At which point, a friend who studied CS about 15 or 20 years ago said something along the lines of “Of course. Is that no longer one of the first things you learn?”)
Edited 2012-09-22 22:03 UTC
Mostly the latter. As programming languages make it easier for developers to ignore the intricacies of memory management for simple programs (which is overall a good thing), teachers can get more and more tempted to skip stressing the importance of the under-the-hood part altogether when discussing the language.
This, in turn, leads to the phenomenon that you discussed : people who, either due to misplaced pride or ignorance, end up not knowing enough about interpreters and OSs to write efficient software.
In that case, I agree with you.
One of the ongoing problems I’ve had with learning lower-level stuff is that all the educational material I’ve found seems to be written for either programmers skilled in other low-level languages or complete and utter newbies.
There seem to be no “C/C++/D/Vala/etc. for the experienced Perl/Python/PHP/Ruby/Javascript/whatever programmer” books or articles.
What that means is that I end up either bored to the point of distraction or unknowingly missing important lessons whenever I try to improve my knowledge of C and C++ as someone skilled in Python, mildly so in PHP, shell script, and Javascript, and having written one or two small programs in C, C++, Vala, Java, and Prolog.
For example, K&R’s “The C Programming Language” was good, but, given that it doesn’t even mention what a buffer overflow is, it’s obviously written for someone who already has some familiarity with low-level programming in some other way.
Edited 2012-09-23 12:36 UTC
Not *that* detailled, but for example learning how virtual functions work by looking at vtables and the pointer.
Provided you don’t want to teach about performance..
Otherwise at some point you have to look at the binary generated, which is not too bad with C/C++ but with Haskell..
Neolander,
“In a university context, one might imagine having one course on high-level programming and another on low-level computer architecture.”
Don’t most students start programming well before they reach university? Even by high school, those who have an interest are probably already programming.
In any case, my university didn’t do it that way. All first year courses were taught in Eiffel, a relatively obscure high level language. I’m not even sure we touched C except in specialty electives. After us, they replaced Eiffel with Java in our program, but I doubt they increased emphasis on low level fundamentals.
You have to be careful not to throw too much at someone new to coding. It can quickly become overwhelming and actually make things harder for them to understand. The fine print and advanced details can come later, after they’ve gotten a minimal/basic foundation for the language and the logical thinking.
I know decent coders who don’t know every last in & out, and I know crap coders who do so part of the equation is the actual person and not simply to quantity of detail they’ve memorized.
Both low-level and high-level skills are useful and partially overlap but at the beginner level, if you don’t plan your career in softrare engineering, you just have to pick one of them. And even if you do, you should start with both simultaneously (e.g. C on Arduino and Python programming on PC).
Given that an educated person is more likely to use programming languages for automating stuff or writing simple one-off utility programs, starting from Python makes a lot of sense. In a way, an interactive enviromnent eliminates many of the problems you are talking about.
Reversing your point, you can’t (or won’t) learn programming well (breaking down the problem, structuring the program so that you don’t end up with a spaghetti code, learning about data structures, OOP and FP idioms etc.). Just like Python is not the best tool for learning about a PC memory model, C is a poor choice for getting familiar with anything else.
This is essentially akin to WinDbg, but for Python plus fancy visuals minus debugging symbols minus OS dependency.
I see how this is useful for those who have no prior programming knowledge. Back in the days, I learned C and C++ by littering my source codes with #define macros, #ifdef, and #endif to print out code execution path, which get really messy very quickly — Internet was a luxury at the time, so we couldn’t just look up something everytime we hit a wall, and there was nothing similar to CodeReview. I ended up learning some Assembly to be more efficient in debugging and to better understand the implication of every single line of code I write.
I like Python, but I haven’t got a chance to build real software with it. I do use it for quickly testing ideas in code form and, to a degree, prototyping.
Whenever I look at someone else’s code, I do tend to visualize an execution.