This article explores the new Python 2.3 itertools module, and gives you a sense of the new expressive power available with combinatorial iterators and how iterators — conceived as lazy sequences — are a powerful concept that opens new styles of Python programming.
Man, that David Mertz is going nuts with Python and functional programming. Python is vaguely like the Mac — expensive, beautiful, makes people proud of their work because they’re working with a tool that was made with pride.
Nice thing is, you can really get rid of that “expensive” part by using tools like Psyco.
http://psyco.sourceforge.net/
Whereas, with a Mac, you’d probably need a crowbar and lockpicking set.
Yes, I know that Ruby does inherit from so many languages (Simula, Perl, Eiffel…), but this stuff sounds like Ruby: Iterator .each method, yield…
Python is vaguely like the Mac — expensive, beautiful, makes people proud of their work because they’re working with a tool that was made with pride.
Where do you get expensive from Python?
Last I knew it didn’t cost anything to use it.
I’m assuming you mean slow, judging by your link, which in most cases is no big deal with Python, as programs written in Python are seldom noticeably slower than their C/C++ counterparts. So anyway, replace expensive with slow and your statement is somewhat viable.
A lot of the functional guys seem to be happy about Python. Peter Norvig, of Lisp fame, espouses Python as a language nearly as good as Lisp, but with a mainstream syntax. The functional guys see Python as a way to sneak some functional concepts into the mainstream.
Psyco is good, but it has lots of problems. Its really easy to defeat the optimizer. In particular, once you wrap something in a class, the optimizer tends to just give up. Python should do what the Lisp folks do — add type declarations to help the optimizer generate good assembly.
Expensive means it uses a lot of resources/processing time. It’s common programming convention, just because you don’t use it doesn’t mean it’s not correct.
Python should do what the Lisp folks do — add type declarations to help the optimizer generate good assembly.
Funny, I made the exact same wish on the joelonsoftware forums today.
Sussman once explained though (in the SICP videos, near the end) that adding features to a language is difficult because its syntax could prevent other features you might want in the future. For example, a simple way of declaring types in Scheme could preclude good syntax for keyword arguments.
I’m sure optional types will come at some point though, and I don’t think this is a big problem. Certainly, the Common Lisp way presents no problems I know about.
python is hard to program using someone else’s API because it is dynamically typed. This causes some documentation to omit which method returns what, making debugging a terribly necessary thing
I’ve found that writing good tests ahead of time by following the eXtreme programming conventions has made the fact that Python is dynamically typed a non-issue. It also makes debugging a non-issue because once I have a strong test suite for my classes I very seldom fall victim to errors at runtime. People that have issues with dynamic typing usually come from the C style printf debugging philosophy, so it’s natural for them to think strong typing is going to cause them issue.
Anyone who would suggest type checking in python doesn’t really understand python. If it walks like a dict, then who cares what it is. It is the programmer’s responsibility to understand what a function expects and provide that.
A lot of functions accept anything considered “iterable” or all “sequence types”, where the lack of a specified type makes the functions much more userful.
-Hugh
Just in case you don’t understand what we’re talking about, it’s optional typing we’re talking about. Common Lisp uses this, but most implementations don’t enforce them at compile time. Rather it uses these declarations as hints to optimize code better.
Or so as I understand Common Lisp. I think this email by Jonathan Rees sums my feelings up well, even though he’s talking about OO:
http://www.paulgraham.com/reesoo.html
The problem with dynamic typing is that it makes generating proper code very difficult. The compiler doesn’t know until runtime what types will actually be used with a particular routine, so it has to emit generic code. Take, for example, the following Python function:
def foo(bar):
return bar*2.0
Say bar is an integer. A statically typed language knows how to multiply two floats, so it generates fmul instructions that implement the bar*2.0. A dynamically typed language doesn’t know whether bar is a float (or an integer or even a user-defined type) so it doesn’t know what code to generate. In Python each object has a pointer to a dictionary of its methods. When the Python compiler sees the above declaration, it looks up the ‘__mul__’ operator in bar’s dictionary, and calls it with an argument of 2.0. The dictionary lookup is extremely slow. I don’t know how most Lisp compilers do it, but they would have to use a similarly expensive mechanism to retain dynamic typing. There are two ways to make this fast: you can add type declarations (so the compiler knows what code to generate) or you can use a super-duper runtime optimizer like Psyco. Psyco can detect if a function is being called with the same time over and over and generate optimized ASM on the fly.
I don’t see what the link has to do with anything.
As far as “The compiler doesn’t know until runtime what types will actually be used with a particular routine, so it has to emit generic code.” – This is the strength of Python, not the weakness.
If you want strict type checking, and functions that are not extensible, but only work for the specific type for which they were conceived, then use Java (which is barely faster than python).
If, however, you can manage to get your function’s parameters in the right order, then type checking gains you nothing but runtime speed. In this case python makes code easier to maintain. For instance, if you decide you need to use floats instead of ints, often you can just change where the variable was first created, and the function calls wont break.
How many bugs do *you* have where you meant to say something like
print “hello”
but accidentally said
print 3
?
That doesn’t happen to me much; Type checking has never made my programs less buggy (just longer because of casting).
And in the rare case where you need a function to run in 3 milliseconds instead of 300 milliseconds, you should probably just use C anyway.
-hugh
Besides, if I have a function that needs to take a specific type and do something at break neck speed, this is exactly what C extenstions are for.
def reverse(sequence):
….if isinstance(sequence,str):
……..return fast_c_strrev(sequence)
….else:
……..return sequence[::-1]
And it still works with non-strings!
-hugh
1) Python is nowhere as near as fast as Java. Low-level code is much faster in Java thanks to the JIT. High-level code is more comparable, because much of the low-level bits of Python are written in C. Doing numeric code in Python (without Psyco) is ridiculously slow because what is a single instruction in C degenerates to a hash table lookup and an indirect call.This usually isn’t a problem in practice, because speed isn’t an issue. However, Python is a wonderful language, and it would be nice not to have to go running to C everytime you need some performance. From an elegance standpoint, it would also be nice to be able to write the Python VM in Python.
2) *Optional* type declarations wouldn’t change anything fundemental. They allow you to keep the code quite clean, and most importantly, you can apply them only where you need them. Say you have a scientific app written in Python. Just add some type declarations to the heavily used data structures, and leave the rest alone. Very clean, very flexible.
talking of cool languages… the only other language to get me excited as python did is prolog (and i’m hard to please). do you know how easy it is to write a meta-interpreter in prolog. and then to modify its behaviour?! google it – its fantastic!
1) Python not as fast as java? Java can’t even keep up with python on
“Hello World!” http://www.bagley.org/~doug/shootout/bench/hello/
🙂 But there are other kinds of code where python kicks java’s butt:
list processing http://www.bagley.org/~doug/shootout/bench/lists/
string concatenation http://www.bagley.org/~doug/shootout/bench/strcat/
Java is still an interpreted language after all.
2) *Optional* type declarations will waste the man-hours of development time used to create them, and they will desecrate any code they are in. You never need them.
Saying “Python is nice, so i’d like to be able to write fast code in it” is like saying “C is fast, so i’d like to be able to have true OO in it with polymorphism, and multiple inheritance, and functions are objects, and integers are unbounded, but you don’t have to use these features.” When you add those features, you lose the speed. If you add stuff like this to python just to make it faster at the expense of flexibility, it won’t be “nice” anymore.
-Hugh
but the bad documentation of some projects really makes me wish for static typing. When you expect a method to return one class but it returns another, it just wastes time.
What you are looking for already exists:
“The fundamental nature of Pyrex can be summed up as follows: Pyrex is Python with C data types.
Pyrex is Python: Almost any piece of Python code is also valid Pyrex code. (There are a few limitations, but this approximation will serve for now.) The Pyrex compiler will convert it into C code which makes equivalent calls to the Python/C API. In this respect, Pyrex is similar to the former Python2C project (to which I would supply a reference except that it no longer seems to exist).
…with C data types. But Pyrex is much more than that, because parameters and variables can be declared to have C data types. Code which manipulates Python values and C values can be freely intermixed, with conversions occurring automatically wherever possible. Reference count maintenance and error checking of Python operations is also automatic, and the full power of Python’s exception handling facilities, including the try-except and try-finally statements, is available to you — even in the midst of manipulating C data”
Rayiner: “Doing numeric code in Python (without Psyco) is ridiculously slow”
No it’s not due to the availability of powerfull add-ons like the Numeric and DISLIN packages.
list processing http://www.bagley.org/~doug/shootout/bench/lists/
string concatenation http://www.bagley.org/~doug/shootout/bench/strcat/
>>>>>>>
Yes, because those aspects of Python are implemented in C, while those aspects of Java are written in Java. When the raw performance of the language itself comes into play, Java orders of magnitude faster because the JIT can generate optimized machine code for inner loops, while the Python code has to go through the bytecode VM for each operation.
Java is still an interpreted language after all.
>>>>>>>>>
Wrong. Its a JIT’ed language, or in some cases, a compiled language. Big difference.
*Optional* type declarations will waste the man-hours of development time used to create them, and they will desecrate any code they are in. You never need them.
>>>>>>
Adding type declarations will take time, but it takes a lot less time do add a few type declarations than to rewrite all the performance critical parts in C. Just ask the Lisp people — a Lisp program can be made 50% as fast as a C program just by adding a few type declarations to critical loops. Its a process that can take as little as a few hours, because the type declarations are just hints to the compiler. And what is this about “desecrating” code? I would consider glueing C onto Python more of a desecration than a few lines of type declarations.
Saying “Python is nice, so i’d like to be able to write fast code in it” is like saying “C is fast, so i’d like to be able to have true OO in it with polymorphism, and multiple inheritance, and functions are objects, and integers are unbounded, but you don’t have to use these features.” When you add those features, you lose the speed. If you add stuff like this to python just to make it faster at the expense of flexibility, it won’t be “nice” anymore.
>>>>>>>
Bullshit. C++, for example, adds most of those features on top of C without sacrificing performance. In fact, in most cases, the C++ version is faster, because where the C code has to use (slow) function pointers and indirect calls, the C++ code can use inlined function objects. You don’t lose any flexibility through optional type declarations. You simply gain the ability to give hints to the compiler in performance-critical areas of the code.
Functional programming in Python is like playing golf with a baseball bat, but the sentiment is nice. I don’t think any of the currently available hard core FP languages are really ready to compete with Python (let alone C++ or Visual BASIC), but for an interesting perspective, check out Haskell. When Mertz talks about lazy evaluation, declarative style of programming, higher order functions, he’s talking about facts of life in Haskell. Experience the difference if that kind of thing appeals to you.
Also has rigorous static typing – with type inference, so you can write a program with no explicit type information, or you can add it for clarity at your option. Compiles to native code, if you use GHC or nhc98.
David Mertz has a good Haskell intro here:
http://gnosis.cx/publish/tech_index_dw.html
The only weirdness is it stops abruptly on p.13. Still worthwhile despite that, very clear writing that doesn’t assume much from the reader.
Ok, David will be putting up an updated version when he has time. All that’s missing is the phrase, “large-scale componentized systems.”