Some questions that people often ask about Python 3000 (and answers). (First in a series of posts.)
Q. I want to learn Python. Should I learn Python 2.6 or Python 3.0?
A. Definitely learn Python 2.x (the latest version out is 2.5). I expect it'll be two years before you'll need to learn Python 3.0, and the differences aren't that great from a beginner's perspective: most of what you'll learn about 2.x will still hold about 3.0.
Q. If you're killing reduce(), why are you keeping map() and filter()?
A. I'm not killing reduce() because I hate functional programming; I'm killing it because almost all code using reduce() is less readable than the same thing written out using a for loop and an accumulator variable. On the other hand, map() and filter() are often useful and when used with a pre-existing function (e.g. a built-in) they are clearer than a list comprehension or generator expression. (Don't use these with a lambda though; then a list comprehension is clearer and faster.)
Q. Multi-core processors will be standard even on laptops in the near future. Is Python 3.0 going to get rid of the GIL (Global Interpreter Lock) in order to be able to benefit from this feature?
A. No. We're not changing the CPython implementation much. Getting rid of the GIL would be a massive rewrite of the interpreter because all the internal data structures (and the reference counting operations) would have to be made thread-safe. This was tried once before (in the late '90s by Greg Stein) and the resulting interpreter ran twice as slow. If you have multiple CPUs and you want to use them all, fork off as many processes as you have CPUs. (You write your web application to be easily scalable, don't you? So if you can run several copies on different boxes it should be trivial to run several copies on the same box as well.) If you really want "true" multi-threading for Python, use Jython or IronPython; the JVM and the CLR do support multi-CPU threads. Of course, be prepared for deadlocks, live-locks, race conditions, and all the other nuisances that come with multi-threaded code.
Q. I prefer to use the same source code for 2.x and 3.0; I really don't want to have to use the 2to3 source conversion tool. Why can't you make that work?
A. Suit yourself. The intersection of 2.6 and 3.0 is large, but there are several things you can't do: you can't use Unicode literals (2.6 only), nor bytes literals (3.0 only). The only print syntax that works the same in both versions is print(x). When you catch exceptions you can't inspect their values, because the 2.6 syntax uses , while 3.0 uses as. You can't use .iterkeys(), but .keys() works differently in 2.6 and 3.0. You can't use xrange(), but range() works differently. You can't use metaclasses, as the syntax for specifying a metaclass is completely changed in 3.0. And so on. Restricting yourself to the intersection of the two versions is very painful and limited. We're not introducing backwards compatibility syntax in 3.0, because that would defeat the purpose (we've had backwards compatibility forever in 2.x, and the whole point of 3.0 is to clean up the mess).
P.S. For another batch of Q/A pairs, see the sequel post.
Q: Will Python 3000 provide a reliable binary interface for strings?
The ROX desktop (http://rox.sf.net) is largely written in Python, but we have always had to avoid using Python C modules. The reason is that some Python binaries use UCS2 and some use UCS4. Either would be fine for us, but not knowing means we can't use modules at all!
I know we're not the only people with this problem, e.g. the Autopackage developers wrote this:
(First of all, thanks for Python. It's my language of choice for development.)
Q: Are there any plans to support statements in lambda, or rather, to make statements expressions so we can use def instead of lambda?
... map(def (x): for i in x: do some stuff... return z , some_list) ...
(This would require turning def into an expression, but I'd suggest turning all statements into expressions. In my humble opinion, statements are a design mistake of most programming languages; they subtract expressiveness and have no advantages over expressions.)
Q: Any plans to make Python implementations require tail call optimization?
Being an optimization, it's transparent and doesn't make the language any more complex than it is. In fact, it doesn't introduce any keywords or symbols, and it's backwards-compatible. It's useful for both functional-style and non-functional-style programming, and it makes it easier for Python users if they are allowed to implement recursive algorithms, which are easier to think than iterative algorithms, at least to some people; and it doesn't affect people not interested in it.
Q: Will Python 3000 support "real" private, protected and public object members ? The "_" syntax is terrible, and I think that is the time to change that, since the language will break backwards compatibility.
Q: Will Python 3000 support static typing, or something like that ?
Most people's suggestions here are for horrible things like keeping reduce or whatever, and I imagine you will summarily dismiss them. I did think of one thing that should be added to Py3k that perhaps has not been done yet. Currently, if you say str(i for i in "string"), the response you get is '<generator object at ...>'. In other words, str is just returning the repr of the generator, not using the generator in a meaningful way to build a new object. I think that's inappropriate. While of course '<generator object at ...>' should be the response to repr(i for i in "string"), nevertheless since str is a class that has an __iter__ method, it should be able to take in an iterator for meaningful use as well. This would allow us to write code like:
>>> str(i for i in "string" if i not in "aeiou") 'strng'
Also, it would let us write generators that preserve the type of their initial object. Here's how it currently works:
>>> mycond = lambda i: True >>> it = [1, 2] >>> type(it)(i for i in it if mycond(i)) [1, 2] >>> it = (1, 2) >>> type(it)(i for i in it if mycond(i)) (1, 2) >>> it = set([1,2]) >>> type(it)(i for i in it if mycond(i)) set([1, 2]) >>> it = "abc" >>> type(it)(i for i in it if mycond(i)) '<generator object at 0x789e0>'
Can you spot the odd item out in that list? ;-) In fact, I wouldn't be surprised if someone has done this already, but nevertheless I'd like to point it out anyway, just in case. Fix the __str__ method on generator-type objects please.
Removing reduce cripples the use of Python for functional programming. I think it's a little snooty to say suggestions to keep it should be "summarily dismissed". The functional paradigm is mature and powerful, and to say you can just do the same thing as reduce with a loop and accumulator is like saying lower-level constructs are better than higher-level ones - which is the opposite of what Python stands for. Removing reduce denigrates one style of programming over another. Why not keep it if it's useful? For those of us who have invested time in FP, and who appreciate Python's multi-paradigm posture, this is a blow. If anything, filter is the least useful of the FP triad, since a list comprehension works equally as well.
Suppose I want the composition of an arbitrary number of functions. This is a nice feature to package functionality and reuse it. The following function is very handy:
So you're going to make me write a loop when I am in FP mode? This is good FP and Pythonic at the same time. I just don't understand the push to remove reduce. I guess you can remove FP from the list of paradigms that Python supports.
Well, you've got me there. I can type this in an experiment with it until I believe it works, but I can't wrap my head around it until I've written the functionality you describe like this:
>>> def c(fs): ... def F(x): ... for f in reversed(fs): x = f(x) ... return x ... return F ...
and mentally mapped each part of my version to a component of yours. Yours may be shorter, but it doesn't fit my brain! And why exactly does it need a list() call around the reversed() call?
> So you're going to make me write a loop when I am in FP > mode? This is good FP and Pythonic at the same time. I > just don't understand the push to remove reduce. I guess > you can remove FP from the list of paradigms that Python > supports.
FP as I understand it was never a paradigm supported all that well by Python. FP works well in combination with super smart compilers. But Python's semantics are too dynamic for a compiler to have a chance at optimizing the code.
But it's a matter of familiarity. It's like STL in C++ - people have trouble with it at first, but then it becomes a language in itself. That's the whole point. It becomes part of your thinking, the components you build solutions with. Just because *you* have to pull it apart first only means that *you* don't find the FP style natural. Why impose your preferences on the community when reduce() is already there? With all due respect (and I mean that sincerely, I care deeply about Python - it resonates with my inner programmer like no other language), this may cause dropping the B in BDFL, don't you think? Why limit us? I mean, all abstactions seem unnatural at first, but then they become tools for progressing to the next abstraction, etc.
The list wrapper is necessary because reduce() wants a list - not an iterable. I got an error with 2.5.1 before I added the list. Maybe that's an enhancement for reduce() to consider for 3.0 :-).
> Just because *you* have to pull it apart first only > means that *you* don't find the FP style natural. Why > impose your preferences on the community when reduce() is > already there?
Exactly because there's only a very small minority who is capable of using it correctly; most Python users either don't use it or use it wrongly. I have done some research to back this up.
Python fits in *my* brain, and attracts users whose brain works somewhat similar to mine. I'm trying to keep the language optimal for this self-selecting audience. Trying to cater to everyone's preferences would result in a language that nobody likes.
Reduce() is *essential* in FP languages because they don't have the option of writing loops with side effects. it is not essential for Python at all; as a convenience, given the evidence that most uses are abuse, it clearly falls short.
PS. It isn't really removed; it's been resurrected in functools. You can also write it yourself in three lines of course.
Okay, but it seems similar logic can be applied to map and filter, which are higher-order functions. Again, it seems like such a small thing, to leave reduce in; I can't see how it affects the rest of the language much - but you have a much better perspective on that, so I acquiesce. FWIW, functools.partial() is cool, but not a very readable way to mimic reduce's functionality, IMO.
Flat View: This topic has 18 replies
on 2 pages