The Artima Developer Community
Sponsored Link

Python Buzz Forum
Paralipsis

0 replies on 1 page.

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 0 replies on 1 page
Thomas Guest

Posts: 236
Nickname: tag
Registered: Nov, 2004

Thomas Guest likes dynamic languages.
Paralipsis Posted: Sep 25, 2007 12:11 PM
Reply to this message Reply

This post originated from an RSS feed registered with Python Buzz by Thomas Guest.
Original Post: Paralipsis
Feed Title: Word Aligned: Category Python
Feed URL: http://feeds.feedburner.com/WordAlignedCategoryPython
Feed Description: Dynamic languages in general. Python in particular. The adventures of a space sensitive programmer.
Latest Python Buzz Posts
Latest Python Buzz Posts by Thomas Guest
Latest Posts From Word Aligned: Category Python

Advertisement

Reading between the lines, Python has a complex and lengthy relationship with functional programming. A couple of years ago Guido van Rossum wrote:

About 12 years ago, Python acquired lambda, reduce(), filter() and map(), courtesy of (I believe) a Lisp hacker who missed them and submitted working patches. But, despite of the PR value, I think these features should be cut from Python 3000.

The argument against map and filter is that list comprehensions serve the same need; and similarly, local functions render lambda inessential. Python prefers there to be a single obvious way to do things. As it happens, lambda will persist – and since Python lambdas are limited to single expressions, they’re quite hard to abuse. Map and filter are out, sort of. In a subtle but inspired move they’re being over-written by their lazier counterparts from the itertools module, imap and ifilter. So in Python 3000 list comprehensions are the way to create lists from lists (and iterables, of course); and map and filter are two of the most important iterator adaptors/stream processors. All things considered, Pythonic support for functional programming is definitely on the up.

Reduce really is to depart from the language core, though I guess homesick Lisp hackers will be able to find it kicking its heels somewhere in functools. The primary argument against reduce seems to be that it’s been abused by naughty programmers to create unreadable and inefficient code.

If I discover my children using their toys inappropriately – for hitting each other, destroying furniture, blocking plumbing, etc. – then the toys are confiscated (after fair warning, and temporarily; I’m a pretty soft dictator). Once removed, these toys become highly desirable; once returned, less so.

I don’t use reduce much (as the article goes on to point out, efficient container operations such as sum, string.join, max, min, and more recent additions to the language like any and all eliminate any common needs for it), or at least I didn’t. Now though, despite – maybe because of – its imminent demise, reduce refuses to be ignored. Here are some of its greatest hits, all of which I’ve found useful recently.

Reduce’s greatest hits
def unite(sets):
    return reduce(set.union, sets, set())

def intersect(sets):
    return reduce(set.intersection, sets) if sets else set()

def bits_to_integer(bits):
    return reduce(lambda acc, bit: acc << 1 | bit, bits, 0)

def concatenate(items, initial):
    from operator import concat
    return reduce(concat, items, initial)

def product(items):
    from operator import mul
    return reduce(mul, items, 1)

Maybe we’ll see a few of these absorbed into the core language (I’m kind of surprised that set.intersection and set.union aren’t already flexible enough to accept more general inputs).

In case it looks as I’m making a case for keeping reduce as a built-in, here’s a chunk of hideously inefficient and dys-functional code which demonstrates why I deserve to have it taken away from me.

Mixing metaphors midstream
from operator import add, mul, div
from itertools import *
from functools import partial

accu = lambda terms: reduce(add, terms, 0)
prod = lambda terms: reduce(mul, terms, 1)
flip = partial(div, 1.0)

def fact(n):
    return prod(islice(count(), 2, n + 1))

def sum_n(terms, n):
    return accu(islice(terms, n))

def e():
    def terms():
        return imap(flip, imap(fact, count()))
    return imap(lambda n: sum_n(terms(), n), count())

For what it’s worth, the (current version of) 2to3 leaves this code unchanged.

Read: Paralipsis

Topic: Re: Backpack Previous Topic   Next Topic Topic: lxml.html

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use