Showing posts with label Python. Show all posts
Showing posts with label Python. Show all posts

Saturday, June 26, 2010

PyFilesystem and "filepath": abstracting the file-system in Python

PyFilesystem 0.3 released

Will McGugan: I am pleased to announce a new version of PyFilesystem (0.3), which is a Python module that provides a common interface to many kinds of filesystem. Basically it provides a way of working with files and directories that is exactly the same, regardless of how and where the file information is stored. Even if you don't plan on working with anything other than the files and directories on your hard-drive, PyFilesystem can simplify your code and reduce the potential of error.

PyFilesystem is a joint effort by myself and Ryan Kelly, who has created a number of new FS implementations such as Amazon S3 support and Secure FTP, and some pretty cool features such as FUSE support and Django storage integration.

[MMMG: Compare this to "filepath 0.1" from Jp Calderone]

http://jcalderone.livejournal.com/56137.html

Jp Calderone: I'm happy to announce the initial release of filepath.

filepath is an abstract interface to the filesystem. It provides APIs for path name manipulation and for inspecting and modifying the filesystem (for example, renaming files, reading from them, etc). filepath's APIs are intended to be easier than those of the standard library os.path module to use correctly and safely.

filepath is a re-packaging of the twisted.python.filepath module independent from Twisted (except for the test suite which still depends on Twisted Trial).

The low number of this release reflects the newness of this packaging. The implementation is almost entirely mature and well tested in real-world situations from its time as part of Twisted.

You can find the package on PyPI or Launchpad:

http://pypi.python.org/pypi/filepath/0.1
https://launchpad.net/filepath

MMMG: This is all great stuff. From what I saw, the API of PyFilesystem seems like the winner, at least to my eyes. I will steal the best code from "filepath" to augment my personal version of PyFilesystem, then I will see what I can contribute back to these two wonderful projects.
Enhanced by Zemanta

Tuesday, January 5, 2010

Optimize python functions by marking certain promises about its behavior : Python

promise: bytecode optimisation using staticness assertions.


This is a module for applying some simple optimizations to function bytecode. By promising that a function doesn't do certain things at run-time, it's possible to apply optimizations that are not legal in the general case.


Reddit Comments: Optimize python functions by marking certain promises about its behavior : Python: "
I gave a talk on this at our local users group a few weeks ago, the slides are online if anyone's interested:

http://wiki.python.org/moin/MelbournePUG?action=AttachFile&do=get&target=promise.odp

"

The above presentation is really nice - great, simple example of the power of this technique.

I was thinking along these lines. Having code where we specify two "speeds":

(1) flexibility/expressiveness/global-mutability/side-effects-happen-globally-immediately are important (at cost to throughput and low-latency)

i.e. dispatch on pattern matching ASTs on global mutable list of patterns, global mutable generic functions, global mutable generic methods

Allow global mutable objects in general. All mutable state is handled like distributed-revision-control & write-on-change, with all communication going through a key-hole (enforcing low latency by throttling large data transfers) as asynchronous messaging as transactions (and building transactions).

Side effects are GO!  Allow global state to change, allow outside communication, all happening ASAP, might pre-calculate while waiting for reply, but wait to the bitter end for reply none-the-less.

(2) throughput and low-latency (performance) are important (at cost to flexibility/expressiveness/global-mutability) - no side effects (all "side-effect messages" are stored, to be returned as a group when function returns, maybe each "side-effect message" is paired with a continuation)


A simple directed acyclic graphImage via Wikipedia
i.e. dispatch on low level bytecode (think: LLVM), preferred data structures are immutable and have the "shape" of directed acyclic graph (possibly a much more restricted form of directed acyclic graph, where each node has an immutable index, and parent indexes are always less than child indexes), the limited explicitly mutable state is local to OS-thread or green-thread.  No traditional asynchronous side-effects allowed.

And there are speeds in the middle, by being explicit about what you are relaxing and what you are constraining. Typically, you expect to pay a "compiling" cost to get down to the low level bytecode - again, by being explicit, if you are doing a lot of this "compiling", you can improve by being explicit about what you are relaxing and what you are constraining.

Low-latency (ability to best react to asynchronous signals) will always be preferred over throughput. If you want to put throughput over low-latency, you have to explicitly say so, with the knowledge that you have very little influence over that isolated code (isolated so able to make no compromise in throughput)

[Edit]

Hey, cool.  People are trying "Promise" out on code in the wild.  Here is creator Ryan Kelly explaining how to make use of the bytecode improvements:

http://panela.blog-city.com/the_promise_of_faster_python_1.htm#1

1. Ryan Kelly left...
2010.01.05 Tue 4:25 pm :: http://www.rfk.id.au/
Matt, thanks for taking the time to put this together. The optimizations applied by promise are certainly not in the same league as something like psyco - they have to be quite well targeted to have any measurable effect.
Some clarifications: promising a function pure() doesn't optimise that function at all, but it can speed up things that call that function by inlining its bytecode at the call site. To get this to work, you have to use constant() to promise that references to the pure function won't change. Example:

..@promise.pure()
..def calculate(a,b):
......return a + 2*b
..@promise.constant(("calculate",))
..def aggregate(pairs):
......return

In this scenario, the bytecode for "calculate" will be inlined directly into the "aggregate" function and will save the overhead of many function calls.
By far the biggest speedup that can currently be obtained using promise is inlining pure functions that get called in a loop.
Reblog this post [with Zemanta]

Tuesday, November 10, 2009

Mark Chu-Carroll on Haskell

Simon Peyton JonesImage via Wikipedia

Always a sign of "programming maturity" when a programmer can readily admit the serious short-comings of their preferred language. It raises the respect for the programmer and the language discussed.
Philosophizing about Programming; or "Why I'm learning to love functional programming" : Good Math, Bad Math: "

But it's not all good. Haskell has some serious problems. In particular, it's got two issues that worry me enough that I'm still a bit hesitant to recommend it for a lot of applications. Those two are what I call lazy confusion, and monad complexity.

By lazy confusion, I mean that it's often extremely difficult to predict what's going to happen in what order in a Haskell program. You can say what the result will be, but you can't necessarily say what order the steps will happen in. That's because Haskell uses lazy evaluation, which means that no computation in Haskell is really evaluated until its result is used. You can write Haskell programs that generate infinitely long lists -but it's not a problem, because no element of the list is ever evaluated until you try to use it, and you'll never use more that a finite number of elements. But lazy evaluation can be very confusing: even Haskell experts - even people who've implemented Haskell compilers! - sometimes have trouble predicting what code will be executed in what order. In order to figure out the computational complexity of algorithms or operations on data structures, people often wind up basically treating the program as if it were going to be evaluated eagerly - because analyzing the laziness is just too difficult. Laziness is not a bad thing; in fact, I'm pretty convinced that very frequently, it's a good thing, which can make code much cleaner and clearer. But the difficulty of analyzing it is a major concern.

...

Wired plug board for an IBM 402 Accounting Mac...Image via Wikipedia

Monad complexity is a very different problem. In Haskell, most code is completely stateless. It's a pure functional language, so most code can't possibly have side effects. There's no assignments, no I/O, nothing but pure functions in most Haskell code. But state is absolutely essential. To quote Simon Peyton-Jones, one of the designers of Haskell: "In the end, any program must manipulate state. A program that has no side effects whatsoever is a kind of black box. All you can tell is that the box gets hotter." The way that Haskell gets around that is with a very elegant concept called a monad. A monad is a construct in the program that allows you to create an element of state, and transparently pass it through a sequence of computations. This gives you functional semantics for a stateful computation, without having to write tons of code to pass the state around.

...

The reason that that's a problem is that there are multiple different monads, to represent different kinds of state. There are monads for mutable arrays - so that you can write efficient matrix code. There are monads for parsing, so that you can write beautiful parsers. There are monads for IO, so that you can interact with the outside world. There are monads for interacting with external libraries written in non-functional libraries. There are monads for building graphical UIs. But each of them has a packet of state that needs to be passed between the steps. So if you want to be able to do more than one monadic thing - like, say, write a program with a GUI that can also read and write files - you need to be able to combine monads. And the more monads you need to combine, the more complicated and confusing things can get.

"
Python took the idea of "generators", added the easy syntax of the "yield" statement, and, most importantly, let the generator be a first-class object. It ended up more powerful than the language that generators was inspired from, the Icon programming language. Maybe a multi-paridigm language like Python - where the ideas of "state handling/managing", "across-process global/intra-process global state handling/managing", and ways of expressing different forms of lazy evaluation are added - and none of these are required and all of these are available - and all able to be wrapped up in a first class object - would be the correct approach.

Reblog this post [with Zemanta]

Saturday, October 24, 2009

Python News: PSF adopts Diversity Statement

What do I think about diversity statements? The Python Software Foundation has published and endorsed one:
The Python Software Foundation and the global Python community welcome and encourage participation by everyone. Our community is based on mutual respect, tolerance, and encouragement, and we are working to help each other live up to these principles. We want our community to be more diverse: whoever you are, and whatever your background, we welcome you.
Also see: For additional resources on efforts to promote diversity within the Python community, please see the main diversity page.

17th century painting from Hasht-Bahesht palac...Image via Wikipedia

Diversity/multiculturalism is a means to an end, not an end in itself.
I am for diversity/multiculturalism, now, in western culture, because there are too few natural experiments in effective living. More natural experiments by different cultural, national, religious, social-sexual groups lead to better outcomes.
Society would give more people happiness, fulfilment, health, reproductive opportunities in the absence of pain and anxiety if more people copied the best of other cultural, national, religious, social-sexual groups.
I take exception to the use of the word "tolerance". To tolerate one thing, you must be intolerant of forces in opposition to that thing. "Tolerance", taken by itself, is meaningless, unless the form of the corresponding intolerance is unspecified.
As a human, I have a built in bias against other cultural, nationalreligious, social-sexual groups. I strive to rise above these biases, and act accordingly. I am intolerant of those who would act against other cultural, religious, social-sexual groups, for their own benefit or comfort. I also take exception to the use of the word "diverse". There are some forms of diversity I value, and other I do not. I would not appreciate diversity that welcomes motivated stupidity, motivated sloth, motivated controversy, and motivated egoistic criticism and . I feel strongly that those who practice motivated stupidity, motivated sloth, motivated controversy, and motivated egoistic criticism should feel unwelcome. ("motivated is important here, I don't mean those who do so out of understandable ignorance, or who are incapable of constructive communication because of situational stress.)
The diversity I value are, again, cultural, national, religious, and social-sexual, for the reasons stated above. Not that anyone should care, I just wanted the exercise of making my thoughts more precise.

Communication major dimensions schemeImage via Wikipedia

I have a terrible temper, and act like a lout, many times. I hate communities that rate politeness over technical correctness, but I cannot defend my loutish stupidity. I don't mind getting called out - I deserve it.
Python News: PSF adopts Diversity Statement: "

On October 12, 2009, the board of the Python Software Foundation voted to adopt a Diversity Statement.

"
Reblog this post [with Zemanta]

Friday, October 16, 2009

Voidspace Techie Blog - storage.py

My comment on Fuzzyman's 'storage.py' module.

Google File SystemImage by lukedbaker via Flickr

Voidspace Techie Blog - Comments: "your 'storage.py', backed by a native file system for persistence, could solve a lot of problems with untrusted users supplying filepath input. Twisted's 'filepath' module tries to do this http://glyf.livejournal.com/75142.html on a limited basis, but your storage.py as a level of abstraction between the untrusted user filepath input and the actual native file system would do this more comprehensively. Also, persistence backed by Google BigTable, etc. Great work!"
Reblog this post [with Zemanta]

Thursday, August 27, 2009

Dave Beazley writes more on Python's GIL

Inside the "Inside the Python GIL" Presentation
My comment: Thanks to Dave Beazley for doing some of the very best writing and research on Python's GIL. It removed much of my own stupidity on the subject.

A multithreaded process with two threads execu...Image via Wikipedia

What seems to be the answer: *** I/O bound threaded code - no risk of context switches swamping the CPU: For CPython's implementation, implementing the GIL using both a condition variable and an pthreads/OS mutex lock is the way to go, so that code can be developed on a single core machine, and not blow up on a multiple core machine (and vice-versa, to a degree). *** I/O bound threaded code - possibility of context switches swamping the CPU: As the number of "concurrent" events grows, there is a possibility of the CPU being swamped by the context switches involved in OS threading. The solution is, in pure Python, using event-driven multi-threading and deferred objects, using the Twisted library or rolling your own. You have the problem of avoiding writing blocking code, such as loops. And code that has the possibility of blocking must regularly check/pause to handle queued events. *** CPU intensive threaded code: 1) event-driven and deferred objects (using Twisted or rolling your own) - being careful to avoid loops and other long running code 2) "green" threads, implemented in pure Python - maintain your own stack of tuples of ("functionname", arg0, arg1) and ("continuationname", statevar0, statevar1), and dispatch on name from that stack - being careful to avoid loops and other long running code - if you have some code with a loop, break it up into more than one "continuationname0" "continuationname1" etc.

Die of an Intel 80486DX2 microprocessor (actua...Image via Wikipedia

3) message passing between Python processes 4) Python module multiprocessing - threading work-alike 5) combination of the above four 6) Why not modern shared mutable state threading like in Java and C#: the implementation in the virtual machines and the language and library constructs? Sweep away all the complexities of the above 5 with a single broom? I am prejudiced against general shared mutable state threading because it is brittle and non-deterministic. That makes it a non-starter: you are never able to make ANY guarantees about low-latency and performance after ANY change in the code, no matter how small. And to regain adequate low-latency and performance, your implementation could get very hairy very quickly. Of course, the penalty paid by my suggested approach is a hairy implementation right off the bat - I have to be honest about that. It seems to me: any techniques adequate to handle CPU intensive multi-threaded code would be overkill for I/O bound multi-threaded code. So best to deal with the cases separately. [ Use Google books to find out about "shared mutable state threading in Java" http://books.google.com/books?q=shared+mutable+state+threading+java&btnG=Search+Books ] The biggest missing piece: in a long running high availability application (a candidate for multi-threaded code), code reloading on the fly. Right now, a terrible solution is using Erlang as a thin layer of supervisor code, where the real work is farmed out to Python. The only advantage of this approach is avoiding predictable failure modes. Armin Ronacher blogged about this problem: http://lucumr.pocoo.org/2009/7/24/singletons-and-their-problems-in-python
Reblog this post [with Zemanta]

Monday, July 20, 2009

Dive Into Mark: Universal Encoding Detector

Character encoding auto-detection in Python 2 and 3. As smart as your browser. Open source. More great open-source code from Mark Pilgrim.
>>> import urllib
>>> urlread = lambda url: urllib.urlopen(url).read()
>>> import chardet
>>> chardet.detect(urlread("http://google.cn/"))
{'encoding': 'GB2312', 'confidence': 0.99}

>>> chardet.detect(urlread("http://yahoo.co.jp/"))
{'encoding': 'EUC-JP', 'confidence': 0.99}

>>> chardet.detect(urlread("http://amazon.co.jp/"))
{'encoding': 'SHIFT_JIS', 'confidence': 1}

>>> chardet.detect(urlread("http://pravda.ru/"))
{'encoding': 'windows-1251', 'confidence': 0.9355}

>>> chardet.detect(urlread("http://auction.co.kr/"))
{'encoding': 'EUC-KR', 'confidence': 0.99}

>>> chardet.detect(urlread("http://haaretz.co.il/"))
{'encoding': 'windows-1255', 'confidence': 0.99}

>>> chardet.detect(urlread("http://www.nectec.or.th/tindex.html"))
{'encoding': 'TIS-620', 'confidence': 0.7675}

>>> chardet.detect(urlread("http://feedparser.org/docs/"))
{'encoding': 'utf-8', 'confidence': 0.99}
Great to know!
Reblog this post [with Zemanta]

Thursday, July 2, 2009

Armin Ronacher knows the correct way to use Python's "super"... Do you?

Tweet from "mitsuhiko" (Armin Ronacher) on the mis-uses of Python's "super".

Image of Armin Ronacher from TwitterImage of Armin Ronacher

http://twitter.com/mitsuhiko/status/2438234176 "super" is the built-in function that traverses the Method Resolution Order of bases classes for delegating work inside of a class method. This could be non-trivial, because Python supports multiple inheritance. It is very nice to have a built-in function to do this, but this is the correct way to call "super":
class TypeOutTheClassNameHere(Class1, Class2):

    # methodname - method provided to support delegation

    def methodname(self, x, y, z):
    
        # If you wish for Side-Effects...
        heavy_lifting1(x, y, z)
        
        result = super(TypeOutTheClassNameHere, self).methodname(x, y, z)
        
        return heavy_lifting2(result)
        
# greetings to library users:
# TypeOutTheClassNameHere is ready to be sub-classed!

class OtherClass(TypeOutTheClassNameHere):

    def methodname(self, x, y, z):
    
        heavy_lifting3(x, y, z)
        
        result = super(OtherClass, self).methodname(x, y, z)
        
        return heavy_lifting4(result)
Really Important Point: without unit-testing of the implementation and the success of delegation, using "super" is pure vanity. "super", by itself, cannot magically make your code handle multiple inheritance correctly! Not only do you have to test your class, you have to test that future users of your code, inheriting from your class, will get correct behavior. Yup, your unit-test suite may include creating one-off classes for testing! Armin's tweet demonstrates the prevalence of incorrect use of "super". If you fail to "TypeOutTheClassNameHere", any sub-class to your class will break. http://www.google.com/codesearch?q="super(type(self)"&hl=en&btnG=Search+Code Michele Simionato has a great three-part write-up called "Things to Know About Python Super". ( Part 1, Part 2, Part 3 ) The complexity in any given case is not great, the complexity comes only from considering every single corner case. My view is that unit-testing is a must to make sure the corner cases you are interested in is correctly implemented. Really Important Point: If you have no interest in supporting multiple-inheritance, and you have no interest in supporting sub-classing, don't use "super". It will simply mislead users of your code as a library that you investing in engineering and testing.

Mushroom cloud from the largest nuclear test t...Image via Wikipedia

"super" is an advertisement to the world that you invested in the engineering and testing to support multiple-inheritance and sub-classing! Don't use it if you don't mean it!
I like "super". I only user "super" when I am supporting multiple-inheritance and sub-classing. I write unit-tests whenever I write code with "super". Please take these issues into consideration, for your library code.
Reblog this post [with Zemanta]

Using an inner function for breaking out of nesting

From Fuzzyman, a blog post on different ways of handling breaking out of nesting. Here is a snippet:
def find_match():
    for x in range(max_x):
        for y in range(max_y):
            if match(x, y):
                return x, y
result = find_match()
if result is None:
    # match not found
else:
    x, y = result
Yup, looks clean to me, and the named "result" and "inner_function" give the opportunity for self-documentation, with appropriate names instead of "result" and "inner_function".

Great Blue Heron pair preparing a nest (bird),...Image by mikebaird via Flickr

For your esthetical enrichment, a picture of a nest!
Reblog this post [with Zemanta]

Monday, June 15, 2009

Dive Into Python 3 - httplib2

Fantastic writing on a fine HTTP library "httplib2" by Mark Pilgrim. Dive Into Python 3 - HTTP Web Services Shows the value of the feature set of HTTP - compression, caching, getting around local and third-party caches, temporary and permanent redirects. "httplib2" supports them all, by default, so you can be a good HTTP-Citizen from the word GO. I read the Reddit comments on this article, and it was pretty depressing. There is a weird reaction to a well designed piece of library code - the better designed and more comprehensive a library is, the more vocal and numerous the irrational criticism for it is.

Arthur SchopenhauerImage via Wikipedia

People are so terrified to learn a new library, because of a lack of unallocated healthy plastic brain neurons, and because they are terrified of their meagre mental gifts being so publicly obvious, that they have to publicly use all the the odious tools of the Art of Controversy to denigrate it. "EasyInstall" and "Eggs" in Python seem to suffer the worst from this - EasyInstall is hardly perfect, but it helps smart people with tricky Python packaging problems, and it will certainly play a part, in some future incarnation, to the issue of Python packaging for all who wish to do it well. Because of that, and because of Python programmers with a lack of unallocated healthy plastic brain neurons who are terrified of their

A scan of the brain using fMRIImage via Wikipedia

meagre mental gifts being so publicly obvious, you get a lot of trash talk and irrational criticism of the "EasyInstall" library. (Phillip Eby's personality is no help here, and I am hardly his best apologist for his personality defects, but the man does some good work.) The Reddit comments of Mark Pilgrim's "httplib2" write-up is the nittiest of nit-picking, and conflating of pertinent issues. Some idiot was trying to argue that local disc caching is sometimes higher latency than caching on a local network call. I guess some weird local network with a massive RAM cache and only a single workstation user. Dumb stuff. Then some idiot complained that httplib2 does too many "correct" things by default, and it was interfering with him writing a mobile device application, badly.

The human brainImage via Wikipedia

All you can do is call it like you see it, and hope the silent crowd of competent developers recognize the garbage comments for what it is. And maintain healthy brain tissues.
Reblog this post [with Zemanta]

Friday, June 12, 2009

Jp Calderone knows how to override __eq__. Do you?

Took me a while to find this, so let me blog it now, for prosperity:

CPythonImage via Wikipedia

How to override comparison operators in Python Jp Calderone goes into much more detail than just how to write proper "__eq__" and "__ne__" methods for your own Python classes, but it is surprising how well hidden the details for correctly implementing "__eq__" and "__ne__" are. I believe the issue is less critical in Python3, because it does the correct thing when only "__eq__" is implemented. Here is the sample code:
class A(object):
    def __init__(self, foo):
        self.foo = foo
    def __eq__(self, other):
        if isinstance(other, A):
            return self.foo == other.foo
        return NotImplemented
    def __ne__(self, other):
        result = self.__eq__(other)
        if result is NotImplemented:
            return result
        return not result
If you want an immutable object that can be used as a dictionary key, you will want to implement "__hash__", along with "__eq__" and "__ne__". If you are implementing inequality comparisons - Be Careful - supply the full complement of inequality comparisons and take care when using "NotImplemented". The default implementations of "less-than __lt__" "less-than-or-equals __le__" "greater-than __gt__" "greater-than-or-equals __ge__" aren’t very useful - they compare by address using id(). This default inequality comparison can introduce intermittent bugs in your comparison code. If there is no meaningful comparison between different types or classes, raise a TypeError, so there is no risk of falling back on the terrible default inequality comparison implementation. This problem will be fixed in Python3. The fastest and most complete solution is this code from Raymond Hettinger - Python Cookbook recipe 576685: Total ordering class decorator.
def total_ordering(cls):
    'Class decorator that fills-in missing ordering methods'    
    convert = {
        '__lt__': [('__gt__', lambda self, other: other < self),
                   ('__le__', lambda self, other: not other < self),
                   ('__ge__', lambda self, other: not self < other)],
        '__le__': [('__ge__', lambda self, other: other <= self),
                   ('__lt__', lambda self, other: not other <= self),
                   ('__gt__', lambda self, other: not self <= other)],
        '__gt__': [('__lt__', lambda self, other: other > self),
                   ('__ge__', lambda self, other: not other > self),
                   ('__le__', lambda self, other: not self > other)],
        '__ge__': [('__le__', lambda self, other: other >= self),
                   ('__gt__', lambda self, other: not other >= self),
                   ('__lt__', lambda self, other: not self >= other)]
    }
    roots = set(dir(cls)) & set(convert)
    assert roots, 'must define at least one ordering operation: < > <= >='
    root = max(roots)       # prefer __lt __ to __le__ to __gt__ to __ge__
    for opname, opfunc in convert[root]:
        if opname not in roots:
            opfunc.__name__ = opname
            opfunc.__doc__ = getattr(int, opname).__doc__
            setattr(cls, opname, opfunc)
    return cls
For a lower tech solution, consider using this Mixin class for inequality comparison special methods [from Fuzzyman: http://www.voidspace.org.uk/python/articles/comparison.shtml]
class RichComparisonMixin(object):

    def __eq__(self, other):
        raise NotImplementedError("Equality not implemented")

    def __lt__(self, other):
        raise NotImplementedError("Less than not implemented")

    def __ne__(self, other):
        return not self.__eq__(other)

    def __gt__(self, other):
        return not (self.__lt__(other) or self.__eq__(other))

    def __le__(self, other):
        return self.__eq__(other) or self.__lt__(other)

    def __ge__(self, other):
        return not self.__lt__(other)

Monty Python's Flying Circus album coverImage via Wikipedia

[Aside & Plug] Let me take this opportunity to give a plug to the book IronPython in Action, by Michael Foord (Fuzzyman) and Christian Muirhead. The publisher, Manning, has a great service to Python Programmers on the book's website:

FuzzymanImage by Michael Foord via Flickr

Python Magic Methods I was a little disappointed (and surprised) that this great Python magic methods reference didn't give more tips about "__eq__" and "__ne__". But, otherwise, this is all great material and this is all new material, not just a re-hash of the original on-line Python docs. The best summary I have seen; even better than Alex Martelli's Python in a Nutshell.
Reblog this post [with Zemanta]