Python DevCenter
oreilly.comSafari Books Online.Conferences.

advertisement


Introduction to Stackless Python
Pages: 1, 2

Micro-threads

Will Ware's microthreads, for example, journey a long way from the "ivory tower." Scripting languages are rightly becoming popular implementation vehicles for games. The higher-level constructs in common scripting languages encourage customization and extensibility at a pace that's crucial in gaming, where time-to-market is so important. Performance is also essential, though, and conventional threading is just too clumsy to satisfy the requirements typical in game programming.

Microthreads are a perfect answer. In a description Ware co-authored:

Microthreads are useful when you want to program many behaviors happening simultaneously. Simulations and games often want to model the simultaneous and independent behavior of many people, many businesses, many monsters ... With microthreads, you can code these behaviors as Python functions. ...

Microthreads switch faster and use much less memory than OS threads. You can run thousands of microthreads simultaneously.

What's the relation between these benefits and Stackless's implementation details? Here's a quick sketch:

Continuations are the general-purpose concurrency construct. A continuation represents all the future computations of a particular program. Capturing all this control flow in a single conceptual object makes it programmable: It becomes possible to calculate or reason over the control flow. In particular, there's great scope for optimizing assignment of different calculations to different processes or threads or even hosts.

The example Tismer likes is one he credits to Jeremy Hylton: Think of the simple program

     x = 2
     y = x + 1
     z = x * 2

For this example, "the continuation of x = 2 is y = x + 1; z = x * 2. ... [E]very single line of code has a continuation that represents the entire future execution of the program."

Continuations are sufficiently general that they can model threads efficiently. As it happens, the threads native to most operating systems (OSs) have accumulated a lot of baggage through the years. Legend advertises threads as "lightweight." They're supposed to be much nimbler than processes, for example. This is no longer true, though. Common industry practice has caused threads and processes to converge in their resource demands. Threads are not particularly lightweight.

Think for a minute about an ideal thread, not the one leading OSs provide, but a minimal flow of execution burdened with as little communication and synchronization overhead as possible. That's a microthread, and continuations can model microthreads as well as they model any other concurrency structure.

This is important. It's well known among experts that common multi-threaded Web servers, for example, can gag when they spawn more than a modest number of threads -- fifty, in common cases. Stackless microthreads not only possess all the programming ease Python delivers, but even mediocre equipment handles thousands and tens of thousands of microthreads.

It's not just reliability and performance that Stackless gives microthreads. Stackless should also make microthreads and related control structures "serializable," or persistent: They will have a representation that allows state to be saved and restored. At the level of game-playing, it means that there will be a thoroughly natural and complete way to pause or save any simulation object, or even to relocate it to a different process or host. A game player might choose to pack his identity away, and then re-start again at the same point months later. While serialization isn't available in a finished form, it's already in Tismer's plans. The Python jargon for serialize is "pickle." As Tismer puts the situation, "Microthreads are easier to pickle than general continuations. . . . [M]icrothreads are most likely to be the first things we will be able to pickle." Early experiments suggest it'll work nicely, once the API becomes more definite.

Join in the fun

The challenge of this introduction has been to explain Stackless' promise for the future, without losing contact with its present reality. To help bring the abstractions back into proper focus, tune in again next week for an article on Stackless Programming. There you'll see how easy it is for you to start to practice with Stackless Python for your own work.

Cameron Laird is the vice president of Phaseit, Inc. and frequently writes for the O'Reilly Network and other publications.


Discuss this article in the O'Reilly Network Python Forum.

Return to the Python DevCenter.

 





Sponsored by: