Python Programing Workshops
Python Programing Workshop delivered by TechBharat
Python is Efficient
A ton of mental energy these days is going into Big Data (both on defining it and on processing it). The more data you have to process, the more important it becomes to manage the memory you use.
Python provides generators, both as expressions and from functions.
Generators allow for iterative processing of things, one item at a time. This doesn't seem so fancy, until you start to realize that normal iterative processing of a list requires a list. A list takes memory. A really big list takes a lot of memory.
Where this becomes particularly handy is when you have a long chain of processes you need to apply to a set of data. Generators allow you to grab source data one item at a time, and pass each through the full processing chain.
I often face the problem of needing to migrate data from one website into another. Some of the sites I'm migrating have over a decade of history, gigabytes of content. Using the generator-based migration tool collective.transmogrifier I can read data from the old site, make complex, interdependent updates to the data as it is being processed, and then create and store objects in the new site all in constant memory.
For applications where you are dealing with even larger data sets, this sort of tool can be indespensible. David Beazley has a great slide deck online that provides some very compelling examples of using generators for system tasks. Take a look and see what sparks start flying in your imagination!