Icon BVHView

 

Icon Dead Blending Node in Unreal Engine

 

Icon Propagating Velocities through Animation Systems

 

Icon Cubic Interpolation of Quaternions

 

Icon Dead Blending

 

Icon Perfect Tracking with Springs

 

Icon Creating Looping Animations from Motion Capture

 

Icon My Favourite Things

 

Icon Inertialization Transition Cost

 

Icon Scalar Velocity

 

Icon Tags, Ranges and Masks

 

Icon Fitting Code Driven Displacement

 

Icon atoi and Trillions of Whales

 

Icon SuperTrack: Motion Tracking for Physically Simulated Characters using Supervised Learning

 

Icon Joint Limits

 

Icon Code vs Data Driven Displacement

 

Icon Exponential Map, Angle Axis, and Angular Velocity

 

Icon Encoding Events for Neural Networks

 

Icon Visualizing Rotation Spaces

 

Icon Spring-It-On: The Game Developer's Spring-Roll-Call

 

Icon Interviewing Advice from the Other Side of the Table

 

Icon Saguaro

 

Icon Learned Motion Matching

 

Icon Why Can't I Reproduce Their Results?

 

Icon Latinendian vs Arabendian

 

Icon Machine Learning, Kolmogorov Complexity, and Squishy Bunnies

 

Icon Subspace Neural Physics: Fast Data-Driven Interactive Simulation

 

Icon Software for Rent

 

Icon Naraleian Caterpillars

 

Icon The Scientific Method is a Virus

 

Icon Local Minima, Saddle Points, and Plateaus

 

Icon Robust Solving of Optical Motion Capture Data by Denoising

 

Icon Simple Concurrency in Python

 

Icon The Software Thief

 

Icon ASCII : A Love Letter

 

Icon My Neural Network isn't working! What should I do?

 

Icon Phase-Functioned Neural Networks for Character Control

 

Icon 17 Line Markov Chain

 

Icon 14 Character Random Number Generator

 

Icon Simple Two Joint IK

 

Icon Generating Icons with Pixel Sorting

 

Icon Neural Network Ambient Occlusion

 

Icon Three Short Stories about the East Coast Main Line

 

Icon The New Alphabet

 

Icon "The Color Munifni Exists"

 

Icon A Deep Learning Framework For Character Motion Synthesis and Editing

 

Icon The Halting Problem and The Moral Arbitrator

 

Icon The Witness

 

Icon Four Seasons Crisp Omelette

 

Icon At the Bottom of the Elevator

 

Icon Tracing Functions in Python

 

Icon Still Things and Moving Things

 

Icon water.cpp

 

Icon Making Poetry in Piet

 

Icon Learning Motion Manifolds with Convolutional Autoencoders

 

Icon Learning an Inverse Rig Mapping for Character Animation

 

Icon Infinity Doesn't Exist

 

Icon Polyconf

 

Icon Raleigh

 

Icon The Skagerrak

 

Icon Printing a Stack Trace with MinGW

 

Icon The Border Pines

 

Icon You could have invented Parser Combinators

 

Icon Ready for the Fight

 

Icon Earthbound

 

Icon Turing Drawings

 

Icon Lost Child Announcement

 

Icon Shelter

 

Icon Data Science, how hard can it be?

 

Icon Denki Furo

 

Icon In Defence of the Unitype

 

Icon Maya Velocity Node

 

Icon Sandy Denny

 

Icon What type of Machine is the C Preprocessor?

 

Icon Which AI is more human?

 

Icon Gone Home

 

Icon Thoughts on Japan

 

Icon Can Computers Think?

 

Icon Counting Sheep & Infinity

 

Icon How Nature Builds Computers

 

Icon Painkillers

 

Icon Correct Box Sphere Intersection

 

Icon Avoiding Shader Conditionals

 

Icon Writing Portable OpenGL

 

Icon The Only Cable Car in Ireland

 

Icon Is the C Preprocessor Turing Complete?

 

Icon The aesthetics of code

 

Icon Issues with SDL on iOS and Android

 

Icon How I learned to stop worrying and love statistics

 

Icon PyMark

 

Icon AutoC Tools

 

Icon Scripting xNormal with Python

 

Icon Six Myths About Ray Tracing

 

Icon The Web Giants Will Fall

 

Icon PyAutoC

 

Icon The Pirate Song

 

Icon Dear Esther

 

Icon Unsharp Anti Aliasing

 

Icon The First Boy

 

Icon Parallel programming isn't hard, optimisation is.

 

Icon Skyrim

 

Icon Recognizing a language is solving a problem

 

Icon Could an animal learn to program?

 

Icon RAGE

 

Icon Pure Depth SSAO

 

Icon Synchronized in Python

 

Icon 3d Printing

 

Icon Real Time Graphics is Virtual Reality

 

Icon Painting Style Renderer

 

Icon A very hard problem

 

Icon Indie Development vs Modding

 

Icon Corange

 

Icon 3ds Max PLY Exporter

 

Icon A Case for the Technical Artist

 

Icon Enums

 

Icon Scorpions have won evolution

 

Icon Dirt and Ashes

 

Icon Lazy Python

 

Icon Subdivision Modelling

 

Icon The Owl

 

Icon Mouse Traps

 

Icon Updated Art Reel

 

Icon Tech Reel

 

Icon Graphics Aren't the Enemy

 

Icon On Being A Games Artist

 

Icon The Bluebird

 

Icon Everything2

 

Icon Duck Engine

 

Icon Boarding Preview

 

Icon Sailing Preview

 

Icon Exodus Village Flyover

 

Icon Art Reel

 

Icon LOL I DREW THIS DRAGON

 

Icon One Cat Just Leads To Another

Misunderstanding Laziness in Python

Created on April 8, 2011, 7:11 p.m.

 

Laziness in programming languages is something that has been thrown around a fair amount recently. Certainly it seems like a lot of people want it implemented in their favourite language. This is often after having used it in Haskell, where it is extremely useful, easy to understand, and seemingly without any serious pitfalls.

I'd been playing around with decorators in Python, and laziness seemed like something that would make a good experiment. The actual concept of lazy evaluation isn't really that difficult; I'd already looked at it in ML, and understood the nature of delayed computation. Seeing as python treats functions as high order constructs, i.e values, it isn't difficult to implement the functionality for a function that returns a function, ready to be evaluated when needed.

But this wasn't really enough. I wanted a way for the laziness to be truly transparent, preferably with just the use of a decorator and that's all. For this, the solution wasn't immediately apparent, but python is a flexible language, so I thought there might be a chance. With all this on my mind I spent a long time thinking about lazy evaluation in python and in the end decided on a few things.

  1. It wouldn't be possible to add transparent laziness without manipulating the bytecode extensively or the serious internals of the compiler. Both things I didn't want to do.
  2. Transparent lazy evaluation isn't really a good idea in an imperative language.

I'll start with the first point. Lets look at some examples, starting with my lazy decorator.

 

def lazy(func):
    def lazyfunc(*args, **kwargs):
        wrapped = lambda x : func(*args, **kwargs)
        wrapped.__name__ = "lazy-" + func.__name__
    return wrapped
return lazyfunc

 

This is a fairly simple thing - the typical way of emulating lazy evaluation. Given a function, it returns a new function, which when called, evaluates to the original passed function.

We can use this as a decorator in the following.

 

def hello_add_normal(x, y):
	print "Hello I am Normal!"
	return x + y

@lazy
def hello_add_lazy(x, y):
	print "Hello I am Lazy!"
	return x + y

 

Executing the first function will evaluate it, but the second one takes an extra call, used when we require the actual value of the evaluation. In this way we can evaluate the result when we want (or when it is needed, in the case of laziness).

 

$ hello_add_normal(1, 2)
Hello I am Normal!
3
$ hello_add_lazy(1, 2)
<function lazy-hello_add_lazy at 0x021E9470>
$ myval = hello_add_lazy(1, 2)
$ myval()
Hello I am Lazy!
3

 

As this point, it's perfectly possible for the user programmer to add in this extra call (or an "force" function) whenever they want the evaluation, but I was looking for some way for python to know when it is needed, and to do it automatically.

So really the trick here is to add this extra call at any point when the value of this function is requested. Unfortunately this requires code analysis. Something that is possible in the python runtime, but only at the bytecode level.

Still, perhaps this is enough, all in the name of lazy evaluation!? In the example above it is fairly simple. We'd want to add an extra call before the assignment to myval - after we'd just looked at a symbol we know is lazy. As python code is stack based, it looks like you could just add in an extra "call" bytecode, after the appearance of any "lazy" symbol at the top of the stack. In reality though, things get a bit harder. It is hard to monitor what is on top levels of the stack, and there are various other complications. Take a look at this example.

 

mylist = [hello_add_lazy(1,2), hello_add_lazy(2,5), hello_add_lazy(6,1)]
hello = mylist[0]

 

At this point, the only place we have referenced a "lazy" symbol in the bytecode is on the list construction, no where near the place we want to actually do the automatic evaluation. So in this case we'd have to keep track of this "list of lazy things", and look at that too. 

Sure, this example is still possible to solve, and reason about, but with arbitrarily complicated data structures, build in functions, high order functions and various other complicated python constructs, I wasn't feeling like it was worth it.

Bytecode is just too far removed from the semantic model of the language for this kind of analysis. If you wanted to do lazy evaluation seriously it would have to be done in the compiler, which brings me to my second point. Perhaps we don't even want automatic lazy evaluation.

Python has two semantic expressions that are perhaps the strongest in the language, namely assignment "=" and calling "()". Needless to say, messing with these is probably not the best idea. Unfortunately lazy evaluation does both. Lets take a look at another example.

 

def sleepy(x):
	time.sleep(1)
	return x
	
$ my_other_list = [sleepy(1), sleepy(2), sleepy(3)]
[1,2,3]

 

When run, this code sleeps for three seconds then return the list of [1,2,3]. This isn't lazy in the slightest, but would you expect it to do anything else? Perhaps sleeping is a bad example, but list any other number of common side effects including IO and you quickly get an idea of where things could go terribly wrong. The brackets after "sleepy" mean the function is being called, you expect it to be evaluated. Lazy evaluation messes with this notion.

Assignment is made confused too, lets take a look at another example.

 

the_time = time.time()
time.sleep(1)
next_time = time.time()

if the_time == next_time: print "BOOM"

 

In this example, I would not want the code to ever print out "BOOM". With evaluation as lazy as possible, it would. Fine, you say, but what if we evaluate on assignment to the variable instead, to keep the ordering correct. Well this is kind of like wanting a mix of the pure and non pure, and in the end, following this argument means that you end up having to evaluate on assignment to ANY data structure, not just variables, and this just isn't lazy at all! Assignment can't mean equivalance and assignment at the same time.

In a language like Haskell, this kind of code would not be possible without monads. This is because in Haskell there are no side effects in functions, no sense of one thing then another. Sleeping, printing, all of that wouldn't be possible to declare in a function, and so with lazy evaluation it isn't an issue. It doesn't confuse when things happen.

This isn't the case in python. Programming without side effects is hard, and not always the preferred method (for me at least). There is nothing wrong with side effects providing you know what assignment and function calling actually mean semantically. And anyway, isn't it clearer and far nicer to know when you're functions are being called?

Lazy evaluation is great in Haskell, but it isn't the kind of "language feature" that can and should be added to languages on a whim. Most of the behaviours of lazy evaluation can be emulated via other means, that I think are essential in their clarity when not dealing with a pure, no side effect, language such as Haskell.

 

args = [1, (2, 4), 3]
funcs = [sleepy, hello_add_normal, time.sleep]
results = [f(*x) for f,x in zip(funcs, args)]
github twitter rss