Icon 14 Character Random Number Generator

 

Icon Simple Two Joint IK

 

Icon Generating Icons with Pixel Sorting

 

Icon Neural Network Ambient Occlusion

 

Icon Three Short Stories about the East Coast Main Line

 

Icon The New Alphabet

 

Icon "The Color Munifni Exists"

 

Icon A Deep Learning Framework For Character Motion Synthesis and Editing

 

Icon The Halting Problem and The Moral Arbitrator

 

Icon The Witness

 

Icon Four Seasons Crisp Omelette

 

Icon At the Bottom of the Elevator

 

Icon Tracing Functions in Python

 

Icon Still Things and Moving Things

 

Icon water.cpp

 

Icon Making Poetry in Piet

 

Icon Learning Motion Manifolds with Convolutional Autoencoders

 

Icon Learning an Inverse Rig Mapping for Character Animation

 

Icon Infinity Doesn't Exist

 

Icon Polyconf

 

Icon Raleigh

 

Icon The Skagerrak

 

Icon Printing a Stack Trace with MinGW

 

Icon The Border Pines

 

Icon You could have invented Parser Combinators

 

Icon Ready for the Fight

 

Icon Earthbound

 

Icon Turing Drawings

 

Icon Lost Child Announcement

 

Icon Shelter

 

Icon Data Science, how hard can it be?

 

Icon Denki Furo

 

Icon In Defence of the Unitype

 

Icon Maya Velocity Node

 

Icon Sandy Denny

 

Icon What type of Machine is the C Preprocessor?

 

Icon Which AI is more human?

 

Icon Gone Home

 

Icon Thoughts on Japan

 

Icon Can Computers Think?

 

Icon Counting Sheep & Infinity

 

Icon How Nature Builds Computers

 

Icon Painkillers

 

Icon Correct Box Sphere Intersection

 

Icon Avoiding Shader Conditionals

 

Icon Writing Portable OpenGL

 

Icon The Only Cable Car in Ireland

 

Icon Is the C Preprocessor Turing Complete?

 

Icon The aesthetics of code

 

Icon Issues with SDL on iOS and Android

 

Icon How I learned to stop worrying and love statistics

 

Icon PyMark

 

Icon AutoC Tools

 

Icon Scripting xNormal with Python

 

Icon Six Myths About Ray Tracing

 

Icon The Web Giants Will Fall

 

Icon PyAutoC

 

Icon The Pirate Song

 

Icon Dear Esther

 

Icon Unsharp Anti Aliasing

 

Icon The First Boy

 

Icon Parallel programming isn't hard, optimisation is.

 

Icon Skyrim

 

Icon Recognizing a language is solving a problem

 

Icon Could an animal learn to program?

 

Icon RAGE

 

Icon Pure Depth SSAO

 

Icon Synchronized in Python

 

Icon 3d Printing

 

Icon Real Time Graphics is Virtual Reality

 

Icon Painting Style Renderer

 

Icon A very hard problem

 

Icon Indie Development vs Modding

 

Icon Corange

 

Icon 3ds Max PLY Exporter

 

Icon A Case for the Technical Artist

 

Icon Enums

 

Icon Scorpions have won evolution

 

Icon Dirt and Ashes

 

Icon Lazy Python

 

Icon Subdivision Modelling

 

Icon The Owl

 

Icon Mouse Traps

 

Icon Updated Art Reel

 

Icon Tech Reel

 

Icon Graphics Aren't the Enemy

 

Icon On Being A Games Artist

 

Icon The Bluebird

 

Icon Everything2

 

Icon Duck Engine

 

Icon Boarding Preview

 

Icon Sailing Preview

 

Icon Exodus Village Flyover

 

Icon Art Reel

 

Icon LOL I DREW THIS DRAGON

 

Icon One Cat Just Leads To Another

Painting Style Renderer

Created on July 4, 2011, 1:03 p.m.

I've always loved photorealism in graphics but I do get the sense that people are too unwilling to experiment with it. Often a "realistic" renderer means that the art team stops bothering with anything interesting in the artistic direction. Perhaps that is unfair - but at the very least it gives them the excuse to fall back into the artistic cliches which we all love - gritty realism, bright fantasy and greeble infested sci-fi.

Because of this, non-photo-realism has always had a nagging at the back of my head, so I decided to try and roll a renderer that did something a bit different. This is what I've come up with so far:

 

 

The main inspiration comes from a couple of papers and presentations. The best paper I found is one by Barbara J. Meier from 1996 which suggests an offline rendering algorithm for a painterly style. There is also a later paper by a German Student Daniel Sperl who suggests a possible implementation in OpenGL using similar techniques. Worth mentioning, but not something I used is another presentation which looked like it presented a very cheap effect but I wasn't happy with the resulting look.

The basic idea is this: generate thousands of particles at different positions on a mesh and render these as "brush strokes" in screen space. This gives a painting style effect without much of the randomness or changes/flickering which you get in other post effects and can be off-putting.

To know the color of these brush strokes you first render the scene as usual and then use this reference image for a color lookup.

The offline technique presented by Barbara is done on the CPU and there were certain things I could not emulate in the graphics pipeline, but none were that important. For several of these I came up with solutions which I think work better anyhow.

The implementation proposed by Daniel Sperl was also interesting but lacked some of the features from the offline version such as orientation of particles, which I added. I also changed around how the billboarded particles are rendered to a far cheaper technique and moved the data into a VBO which sped draw times up significantly.

After much testing it became apparent that the bottleneck in the system is not actually the number of particles. It is the fill time of the particles, mainly due to the fact they are alpha blended onto the scene. I could easily render several million particles when they only covered a couple pixels each, but when their size grew, and they began to overlap each other, that is when the slowdown really begun. The problem was, without fairly large overlapping particles it is almost impossible to avoid gaps inbetween the strokes where the background shows through.

My first attempt at a solution was to do something normal painters do, I rendered a base layer with much larger and fewer particles to fill the gaps. Unfortunately there were two issues with this. First of all it was expensive - as the fill time was just as high as rendering lots of particles. Secondly it completely messed up the silhouette of objects as the brush strokes were so large.

On my second attempt I drew the reference image (of the normally rendered scene) below the particles. This filled in any gaps, but the silhouette was still bad. In some places it had a per-pixel sharpness and in others it was blurred where a brush stroke went across it.

In the end I think I came up with quite a novel solution, which is to render the reference image at a quarter of the screen size. This avoids having a sharp silhouette and fills in the gaps much more seamlessly. The added advantage is that the first pass, in drawing the scene as usual, becomes quite a bit faster. In fact, this pattern of taking advantage of the intrinsic loss of detail due to the effect, seems to be the key to getting the most out of the renderer. As well as downscaling the reference image it also makes sense to drop normal maps and texture resolutions, and any of the unneeded detail which wont be seen in the final product. Hopefully this makes up somewhat for the memory sucking horror of generating hundreds of thousands of particles per mesh.

For rendering the particles it made sense to have a LOD system. In the end I generated a set of Element Index Buffers for each object which skipped out every other or every third (etc) particle and used different ones of these based on distance.

There were some other tricks to reduce the fill time time too. The surface normal was stored with each particle and it's angle to the screen calculated in the vertex shader. This allowed me to shrink particles at a sharp angle to the screen and cull ones which were backfacing. The depth is rendered to texture when rendering the reference image and this can be used as a kind of poor man's depth sorting. You can discard particle fragments which are behind the reference object.

Currently the orientation is based upon either the tangent or binormal vector depending on the size of each UV triangle, but it is perfectly possible to allow the artist full control via either all strokes align along the UV x-axis or the UV y-axis. Stroke orientation and size is something I really want to experiment with in future as currently it has a very Monet type feel due to the small strokes.

The other thing that needs work is the colors. I feel there needs to be some post effect, perhaps increasing contrast and saturation. The shadows also seems to be off. This could be due to the color as I feel they need to be more blue and the light more orange. These are things which can all come down to tweaking though.

Anyway, I'm very welcome to ideas suggestions and feedback. In honesty I've been starting at that piano so long I hardly know what looks good any more.

github twitter rss