I’m not sure if this is ever going to be of any serious use for me, but I just had to give it a shot:

[flv: 480 360]

Whaddya think?

A world of sound

Just when you thought you were safe:

[flv: 480 360]

Audio done in Audacity (an open source audio recording/editing program) and synched in Blender‘s video sequencer. The drops are actually just me tapping my finger on my laptop’s microphone. Sound FX to the max!

The Gimpy Fly

[flv: 480 360]

I thought I could do the whole thing with the Gimp Animation Plug-in (GAP), but the video encoder is buggy and I couldn’t get it to make an AVI file. After several attempts, I put the single frames into Blender for final processing. Let me know what you think.

Video editor

Blender continues to amaze me – it’s basically a self-contained AV production suite – you’d need at least two different pieces of high-end (read “extremely expensive”) software to replace it. Here’s a simple fade between the old and new version of the pavilion animation, to show a very basic functionality:
[flv: 480 360]

New rendering

This really took a long time – about 4 minutes per frame (total render time more than 24 hours – not sure how many exactly). However, after some tweaking to the sun/sky settings and the photon sampling rate, I think the results are at least passable. I also put this through Blender’s compositing node editor to correct Yafaray’s gamma, which came out too pale. Final output to FLV file done on Riva FLV Encoder – which, for the record, is free but not open source. This means that the entire workflow did not use a single piece of commercial software.

Well, here are the results:

[flv: 480 360]

BTW, since I’ve been asked: the pavilion is not a full-fledged project by any means – it’s just a quickie something to allow me to test software.

Update: just checked – Riva’s website doesn’t specify its license terms, but it FFmpeg and LAME modules, which are both released under the LGPL, an open-source license. So all is well.

A few hours later…

… I have an animation of the pavilion ready:

[flv: 480 360]

This took an inordinate amount of time to render in Yafaray – and I’m still not happy with the quality of the results – too much noise at the edges (the patchiness of the “grass” is not from the renderer, but the external flash encoder). I think Yafaray is mainly good for stills, where you can crank up the settings and wait for it to churn out the results. MentalRay in Maya still has this one beat, especially in terms of the quality of the sky light and the cleanliness of images.

Next I think I will see how Big Buck Bunny was lit – it’s rendered without final gathering in Blender’s internal renderer, but it’s all really nice. Here’s the main advantage of open source and Creative Commons licenses – it allows you to learn from the work of others. Beautiful.

I also want to test out Kerkythea – it’s a free (though not open source) renderer, which appears to be fast AND to give exceptional results.

Testing, testing…

Fur engine revved up:


Digital furballs – what’s this world coming to?

Anyways – this is just a test of Blender’s particle engine – good for making grass, as well.

And as long as I’m playing with particles, I couldn’t resist a bit of animation:

[flv: 480 360]

Blender animation test

A little something I put together with Blender – mind you, it’s just a technical test:

[flv: 400 300]

One thing I noticed doing this is the rudimentary support Blender has for NURBS surfaces – I had to convert to polygons to get it to smooth out, which seems like a contradiction in terms (you can see the original NURBS rendering in the video still). Still, it did the job I wanted it to do.