I’m not sure if this is ever going to be of any serious use for me, but I just had to give it a shot:
[flv:http://cygielski.com/blog/wp-content/uploads/2009/07/dadadance.flv 480 360]
Whaddya think?
I’m not sure if this is ever going to be of any serious use for me, but I just had to give it a shot:
[flv:http://cygielski.com/blog/wp-content/uploads/2009/07/dadadance.flv 480 360]
Whaddya think?
Just when you thought you were safe:
[flv:http://cygielski.com/blog/wp-content/uploads/2009/07/frogmovie2.flv 480 360]
Audio done in Audacity (an open source audio recording/editing program) and synched in Blender‘s video sequencer. The drops are actually just me tapping my finger on my laptop’s microphone. Sound FX to the max!
[flv:http://cygielski.com/blog/wp-content/uploads/2009/07/fly.flv 480 360]
I thought I could do the whole thing with the Gimp Animation Plug-in (GAP), but the video encoder is buggy and I couldn’t get it to make an AVI file. After several attempts, I put the single frames into Blender for final processing. Let me know what you think.
Blender continues to amaze me – it’s basically a self-contained AV production suite – you’d need at least two different pieces of high-end (read “extremely expensive”) software to replace it. Here’s a simple fade between the old and new version of the pavilion animation, to show a very basic functionality:
[flv:http://cygielski.com/blog/wp-content/uploads/2009/07/pav_YR_cross.flv 480 360]
This really took a long time – about 4 minutes per frame (total render time more than 24 hours – not sure how many exactly). However, after some tweaking to the sun/sky settings and the photon sampling rate, I think the results are at least passable. I also put this through Blender’s compositing node editor to correct Yafaray’s gamma, which came out too pale. Final output to FLV file done on Riva FLV Encoder – which, for the record, is free but not open source. This means that the entire workflow did not use a single piece of commercial software.
Well, here are the results:
[flv:http://cygielski.com/blog/wp-content/uploads/2009/07/pav_YR_comp.flv 480 360]
BTW, since I’ve been asked: the pavilion is not a full-fledged project by any means – it’s just a quickie something to allow me to test software.
Update: just checked – Riva’s website doesn’t specify its license terms, but it FFmpeg and LAME modules, which are both released under the LGPL, an open-source license. So all is well.
… I have an animation of the pavilion ready:
[flv:http://cygielski.com/blog/wp-content/uploads/2009/07/pav_yr.flv 480 360]
This took an inordinate amount of time to render in Yafaray – and I’m still not happy with the quality of the results – too much noise at the edges (the patchiness of the “grass” is not from the renderer, but the external flash encoder). I think Yafaray is mainly good for stills, where you can crank up the settings and wait for it to churn out the results. MentalRay in Maya still has this one beat, especially in terms of the quality of the sky light and the cleanliness of images.
Next I think I will see how Big Buck Bunny was lit – it’s rendered without final gathering in Blender’s internal renderer, but it’s all really nice. Here’s the main advantage of open source and Creative Commons licenses – it allows you to learn from the work of others. Beautiful.
I also want to test out Kerkythea – it’s a free (though not open source) renderer, which appears to be fast AND to give exceptional results.
Here’s what I managed to put together with Blender and Yafaray today:
This took me about three hours to model and render. Not bad for a piece of free software, huh?
Fur engine revved up:
Digital furballs – what’s this world coming to?
Anyways – this is just a test of Blender’s particle engine – good for making grass, as well.
And as long as I’m playing with particles, I couldn’t resist a bit of animation:
[flv:http://cygielski.com/blog/wp-content/uploads/2009/07/stars.flv 480 360]
A little something I put together with Blender – mind you, it’s just a technical test:
[flv:http://cygielski.com/blog/wp-content/uploads/2009/07/test.flv 400 300]
One thing I noticed doing this is the rudimentary support Blender has for NURBS surfaces – I had to convert to polygons to get it to smooth out, which seems like a contradiction in terms (you can see the original NURBS rendering in the video still). Still, it did the job I wanted it to do.