Lately I've been heavily experimenting with Mandelbrot sets. Not sets ran in 2D space that give you your typical hippie fractal images, but sets computed in 3D space, that give you complex and infinite 3D geometry. I've been experimenting with Mandelbulb3d - It's completely free software with support for DOF, fog, stereoscopic 3D rendering, an animation engine, etc. It could quite easily be integrated into a post pipeline utilizing Nuke/Houdini/Fusion/etc, if not generating the set math directly in Nuke with it's built-in Python API for example. The only downside I've found generating assets in Mandelbulb3d directly is no alpha support! This makes exporting renders and comping them into other work a bit difficult. Thankfully it supports the exporting of depth maps. What I've been doing is exporting depth maps, and using them as a luma matte - this will get rid of the background for you, just tweak the depth map with curves or levels to get the key you want.
Here's a quick render out of Mandelbulb3d. You can produce all kinds of geometry, and remember they sit in 3D space and everything can be animated.
For a great example of what these objects and spaces can look like when worked on and animated, check the video below created by Ricardo Montalban -
The past two days I've also been playing around with Mir, A new After Effects plugin from Red Giant. It uses fractal sets to create and modify geometry natively inside AE, and is completely OpenGL powered so it's nearly realtime. It's 100% 3D inside AE, so you can set up your scenes using AE cameras and lights. I made this quick render using Mir a couple nights ago.
So that's what I've been up to the last two weeks. I started playing with 3D fractals out of pure curiosity but I'm coming to the belief that they have some serious potential in a VFX/3D environment for the creation of spaces/visuals. I've begun to toy with Mandelbulber which is similar to Mandelbulb3d, however completely open source, ray tracing 3D support, alpha support (woo!), and x64 packages available. (rendering time with these things is horrible - however they have an alpha build with GPU support!). In the next week I'm going to start working with getting these renders into a Nuke/Maya workflow and see what kind of spaces I can create compositing these sets into 3D tracked live footage and a mix of live/rendered spaces. There's only rough OBJ export support and no camera data export, so I have a fun week ahead of me! Have a good weekend!