Flame RnD:  Modular/procedural approach to problem solving allows extra time for creativity.

Spend less time connecting things so you can spend more time creatively. By adapting your workflow to a modular/procedural approach can reap great creative rewards. Boiling down tasks to manageable pieces brings clarity and allows sections w/o animation to be locked down, so you can continue sketching and creating. Hidden connections makes this seamless, especially w multi-output nodes like Action.  Something to note w the OUT/IN mux nodes.  I group them so they are clearly visible from far zoom out schematic.  The groups also can also show proxies, meaning they can appear a bit more like clips if it helps.
Setup comes from a lively discussion w my favorite danish Viking. I’ve labeled things to accommodate his native language. Sketching freeform involves exploring ideas which is more/less randomly trying things. Some ideas, like color correcting z-depth is honestly not the preferred approach, but the demo shows how the workflow can encourage experimentation, warts and all.
"Train in the Way of the Sword with your hands." -Musashi Miyamoto
When working on a complex problem, which map is easier to follow? Time spent cleaning schematics is important to maintain clarity and critical if someone else needs to open it. If you don’t need to see them all, that means the time saved can be used to play around w creative stuff. Which map is more intimidating?


Flame 2019 setup to play with:
demo movie:


Flame RnD:  ColorTransform for ARRI footage to ARRI_sceneLinear

I’ve been kicking this around for awhile until I got the final CIE-XYZ piece of the puzzle.
Correct colorTransform is to create two viewing rules under colorManage prefs. First create one for Alexa Rendering and enable for any log. This will display the Alexa files as they should. Second, create a rule for Linear (gamma corrected) and allow for any linear. This will give you viewing options in the viewport. Next, create a colorManage node as a ColorTransform. Set the Tagged space as scene-linear Alexa Wide Gamut. Hit Custom and add two layers: camera and primaries. Set camera to LogC to CIE-XYZ. Set primaries to CIE to AlexaWideGamut. Then results of the sceneLinear will match the Alexa render of the log file.
I’ve pumped the saturation to make the differences more obvious. Also included some of the many wrong answers I encountered.
The full rez ref pics are also with the setup.


Camera/Lens RnD

During recent research into lenses and cameras, these links were helpful:

The Five Pillars of Anamorphic

The Five Pillars are great explanations from Panavision’s Dan Sasaki.  The anamorphic look goes beyond flares and bokeh.  Although painful addtl VFX work, the look can certainly be worth the effort.

Depth of Field and Bokeh Zeiss PDF

DoF/Bokeh PDF from Zeiss clearly explains the general science of lenses. Worth wrapping your head around.  Lots of diagrams and pictures in a dense 45 pages.

Ultimate Vintage Lens Test

Ultimate Anamorphic Lens Test

Especially helpful for comparing lenses.  Same stage/setup/cameras with many dozens of lenses tested.

VFX Camera Database

Huge amounts of great technical info on this site.


Flame/Nuke: Optical Flares wrap-up

Optical Flares thru Pybox wrap up:
Wanted to leave this in a useful state. I think Pybox can be somewhat useful, but think many ideas would be better solved thru a Python script instead of Pybox. (example: I’d like to figure out how to export a FBX, unwrap the UVs in houdini, and reimport the results back into a batch script. It’s not a situation where I’d want interactivity, so guessing the script is likely more efficient than the Pybox. Also it’s only needed for one frame.)

Field notes:
– It was important to run Flame from a terminal shell, so I could see where the python errors were happening.
– Occasionally I would just create colorBars in the Nuke script and have Pybox return that image to troubleshoot connectivity.
– My use case was over black since I wanted to comp additively in Flame. For this reason, I didn’t choose to pass useful imagery to Nuke, only to enable the Pybox connection.
– Although it took overnight, I was able to set up all 193 presets and let it run.
– It’s important to set the adsk_result node explicitly to EXR to prevent clipping issues.

Overall, performance in Pybox is pretty abysmal in it’s responsiveness. That said, I think it’s helpful to enable workflows using plugins that aren’t available to Flame, in this case set up flare performance and export to Nuke to run the plugin accessing the deep selection of presets. It’s actually somewhat useful to be able to have Flame open the script in Nuke, tweak the Nuke script and then reload into Flame. Seemed most handy to be able to launch the Nuke script from Flame, as it was already connected to the pipeline.

The Pybox controls from Flame to Nuke were wonky. Had problems getting a 3 vector axis to pass data correctly between the programs. For my needs, I chose to just export as FBX from action, Axis for light control as well as a Camera. I then imported the FBX data into the Nuke script. This was better for me as I could compare the results of Flame and Nuke from a similar baseline.

I’ve uploaded some stuff to play with:
download setup

– Nuke Script used (Nuke 10.5.5 to make useful to wider audience)
– Nuke Scripts prepped to load all the presets
Important for me as the browser in the plugin is irritating, and wanted to access the creative tools easily.
(These are Nuke 11.2 which I was running for my RnD. Simple but tedious to recreate in 10.5.5 if someone wanted to.)
– Half-rez previews for reference of the presets.

– Create Action in Batch w an Axis and 3dCam.
– Select the Axis and 3dCam and export as FBX. Note: Exported geo seems to bake in transforms, but not the case w lights.
– Load a Pybox into batch.
– Choose the nuke_px.py setup.
– Hit the Nuke Composition button and choose the desired Nuke Script.
– Load the FBX data into Nuke.
– Result can rendered in Nuke or piped back thru Pybox to view in Flame.

Here’s a ref clip of the 193 opticalFlares presets run thru flame:


Flame/Nuke: Pybox RnD w Optical Flares

Wanted to use the action flares for interactivity, but have the info then render thru Nuke so I can use OpticalFlares plugin.
Currently, I’ve exported thru FBX so I can verify camera, flare position, etc. which works great.
It’s a little sluggish, but the adsk_controller knobs do indeed work and pybox updates appropriately. With all the presets that are avail in OpticalFlares, seemed like a good solution. Use Flame for interactivity, Pybox/Nuke for tweaking the presets. Updating the comp is as easy as reloading the Nuke Script into Pybox.

Ideally I’d like to build out the custom knobs to drive pivot, scale, brightness, etc as well as 3d position of the flare from the adsk_controller.


Flame: Stab/Unstab


Demo of using perspGrid in 2D mode for stabilizing moves w inverse for retrack. This tool is a Swiss Army knife. Obviously good for screens but also useful for clean reflections, far bg fix, etc. If track doesn’t completely lock, then adding an Action in the “FIX” area w a little old-school tracking can get you really close quickly. Tedious to link w expressions, instead copy/paste of new perspGrid nodes and turning on/off “invert” for refining track.

Takes longer to watch it track than to do the work. Here’s hoping for a new Mac Pro which can take an nVidia card.

flame 2018.3 setup:

download setup


Flame: expressions let you drive an axis from Tangent Panels.


Although there are a few nodes that let you use the wheels and knobs, only the Color Corrector will give many usable animation channels of data for expressions.
Since you need to be on a CC node, you must view the result in a Context. Further complications come from Offset, Gamma and Gain reacting differently so each required a slightly different expression.
It would be great to wrap this in a GLSL “UI only” shader, allowing renaming of the panel readouts, but currently stuck w the default CC displays. If any Matchbox/Autodesk/Tangent wizards have any insight into a way to “trick” a UI_Only setup so that the panels will think it’s a CC node, please chime in.

Color Wheels = x, y, z translate
RGB Gamma knobs = x, y, z rotate
RGB Offset knobs = x, y, z scale
Contrast knob = proportional scale

Keep in UserBin to drag out, then copy/paste the slave axis to your 3D scene and parent.
If you drag multiple times from UserBin, each slave axis will retain it’s connections by default. Renaming the CTL axis will require updating the expressions. If the CTL axis is deleted, then the animation keyframes are baked in.

Aside from being a novelty, I think the wheels and knobs are a better creative tool than sliders where they can be implemented. Especially true w cameras where controlling focus w knobs seems more intuitive.

flame 2018.3 setup:

download setup


Flame ROI for big plates updated

ROI fix for big plates.

flame 2018.3 setup:

download setup


ARRI sensor crop


More good info from that same ARRI page.
Shows how the different ARRI formats crop on the openGate sensor.

flame 2018.3 setup:
download setup


ARRI lens illumination

ARRI website has lots of great info and among them is Lens Illumination Guide. It shows how different lenses are expected to vignette and they create a handy web app so you can preview and even download the image.

OpenClips are a really handy way of sorting arrays of data.
Downloaded the variations of the focalLengths and lenses from the ARRI site and placed in directories ready to be imported into flame as openClips. After creating the openClips, reimport them into flame and the video versions point to the different focalLengths renders, or any renders you choose.

After creation, I’ve moved the openClips to sort them and they retain their connections.

Flame archive 2018.3 which has the intact openClips in it. Contains all the variations so it’s big.

sourceImages: The original ARRI jpgs sorted in directories. A couple of the images aren’t good, but that’s what they have. This is what you’d use to create your own openClips.


Flame – drag nodes across the viewports.

The boundary is only in your mind. Brought to you by the hotkeys “Shift + A”, “Shift + F” and “Alt + 2”


The Tao of CTL+SHIFT+D.

Duplicate w upstreams connections intact.  Use it. It’ll change yer life.  If you delete the geo and media in the action, it’s a handy tool to drop in yer userBin.

flame 2018.3 setup:
download setup


Flame DoF continued…


Here’s a solution that addresses some of the aliasing issues w Z depth after discussion. The settings need to be tweaked, but this is where I’d start.  Works well w DoF but not Blur3d which can take the kernel.

flame 2018.3 setup:

download setup


Flame DoF

Made a comparison setup to compare similar settings between DepthOfField node and 3DBlur node. (Also added a bokeh kernel to 3DBlur, since it’s a useful option.

Basically, if you use the DoF node in the yellow box, it’s expression-linked to the nodes below it. Then you can compare.

FYI, something that peeps might not be aware of:
Camera near/far clipping planes are a great way to control your Z-Depth pass coming out of Action. Like many out there, I used to use CC to get into a manageable range, but I’ve found this is a better solve since you can see it clearly from the Action Top View.

flame 2018.3 setup:

download setup


Flame ROI for large images

Sample setup for ROI fix. Keeping the resolution of ROI divisible by 4 seems to avoid softening.  All that’s needed is to set ROI crop and t-Click to match orig size.

flame 2018.3 setup:

download setup


Flame note on OpenClip

Interesting note for Mac Flame guys creating openClip: The openClip creator app seems to use the macOS keystrokes instead of flame. CMD+C vs. Alt+C for copy as an example. Also, the up/down arrow shortcut for naming doesn’t work. FYI


Camera RnD


Doing some camera RnD. Here’s a setup which uses data from ARRI website to emulate lens illumination. OpenClips tracks contain tStop and Mux switches for lens, focal distance, sensor crop. Trying to decide whether openClip image track is better used to sort matrix of tStop or lens.

flame 2018.3 setup:

download setup


Substance texturing in Flame

Last tests were with Mantra. These are from my mac flame. IBL and PBS are pretty cool tech.


Substance texture RnD in Houdini