top of page

Demo Reel

Breakdown

Finding Dory -- Crowds Technical Supervisor

Supervised a team of crowd TDs to build a pipeline and execute over 600 crowd shots.

Contribution to Depicted Shots:  Created the "fish brain", ran the majority of the fish simulations, and supervised the rest.  Supervised placement of human crowds.  Colloborated in render optimization profiling and tuning.   This work will be presented at SIGGRAPH 2016 

Houdini, RenderMan (RIS), Katana, Presto

The Good Dinosaur -- Crowd TD

Stepped in to simulate some challenging crowd shots of a dinosaur parting a sea of birds

Contribution to Depicted Shots:  Ran the bird simulations and made brain modifications.  This work will be presented at SIGGRAPH 2016 

Houdini, RenderMan (REYES)

Up -- Crowd TD

Developed a canine locomotion brain in collaboration with JD Northrop and executed shots. 

Contribution to Depicted Shots: simluated the pursuing dogs

Massive

Ratatouille -- Crowd TD

Learned Massive on the job, collaborated with David Ryu to develop a general rat locomotion brain, and executed shots.

Contribution to Depicted Shots: simluated the swarming rats.  This work was presented at SIGGRAPH 2007

Massive

Cars 2 -- Render Optimization TD

Refactored and improved the Cars-era "shrinkwrapping" technique, which encodes the geometric differences between a cube and an object as a displacement texture.

Contribution to Depicted Shots:  Worked on the underlying shrinkrwapping technology.  This work was presented at SIGGRAPH 2011

RenderMan Shading Language (RSL), Python

Monster's University -- Software Engineer

Developed and deployed a pipeline for baking varying BRDF parameters into brickamps (sparse voxel octrees), and PTEX (per face textures).  The brickmap version yielded significant speed and memory improvements on crowd shots.

Contribution to Depicted Shots: Optimized render times and memory.  This work was presented at SIGGRAPH 2014

Brickmaps (Sparse Voxel Octrees), PTEX, RenderMan, Shader Simplification

Wall-E -- Crowd TD

Developed a method for simulating spring physics using the signal processing capabilities of brain nodes in Massive (dubbed "Brain Springs").  This work was later patented.  Created a Quaternian filtering pipeline for smoothing simulation data.  Designed the general robot and human locomotion brains.  Maintained, debugged, and improved the RenderMan plugins which supported lighting and shading on agents posed by Massive.

Contribution to Depicted Shots: simluated humans in floating chairs and service robots.  This work was presented at SIGGRAPH 2008

Massive, Python, C++, RenderMan

Brave -- Crowds Lead

Lead a team of TDs to plan and build a new crowds pipeline, and execute shots for the film Brave

Contribution to Depicted Shots:  Developed the geometry caching and sequencing pipeline.  Supervised crowd TDs in building the Massive import pipeline.  Executed these shots.  This work was presented at SIGGRAPH 2012

Geometry Cache Pipeline, Finite State Machine Sequencer

Academy of Art University -- Instructor for Rendering and Crowd Simulation

I teach yearly classes in Rendering and Crowd Simulation, for which I developed the curriculum.   The images depicted are students' projects and belong to:

Yeong Kyeong Kang

Lalida Karnjanasirirat

Ruchirek Somrit

Kai An Chuang

Jongeon Lee

Xinran Yu

Jae Jun Yi

Samuel Felix Eugene Martono

Justin Schubert

Arnold Moon

Kexing Yang

Ahna Ko

Kai An Chuang

Jaehark Kim

Chinatip Tangsiripat

Zheng Si

RenderMan for Maya, Slim, Massive

Nerdy Side Projects

Besides film and teaching, I've been pursuing plenty of nerdy side projects.  For a sampling, check out:

# An Audulus implementation of Muse's sequencer FX in Map of the Promblematique

https://www.youtube.com/watch?v=sQDHFIoM1UY

# A Unity implementation of a "Cliodynamics" simulation for the rise and fall of states:

https://www.youtube.com/watch?v=C2ZHC2mfcVg

# A test of the neural style transfer implementation found at https://github.com/jcjohnson/neural-style .

https://www.youtube.com/watch?v=UNl4_nrcXHY

# Advised and aided a team of artists creating a short proof of concept film for VR in Unity

http://kaleidovr.com/project/the-last-mountain

# To demonstrate to my students that the procedural shading I teach in my class can be implemented in real time, I created a ray marching shader of an aging apple in Shadertoy:

https://www.shadertoy.com/view/ldB3WD

# Along similar lines, this is an implementation of an aging orange using a GLSL node network in Mental Mill:

https://www.youtube.com/watch?v=EVFxzD3LQg8

Unity, Audulus, VR, Machine Learning, Shadertoy, Mental Mill

Music

Song 1: A collaboration between Taylor Holliday (Guitar) and myself (Bass).  Both of us did some cheesy midi work for synths

Song 2: A short piece I did on guitar and bass, with logic's auto-drummer.

Song 3: A cover of Capital City's "Safe and Sound", where I am playing bass and guitar.  The drums are from a publically accessible midi.

Bass, Guitar, Audulus, Logic

Please reload

bottom of page