top of page

Statement of Purpose

I will be creating a small diorama / environment of a astronaut who has crash landed on a environment similar to Earth. It will be an exercise in various elements of particles and simulations in Houdini. It would be a learning experience in pyro, liquid simulation, and particles. Models will be modeled and textured inside of Maya and substance, and some comp work will be done in Nuke.

I will also be experimenting with implementing particles and procedural effects in live action shots. I will be tracking footage with Nuke and implementing particles from Houdini and compositing everything in Nuke.

Responsibilities

Environment models / textures ( grass, ground, trees, water)

Spaceship models / textures

All compositing work

Simulations

Schedule

● Week 1: Brainstorming

● Week 2: Proposal and approval

● Week 3: Layout and environment blocking

● Week 4: Grass simulation / particles

● Week 5: Water simulation

● Week 6: Environment texturing / beginning ship model

● Week 7: Texturing ship / beginning pyro

● Week 8: Pyro and finalizing texturing on any remaining components

● Week 9: Final renders / Comp tweaks

● Week 10: Present Final

Week 3

This week I started doing a basic layout of how I

wanted the scene to look. I also started on some of

the particle systems that will be used for the grass.

I am following a tutorial that can be found here. 

https://www.youtube.com/watch?v=-q13eJOztyg

The basic setup is to copy a bunch of small lines to a bunch of points across a plane. There is randomness applied to the size of the lines as well as the orientation of the lines.

Capture.PNG

I am blocking in objects with simple geometry. I have a small box to represent the main character and a simple geometry to represent the ship. I also have a few thin boxes to represent the trees.

Capture2.PNG
Capture3.PNG

Week 4

This week, I started working on the textures in the environment and cleaned up the environment block so that the water will connect to the environment better. I used a boolean node and created a box around the environment to basically cut extra edges off the environment. This made the sides nice and clean so that when the water block was added, everything lined up cleanly.

For the textures, I have been using maps from 3Dtextures.me. I have been teaching myself effective ways of UV unwrapping objects in Houdini. Since most objects in my scene are simple shapes (boxes, tubes, etc,) it was pretty easy to unwrap the objects using settings such as face project and cylindrical project. The tree UV map had to be manually edited to line up the seams a little bit better.

I didn't get to make too much progress on the ocean simulation. The body of water I am trying to create is very small, so it is hard for the ocean solver to get a good amount of detail in such a small area. For example, the ocean foam solver isn't effective because there is not enough detail in the waves for the foam to form. I will try a few more strategies this week. I have also started looking into the shaders for the ocean as well.

Capture1.PNG
Capture2.PNG
Capture4.PNG
Capture5.PNG
Capture3.PNG
Capture6.PNG
Capture7.PNG

Week 5

5Capture.PNG
5Capture1.PNG
5Capture2.PNG
5Capture3.PNG
5Capture5.PNG
5Capture6.PNG
5Capture4.PNG

This week I've put the brakes on my first project a little bit. I used SideFx's tree generator to make some willow trees for my scene. 

The process for setting up the trees was simple, it started with feeding a tree trunk generator into a branch generator. The overall system can be altered via the controller, which can add randomness to all of your parameters, resulting in slightly different trees that can be implemented into your scene. The generator nodes all look pretty similar. Altering branch length, taper, and how gravity affects the branches are basically all I was playing with to achieve the effect. As the tree grew upwards I increased gravity's effect on the branches to give them more "droop."

I also polyreduced the trees by 50%. I figured since they wouldn't be seen close up, I could reduce their polycount to improve performance and still get the general silhouettes of them.

I also began a new project for this class. I wanted to try integrating a particle system into footage that would be shot by me. This would be a good exercise in both Nuke and Houdini, areas I want to get better in. This week I completed a preliminary pipeline to prove that I could track a face, and emit particles from it. I started my test by trying to track stock footage I found online. 

I then exported the tracking data from Nuke to Maya. As I have further thought about this, I'm wondering if I can skip this step completely. I altered my data so that it went from camera tracked data to object tracked data. Meaning when Nuke tracked the footage, it assumed the camera was moving. So through Maya I had to keep the camera static and apply the movement to a sphere that would represent the face.

From there, I was then able to export the FBX from Maya to Houdini where I was able to use that sphere shape representing the face to emit particles from. Overall I would call it a successful test. I understand how I am going to go about my project and know a few things that I will have to watch out for when shooting my footage. I will also be putting markers on my actors face for a better and more accurate track. I also again believe I can completely step past the Maya step and go straight from Nuke to Houdini. 

Week 6

This week I've put more time into getting a good workflow. I shot some footage over the weekend to try to track. Nuke could not get a good track from my footage. I'm not sure if the area I was trying to track was too small or not. 

I went back to Blender to try their tracking method. It actually gave me a very good track. I decided to stay in Blender for the particles because the footage was shot at 23.98fps and Houdini can't match that framerate. It automatically rounds to 24 fps which would throw off my emitter from the actress' face.

Using Blender's particle system was very similar to Houdini's DOP Networks. I tracked a face base mesh to the actress' face and began emitting particles from there. After I put it back onto my original footage, I wasn't sure of how to blend the two images into one image, and how I wanted the end product to look. Since the sweatshirt is creating a black void on the inside, I created a roto in Nuke and basically created a black hole for the particles to come out of. I plan on setting my particles back further so that the particles look more like they are entering from this void in the person's face. I also plan on shooting more shots this week and playing around with more experimental shots!

6Capture.PNG
6Capture2.PNG
6Capture1.PNG
6Capture3.PNG
concept.png

Week 7

This week I've shifted gears yet again to more exploration into Houdini and integration. This week I started with some stock footage. I found my footage to have a few issues. One being the surface that I was trying to track was too small to get a good track from. So I started with something with a lot more detail and a lot more areas to track, a busy street.

I started by rotoing out the windows of the building to make them seem open and empty. I wanted to emit some particles through the windows of the building. I then tracked the footage and got a good track that I could work with in Houdini.

Once I was in Houdini I imported the tracked point cloud and was able to line up the building I wanted the particles to come out from as well as the building on the right hand side of the shot that would cast shadows over the particles. I then set up an emitter inside the building and simulated balloons exiting through the windows. I added some noise and wind to the simulation to give them some interesting flight paths.

I also created a loop with a switch node so that I could randomly assign the balloons different colors as well as slightly different sizes. I soon after discovered that Houdini can use the Cryptomatte plugin, which would help me a ton in Nuke. I could render the scene as is with both the buildings casting shadows on the balloons, then Cryptomatte them out of the shot so that they won't be seen in the final comp. 

7Capture1.png
7Capture2.png
7Capture3.png
7Capture4.png
7Capture5.png
7Capture6.png
7Capture8.png

Week 8

This week I wanted to set up a new challenge by tracking and implementing CG into an interior scene. The reason I wanted to do this was I wanted to see if I could get particles to accurately interact with the environment and obstacles in the environment.

I once again started by finding stock footage that seemed like it would be a good match. I found this interior living room on Pexels.com and decided to track it. I managed to get a good track (error < 0.8) and lined up some of the basic geometry that I could use as reference in my Houdini scene. I exported my scene and imported it into Houdini.

Before I got into Houdini, I wanted some sort of animation to be my emitter in this project. In the balloons project I used a simple static cube to be my source. This time I wanted something more dynamic. I found an animation on Mixamo that I liked and downloaded it and imported it into my scene as well.

I also wanted to take the simulation one step further by making the particles themselves more dynamic. I wrote a simple script that would check the age of the particle, and then scale and set the color of it accordingly. This makes the whole system more interesting because it creates an interesting gradient to them overtime. My next step this week would be to start implementing better geometry into the scene for the particles to interact and collide with.

8Capture1.png
8Capture2.png
8Capture3.png
8Capture4.png
8Capture5.png
8Capture6.png
8Capture8.png
8Capture7.png

Week 9

This week I worked with soft bodies for the first time. I have yet to experiment with them so I decided to make a fun simulation using soft bodies. I used my interior tracked scene because I wanted the soft bodies to interact with the furniture in the scene.

I started by making sure the furniture had proxy geometry that serve as the colliders for the simulation. I used the track data as well as the other furniture as reference points to line everything up.

I then created a sphere that would serve as the emitter for the soft bodies. I created two attribute wrangles to create some initial velocity for the spheres so they would shoot in a direction rather than fall to the ground. (Credit to Christian Bohm for the code). I then created the soft body struts and constraints that would control the "softness" of the spheres as well as how "pressurized" they are.

I also included the DOP Network setup as well. It is a standard setup, importing the source for the DOP, adding gravity etc. The only thing I changed is the activation in the source. By default, the spheres would be spawned every frame. I didn't want this so I changed the activation to an expression. It takes the frame number divides it by 20. If the remainder is 0, a sphere is spawned. This results in a sphere being spawned every 20 frames.

Everything else was using the same workflow I have been using with my other experiments. Using cryptomatte to mask out the spheres and color correcting them in Nuke.

9Capture1.png
9Capture3.png
9Capture2.png
9Capture4.png
9Capture5.png
9Capture6.png
9Capture7.png

Week 10

This final week I spent mostly wrestling with rendering. I am still not fully understanding Houdini's shadow matting workflow and how they want you to be able to pull CG shadows from renders. It is unlike Maya in this regard, so a lot of experimentation and research had to be done.

Another huge problem I am having is the chair in the foreground. It is a very weird shape to roto by hand, so I tried pulling keys to create the matte for it. This was pretty unsuccessful, so I tried tracking it and using a tracker to drive my roto, but because of the camera pan and rotation, the roto would still have to be altered by hand which basically brings me back to square one. 

Another issue I was coming across was the collision geometry in my Houdini scene not being accurate to the background plate. This was causing the soft bodies to collide before they hit the furniture resulting in a weird looking composite. I had to go back and adjust the collision geometry to make it more accurate so the collision lines would line up better.

I am overall still working on this project. I think the end result would look very cool if I can solve these last few issues. 

Looking back on my work over this quarter, my aim for this class changed drastically over time. What started as a modeling and look development project turned into a ton of learning and experimentation in Houdini. I am grateful to have gotten the experience of these experiments. They are going to help me so much in the future as I am looking to specialize in FX. ​

10Capture1.png
10Capture4.png
10Capture2.png
10Capture3.png
bottom of page