Project #3: Project Settings and Updates

Final Video & Breakdown Link:

https://vimeo.com/208163542

frame_311

(a preview of frame 311 in Nuke)

Maya/Arnold Settings

AOVs, Initial Render Settings, and Sampling/Ray Depth/Motion Blur Values:

Render Layers Setup:

The only layers I didn’t end up using were “ground Projection” and “groundRefMask”, which was a duplicate of “justFloorReflect”.

render_layers

Render Times

This slideshow requires JavaScript.

Key Light & allObjectReflection Layers: ~30 – 50 minutes per frame

Fill Light Layer: ~60 – 100 minutes per frame

Shadow and Ground Bounce Layers: ~8 – 20 minutes per frame

I rendered all of my “heavier” layers on the Renderfarm, and I opted to locally render many of the mask layers that were super inexpensive. For my final submission of this project, I also added motion blur and depth of field in the render process.

The fill layer was my most expensive by far, largely due to the sampling rates and the refractive material.

Nuke Node Tree

Total Node Tree:

 

Foreground and Background Elements:

*Foreground includes key, fill, ground bounce, and  object occlusion layers.

*Background includes shadow and ground occlusion layers.

 

Ground Reflection Elements:

 

Dancer Reflections & Mirror Bar Roto:

 

Final Elements:

*Includes Lightwrap, Final Merge, Firefly Killer plug-in, and Edge Blur.

*Special thanks to Dhruv for sending our class a link to the Firefly Killer Gizmo script which can be found here:

http://muellerstefan.de/?p=20

Advertisements

Project #3: Troubleshooting Alembic Files and UVs from Houdini to Maya

While working on the shading for my dancers in Maya, I noticed that I was unable to add textures, as my alembic files had no UVs when they were imported into Maya. After a little bit of research, I decided to do some tests to see if I could troubleshoot the problem and come up with a work around.

I went back into Houdini to revisit my sphere dancer file. The original node structure for the sphere dancer looked like this:

basic_chrome_sphere_structure

Spheres were copied onto the points of a particle system attached to the motion capture character rig. I knew that the geometry and animation worked when brought into Maya, but that the UVs were missing. I first added some Houdini UV nodes to the node tree, and opened up the UV Layout option into Houdini to study the results.

Below is my first attempt at adjusting the node tree structure:

uv_combo_001

At the UVTexture node level (before processing any sort of UV unwrapping or layout), all of the spheres on the dancer’s body were on top of each other in the UV view:

original_uvs.png

Once UVUnwrap and UVLayout nodes were added, the result looked something like this:

uv_layout_002.png

It was evident that the spheres were unwrapping in an unnatural way. I brought a test alembic of the dancer into Maya and put an aiStandard shader with a ramp in the diffuse color channel to study the results. In the image below, you can see how the colors in the ramp shader were read in pieces rather than in a smooth ramp.

The ramp that plugged into the diffuse channel of an aiStandard shader:

ramp_shader.png

The rendered result:

new_uvs_from_houdini_001

An additional test with some tweaking of the UV node parameters in Houdini:

side_by_side_comparison_uvs

The tweaking of values (left figure) resulted in a slight improvement, but it still wasn’t as smooth or responsive to the ramp as I liked. I revisited the Houdini file and adjusted the UV process. I moved the UV nodes higher up in the node hierarchy so that the UVs were established on the base sphere before it was copied to multiple points. With a bit of trial and error, I realized I did not need the UVunwrap node for this process, so I flagged it yellow to “bypass” it.

Adjusted node tree structure:

proper_uv_network.png

I also switched the UVtexture node “Texture Type” parameter from “Orthographic” to “Polar” which gave me a different result:

polar_parameters.png

Adjusted polar UVs for one sphere:

uv_texture_polar.png

Now all of the copied spheres should have the same polar UVs. I tested this in Maya with same ramp shader, using side by side comparisons to study exactly what changed.

The figure on the far right is the dancer with polar UV coordinates for each sphere. The linear ramp was noticeably smoother in this iteration. I also added noise to study how the UVs broke up the noise across the surface of the spheres.

 

After running these tests, I used the polar UV method and re-imported the alembic into my dance studio scene.

Project #3 Update: Shaders and Tracking

Issues:

  • Reflections work in chrome spheres but not in a plane that is made to be reflective. I tried different materials on the plane, adjusted specularity and refraction, and tried to rotate the position of the aiSkyDome, but I am still not sure why this is happening.
  • Scale of scene and placement of objects
  • The dancer motion capture data starts on frame 425 and I’m not sure how to adjust this so that it starts on frame 1 instead.

Update:

Professor Gaynor showed me how to offset the animation of an alembic file. The following controls were located in a tab on the attribute editor. By making the frame offset – 425, I was able to start the animation much closer to frame 1.

time_offset

Project #3: Shader & Effects Tests

I did some quick shader and effects tests today in class. I brought my original chrome sphere dancer back into Houdini to try to adjust the shape and form. Then, I brought in the new dancing figures as an alembic file to test in Arnold for Maya.

This slideshow requires JavaScript.

I also did a quick render test at a lower resolution (720 x 405) of about 100 frames so that I could see what the shaders looked like with movement added. I still have a long way to go and there is no compositing happening in this render yet (I rendered really quickly out of the master layer). The test footage is below:

Project #3: Quick Update

Due to illness, I was unable to make a lot of progress on Project #3. I began camera tracking, started the gray ball match, and started to set up render layers in Maya. I was able to export an .fbx file from Nuke with the camera and point cloud and bring it into Maya as a test.

I plan to continue to refine the camera track, continue to work on the gray ball match with the still frame, and further develop the shaders. I plan to further develop and integrate two motion capture effects this quarter. The first I want to develop further is a motion capture asset with chrome spheres — I plan to adjust the shaders on the spheres. The other effect is a fish-scale pattern that I was able to get working in Houdini but not in Maya last quarter. I’d like to get this effect functioning in Maya so I can further develop the shader and overall look. I plan to try out refractive materials on the motion capture geometry.

  1. First image: Chrome on fish-scale effect
  2. Chrome spheres in quick lighting test — the chrome spheres can be adjusted to be mesh lights
  3. Chrome sphere motion capture integrated into the car project by Adrienne Johnson (Fall 2016)