Project #3 Update: Shaders and Tracking

Issues:

  • Reflections work in chrome spheres but not in a plane that is made to be reflective. I tried different materials on the plane, adjusted specularity and refraction, and tried to rotate the position of the aiSkyDome, but I am still not sure why this is happening.
  • Scale of scene and placement of objects
  • The dancer motion capture data starts on frame 425 and I’m not sure how to adjust this so that it starts on frame 1 instead.

Update:

Professor Gaynor showed me how to offset the animation of an alembic file. The following controls were located in a tab on the attribute editor. By making the frame offset – 425, I was able to start the animation much closer to frame 1.

time_offset

Project #3: Shader & Effects Tests

I did some quick shader and effects tests today in class. I brought my original chrome sphere dancer back into Houdini to try to adjust the shape and form. Then, I brought in the new dancing figures as an alembic file to test in Arnold for Maya.

This slideshow requires JavaScript.

I also did a quick render test at a lower resolution (720 x 405) of about 100 frames so that I could see what the shaders looked like with movement added. I still have a long way to go and there is no compositing happening in this render yet (I rendered really quickly out of the master layer). The test footage is below:

Project #3: Quick Update

Due to illness, I was unable to make a lot of progress on Project #3. I began camera tracking, started the gray ball match, and started to set up render layers in Maya. I was able to export an .fbx file from Nuke with the camera and point cloud and bring it into Maya as a test.

I plan to continue to refine the camera track, continue to work on the gray ball match with the still frame, and further develop the shaders. I plan to further develop and integrate two motion capture effects this quarter. The first I want to develop further is a motion capture asset with chrome spheres — I plan to adjust the shaders on the spheres. The other effect is a fish-scale pattern that I was able to get working in Houdini but not in Maya last quarter. I’d like to get this effect functioning in Maya so I can further develop the shader and overall look. I plan to try out refractive materials on the motion capture geometry.

  1. First image: Chrome on fish-scale effect
  2. Chrome spheres in quick lighting test — the chrome spheres can be adjusted to be mesh lights
  3. Chrome sphere motion capture integrated into the car project by Adrienne Johnson (Fall 2016)

Project #3: Footage and Troubleshooting

On Saturday, Kevin and I shot footage in a dance studio for our motion capture integration project. We took a few different angles and tried varying the speed of the shots. We also took reference photography and got as many measurements as we could so that we could better replicate the space.

Video Footage

 

Chrome Sphere HDR Exposures

We took two shots of the chrome sphere, one with the side of the sphere facing the mirror in the dance studio, and one with the side of the sphere facing the open window. It is our hope to potentially stitch these two images together; however, if we are not able to do this for this project, we still felt the extra reference could be beneficial.

HDR Colorspace Issue

Kevin and I noticed after using Automate > Merge to HDR Pro in Photoshop, we would get a result with a color range that we liked. However, when we saved the file out as a Radiance file in .hdr format, anytime we re-opened this HDR in Photoshop, the colors were dramatically different. All of the warm red tones seemed to become dingy and greenish.

We tried saving the file on two different computers, but we still had the same issue. We also double-checked that we were saving as 32-bit, and that the option to automatically tone map was not checked. So far we have not been able to resolve the issue. We plan to also try to bring the .hdr into Maya and see what results we are getting if we switch the settings to linear vs. sRGB. When Kevin tried to save/open the files on his computer, we noticed that Photoshop would give an error message regarding linear vs. sRGB colorspace. However, when I tried to do the same thing in Photoshop on my computer, the same message didn’t appear.

screenshot_window_01
HDR after Merge to HDR Pro
screenshot_window_02
HDR after saving and opening up the file again in Photoshop

window_hdr_test_001

When I saved the hdr as a .jpg file instead, the colors were much closer to the HDR that we saw in Photoshop before saving.

Theta 360 Camera

We also took hdr photos with the 360 camera. Kevin used two different methods to attempt to bracket the exposures with the camera. However, we noticed when were uploading the pictures that an odd banding issue occurs in the lower exposure shots.

Method #1:

This slideshow requires JavaScript.

Method #2:

This slideshow requires JavaScript.

Lighting Integration References

Below are some images of our gray cube and sphere in the environment.

Human Figure References

For size reference within the space, we also took photos of Kevin moving throughout the studio. We took reference from farther away, when Kevin’s reflection was also visible in the studio mirrors, and we took close up reference. This way, we would be able to estimate the necessary size for our motion capture assets.

Item Reference

For this project, we also took reference images of a few different materials. Kevin wants to create a ribbon effect, so we took pictures of pink stain ribbon to better understand the highlights. I am still trying to decide what materials I would like to recreate in a shader, so I took pictures of rock materials (to potentially further expand on refraction to a greater degree in project 3) and I also took pictures of shiny iridescent ribbon. The iridescent ribbon was to gather reference for iridescent materials — I have two effects I might possibly use for Project 3, and I thought about including shaders that have unique properties. However, I know that trying to create an iridescent shader might be too complex for this project (considering the amount of time we have), so we took the pictures for future reference if we are unable to use them now.

Measurements of the Studio Space

Room/floor space: 20′ wide, 35 ‘ 7″ long

Mirror: 22 ‘ 8 ” long

Large window (in HDR sphere):

-Window sill from ground: 2’/2’ 1″

– Window w/ Frame: 6 ft wide (?)

-Window itself: 59 inches

Window by Door (near mirror): 59 in tall, 3 ft wide

 

 

Project #3: Quick Intro

For my Project #3, I would like to integrate motion captureĀ  into live action footage. I have included quick shots of tests from last quarter, including a “mishap” in which the effect started deforming wildly once the model moved. I have also included my chrome sphere motion capture effect that was used in Adrienne’s car project for the Mill in Fall 2016.

 

I am working with Kevin Johnson to take footage. We discussed several possible locations but most would require official permission from SCAD:

  • SCAD Studio
  • SCAD Museum
  • Bergen Hall
  • Poetter Hall

I plan to upload movie tests from last quarter to this blog soon.

Project #2: Final Version for Submission

final_render-copy

Nuke Node Networks

The picture on the left is the foreground node network for this project, and the picture on the right is the background node network for this project. These two networks were connected by a merge over node. Beneath the merge over, I attached an edge blur node.

Hypershade Information

These are the hypershade network and shader values for my purple stone shader.

hypershade_networkshader_values

Texture Maps

The more saturated map is my refraction map, and the less saturated map is my sub-surface map.

Issues I would like to address for the re-submission:

  • The shadow animation – at the moment, it is not at all accurate.
  • Ambient occlusion (both on the object and the ground)
  • Creating more of a shadow for the statue itself
  • Refining reflections to show more detail
  • Noise
  • Adding darker irregularities to the shader, with variations in specularity
  • Getting the shader even closer overall – adding areas of transparency and a ramp mask to create a balance between refraction and sub-surface
  • Double-checking light placement and shadow length
  • Adding light wrap and edge blur (beyond the simple node I threw down)

Project #2: Composite

Below is the first attempt at compositing different render layers for Project 2.

So far I have the following elements/render layers:

  • Skylight (aiSkydome)
  • Sunlight
  • Lamplight
  • Shadow Pass (from Sunlight)
  • Shadow Pass (from lamp)
  • Ball Occlusion
  • Ground Occlusion

 

nuke_composite_001

I have included screenshots of the node networks. Right now, I cannot seem to get a good shadow for the statue. I would also like to have a Depth of Field element and a Vector Blur element (for the animation).

I would also like to spend more time developing the shader to more closely match my stone.