The purpose of this assignment was to provide an opportunity to get familiar with Slim as a shading tool within Maya by modeling, shading and lighting 3 different cupcakes. I found this project to be quite challenging as the majority of my shading experience is with hard surface elements and cupcakes are about as far as you can get from hard surface.
Malcolm generously provided us with a tasty cupcake to start us off with reference collection. The material properties were observed from this cupcake and various images online. It became clear that the icing had a very organic shape and a very high amount of subsurface scattering.
The first order of business was to produce a low-resolution mesh to sculpt onto in Mudbox. I made use of the PTex features added in Mudbox 2012 to avoid UVing a very organic shape of the icing. Another new technique I tried on this project was vector displacement with PTex. Working with PTex and Mudbox was a new experience for me, since I don't really have any sculpting experience.
Applying the PTex information with Renderman is very simple. First, the PTex attributes must be added to the relevant objects, along with a subdivision scheme if working with PTex displacements. - Shape Node -> Attributes -> RenderMan -> PTex Support - Shape Node -> Attributes -> RenderMan -> Subdiv Scheme
In order for PTex displacements or color maps to be correctly used in Maya, a new base mesh must be exported from Mudbox reflecting the large scale scuplting changes. PTex vector displacement was more restrictive than I expected. For example, if cusps are formed with vector displacement it can be impossible to multiply the effect to taste in Maya without various bits of geometry turning inside out.
This cupcake represented my first experience using subsurface scattering in a shader of my own creation. Like most things with RMS 3, it is not immediately apparent how to get subsurface scattering working with RenderMan. First, I will describe in general how RenderMan performs subsurface scattering.
Since subsurface scattering was implemented into RenderMan before it supported raytracing and photon mapping, a point cloud method is used to perform the neccessary calculations. First, a point cloud is generated with all the incident light hitting an object with subsurface scattering enabled (the SSRender pass).
Next, based on parameters in the material, the sub surface calculations are performed and a second point cloud is generated (the SSDiffuse pass). Optionally a brick map can be generated at the end for optimization .
With recent releases of RenderMan, subsurface scattering can be computed with raytracing. These options can be selected in the Render Settings->Features tab by selecting the different options under Shading->Sub-Surface Baking
There are two key ways to hook up subsurface scattering with RenderMan in Maya. First, subsurface attributes can be added to traditional materials inside of Hypershade: - Material Node -> Attributes -> RenderMan -> Add Subsurface Scattering
Subsurface attributes can also be added to the All-Purpose material inside Slim by attaching the APSubsurfaceScattering node to the Subsurface slot of the All-Purpose material. Enable Bake to generate the required passes at render time.
Special lights can be used for the scattering by using Maya Sets. Select a light set in the RenderMan pass settings Light Set option.
To better understand global illumination options, I built a simple Cornell box to test the various solutions.
The first option is a RenderRadiosity pass with the RenderMan Environment Light. I turned the color to black so that it simply acted as a controller for GI. RenderRadiosity with color bleeding enabled generates a point cloud that acts as one 'bounce' of global illumination.
Next, the RenderRadiosity pass can be combined with FilterApproxGlobalDiffuse. FilterApproxGlobalDiffuse runs the RenderRadiosity point cloud through the same external application as subsurface scattering and can allow for multiple additional bounces of global illumination.
Finally, the RenderGlobalDiffuse3d pass can be used for true raytraced global illumination. The results are stored in a brick map. I was unable to get this option to a smooth result in a timely manner, so I will likely stick to FilterApproxGlobalDiffuse for most cases.
For rendering, I used three-point lighting with spotlights, deep shadows, and an HDR for extra indirect lighting and environment reflections. I attempted to use Area lights, but RenderMan would consistently not update with my setting changes and it eventually became more trouble than it was worth. I matched my key to the key of my HDR, which was a bright kitchen window.
As I am from a mental ray background where it is quite simple to create blurry tinted refractions and reflections, one shader issue that has always given me discomfort with RenderMan is the lack of a simple true raytraced material. For this assignment, I chose a cupcake with a sugar crystal shard made of transluscent hard candy. The shard both blurs the image that refracts through it and tints it with absorbtion. I wrote a shader to generate these effects, and a more detailed writeup can be found here.
Overall I learned a lot on this project about shading organic objects. Dealing with all the PTex issues back and forth with Mudbox proved the most annoying thing, but I believe they still outweigh the time I would have spent UVing. The biggest thing, however, is the inability to open a PTex in Photoshop for quick fixes or for applying image processing to get glossy/reflection maps quickly.
RenderMan performed well, but a number of us had a big problem with the RenderMan settings window becoming disconnected from what is actually happening underneath, leaving little option but to at least restart Maya, but often have to set up the scene again after exporting geometry. I am sure there are workarounds to this but they were not immediately apparent.