One way to study the info on the rendering would be the use of Output Statistic.
To enable this feature, go to Render Settings, under Advance tab, Statistics, check Output Statistics to turn it ON. Insert a path into Statistics XML File field:
C:/myProj/mystatistics.xml
After each render, you should get mystatistics.xml file. Drag and drop it into Firefox browser.
Look under Memory tab, search for Radiosity Cache and you can see how much memory was used in the render. Renderman default 102400 kilobytes (100mb) for Radiosity Cache. If the renders use up more than 100 mb, then it would be better to raise the default to match the memory used, else the rendering will get slowdown.
Another info to look up for is the Bounds in the xml file, This is where you will find if your displacement bound has been set too low or too high.
Other interesting stats to look out for is Texture. Here you will find how many hits are at what resolution if the textures and the memory used.
PRMAN Notes
Saturday, February 15, 2014
Displacement out correctly
Here's what I find a little weird between using a grayscale and RGB displacement map.
If you were to create a grayscale 50% gray 16bit TIF image. Convert into tex and used it as displacement map. Even though it is 50% gray, there is still changes to the geometry.
However, if you were to take the grayscale TIF image again, change it into 16bit RGB TIF, the geometry stay the same. That's something I have to live with at the moment.
To correctly setup displacement in Slim, here are few things to do:
- Create a float MayaImageFile node, load tex file, use Luminance RGB as Channel. Set the S and T Filtering to 0.
- Connect it to the Float Displacement handler in GPSurface. Within GPSurface node, set Displacement Encoding to Centered. Adjust Displacement Scale accordingly.
- Remember to set Displacement Bound either within GPSurface or MaterialEnsemble node. Also remember to turn ON Trace Displacement (If using Raytracing) inside Material Ensemble node.
You can also plug in a normal Bump map texture into GPSurface for very fine details....
If you were to create a grayscale 50% gray 16bit TIF image. Convert into tex and used it as displacement map. Even though it is 50% gray, there is still changes to the geometry.
However, if you were to take the grayscale TIF image again, change it into 16bit RGB TIF, the geometry stay the same. That's something I have to live with at the moment.
To correctly setup displacement in Slim, here are few things to do:
- Create a float MayaImageFile node, load tex file, use Luminance RGB as Channel. Set the S and T Filtering to 0.
- Connect it to the Float Displacement handler in GPSurface. Within GPSurface node, set Displacement Encoding to Centered. Adjust Displacement Scale accordingly.
- Remember to set Displacement Bound either within GPSurface or MaterialEnsemble node. Also remember to turn ON Trace Displacement (If using Raytracing) inside Material Ensemble node.
You can also plug in a normal Bump map texture into GPSurface for very fine details....
That annoying rendering problems with the camera to use
If you are like me, having trouble to get Renderman to use the camera you want as render camera, especially rendering to IT, here's the command line to solve the problems:
rmanLoadPlugin; setCurrentRenderer renderMan; rmanChangeRendererUpdate; rmanRenderPass("Final", "yourCamShape");
Noticed that the pass used is Final. There are in fact Final and Preview pass by default. Something that you could take advantage of when doing interactive rendering.
rmanLoadPlugin; setCurrentRenderer renderMan; rmanChangeRendererUpdate; rmanRenderPass("Final", "yourCamShape");
Noticed that the pass used is Final. There are in fact Final and Preview pass by default. Something that you could take advantage of when doing interactive rendering.
Wednesday, August 21, 2013
DiffuseHitCache
DiffuseHitCache is a PRMan attribute that have the renderman to do a lookup to a pointcloud or brickmap file for the indirect illumination at the ray hit point. In normal scenario, if you have a blue floor and a grey ball sitting on it, you should see the blue color bleeding onto the base of the sphere. The result I am hoping to achieve is have blue floor reflect other colors than blue.
If you wanna to have the floor to bounce off yellow color regardless of what the floor color is, you will first need to have the floor with yellow color. Next, create a RMSGIPtcLight and set the Bake Type to Create RenderRadiosity. This will generate a PointCloud file with Cs channel representing the color of the scene objects.
We will be uisng this pointcloud file for the renderman to lookup to during the indirect illumination pass. You could run a brickmake command to convert pointcloud file to brickmap which does the same thing except for speed difference. For a simple scene like this, using a brickmap will not speed up a lot anyway.
For this to work, the pointbase way of achieving indirect illumination will not work. We will need to switch over to using Raytracing. Delete the RMSGIPtcLight node and any Render Radiosity pass. Create a RMSGILight node, then we need to add a few PRMan attributes to the floor object.
Diffuse Ray Shading - set this to Cache so that at the time of indirect illumination, you can tell renderman to look up for a point cloud or brick map file.
Diffuse Hit Cache - this is where you specify the path to the pointcloud or brickmap file. Beware that you will have to write it in this format:
file: D:\myFile.ptc
Pre Shape MEL - the final step is to add this attribute with the following command:
RiAttribute "shade" "string diffusehitcolorchannel" "Cs";
this will instruct renderman to lookup to the Cs channel in the pointcloud file.
Now, the floor can be any color but the color that get bounces off from the floor will be yellow.
If you wanna to have the floor to bounce off yellow color regardless of what the floor color is, you will first need to have the floor with yellow color. Next, create a RMSGIPtcLight and set the Bake Type to Create RenderRadiosity. This will generate a PointCloud file with Cs channel representing the color of the scene objects.
![]() |
Point Cloud file of the scene |
We will be uisng this pointcloud file for the renderman to lookup to during the indirect illumination pass. You could run a brickmake command to convert pointcloud file to brickmap which does the same thing except for speed difference. For a simple scene like this, using a brickmap will not speed up a lot anyway.
For this to work, the pointbase way of achieving indirect illumination will not work. We will need to switch over to using Raytracing. Delete the RMSGIPtcLight node and any Render Radiosity pass. Create a RMSGILight node, then we need to add a few PRMan attributes to the floor object.
Diffuse Ray Shading - set this to Cache so that at the time of indirect illumination, you can tell renderman to look up for a point cloud or brick map file.
Diffuse Hit Cache - this is where you specify the path to the pointcloud or brickmap file. Beware that you will have to write it in this format:
file: D:\myFile.ptc
Pre Shape MEL - the final step is to add this attribute with the following command:
RiAttribute "shade" "string diffusehitcolorchannel" "Cs";
this will instruct renderman to lookup to the Cs channel in the pointcloud file.
Now, the floor can be any color but the color that get bounces off from the floor will be yellow.
Wednesday, August 7, 2013
Point Base and Brickmaps in RMSGIPtcLight
In the Renderman Studio there is an option to cache GI into Point Clouds or Brickmaps, using RMSGIPtcLight node.
There are three options for Bake Type.
Information about the Point Clouds and Brickmaps usage are the best of my knowledge at the moment.
1) rmanRenderRadiosityPass
- this first method will simply generate a point cloud file (rmanRenderRadiosityPass.0001.ptc). This point cloud will store the direct illumination data and the actual indirect calculation(The last bounce) will still need to be calculated during the beauty render. Having a point cloud file allows Renderman to load the point cloud scene into memory and use it for indirect illumination calculation.
2) rmanPartialFilterApproxGlobalDiffusePass
- this second method will first output a rmanRenderRadiosityPass.ptc point cloud file, the same as the first method, then run a PTFilter command on this point cloud to generate the next point cloud, rmanPartialFilterApproxGlobalDiffusePass.ptc which store indirect illumination data in it. The name Partial might suggest that the last ray bounce will still need to be calculated at beauty rendering time. The benefit of this second rmanPartialFilterApproxGlobalDiffusePass is that it enable more than one bounce to be performed.
3) rmanMakeFilterApproxGlobalDiffusePass
- the third method will generate two point cloud files and a brick map file. As usual rmanRenderRadiosityPass.0001.ptc will be first generated. PTFilter command will run and process this point cloud file to generate the second point cloud file rmanFilterApproxGlobalDiffusePass.0001.ptc. This second point cloud file has more data, includes indirect diffuse, occlusion. Later, the next command brickmake will be run on the second point cloud file to generate a brickmap file.
So the question will be which method will be the best option for certain scenario. In a very early conclusion, brickmap method seems to be the best as it is more memory efficient as brickmap data has multi resolution level which can be loaded accordingly. For example the buildings far away in the background will use the first level, the most "light" version. However, there will be pre point cloud pass needed for brickmap generations.
Point cloud file on the other hand is an unorganized data. A more efficient way to create Point Cloud file will be the use of Organize Point Cloud format, which storing its data in Octree. But then again, a pre pass point cloud file will be needed.
Full ray trace will be the more elegant solution then :)
Friday, August 2, 2013
Sunday, July 28, 2013
Multi Displacement Setup in Slim
Well, not exactly multi cos I have not done any setup for more than two displacement maps. Usually two displacement maps are enough for me, for now.
The setup is simply by adding map1 to map2 and reduce the final value by 0.5 again. Here's the image best explain what I did.
The setup is simply by adding map1 to map2 and reduce the final value by 0.5 again. Here's the image best explain what I did.
Subscribe to:
Posts (Atom)