Testing Arnold GPU with a RTX 2080ti 11GB, with a scene that is fairly simple, couple of materials and not so heavy geometry. Is basically an interior arch viz scene. But even with a RTX 2080 my scene crash for no reason at all. I get no error messages. Is rendering a 4K preview too much? Sometimes it starts rendering, other times doesn't render at all. I don't hear the RTX working at all. I have a RTX 2060 back home, so I am going to test it.
Also when I do the populate GPU I get constantly a message if I want to stop the process, even though I am just waiting the cache to finish. I need to keep clicking No.
Is there away I can debug and see what is causing this issues? My scene is only 3 million polygons. I am using the Studio Drivers. Using 3ds max.
Can you get an Arnold log?
Set the log verbosity to Info, and enable File logging (both settings are in the Render Settings)
Working on the GPU again. I will let you know if any of those situations happen.
Does Arnold GPU supports the denoiser output? I get this message after rendering with GPU
Start denoising (patch radius 3, search radius 9, variance 0.5)
Denoising RGBA
Could not find AOV with source RGBA and filter variance_filter
Could not find variance for AOV "RGBA", skipping denoise.
Finished denoising
Arnold GPU doesn't support all the filters yet, and for noice, we need the variance filter.
Ok. I didn't know that. I checked the GPU limitations page and didn't mention anything. So far the scene seems stable. Will the GPU populate cache error appear on the log?
Does this workflow works on Production Render?
Set the Max. Camera (AA) in the range of 30 to 50 (depending on the scene, you might go closer to 100). In general, the max samples should be a large value. A large max samples means that the quality is controlled by the noise falling under the threshold, instead of by clamping to the max AA. Set the Adaptive Threshold to something like 0.015 or 0.02. For a noise-free render, lower the threshold value, maybe even as far as 0.010. Set the Camera (AA) samples to around 3 or 4. One of the few reasons to go higher with AA is for motion blur. The higher the number of Camera (AA) samples, the less of a speedup you'll get from adaptive sampling.
ok, so one of the issues once not enough memory. I assume I need to flush the caches?
Arnold]: [gpu] an error happened during rendering : Unknown error (Details: Function "_rtContextLaunch2D" caught exception: Encountered a CUDA error: cudaDriver().CuEventSynchronize( m_event ) returned (718): Invalid program counter)
Getting this error with my RTX 2060. Log doesnºt say anything about it. This is what I mean about the lack of consistency, this scene was rendering on GPU.