Hey Everyone,
managed to source a 3080 and a 3090 for my workstation and thought I'd give Arnold a try in Houdini. To my huge surprise, using both GPUs are not helping but rather slowing down rendering A LOT, I have to manually select a single GPU to have decent performance.
Tried Windows 10 and Ubuntu 20.04 with 5.4.1 demo, same issue, no love. Could anyone shine some light on what am I missing?
Thanks!
We think this is a bug, and we're working on confirming this (eg getting the right cards for a test).
Just updated to the most recent HtoA and the issue is still there. Is there any progress you could share Stephen?
We're working with NVidia to get test hw setup, I will check in and see how that's going...
We haven't been able to reproduce this. Does this happen for all scenes? Can you make this happen with a simple scene that would be easy to share with us?
Actually scene is as simple as a sphere and a plane with a skydome. Tried latest Nvidia Game Ready and Studio drivers as well, Windows and Ubuntu, nothing helps, same issue.
Let me know if you need a screen share so you can see what's going on!
If you can post the scene here that would help.
If you know how, ideally export it as an .ass file, double check you can reproduce the speed differences using kick, and supply the debug level logs (-v 6) from the 3080, 3090, and 3080+3090 renders. Otherwise, upload what you can along with debug logs from running in htoa.
Houdini and kick logs attached. Overall kick is around 15% faster, but the abnormal difference is still there. Quite interesting, but the 3080 seems faster in Houdini than the 3090.
Houdini 3080: 1min 1sec
Houdini 3090: 1min 2sec
Houdini 3080+3090: 2min 53sec
Kick 3080: 58sec
Kick 3090: 51sec
Kick 3080+3090: 2min 32sec
Can't upload the .ass scene file as the HDR image needed is over the upload limit. Will send you a download link, just let me know your email.
These logs are great, thanks.
It really does look like it's the gpu rendering that is running faster, as we can see from their avg iteration times:
3090: 00:00:51 1727MB | pixel rendering 0:40.40 (400 sample iterations @ min: 0:00.036, avg: 0:00.104, max: 0:00.178) 3080: 00:00:58 1724MB | pixel rendering 0:47.04 (400 sample iterations @ min: 0:00.032, avg: 0:00.121, max: 0:00.200) both: 00:02:32 1929MB | pixel rendering 2:16.57 (400 sample iterations @ min: 0:00.020, avg: 0:00.345, max: 0:00.689)
Do you get a slowdown when you don't use an hdri texture (set the color to white)? That might make it still easier to share your scene by posting the .ass file here.
Could it be that running both GPUs at the same time is causing them to overheat which in turn causes them to downclock and run slower? Do you have a tool you can run to measure the GPU clocks and/or temperature while rendering? Perhaps https://nvidia.custhelp.com/app/answers/detail/a_id/3751/~/useful-nvidia-smi-queries could be used to get this info?
Done kick renders with a simple dome, no HDR, same results:
3090 / 23sec
3080+3090 / 1min 13sec
GPU temps are not going higher than 60°C, pretty far from slowdown temps - obviously will watercool the GPUs once the blocks are available.
Out of curiosity I've run Redshift/V-Ray benchmarks, no issues there, great performance.
Attached the kick logs and the .ass test scene, appreciate your help Thiago, hope we can figure out something!
Hi Thiago,
do we have any update on the issue? Let me know if you need more test scenes.
Thanks!
We've passed this over to nvidia for them to try to repro and are waiting to hear back.
Hey Thiago,
do we have any update on the issue? It's almost a month passed and our licenses will expire soon. I'd need to know to extend the subscription or look for an alternative render engine for future projects.
Thanks,
Chris.
We had one user report similar problems with two GPUs rendering arnold several times slower than one GPU and this is how they solved it. Worth a try?
Apparently SLI needed to be turned on in my Nvidia Settings but it was NOT there. I tried plug in my monitor in the GPU 0 Slot 1 and magically SLI setting showed up! Apparently that's a requirement. Must have monitor plugged into Card 1 to get SLI to appear...
Thanks Thiago!
The reason it will not work for us is that the RTX 3080 has no SLI/NVLINK, so the option won't show up in the control panel - we'd need to have 2 identical 3090s to have the SLI option enabled.
We've managed to reproduce it. No idea yet about when it'll be fixed, but it looks like it'll be high priority because it appears to be common to users with certain GPU configurations.
About the license, you can keep testing arnold even after the license expires. It will just render with watermarks, but hopefully for testing purposes that's ok?