Community
Arnold for 3ds Max
Rendering with Arnold in 3ds Max using the MaxtoA plug-in.
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Maya 2020 crashing with GPU rendering

2 REPLIES 2
Reply
Message 1 of 3
jcwmoy
2950 Views, 2 Replies

Maya 2020 crashing with GPU rendering

So I have recently wanting to try out the new GPU rendering with Arnold 6. I've opened up a old file I was working on in Maya 2019 with xgen. I swapped out the lights and shaders and it seemed to work once. Ever since it has always crashed. I am using an RTX 2080 ti and have updated all my drivers. It seems to crash before anything even happens. I click to render and it thinks and without anything happening for 30 seconds it crashes. It's as if there isnt enough memory, but I seriously doubt that? Would anyone seem to know a way to fix this?

Tags (2)
Labels (2)
2 REPLIES 2
Message 2 of 3
Stephen.Blair
in reply to: jcwmoy

Are you using the latest MtoA? That's MtoA 4.0.1.1

Can you get an Arnold log?



// Stephen Blair
// Arnold Renderer Support
Message 3 of 3
sputknick
in reply to: Stephen.Blair

We've struggled with render crashes for months now, so I might have something to add. 

 

Xgen and Arnold GPU are not friends, at all. We have some very complex scenes that render just fine in Maya 2020.2 and 2020.3, until we unhide the Xgen hair (even one with tiny amounts of simple hair). And then we're lucky to get one frame before things go haywire. Xgen hair made with 'interactive groom' tends to behave better than Xgen hair made other ways, but still leads to crashes.

 

On much simpler scenes with one character, we have great GPU render success (thousands of frames) until we unhide any Xgen asset. And then things get dicey. On a good day we'll get as far as 200 frames, but rarely more than 100. And crashes at 20 frames are not infrequent.

 

Switching to the latest versions of Maya and Arnold have made no difference.

 

So while we have chased our tails for months thinking our render crashes are the result of an Arnold GPU memory leak/bug, we now know that the bigger problem is Arnold and xGen, and then perhaps some GPU ram memory leak/bug. It seems absolutely absurd to me to have a setup that can do 20 frames, but not 30. It strikes me that ther is no reason for accumulative renders to be a problem.

 

We've also noticed that if you include a "noise" modifier in your Xgen scene, the noise is calculated ad hoc, so if you batch render, and ask Maya to restart and begin rendering at frame 20 say, and then restart and render at frame 40, the hair will appear like it is jittering at those frames, because Maya Xgen recalculates the noise each time using some form of random seed! So your hair will jerk about at the beginning of every batch. Removing the noise modifier solves this problem completely, but wow! what a thing to struggle to figure out.

 

Please lets get Arnold GPU and Xgen working. PLease.

 

 

 

 

Can't find what you're looking for? Ask the community or share your knowledge.

Post to forums  

Autodesk Design & Make Report