Hello i just wanted to ask, when Arnold renderer is approximatly able to render fully functional with the GPU?
Also when is it possible to render with CPU and GPU simultaneously, if that is planned?
That wasnt my question. I wanted to know, when you can use the GPU without limitations and if it be possible to use both CPU and GPU simultaneously to render...
The thing is, i plan to invest in hardware to speed up rendering by using Arnold, but cant decide to add another GPU or replace my CPU from Intel to AMD...
And to be honest i´d like to stay with powerful single thread CPU for the simulation engines and render with the GPU.
>I wanted to know, when you can use the GPU without limitations
Which limitations? Please be specific.
You just pointed with a link to the limitations in your first answer...
Im sorry, but i dont understand how my question is not clear enough.
Right now im rendering with my CPU, but would like to use my GPU instead, which is more powerful and should speed things up. But i cant, because of the limitations. Its not fully functional like CPU and throws errors when used with my OSL maps.
Furthermore its noisier with same CPU settings, so you´ll need to increase them, which ends up in more render time, which makes rendering with GPU absolutly not an option.
So i´d like to know when these limitations will be fixed, approximatly, and when or if Arnold will be able to use both CPU and GPU simultaneously.
>throws errors when used with my OSL maps.
There you go. Which errors? Which OSL maps? A log would be very useful for us to help fix that particular problem (please read the text on the right of this page).
We can't give dates on when exactly things will be fixed. If there is something that you consider to be a higher priority then you should let us know.
>Furthermore its noisier with same CPU settings, so you´ll need to increase them, which ends up in more render time, which makes rendering with GPU absolutly not an option.
Please read this here in the link that I sent you earlier.
There's no exact when. Each Arnold update we add more features and improvements to Arnold GPU. It's an on-going process.
Been using Arnold GPU and you can see the improvements on each release. You can switch from CPU to GPU almost without any issues. In terms of GPU I see it as a good tool for lookdev. Also keep in mind that other GPU renderers also need specific workflows to be used properly and there is a limit to what you can do.
Thx for the answer, that was all i asked for.
But can you tell me, if its planned to implement rendering with CPU and GPU simultaneousy?
I plan to invest in hardware to speed up rendering by using Arnold, but cant decide to add another GPU or replace my CPU from Intel to AMD without knowing when the GPU is usable like CPU.
Btw. what is the workflow for GPU Arnold?
I was also in your situation. I was always an AMD guy, but because of GPU I went to Nvidia. My workflow so far with Arnold GPU is for lookdev. Check the list of things that aren't supported, I believe is only around 5% and will not stop you from working.
For Arnold GPU, keep an eye on the Render Message window. It will tell you if something is wrong. For any kind GPU renderer you need to play safe. Try to go easy on textures size, check the geometry to see if isn't too heavy. Remember that if you have a GPU with 11GB, your scene if is bigger will not render.
At the moment I am on the Beta, and some bits were implemented to help when using 4k or even bigger textures. On every release, there are meaningful updates and improvements.
Thx for the infos.
Do you think it will be possible to render with GPU and CPU simultaneously?
I feel like the answer to this question is being avoided.
I believe the only renderer that is looking into that is Renderman. I saw their latest update and that is something they are developing. I think once Arnold GPU gets to a good position, that option may be considered, but I am not technical enough to understand how. CPU and GPU work in different ways, so matching the noise of both would be difficult. However, if you want faster renders, there are other things you can do. Like tx textures, arnold standins, use noice.
Hybrid rendering? I don't know about matching the noise.
We may consider it later, but not now.
But even when you use Arnold GPU, the CPU is still doing work like displacement and subdivision.
Doesnt V-Ray and Cycles have that already? Im not sure, if its the same like hybrid rendering (hearing this expression for the first time), but they do render with GPU and CPU simultaneously.
And yes its all about faster and of course better results, that i want to accomplish. Tried tx textures, but that option somehow fade out my textures.
What are arnold standins and noice(is this the optix denoiser)?
>What are arnold standins and noice(is this the optix denoiser)?
Please read the documentation:
https://docs.arnoldrenderer.com/display/A5AF3DSUG/Procedural
https://docs.arnoldrenderer.com/display/A5AF3DSUG/Denoising
Pixar’s new combined CPU and GPU rendering system, RenderMan XPU - this is what I am talking about. As far I know Vray CPU and Vray GPU are distinct from each other. Maybe open another thread for the tx issue, I never had that before.