Community
Arnold for Maya Forum
Rendering with Arnold in Maya using the MtoA plug-in.
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Exposure control will never be fixed?

13 REPLIES 13
Reply
Message 1 of 14
kulaginb
573 Views, 13 Replies

Exposure control will never be fixed?

Is there a chance to solve the problem with the physical camera exposure? Now the noise level is determined and only then the tone mapping is done, this is wrong, it should be the other way around - first the tone mapping and only then the noise level calculate, as is realized in v-ray etc.

Tags (1)
Labels (1)
13 REPLIES 13
Message 2 of 14
Message 3 of 14
kulaginb
in reply to: kulaginb

I see. And understand - never. And don't understand - why "super physical correct render" not have physical correct lighting and correct photographics exposure control. 2005 year...

Message 4 of 14
maxtarpini
in reply to: kulaginb

I think for a 'general' renderer it remains a delicate matter.

First that is generally true only for adaptive sampling. However it is not 'wrong', it is 'unbiased'.

If you tonemap your samples you get 'biased' renderings even if you inverse-tonemap them at the end because you already weighted your samples (as a function of how bright they are, where with low sampling for example you'll also reduce the whole HDR.. let's say you have just 2 samples and one is a 'firefly'.. you can get rid of it by tonemapping or you can keep asking for more samples until the variance of the mean of all the samples is reduced, with the former you'll reduce your HDR, with the latter your render times will sky rocket.. this is also where biased sampling will reduce severe bright edges aliasing where instead unbiased will kinda 'expand' that aliasing trying to supply more and more samples aka trying to fully resolve it instead of average it, ie. unbiased vs biased).

On the other side if you get your adaptive sampling done without exposure and you think you can tweak that in post .. changing by a large amount the final exposure you may begin to see under-sampled areas that where originally darker enough to slip through the AS threshold.

What to do ? Properly light your scene without having to crazily rely on exposure/tonemapping. Understand that with denoising you go 'biased' anyway so rely on that for fireflies elimination, edge anti-aliasing, under-sampled areas etc. Keep saying you'll use vray instead .. 🙂

Message 5 of 14
madsd
in reply to: kulaginb

There is a distinct cultural difference between being a guy that will accept almoste any solution as long as it's roughly in the ballpark and gets in the hands of the users - and a developer setup that has a different philosophy to develop under. I have seen the vray guy state he does not care about details, what matters to him is that some functionallity gets implemented and working, disregarding any higher logic. I dont belive I have ever seen an Arnold dev write that something was OK, if it looked semi useful or could be hacked in in some ultra ghetto way, just to please clients. I have seen the vray guy state such things.

Message 6 of 14
kulaginb
in reply to: kulaginb

For me and for many designers "properly lighting" based on data from catalogue of _real_ light sources. And I want to set 100500 lamps with 3600 kelvins and 700 lumens with IES distrib., and don't check all of them to tweak, but rule EV and white balance.

About "v-ray guy" 🙂 - I used mental ray, iray, corona, and all ok with exposure. Only Arnold and ART ignore obvious solutions.

About developers etc. - Arnold is renderer for developers, or for end users? I remember speech ten years ago from Arnold's developers - "Arnold for MAX - never! Arnold not for interiors!"

Message 7 of 14
madsd
in reply to: kulaginb

Well, you can do everything a client want and end up with a huge pile of mess in the code room with no tight lines or structure, or you can work out from a higher logic of unification and eventually things gets integrated or gets realized through advances in hardware technology or code technics and then the interior can render as a side bonus, not as a main goal from the get go.

Arnold never had prepasses to speed things up, the developer team belive in brute force all the way and did not fall into the pit of trying to be fast to horde clients.

Reality is sneaking in though, and there wont be any need for cheap tricks that is extremly flicker prone because everything came around and it proves brute force is the way to go.

Message 8 of 14
kulaginb
in reply to: kulaginb

Mads, are params "Low Light Threshold", "Clamp AA samples", "Indirect Specular Blur" - not heresy for "divine brute force"? 🙂

Ok, if developers want to be unbiased - all simple -- take all lights and multiply/divide they levels to 2^EV with some coef., shift color of light with white balance, and only then kick )

And agree with Aaron Ross https://answers.arnoldrenderer.com/questions/17011/make-denoiser-work-with-exposure-control.html 🙂

"I understand that Arnold came from an environment of motion picture production, where all tonemapping etc. is automatically assumed to be done in post. But as Arnold is now the default renderer for 3ds Max, whose user base has varying workflows and expectations, it behooves the Arnold team to consider the needs of all users. For example, architectural visualizers need full support for Exposure Control, or at least a mechanism to achieve similar functionality in Arnold. If these users are not considered important enough to support, then they will never adopt Arnold. Leaving them out in the cold means they will continue to turn to other renderers, such as V-Ray."

Message 9 of 14
aaronfross
in reply to: kulaginb

It is very clear to me that the Arnold developers never intended for the renderer to be used by anyone other than motion picture studios. And when Solid Angle was bought by Autodesk, the Arnold devs did not understand the implications of that. The facts on the ground are that Arnold is the default renderer in 3ds Max, and 3ds Max's primary use case these days is for design visualization. The former Solid Angle crew have come a long way to meet the gap between Arnold's original mission and the needs of its current user base, but there is still a very long way to go. Adaptive sampling, Noice, etc. are great. The eventual addition of light and shadow exclusion is great. But until 3ds Max Physical Camera Exposure Control is fully supported, there is always going to be a certain segment of the 3ds Max user base that has no use for Arnold, and they will continue using V-ray, or some other renderer that actually meets their needs.

Message 10 of 14
madsd
in reply to: kulaginb

I don't think you can put it up that black and white. People are able to look at the whole package and look at the potentials/possibilities as a whole.
Its not a question wether a single feature is implemented or not, because with that way of thinking, you can always state that "some kind of feature lack" will make everybody run away, which is not the case.

There is VFB+ that handles the exposure and grading, tone mapping, even some simple bloom and glare effects, its currently compiled up to 2019, I suspect a 2020 version will surface at a point.

What I much rather would see that trying to blend into the current universe is to be more analytic and integrate the render developers into the actual future application design/improvement initiatives.

The environment dialouge will be rewamped, there is no question about it, so why start to support something that is going to be rewritten at a point. Where some feature sets could potentially be ported to a different and more relevant and current UI, a frame that works perfect with .ass file export and so on.

I think it is clear that the Arnold devs have the highest coding standards, and they are willing to not support something with instant fixes, because they know its better to be robust with a very high integrity than fly around all over the board trying to please random user requests in whatever way it is possible.

To me personally you can chose to be a prostitute and do whatever the client says, or you can stand your ground and continue down the path of making very high integrity software.

I prefer the first option if we are at a party and want to have fun for a couple of hours.
Monday morning when reality hits the fan, I want high integrity software.

Message 11 of 14
aaronfross
in reply to: kulaginb

Supporting key features such as Exposure Control, upon which thousands of users depend, is not "trying to please random user requests".

Message 12 of 14
kulaginb
in reply to: kulaginb

Mads, VFB+ _postprocess_, but I talk about other thing. "All thresholds should be recomputed based on their exposed alter-egos." (с) MZ 🙂

Message 13 of 14
madsd
in reply to: aaronfross

Correct, it will be nice when something practical surfaces.
I was addressing the general philosophy of developing tools, and the difference between different systems.
You can go about it it many different ways.

The Arnold material as a small detail is by far the fastest to respawn in the UI, when redrawing, comparing to Vray material, the Vray material is super slow for example.
So and if the foundament is not super robust or you rush things through to get things in the users hands you will end out with a ton of issues eventually that hits back on developer in the end.

Message 14 of 14
madsd
in reply to: madsd

( it is what we suffer under in the main application, because they just kept adding in patches of plugins and feature sets to a foundament that was not future proof, which is why they are now reverseenginnering their own code, refactoring, disassembling, sorting things out )

A high grade of diciplin from the render developers needs to infiltrate the general max development. Then we are in good shape, looking forward.

Can't find what you're looking for? Ask the community or share your knowledge.

Post to forums  

Autodesk Design & Make Report