Hello,
Coming from MtoA to HtoA—I've noticed that there isn't any "Flush All Caches" to force unload resources such as RAM usage.
I had memory build-up and no way to release it without closing & re-opening Houdini.
Using Houdini "Cache Manager", closing and opening "Render View", Clicking "Render" ie. Regenerating Shadow Maps and render — didn't unload the RAM usage.
Any reason why this feature doesn't seem to be in HtoA?
All the best,
Andrew
HtoA: 5.6.0.0
Arnold: 6.2.0.0
Houdini: 18.5.462
OS: Windows 10
Flush Cache clears the texture cache, nothing else. So by default, that's 4096 MB of RAM.
We have an open enhancement ticket for adding the same Flush Cache menus as in MtoA.
You can flush individual textures, for example:
Ok thanks for the info!
So technically "Clicking "Render" in the render view. ie. Regenerating Shadow Maps and render" should do the same as an "Update Full Scene" in Maya?
Any reason / ideas why the RAM usage wasn't unloading?
Quite a simple scene with displacement / subdivision which I was adjusting.
RAM usage built up over 30GB, but the actual RAM usage after a Houdini restart was only 8GB.
Andrew
Hey,
I have asked this question before as well. It’s really important to get into HtoA / Solaris. Currently when exporting a texture from a DCC on Windows and using it in Houdini it locks the path since the file is in use. This means as you are lookdeving, you constantly have to write new versions to disk and version up the files in the texture reads. If we had the clear caches, you could clear the cache and write over the file and then restart the render. It’s a workflow issue
for flushing the caches globally you can try
import arnold
arnold.AiUniverseCacheFlush(arnold.AI_CACHE_ALL)