I have an interior scene that is crashing badly. I tried to debug it for 2 days now. It's not using more than 2Gig on the GPU. There is no texture yet. 1 env map. ACIO color manangement.
It's also crashing on CPU.
But...If I save the whole geometry as an alembic and replace all the geo's, it will render fine.
The scene render fine as is in Mantra, and in the demo version of Redshift.
Thanks,
Did you try isolating the problem like this?
Yep did that, I will retry again... but it's crashing randomly on different geo, why it's not crashing when replacing the geometry with alembic? I tried everyting by deleting almost all atributes and shoppath attribute just in case. I recreated the scene by copying from one instance of Houdini to another one....I even saved all the geos as bgeo.sc so there is no processing before rendering. Nothing is working. I even loaded the geo in Blender and the render is fine on cycles RTX.
I will retry hiding geos just in case I missed something....
Thanks
Ok, I was able to render all geos separately but not together. If I hide half of the scene it will render, then the other half it will render, but all the geos together it will crash.
Thanks
Can you provide the complete debug level log file (see instructions to the right of this page) for the CPU render that crashed? It might tell us what is going on.
Good morning Stephen, I did some more test...
I created a .ass file of my scene and it renders perfectly with kick. So it looks like somehow HtoA has some trouble writing the temporary .ass file and crash?
Thanks!
Possibly. In your log we see
00:00:07 2444MB | [htoa.texture] Converted 0 textures in 0:00:00 (0 skipped, 0 errors) 00:00:09 2590MB ERROR | signal caught: error C0000005 -- access violation
and in my log I have:
[htoa.texture] Converted 0 textures in 0:00:00.00 (1 skipped, 0 errors) [htoa.session] SOP cache cleared. 2 nodes unloaded. [ass] writing scene to c:\users\stephe~1\appdata\local\temp\houdini_temp\htoa_Stephen_Blair\11960_8332b843-5a13-4704-b96a-0422c4270252.ass (mask=0xFFFF) ... [ass] wrote 61497 bytes, 7 nodes in 0:00.01
HtoA writes the ass file to the HOUDINI_TEMP_DIR, which is in the Windows Temp folder. Is it possible your Temp folder is full?
I would use Process Monitor to watch what happens. Does HtoA actually try to write the ass file, or does it never get that far?
No it never goes that far, but if I render to Mplay it will render fine.
I tried to add this :
HTOA_TMP = "D:\Temp\htoa"
HTOA_TMP_TIMEOUT = 1
to check if it was a drive or permission problem but htoa still use ~/appdata/local/temp/houdini_tmp/htoa_user folder.
So the scene renders fine in Kick, Mplay window directly in Houdini, but not in the Render View panel !
What version of python Htoa uses? I tried to use:
HOUDINI_USE_HFS_PYTHON = 1
In the Python shell it says 2.7.15
Best regards
HtoA uses HOUDINI_TEMP_DIR
HTOA_TMP is no longer used, we need to update the docs.
I've been looking at it, but unfortunately I can't tell why it doesn't work. I can see that it's not working and something crashes, but there's no clues.
Ok, thanks I will try to investigate it further later when the scene will be done and ready for shading/texturing. I will use redshift demo or mantra for my "dailies" in the meantime .
Cheers!
Hi Stephen, I had time to debug the scene and I found that 2 objects had very bad uv vertices (like spikes in the uv, veeeery far away from 0-1). It's not the first time that I have a scene not rendering in Arnold because of bad uv. Maybee someting to look into.
Cheers!
You`re flipping the problem on its head.
arnold should throw an error about the uv and not crash.
not have people blame the user.