Community
Arnold for Houdini Forum
Rendering with Arnold in Houdini and Solaris using the HtoA plug-in.
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Houdini instancing to Maya+Arnold

7 REPLIES 7
Reply
Message 1 of 8
kamholz.keith
3184 Views, 7 Replies

Houdini instancing to Maya+Arnold

Hi!

I'm working on a big destruction scene in Houdini, and the lighter is using Maya + Arnold, and a cloud rendering service. I'm trying to figure out if there's any way to efficiently get all of these glass shards & debris instances to the lighter for rendering.

Things that don't seem to work:

- Alembic export from Houdini. As raw geometry, the caches are super massive, and when using packed geom it creates many many thousands of objects in Maya, and object per debris piece, which seems too heavy to be a viable option.

- ASS files exported from Houdini. I was hoping that the ASS files could maintain some internal instancing efficiency, but it seems like HtoA is converting all of the instanced geom into regular raw data. I've tried exporting the results of a Copy To Points & an actual Instance object, and they both result in large ASS files that grow as the emission grows. So they don't seem to actually maintain any instancing internally.

Is there some way to get this to work, or some good option that I haven't considered? I really don't want to have to do a bunch of bespoke instancing work in Maya to get the results to match what I'm getting in Houdini, and that would seem to be a pretty shoddy workflow in general.

I've also exported the individual instance objects as small individual ASS files, in case there's any way to just cache out the particles to an ASS, and get them to load the little ASS instances based on particle attributes or something. Haven't seen anything in the docs along those lines though.

Any input/ideas would be much appreciated. Thanks! 🙂

~ Keith

Tags (3)
Labels (3)
7 REPLIES 7
Message 2 of 8

Hi

For Copy to Points:

  • Copy to Points will give you ginstances, but not if your geometry is alembic packed geo (we're working on that now).
  • You have to select the “Pack and Instance” check box.

For Alembic:

  • It would be good if the alembic’s exported from houdini used the alembic instancing for packed geometry, then the alembic procedural should instance too (using make_instance).
  • To bring the Alembic into Maya, ideally you would use the gpuCache, because that creates just one node in the Maya scene. But gpuCache has a memory leak with large animated caches.


// Stephen Blair
// Arnold Renderer Support
Message 3 of 8

Also, in Maya, the Alembic procedural (the gpuCache node) has a Make Instance parameter, which is off by default. So if your Alembic has instances, enable Make Instance.



// Stephen Blair
// Arnold Renderer Support
Message 4 of 8

You could try exporting alembics from houdini using alembic instancing to reduce the file size.

If you use the primitive attribute s@path to name your geometry, and make sure Packed Transform is not set to Deforming Geometry, then the alembic file should use instancing.

Again, you'll have to turn on make_instance in maya (or set i@make_instance=1) to render these alembics.

Message 5 of 8

It looks like the lighter won't be using Arnold after all for this project, but I'd still love to know how this would best be handled in an Arnold situation.

I know how to exported transformed packed geom to alembic efficiently, and that's working properly for the main destruction RBDs. However, for changing topology like debris and glass shatters, alembic doesn't seem to like the changing point count. It also doesn't save out any velocity data on the packed pieces, regardless of whether the velocity attribute is stored on points/vertices/prims.

So I guess the question expands a bit.... is there any way to get velocity motion blur working and leverage instanced geom? To use velocity motion blur are we forced to export raw unpacked geometry, which is incredibly heavy? I know this is probably a general alembic issue, with the alembic exporter discarding attributes on packed geom.

Message 6 of 8

One workaround I've seen in the past is to change the FX setup so that **all** of the glass shards & debris always exist, and they're just scaled down before "emission", and scaling up to become visible when they're supposed to emit.

For now I think I'll move forward with this... maybe in the future USD will make it easier to deal with such things?

Totally open to other suggestions that are less clunky than changing the FX setup to avoid changing topology! 🙂

Message 7 of 8
holst.rune
in reply to: kamholz.keith

Hi Guys

Is there a way to pick up the s@instancefile attribute on an alembic point cache in Maya?

We are trying to do a similar workflow to Houdini instancing, but in Maya.

A point cloud containing all attributes (id, P, pscale, orient, frame, instancefile) so we can avoid heavy caches for crowds and just have Arnold pick up the individual character abc/ass files from disk and read the frame attribute for animation offsets.

Message 8 of 8
Stephen.Blair
in reply to: holst.rune

Hi @Rune
Please ask this as a new question



// Stephen Blair
// Arnold Renderer Support

Can't find what you're looking for? Ask the community or share your knowledge.

Post to forums  

Autodesk Design & Make Report