Random Dancing Robots - The Post SigGRAPH Post


Never did I know attending SigGRAPH would make me so busy I wouldn't have time to make a post post SigGRAPH! Oh well, now I finally take five minutes to post a post.

First, I like to remind everyone of the mental ray wiki over at mymentalray.com, which is really taking off with the good help of Martin Breidt, especially his "mental ray cookbook" section.

Secondly, SigGRAPH was a blast, although I missed much due to, as always, lots of meetings. One of the funniest events happened when some random straggler (apparently not completely awake) wandered into one of our meetings and grabbed a coffee and sat down.... after some investigation we figured out the guy was in the wrong room.... he staggered out mumbling "at least I got free coffee". He had no pants on, for some reason. ;)

Thirdly, I can finally talk about mental ray 3.6 and the new toys that come with it. As shown in various posted demos, there are some new architectural stuff, like the portal lights, which allows a really quick set up for perfect quality skylight coming in through small windows.... as well as other things.

Also included is my own little "baby", the production shader library. I promise that you'll read a lot more about that in this blog as things get released. A lot.

For now, you'll have to make do with watching random dancing robots.

But hey, random dancing robots never hurt anyone, did they? Thought so.



Glacierise (a.k.a. Hristo Velev) said...

Awesome! So there will be mr 3.6 in max 2008, and these prod shaders look very promising! Awaiting for the cursed nervewrecking veil of secrecy to be lifted!

brian-GraphiCat said...

they look interesting, are they the same as the MIA arch. materials , so you have lots of presets?

I always found difficoult to make your own materials in MR, now I am using Vray, but I want to switch to MR in maya.

thanks for keeping this blog, really usefull.

and sorry for my english

Master Zap said...

This isn't about the material on the robots; that's just MIA (= A&D) materials.

This is about the interaction w. the ground (i.e. the background photo), contact shadows, reflections, interaction of bounce light and so forth.


digones said...

howdy, mr zap!

I'm looking forward to see all this new mr 3.6 stuff. Hope the maya version will have all the toolset I saw in the 3dsmax videos (i.e. self illum. in the MIA, tonemapping presets, etc.)

cheers and keep up the good work

Master Zap said...

In Maya, the "self illum" stuff appears as it's own shader, rather than baked into the material (that's just an UI wrapper hiding some complexity from the user on the max side).

The shader deliverables for both versions should be pretty identical, it's only a matter of what is exposed to the end user by default, and what UI layering has gone on top of the raw shaders.


Unknown said...
This comment has been removed by the author.
Unknown said...

The dancing blue RX8 rocks! I just bought one of those, but mine doesn't transform. :P

Looking forward to MR3.6

digones said...

hi zap, it me again... sorry for all these questions.

Have you messed enough with mayatomr 3.6? I've been through some hardtime trying to figure how to properly get the output passes from mia_material_x, and the documentation is kinda vague about that... the closest thing I could do was to create some output passes that render well, with the correct filenames, but the generated images are no diferent than the beauty pass, but with diferent filenames...

is there anything else to setup besides creating output passes from my camera? Where does maya (or mr) gets the info to output those passes? What I did was to type the name of the pass in the camera's outputpasses list. (i.e. "refl_result")

I also tried to enable the user framebuffer from the render globals and I got images rendering with the correct filenames, but all black.


propriƩtaire said...

I'm also trying to find a way to output passes from mia_material_x in maya 2008 osx. How can I go about this without ctrl_buffers. The documentation is super vague and I see that your examples have accomplished this. Thanks for any light you can shed on this...