Friday, February 22, 2008

GDC08 Thursday

The first session I attended today was on environment design in Halo 3.  The speaker presented the idea of having two artists work on each environment.  The first artist is called the architect.  He is in charge of the layout and flow of the level - ensuring that level traversal works correctly for both the player and the AI.  The architect block's out the level and tests the block-out to assure that it can achieve the design goals and that it's fun.  The second artist assigned to the level is called a finishing artist.  This guy is in charge of creating concept art at the beginning stage of the process.  This art is mostly focused on layout at first.  Once the architect starts getting the layout completed, the concepts move more toward color and detail pieces.  Finally, the finishing artist is in charge of creating final textures for the level and adding all of the finishing details that really make the level shine.  This way of working sounded really good to me, since some artists are better at one or the other of the two focused areas.

After my first session, I went down to the expo floor for a bit.  I met with a couple of companies about extending ShaderFX to support their game engines. Everyone seemed to be excited about the idea, so I think this is something we will pursue - adding support for specific game engine export formats.  I also spoke with some guys at Natural Motion about Morpheme - a program we use at Bioware to set up our animation blend trees.  It sounds like they have a really good solution to a problem that we've been running into.  I'm excited to work with them on it.

Next I attended a session that discussed the new features in FX Composer 2.5.  The thing I'm most looking forward to is the new visual debugger that will be included.  They showed how you can highlight any pixel on your object to get information about what's happening at that specific spot, or you can select any variable and see the output assigned to that variable as the end result of the shader instead of the actual value retured by the entire function.  It was great to see that this more robust debugging is finally going to be available to shader authors.

After lunch I attended a session by an effects artist and a programer that worked on Bioshock.  They discussed how they achieved a lot of the special effects in the game, including lit particles for smoke, etc, specularity and per-pixel lighting on particles for water and blood, waterfalls, water caustics, and interactive ripple effects.  There were several ideas that I liked in this session.  The first is that the artist and programmer working on effects sat right next to each other for the entire project - so collaboration was natural and easy.  I also liked the idea that they shared about using a scrolling normal map to distored the UVs  before sampling.  They used this technique on both the projected caustic effect as well as the dynamic rippled.  It served to break up clean patterns and help these effects feel more random and natural.  Very cool.

Finally I attended a session hosted by Natural Motion on Morpheme, their animation state machine and blending software.  They showed the basic process of setting up a blend tree, and also some of the new features that they're going to add to version 1.3 of the software.  I was most interested to see that you can reference a set of nodes in another file in your tree.  This is something we need to add to our blend trees because it will enable more than one person to work on them at a time and also allow us to repeat functionality in different parts of the tree and keep things consistent.

I'm looking forward to tomorrow.  It looks like I've got some more great sessions lined up.


Post a Comment

<< Home