Wednesday, February 27, 2008

GDC08 Friday

My first session on Friday was another one on the characters in Drake's Fortune. This one was presented by Christian Gyrling. He was in charge of implementing the enemy characters in the game. Christian talked about how they seperated the low level mechanics of which animations to play from the high level AI functions. That way their AI could issue simple commands like "find cover" and the lower level system would make the character face the closest cover point, run to get there, play a transition animation once it reached a certain distance from the point, and then play the in-cover animation. Christian also talked a bit about their additive animation system. I'm excited to implement some of these layered animations in our game.

My next session was by Tiago Sousa from Crytek. He talked about the effects in Crysis. I was really impressed that the Crysis guys had implemented so many effects like depth of field, motion blur, HDR tone mapping, and volumetrics, and that all of them work together so well. They really didn't cut any corners on Crysis. Visually I think it's about the most impressive game out there right now. Tiago talked about their system for water, including dynamic waves, reflection and refraction, chromatic dispersion, caustics, god rays, and that cool effect that happens on the screen when you leave the water. Next he talked about the shaders that they wrote to achieve the frozen look on objects. He said that effect was a real pain and required 4 separate iterations before they finally found a version of the effect that they were happy with. In the middle of the talk, Tiago gave a couple of guidelines for creating effects in games that I wish all programmers would follow. First he said "Never sacrifice quality for speed." I could hardly believe I was hearing that from a programmer. Most graphics programmers I know say exactly the opposite. He also said "Make the effects as subtle as possible." Usually when programmers put a new effect into a game, they really make it stand out and it ends up looking silly. Following the "subtle" advice is a good idea.

After that, I went to a session by Jeremy Vickery on Practical Light and Color. Jeremy is a lighting artist at Pixar and also a very good teacher. I loved the style of his presentation. He talked about the properties of light - diffuse, specular, reflection, refraction, subsurface scattering, etc. I loved his description of how light shifts in saturation as well as value when it gets darker. I learned a lot from him and I even ordered his DVD from Gnomon workshop so that I can really study the principles he was presenting.

Finally, I attended Adam Pletcher's session on Python for Artists. Adam really opened my eyes to a whole new world of posibilities with python. I already know MaxScript and I use it a lot to create tools to speed things up in 3ds Max. Python is a lot like MaxScript - but it's outside Max so it can do all kinds of things. Adam showed lots of demos of what it can do. He wrote several scripts that grab things from the web. One script grabbed the current number of people playing Halo 3 online. Another grabbed the current temperature for a given zip code and added to an auto-generated excel spread sheet. Another set of demos dealt with textures. The first demo scanned all of the textures in a directory and reported if any of them were not power of two or if they were saved at the wrong bit depth. For his final demo, Adam showed a script that he wrote that allowed you to fly around a 3ds Max scene using an Xbox 360 controller! Even better than that, he had assigned level decoration assets to the buttons on the controller, so you could use one button to switch between the set of available assets, and another button to place the assets in the level. It was controller-based interactive level creation inside Max. I was totally blown away. Needless to say, I am now learning python. I'm pretty excited about what I'll be able to do with it to speed up our art pipeline once I start learning.

Thanks for reading my GDC blog posts. If you'd like more information on any of the sessions I added, please let me know. I took at lot of notes and I'd be happy to share additional details if anyone is interested.

Friday, February 22, 2008

GDC08 Thursday

The first session I attended today was on environment design in Halo 3.  The speaker presented the idea of having two artists work on each environment.  The first artist is called the architect.  He is in charge of the layout and flow of the level - ensuring that level traversal works correctly for both the player and the AI.  The architect block's out the level and tests the block-out to assure that it can achieve the design goals and that it's fun.  The second artist assigned to the level is called a finishing artist.  This guy is in charge of creating concept art at the beginning stage of the process.  This art is mostly focused on layout at first.  Once the architect starts getting the layout completed, the concepts move more toward color and detail pieces.  Finally, the finishing artist is in charge of creating final textures for the level and adding all of the finishing details that really make the level shine.  This way of working sounded really good to me, since some artists are better at one or the other of the two focused areas.

After my first session, I went down to the expo floor for a bit.  I met with a couple of companies about extending ShaderFX to support their game engines. Everyone seemed to be excited about the idea, so I think this is something we will pursue - adding support for specific game engine export formats.  I also spoke with some guys at Natural Motion about Morpheme - a program we use at Bioware to set up our animation blend trees.  It sounds like they have a really good solution to a problem that we've been running into.  I'm excited to work with them on it.

Next I attended a session that discussed the new features in FX Composer 2.5.  The thing I'm most looking forward to is the new visual debugger that will be included.  They showed how you can highlight any pixel on your object to get information about what's happening at that specific spot, or you can select any variable and see the output assigned to that variable as the end result of the shader instead of the actual value retured by the entire function.  It was great to see that this more robust debugging is finally going to be available to shader authors.

After lunch I attended a session by an effects artist and a programer that worked on Bioshock.  They discussed how they achieved a lot of the special effects in the game, including lit particles for smoke, etc, specularity and per-pixel lighting on particles for water and blood, waterfalls, water caustics, and interactive ripple effects.  There were several ideas that I liked in this session.  The first is that the artist and programmer working on effects sat right next to each other for the entire project - so collaboration was natural and easy.  I also liked the idea that they shared about using a scrolling normal map to distored the UVs  before sampling.  They used this technique on both the projected caustic effect as well as the dynamic rippled.  It served to break up clean patterns and help these effects feel more random and natural.  Very cool.

Finally I attended a session hosted by Natural Motion on Morpheme, their animation state machine and blending software.  They showed the basic process of setting up a blend tree, and also some of the new features that they're going to add to version 1.3 of the software.  I was most interested to see that you can reference a set of nodes in another file in your tree.  This is something we need to add to our blend trees because it will enable more than one person to work on them at a time and also allow us to repeat functionality in different parts of the tree and keep things consistent.

I'm looking forward to tomorrow.  It looks like I've got some more great sessions lined up.

Thursday, February 21, 2008

GDC08 Wednesday

Wow, it was an exciting day at GDC!  I started off the day going to the Technical Artist round table hosted by Jeff Hanna of Volition.  Jeff is a great guy.  I met him in the hall before the session and he and I had a fun chat about tech art stuff.  During the session, Jeff defined a tech artist as a roady - or a stage hand - the guys working behind the scenes to make sure that everything runs smoothly.  We discussed the difference between a tech artists and a tools programmer.  We also came up with a bit of a list of good things that tech artists should do - Use your own tools so that you're forced to see the problems with them, write complete step by step docs for all of your tools, sit with the artists as they use the tools you've created so that you get a good idea of how your tools are working in production.  Artists are our customers and we should be working together with them - instead of somewhere else in the studio.

Next I attended the Microsoft keynote.  They took advantage of the keynote to really push their platform.  The Xbox 360 is great, but I wish they hadn't turned the keynote into a "shove it down your throat" advertising pitch.  I did enjoy Peter Molyneux's demo of Fable 2 (featuring co-op play), and Tim Sweeney's demo of the latest features in the Unreal Engine (dynamic ambient occlusion, large crowds of characters, Ageia soft body physics, and realistic destructability).  Cliffy B also made a "look how cool I am" appearance to announce that Gears of War 2 will be available this November.

After the keynote, I went to a session on the animation in Drake's Fortune given by Judd Simantov and Jeremy Yates from Naughty Dog.  It was cool to hear about all of the tools that Judd created to make it possible for the animators at Naughty dog to mix motion capture and keyframe animation seamlessly.  At Bioware, we're using Puppetshop for animation, and I was happy to see that a lot of the tools that Judd wrote for Drake's Fortune do the same types of things that Puppetshop already handles.  I also got a few ideas for additional Puppetshop extensions to make animating characters much easier for our animators.  This is why I love GDC - I get all kinds of great ideas to use in my projects.  Jeremy talked about how they went about pushing their motion capture to look more like keyframe animation and how they pushed their keyframe animation to look more like motion capture.  In the end they were able to make both animation types fit together very well so they all of the motion feels like it's part of the same style.  After playing Drake's Fortune myself for about 8 hours last weekend, I think it's probably the best console game out right now.

I had lunch with Bruce Straley and a couple of other guys from Naughty Dog and then headed off to my next session - Neil Hazzard's annual review of using real-time shaders in Max.  There wasn't a lot that was new in Neil's talk this year other than the details of how to implement real-time shadows in your shaders.

Finally I attended a session hosted by Nvidia on realistic skin rendering.  The demo they were showing off uses 17 passes and 750 megs of texture memory.  I was pretty floored to hear those figures.  They say that the core concept is very scalable though.  The main idea with the skin rendering is the subsurface scattering technique of image space blurring - the same technique used in the ShaderFX node that I wrote.  The new element in this demo was that they created something like seven different versions of the blurred texture and took different weighted samples from each for the R, G, and B channels to achieve the final result.  Combine that with a physically based specular term and some nice fresnel fall-off and you've got some very expensive but very realistic skin.

Next I went to the show floor and went straight to the Vicious Engine booth.  I've been really excited to see what the guys at VCS have been cooking up since I left.  What I saw didn't disappoint at all!  They've added a bunch of new features to the engine to support the new consoles including rag doll physics, a node-based material editor, and a very nice animation blending system.  I would recommend the Vicious Engine to anyone.

To finish up the night, I got together with a group of old friends and headed off to China Town to go to dinner at the Empress of China.  It was super tasty, and I especially enjoyed catching up and remembering all kinds of craziness from past game projects.  Good times!

Wednesday, February 20, 2008

GDC08 Tuesday

I arrived in San Francisco today after a couple of hours of delays at the airport.  The weather here is rainy - and I guess that was causing the airport to get behind schedule.  We ended up waiting about two extra hours in the airport to board our flight and then another hour or so on the plane before we could take off.  Oh well - nothing that a laptop with DVD's can't fix!

Once I arrived, I spent the day getting to know the area a bit.  I walked around the down town between the hotel and the convention center.  I also walked around the convention center and found where my classes are going to be tomorrow.

It was fun to bump into some old acquaintances from previous jobs while wondering around.  That's one of the things I really enjoy about the conference - catching up with people I haven't seen in several years and finding out how life is treating them.

In the evening I went to the IGDA party.  I usually don't enjoy conference parties much, but I thought I'd go anyway.  I ran into Spencer Trent, the effects artist from Vicious Cycle software.  That was pretty amazing since I've never met him - just chatted via instant messenger.  I guess he recognized me somehow (I wasn't wearing my conference badge with my name on it at the time.)  We had a great chat about creating particle effects, etc.  I also visited the Autodesk party for awhile, but didn't stay long since I'm not much of a party type guy.

Looking forward to tomorrow!

Sunday, February 17, 2008

GDC 2008

Just like the last two years, I'm attending GDC this year and really looking forward to it.  Once again I'll be creating blog posts here for each day I'm at the show.  I'll be summarizing each of the classes and presentations that I attend and commenting on what I learn.

If you'd like to take a look at the sessions I'll be attending, my schedule is available online toward the bottom of this page:


You might notice that I've got lots of time slots with  multiple sessions scheduled.  There's just too much to see and learn!  I wish that they'd add an extra day and spread things out a bit more.  It's hard when two classes you really want to attend are scheduled at the same time.