Siggraph 08 Thursday
My last day at Siggraph was interesting. The first session I attended was a panel discussion entitled "Games: Evolving on An Order of Magnitude." Several prominent people from various game companies discussed how the advances in video game hardware are driving major changes within the industry. Since games are so much more complex to create, larger teams are required, more complex management structures are needed, etc. The problem is that budgets are not increasing proportionally to the demand for content. This means that teams have to find ways to work more efficiently - writing tools to automate tasks, etc. I took lots of notes on this discussion. Let me know if you're interested in reading more.
Next I attended a session on hair and cloth. I thought that I'd be able to learn some things that would help with the projects I'm working on now, but all of the papers presented were on very expensive techniques - like simulating every single strand of hair, or simulating cloth by mimicing the behaviour of the yarn. Results were super cool, but compute times were very high - nothing near real-time which is what we need.
Next I attended a session on physics. There was a paper on doing hair in the real-time on the GPU, but I missed it because I got there late. I'm hoping that Nvidia will post it on their developer web site. It's pretty close to what I'm going to need to work on soon. The second paper was on using lots of GPUs in parallel to run particle systems with very high particle counts. By using 6 or 7 GPUs together, the guy was able to do simulations with a million particles in real-time. Very impressive. Now if we could just get game comsoles to ship with 7 GPUs we'd be all set! :P The final paper in the session was on the bent and broken metal effects in the Hulk movie. They used an interesting technique for allowing the Hulk to destroy metal objects. They basically turned on cloth simulation with the cloth stiffness set very high - but only at the moment of impact. This allowed the surfaces to deform, but then they'd turn the sim off when the impact was done so the surfaces would keep their deformed shape.
Finally, a session that I didn't get to attend (because I had to catch my flight back home) but that looks very interesting was presented by Jon Olick from id
Software. You can grab the paper here:
http://s08.idav.ucdavis.edu/olick-current-and-next-generation-parallelism-in-games.pdf
The first half of the paper talks about their usage of the PS3 hardware - which is nice in itself, but the second half is the really interesting part. He talks about a new way of rendering meshes that's kinda like what they're doing with mega texture - only it's for geometry instead of
texture data. Basically you'd be able to create environments with an interface similar to Zbrush - where you could just create as much detail as you wanted with no concern for polygon counts. Then at run time, the software would do all of the LODing automatically - so you'd get an environment that looked super high detailed - but the detail would only be exactly were it was needed. It's very much like the way mega texture works - but this is really taking it to the next level. Exciting stuff!
Next I attended a session on hair and cloth. I thought that I'd be able to learn some things that would help with the projects I'm working on now, but all of the papers presented were on very expensive techniques - like simulating every single strand of hair, or simulating cloth by mimicing the behaviour of the yarn. Results were super cool, but compute times were very high - nothing near real-time which is what we need.
Next I attended a session on physics. There was a paper on doing hair in the real-time on the GPU, but I missed it because I got there late. I'm hoping that Nvidia will post it on their developer web site. It's pretty close to what I'm going to need to work on soon. The second paper was on using lots of GPUs in parallel to run particle systems with very high particle counts. By using 6 or 7 GPUs together, the guy was able to do simulations with a million particles in real-time. Very impressive. Now if we could just get game comsoles to ship with 7 GPUs we'd be all set! :P The final paper in the session was on the bent and broken metal effects in the Hulk movie. They used an interesting technique for allowing the Hulk to destroy metal objects. They basically turned on cloth simulation with the cloth stiffness set very high - but only at the moment of impact. This allowed the surfaces to deform, but then they'd turn the sim off when the impact was done so the surfaces would keep their deformed shape.
Finally, a session that I didn't get to attend (because I had to catch my flight back home) but that looks very interesting was presented by Jon Olick from id
Software. You can grab the paper here:
http://s08.idav.ucdavis.edu/
The first half of the paper talks about their usage of the PS3 hardware - which is nice in itself, but the second half is the really interesting part. He talks about a new way of rendering meshes that's kinda like what they're doing with mega texture - only it's for geometry instead of
texture data. Basically you'd be able to create environments with an interface similar to Zbrush - where you could just create as much detail as you wanted with no concern for polygon counts. Then at run time, the software would do all of the LODing automatically - so you'd get an environment that looked super high detailed - but the detail would only be exactly were it was needed. It's very much like the way mega texture works - but this is really taking it to the next level. Exciting stuff!
2 Comments:
http://developer.nvidia.com/object/siggraph-2008-hair.html
be sure to check out the video :)
hey, I just wanted to thank you for posting all the amazing pictures of textures. I really appreciate it :)
Post a Comment
<< Home