Cloth is hard to simulate yet it’s important in gaming, scientific analysis, and CGI. That’s why scientists at Berkely and Carnegie Mellon have spent six months exhaustively exploring all of the possible configurations of a single cloth robe on a cute little animated figure, thereby reducing error and creating some of the nicest simulated cloth you’ll see today. They report on their findings in a paper that will be released at SIGGRAPH today.
“The criticism of data-driven techniques has always been that you can’t pre-compute everything,” said Adrien Treuille, associate professor of computer science and robotics at Carnegie Mellon. “Well, that may have been true 10 years ago, but that’s not the way the world is anymore.”
The cloth you see above is made of 29,000 vertices and rendered at 60 frames per second. It flows and moves just like real cloth and because all possible motions are rendered and taken into account using a sort of graph of all possible vertex positions. Why is this important? Because it allows for online simulations of clothing on a human body, it can make games far cooler than they are now, and you can use the technology to see how materials will perform in various configurations, different weather patterns, and the like. In short, it gives virtual robots real clothes.
Can we expect to see this technology in games any time soon? Not on current consoles.
It’s quite cute to see how these little figures move inside robes and “casual” clothing and I’d say we’re just a little past the uncanny valley, at least when it comes to clothing. With a few more months of rendering I wonder what they could do with floppy bellies and arm fat?