You’re in a run-down shopping mall and the perp is running towards you, shooting as he comes. You take aim and fire off a few rounds, which hit him in the chest, punching him off his feet at the top of the escalator. Down he goes, his limbs jerking spasmodically as he crumples at each impact. Not real life of course, but just another wipeout in the gore fest that is Judge Dredd: Dredd vs. Death, a game by Rebellion that relies heavily on ragdoll physics for depicting dynamic human movement.
The technology is a high-profile development in the race to bring realism and excitement to an ever-hungry and ever more jaded game buying public. It involves rigging a character with loosely defined constraints that make it possible for spinal, head, arm, and leg joints to bend and flop realistically in response to forces such as bullet impacts or gravity, in a similar way to a crash test dummy in a seat-belt advert.
Halo, the long-delayed Bungie game that eventually saw light of day on the Microsoft Xbox has won acclaim for its successful use of ragdoll. Current and future games such as the Hitman series, the Unreal Tournament series, Max Payne 2, Half Life 2, Painkiller, Far Cry and S.T.A.L.K.E.R. and a whole host of forthcoming first person shooters feature it for their dynamic character animation.
“It’s used because it’s cool,” says Steven Collins, CTO and founder of Havok, the world’s leading provider of ragdoll technology to the game industry. “There are two key related reasons why people use it. The first is fidelity. It used to be okay to have corpses in a game lie flat, balancing on their ankles off an edge of a cliff, or have dead guys embedded in walls or in other dead guys. Now people want to see a realistic interaction with the game environment. The body should land fully on the ground and react to the shape of the ground, rolling down staircases, sliding off ledges, landing in a heap and so on. The second reason is throughput. Games are getting really complex to build with large teams and vast amounts of content. A skilled animator is very capable of hand-animating every required death animation for a game, but it will take them a very, very long time.”
According to developers like Collins, ragdoll simulation has reached the level of quality that allows the animator to let the physics engine take over and resolve the death sequence. “They still want control, but not total control,” he says.
“It significantly adds to the players’ sense of being there, the immersion,” continues Collins. “The fact that a dead guy is still there, and doesn’t just dissolve or disappear, but is persistent within the game and available for some further beating, as well as the fact that it reacts differently each time depending on the context, is great. We’ve seen players simply play with the ragdolls for ages, trying to pile them up or get them into weird positions. Immersion in the story is becoming increasingly important and huge strides have been made on the graphics side. It’s time to see similar strides in the animation and dynamics.”
Of course, like many elements of the games world, it’s been in development for some time. “Back in 1999, I began to develop the ragdoll code used in Hitman: Codename 47,” says Thomas Jakobsen, head of R&D at IO Interactive (IOI), speaking of the company’s first game, released in 2000. “I believe it might actually have been the first computer game to feature fully articulate, interactive ragdolls. We chose to implement this kind of physics in order to give Hitman its own style and to provide more variation and realism in a game where a lot of people get shot.”
“Before Hitman shipped, I hadn’t seen it in a game before,” continues Jakobsen. “However, I had seen CGI movies of falling people made with specialized offline animation tools, but they weren’t real-time and certainly not interactive. Even so, they looked sweet, so this was obviously the way to go.”
Rather than use off-the-shelf tools, Jakobsen’s ragdoll code is part of IOI’s proprietary game engine, Glacier. “The physics code itself is implemented using algorithms inspired from molecular simulation – with a twist,” he explains. “Although different from their approaches in many aspects, a lot of inspiration came from scientific articles.”
IOI is now part of Eidos. The effects are used in all the IOI games, including Hitman: Codename 47, Hitman 2: Silent Assassin, Freedom Fighters, and the forthcoming Hitman Contracts. “The nice thing is that it’s not just about eye-candy and wow effects,” says Jakobsen. “In the Hitman series, it’s also a part of the gameplay that you can actually drag dead people around. The nice death animations really add to the atmosphere. As for the development process, using ragdolls saves a lot of animation work.”
Havok has been delivering ragdolls to its clients for nearly four years, and its technology is used in over 100 titles – many of which are yet to ship. The process involved in adding the Havok technology generally starts by using a plug-in for 3DS Max or Maya to create the ragdoll in the first instance.
Character artists need to assign parameters like mass and friction to each body part (head, upper arm, lower arm, wrist, upper thigh, and so on), then define the basic biomechanics of the character – such as working out the range of motion for each body joint.
“We provide defaults, but generally the character artist tweaks these,” says Collins. “The game engineers will integrate our libraries into the game code, so that when the output from the modelling package is read into the game engine, our software can build the physics representation for the character in the game, using the parameters specified by the artist. The ideal situation is that the character rigger or animator builds the skeleton for the character in the game while in parallel they will be creating the physical rig.”
In some game studios where the use of physics is still in early stages, it will often be the game engineers who will create the physical representation for the character, and then tweak the behaviour based on feedback from the game designer, producer, or artist. “There’s still a lot of R&D underway to improve the simulation and to extend the control the animator has over the biomechanics of the character physics,” says Collins. “We expect the tools at this stage to become increasingly important.”
Collins continues: “When the game starts and our software implements the full physical model of the motion of the character, we do collision detection for each body part with the game level geometry and figure out how the character moves under gravity or reacting to shots or whatever the game requires.”
The ragdolls in Max Payne 2 developed by Remedy are a good example. “Having implemented the ragdolls, the guys at Remedy then experimented with many different effects applying forces to the ragdolls to get really dramatic effects for the game,” explains Collins.
IOI has a slightly different approach to implementing ragdoll constraints. “The artists simply create their characters with the usual skeleton for animation without worrying specifically about the ragdoll technology,” says Thomas Jakobsen. “The characters are modelled in the usual way using 3DS Max and in-house plug-ins for export. The game code then automatically analyzes the skeletal structure at start-up and adjusts collisions and movement accordingly. From the game code, it’s possible, for example, to exert a certain amount of force to a specific position on the body. At the moment, most behaviour is hardwired in code but increasingly more options are becoming available to the graphics artists at design time.
“The overall key thing to consider is the impact of ragdolls on the game design,” states Steve Collins. “You should have a good idea, as a game designer, how many ragdolls are going to be required and the level of realism required of those ragdolls.” The numbers of ragdolls and the fidelity of the simulation will affect the CPU power required, so this needs to be balanced with the rest of the game requirements. Each ragdoll part could be modelled, or a simplified representation could be used.
Currently, ragdoll is considered a strong marketing feature, and is pretty much expected of a first-person shooter, but use of the technology is spreading to other game genres as the flexibility and quality of the simulation increases. “Most in the industry recognize that the realistic portrayal of game characters is the most important technical challenge we now face,” reveals Steve Collins. “At this point, we need to rely on ‘smart’ content, where much of the behaviour is built into the model, and the game developers are more or less directing the model at a higher level.”
However, there are some products and places worth investigating if you want to take a closer look. Contact the companies involved for information.
The state of play
Looking to the future, it’s clear that games developers and players alike are anticipating even more realism. Thomas Jakobsen sees the technology incorporating more biomechanical modelling resulting in ragdolls that aren’t completely dead. “This is already seen today to some extent,” he says adding that he is looking forward to a combination of ragdolls, animation, inverse kinematics, better character interaction, and physical AI. “Technological capabilities like these will enable the game designers to focus on more important issues like gameplay and storyline, and obtain a higher degree of overall sophistication.”
“The physical representation of the game character is going to increase in complexity,” adds Steve Collins. “ We’ll start to see physical simulation being mixed with traditional keyframe animation to achieve short-term effects, like the impact of a bullet, or damage modelling – a limp dangling arm for example. We’ll see more pervasive clothing simulations, increasingly sophisticated facial animation and hair simulation. Then we’ll start to see the use of behavioural simulation where the muscles of the game characters are being driven by AIs. Characters will react with greater sophistication to the game world. They’ll stagger and regain balance, they’ll anticipate falls and try to protect themselves. They’ll be able to pick themselves up after having fallen over.”
Collins says this will start to impact on the design of the game AI and the complexity of the interactions of the characters with the world and with other characters. “Ultimately it’s fun to think about the logical extrapolation of all this to the point where the game characters are truly virtual actors totally under the control of the game designer, who starts to assume a role very similar to that of a movie director,” he says. “We’re quite a way away from that but we’re starting to see glimpses of the technology that will contribute to making that a reality.”
One such development is the release of Endorphin, a technology that can add AI in the form of behaviours to the character. This means that characters can respond adaptively to their physical environment, trying to keep their balance, for example. According to the developers, Oxford-based NaturalMotion, endorphin produces sophisticated animations in real-time, with qualities and in quantities that are unachievable with plain ragdoll technology.
“Now that ragdolls are a mature technology, we’re able to concentrate on generating the next level of realism and believability,” says Richard Craig-McFeely, sales and marketing director for NaturalMotion. “Using technology derived from Oxford University research on the control of human and animal body motions we’re creating bio-mechanically correct models of humans, while a team of skilled AI engineers implants intelligence, giving our characters the ability to produce truly realistic human behaviour.”
“Endorphin creates animation, even for multiple characters, in real-time,” continues Craig-McFeely. “So it’s a quick process that allows experimentation with the final quality animation and no low-res previews. It really is next-generation technology and a first in the 3D animation market.”
Consumers definitely require more realism and more expressive characters from today’s games, but this is not just a technologist’s dream come true. “There are hard business reasons behind these developments,” says Steve Collins. “We’ve new consoles with ever increasing power, which the console manufacturers require the game developers to exploit. We’ve licence owners, particularly from the movie industry, who want their licences exploited appropriately, so that Harry Potter, for example, looks, walks, sounds, and acts like Harry Potter. We’ve an increasingly sophisticated gamer audience, particularly as the age profile of those playing games widens, leading to a demand for more realistic portrayal of characters and plotlines. This is not to say that all games in the future will be realistic – you’re never going to see a version of Tetris with ragdolls. But there’s a significant and increasing demand for more realism and better characters in the largest market segments – FPS, RTS, adventure, and sports games.”
Most implementations of ragdoll physics in games are developed using in-house custom coding, or middleware for modelling packages such as Discreet 3DS Max, Alias Maya or Softimage|XSI.
Havok 2 creates middleware for games, currently comprising a suite of tools called Havok 2. The Havok Game Dynamics SDK includes a character kit that allows you to easily insert physical characters into your game. Localized ragdoll effects, and bone-level blending of keyframed animation sequences are supported. It ships with exporters for 3DS Max and Maya. www.havok.com
Reactor is an implementation of Havok for 3DS Max, bringing a complete set of dynamics extensions to discreet’s modeller. www.discreet.com/3dsmax
RenderWare Physics by Criterion provides the ability to add real-time dynamic behaviour to game objects, allowing characters to roll down a set of stairs. Objects can move, fall, and break in reaction to the player. True-to-life character physics is available, resulting in impressive human impact responses, such as being ‘hit’ by projectiles. Renderware Physics is available in stand-alone form, or as a component of RenderWare Platform. www.renderware.com
There isn’t a ragdoll system as such in Kaydara’s MotionBuilder or HumanIK. You can apply certain constraints to IK, and characters can interact with their environment and pick up objects in HumanIK Middleware, but there is no built-in AI. However, Kaydara resells a plug-in as part of its Affiliate Product Program called AI.Implant that can handle ragdoll. Developed by BioGraphics it is also available as a plug-in for Maya and 3DS Max. www.kaydara.com/products/affiliateProducts
Ragdolls for Dummies
Using reactor to add simple dynamic physics to characters.
The Havok-powered reactor technology implemented in 3DS Max 6 adds physicality to scenes. Add weight to a character by selecting all its bones and assigning a value to the Mass in the Physical Properties rollout.
The mass of the staircase is set to zero. The World rollout controls gravity in the scene – as the force of gravity is 9.8m/s per second, this is the default negative (downwards) value for the Z direction, displayed as – 386.22 max units if you haven’t changed to metric units.
Using the Create panel’s RB Collection in reactor Helpers you can add a rigid body collection to the scene. Previewing the animation at this point will show all the bones tumbling apart once the character hits the stairs. It needs some Constraints to hold it together.
Add a Constraint Solver to the scene from the reactor Helpers list. Then, apply it to the RB Collection before adding a Hinge constraint between each bone. You’ll need to add a Limited constraint to stop the joints rotating a full 360 degrees. This ties some of the bones together.
By default all collisions are enabled between objects, so you have to tell reactor to ignore the collisions between the bones in our character. Then, add Ragdoll constraints to the hips, back, neck, and shoulders, setting maximum and minimum parameters for twist and the volume and angle of rotation of the joint.
The Ragdoll constraints are added to the Constraint Solver and the character will now fall down the stairs properly. To finish off, hide the bones by only selecting the floor, stairs and character skin to be visible, then click Create Animation.