This article, on the possibilities and problems with virtual actors, still holds good a decade later. It’s very difficult to produce realistic humans (and why bother when, as one of the Mill’s staff said, there are thousands outside the window) – they either have to be perfect, or they fall into the ‘uncanny valley’ of looking nearly, but not quite, right. Instead, virtual actors are either cartoon-like (the route taken by Pixar) or used to add digital extras in post-production.
I don’t suppose many more films have added naked figures to HELP a film get a lower certificate from US censors – as revealed in the box at the end on Stanley Kubrick’s Eyes Wide Shut (a section published in the paper but missing from the version available on the Guardian website).
The Screen Actors’ Guild, the US union to which most film stars belong, is poised to go on strike in June. But one summer release may be pioneering the way to make future thespian walkouts an irrelevance.
Final Fantasy, due to reach UK screens in August, uses state-of-the-art virtual actors – sometimes known as vactors, or synthespians. The film, a science-fiction thriller set in 2065, has been entirely computer-generated by animation specialist Square. Flesh and blood actors contribute just basic movement data, and voicing.
But anyone hoping to be convinced by the female lead, Aki Ross, is likely to be disappointed. A preview last week in London showed that “she” may look more human than most characters of software born, but there’s still something wrong. Her skin and eyes look lifeless, and when she runs, her movements are somehow too smooth.
Some of these problems also affect Eve Solal, a computer-generated character produced by Parisian animator Attitude Studio. In stills – such as “her” recent cover shoot for women’s magazine Madame Figaro – Solal looks disconcertingly real. But in a short video interview, available on her website, she doesn’t quite convince.
This is because, when computer animators try to simulate human life, they are trying to trick one of our brain’s most reliable processes. “There is a disproportionate amount of mental capacity used for dealing with faces,” says Dr Donald Laming, lecturer in experimental psychology at Cambridge University. “If the face isn’t quite right, it’s not going to work.”
The brains behind Solal and Ross know they aren’t quite there – yet. “Her face is a bit plastic, and her eyes move a little bit too much,” says Rémy Brun, motion capture director at Attitude, of Solal.
Jun Aida, producer of Final Fantasy and president of Hawaiian-based studio Square, agrees. “Our goal was not to create photo-real characters. I don’t think technically it’s possible with animation. In still photos, we can.” He sees his film as creating a new category of high-quality animation, rather than as a challenge to live-action movies.
Virtual actors are not even a cheap alternative to the real thing. Aida says the 60,000 hairs on Ross’s head took a fifth of Square’s graphic rendering capacity to produce. “She ended up costing us millions of dollars,” he says. “We should have given her shorter hair.” The resulting hairdo, although one of the more realistic things about Ross, does tend to make her look as if she’s just stepped out of the salon even when cheating death – but perhaps that’s a screen siren’s prerogative.
Despite the difficulty in creating faces, realistic body movements are at least as tricky. In Final Fantasy, producers used a technique called motion capture. This involves real people performing the required actions, such as running or walking, while wearing sensors spread around their bodies. Cameras track the sensors, then map them on to the virtual actor.
But using this process to create convincing movement is fiendishly difficult, according to Chris Ford, senior product manager for Alias Wavefront’s Maya, the software used by many firms to handle three-dimensional models. “Motion capture is true to life, but it is sampled data,” says Ford. “It’s a bit like digital versus analogue [sound].” Motion-captured movements often look odd as a result of poor sampling.
With Solal, Attitude has got closer than most to convincing human movement, by basing almost all her actions on those of an actress of a similar build; Final Fantasy’s animators, by contrast, have used actors only to model action sequences. “We chose the actress as we would for theatre or cinema,” says Brun. “Often, people use their friends, but acting is 50% of the movement.” Brun uses 12 cameras running Vicon, a system from UK firm Oxford Metrics, and his eight years of experience in the field, which he describes as “very high-tech puppetry”.
Some animators are experimenting with virtual animals. Although in some respects more difficult – fur or hair is notoriously hard to get right – our eyes are less attuned to any imperfections, and also, the results will often be impossible to obtain in real life. Viewers of the latest surreal Guinness television ad may have noticed the appearance of a group of squirrels, who talk and quaff glasses of the black stuff. They are computer-generated.
Sally Goldberg is head of creature animation for Computer Film Company, one of several Soho firms working in this area. She is also virtual mother of the Guinness squirrels. “We built them from scratch,” she says. This is unusual, as three-dimensional characters are often designed from a template, such as with the firm’s work on Eyes Wide Shut.
Goldberg filmed squirrels from an animal rescue centre, and then translated their movements into the three-dimensional models used in the ad. “When he sat there and held a nut, he looked just like he was drinking beer,” she says, of the squirrel she based the main character on.
It is already possible to fill the screen with computer-generated humans and convince the eye – if you’re using thousands of virtual actors, rather than one close up. Mill Film, a few hundred yards south of Computer Film Company, last month won an Oscar for its work on Gladiator, which included a 50,000-strong virtual Roman crowd – an obvious moneysaver for the production company.
It is partly financial concerns that hold back use of virtual actors as leads. “If money’s no object, it’s possible now,” claims Dave Throssell, a head of computer animation at Mill. “But I’d ask, ‘are you sure?’ You want a real person, then there’s thousands of them outside this window.” He adds that computer generating an actor for a whole film would be expensive even compared to a star demanding $20m.
And then there’s the Lara Croft effect. Virtual characters to date have been not so much Final Fantasy as Unrealistic Male Fantasy. The somewhat top-heavy star of the Tomb Raider games has recently been joined by two similarly shaped virtual pop stars, TMmy and T-Babe, created by Glaswegian animators. “Most people who sit in front of computers tend to be male, and that’s what they think about 24 hours a day,” Throssell points out.
Attitude’s Brun says Eve Solal was deliberately designed to break this stereotype. “She’s 1.62 metres [5ft4in], which is quite short, so she’s not a top model. She’s good-looking in a way, but she’s not Pamela Anderson, or Lara Croft,” Brun says.
The French studio is, however, trying to perfect elements such as Solal’s skin, in its quest to make the character more convincing. But it may be a long task. “In the history of painting, it took several hundred years for artists to learn how to paint in three dimensions, and to draw the human figure and face,” says Cambridge’s Dr Laming, adding that science doesn’t yet fully understand how we read each others’ body language. “It’s as if computer animators are in the position of those early artists.”
Box: Don’t Believe Your Eyes
The legendary film director Stanley Kubrick died in March 1999, as Eyes Wide Shut, his sprawling film starring Tom Cruise and Nicole Kidman, neared completion – but with some vital tasks left undone.
One concerned a sequence in which Cruise cons his way into a high-society orgy, where some explicit shots were expected to stop the film gaining in the United States an R certificate (meaning children must be accompanied by adults – a rating often spelling commercial failure).
The lengthy and opulently filmed shots would be very difficult to edit, and the normal alternative – of superimposing real actors into the scenes – would be equally tricky, as the sets had been dismantled and the camera was not static.
So Sally Goldberg, whose 10-year career in computer-generated effects has included creating the lobster-like stomach creature in The Matrix, found herself working in secrecy on two computer-generated naked women – in order to mollify the US film-raters. (The figures face away from the camera, and helpfully block most of the censor-worrying action.) “They talked about pot-plants, but went for something that would stand out a bit less,” says Goldberg.
Computer Film Company bought a basic three-dimensional female figure from US model specialist Viewpoint, then “we just made them very beautiful,” Goldberg says. This took work. “Skin has a very particular way of reacting with light – a translucent quality – and it’s very difficult to reproduce.”
Furthermore, the figures could not be frozen as the camera swept past. “There are actions like weight-shifts from foot to foot,” Goldberg says. The firm booked a couple of nude models, to study these movements.
The resulting work, which also includes several cloaked figures, passes the visual Turing test: you don’t realise which figures are computerised extras.
Producing such effects takes computer power as well as design ingenuity. Many film-makers require a picture consisting of 4,000 horizontal lines of pixels, compared with the 625 used by standard UK televisions.
Goldberg says it would take roughly an hour for a 1 GHz processor to produce a single frame of this quality – and film uses 24 frames each second. CFC’s hardware can produce the images a little quicker. “We were looking at 15 hours a shot, for shots lasting a second or two,” she says.