Motion capture is a useful tool for a wide range of applications including animation, visual effects, virtual reality and simulation. The technologies that comprise motion capture have come a long way since their inception and are now commonplace in a variety of industries.
How is a motion capture system deployed, and what are its strengths?
What is a motion capture system?
Motion capture systems are traditionally comprised of a specialist multi-camera array that is strategically positioned to focus upon a volume of space where the objects to be tracked are present, ensuring that there are enough cameras present to accurately pinpoint the location of moving objects that are equipped with reflective tracking markers.
A tracking software enables specific targets to be recognised, from 3D glasses to hand-held navigation devices to entire human skeletons, UAVs, robots and virtually anything you wish to capture the movements of. Some systems operate with an accuracy down to 1/5th of a human hair, which provides very accurate data to be captured and recorded upon the subject of focus.
Motion capture, also known as mo-cap, helps to give animated characters a level of realism that makes them feel truer-to-life and more authentic.
To have their movements recorded and mapped, actors need to wear special motion capture suits with sophisticated camera rigs that can also record the movements of their faces. These suits are famous for their somewhat humorous and unusual appearance, being tightly fitting and covered in geometric shape markers to help track an actor’s movements, and often sporting white ping pong ball-like markers.
These markers signify reference points that will help animators to match similar movements from performance to simulated movement, for example the movement of the actor’s shoulder to the characters. Modern mo-cap suits tend to also incorporate sensors to help improve the tracking of their wearers.
How does a motion capture system work?
As a performance is recorded, specialised cameras track the movements of the markers and use them to generate a kind of digital skeleton with the help of highly specialised software. This skeleton can then be transposed onto a digital character to form the basis of their movements and expressions, cutting out a painstaking time for animators who would otherwise need to study certain motions over and again in excruciating detail in order to realistically portray them.
Motion capture does not need to be a like-for-like translation from actor to character on screen. Indeed, some motion capture performances help to adapt the movements of a human to an animal or fantasy creature expressing human characteristics, for which there is no real-life reference. Similarly, not every part of the actor is transferred onto the character, who will often differ in size and appearance to the actor themselves.
One such example is Vicon’s recent collaboration with Industrial Light and Magic (ILM) on the award-winning Star Wars series The Mandalorian which has highlighted how motion capture technology can be married with in-camera visual effects. Motion capture is utilised to track the cameras with the actors and props performing before large scale direct view LED video walls. These combine reality with digital extensions in real time, advancing the possibilities of filmmakers through this ground-breaking technology by replacing much of the post-production green screen requirements of the past.
Another film, War for the Planet of the Apes, also utilised motion capture performances by actors including famed mo-cap performer Andy Serkis. In this case, motion capture helped to enhance the expressions of the character Caesar, a highly intelligent chimpanzee, to be closer to those of a human.
Motion capture hardware and software
Motion tracking at its most accurate relies on cutting-edge technology, both in the form of hardware and the software components that support and power them.
Vicon’s Motion capture cameras offer more than their standard counterparts. Powerful electronics like onboard sensors give them a range of abilities that make shooting more manageable and ultimately give VFX studios better tools for use in the finished product. The Vicon Vantage, for instance, detects when it has been physically bumped and recalibrates without input.
These cameras are able to monitor and feedback on the go, making them ideal for reliable data capture in motion tracking.
Additional hardware like calibration wands helps ensure motion capture cameras work as intended and with full accuracy, while control boxes help to connect third-party devices and harmonise the various moving parts of a cohesive motion capture system.
Supporting motion capture software, like Vicon’s Shogun, allows for low-latency motion tracking and other features that shorten the gap between recording actors and producing the immersive final result.
Shogun even helps to refine the minutiae of motion capture performances, with its finger solver features helping the animation of human hands.
Advanced motion capture hardware and software has the main advantage of speed. By being able to capture data quickly and accurately, outputting animation in real time, the workflows of motion capture are significantly shortened, leaving more time for the finishing touches. This is especially helpful for works that contain several diverse characters.
What are the applications of Vicon Motion Capture Systems?
Motion capture can be applied in several industries as a tool for scientific research as well as entertainment. Its data capturing capabilities make it as versatile as the minds behind its employment.
The ability of motion capture systems to record and analyse the movement of humans and animals makes it a valuable tool in furthering life sciences knowledge.
One use is gait analysis, which records how a person walks and can use the captured data for both quantitative and observational information. Sensors gather quantitative kinetic data that can’t be merely observed, and medical practitioners can then interpret observational data to inform the context of a patient’s medical condition and their recovery from it.
Motion capture in life sciences can further our knowledge of animals and how they move, as well as the impact of strokes and neurological diseases on the human body.
Motion capture is well-known for its uses in virtual production, used in critically acclaimed blockbusters including Avatar and numerous entries in the Marvel Cinematic Universe.
Motion capture is useful for its aforementioned accuracy in capturing the subtleties of actor movement. This helps to not only push the boundaries of VFX in film, but also aids animators in avoiding a phenomenon known as the uncanny valley.
Coined in 1970 by Masahiro Mori, a Japanese roboticist, it is a metaphorical and theoretical idea that describes what happens in the human brain when it encounters something human-like that lacks crucial realistic details. The uncanny valley can be triggered by animated characters, robotic objects, and other humanoid entities.
Motion capture helps animators avoid this pitfall that lies between ‘lifelike’ and ‘not lifelike enough’. VICON’s motion capture system , with VR scouting, performance capture, green screen, and in-camera VFX, provides industry-leading virtual production solutions that help to push VFX even farther from this issue.
Motion capture in engineering shifts the focus from the movement of humans to the movement of machines, vehicles, and various components of automated systems. That having been said, human factor engineering benefits greatly from motion capture too.
Vicon motion capture systems can provide positional data on unmanned aerial vehicles and feed back to control systems for greater accuracy. The effects of real-world tests like earthquake simulators can be measured precisely, and the movement of robotics can be analysed and refined using motion capture data.
Aerospace companies such as Boeing, NASA and Airbus also employ motion capture to further their research.
Perhaps the newest and thus most unexplored avenue for motion capture, virtual reality can utilise motion capture to make VR experiences even more immersive. With high-precision tracking, player movements in the real world can be translated into the VR experience for closely synchronised character behaviour in the world of the game or experience.
VR developers could also use the analysis of player movements to inform user reactions to events in the VR world and fine-tune the way these events play out based on tester data.
By tracking 3D stereoscopic eyewear in a virtual reality CAVE, for example, the participant can be immersed within the virtual environment with a more natural viewing experience delivered to their eyes. The tracking system ensures the visuals are rendered in relation to the users’ direction of gaze, allowing the ability to look around a 3D object from any angle as you would in reality, giving a real sense of presence. Furthermore, the ability to track multiple users’ perspectives is now possible with innovations like multi-view 3D-capable projection technologies offering VR experiences where collaboration is enhanced between users like never before.
Simulation and motion tracking
Spatial tracking is but one essential component to our VR solutions and other simulation systems integrated by ST Engineering Antycip.
We use a wide range of cutting-edge technologies and technology partners to deliver the most immersive and optimised interactive environments possible to businesses across Europe.
To learn more about motion capture systems and how it can be applied in your organisation, contact us today.