Motion Capture: Freshman Innovation in CIAS
by Sam Finston | published May. 23rd, 2017
For the last two semesters, a group of sixteen first-year students have worked non-stop in a small lab on the third floor of the Chester F. Carlson Center for Imaging Science. They are all part of a unique class – if it can even be called a class in the traditional sense – titled the "Innovative Freshman Experience".
An outside observer might have noticed a group of students working diligently, but they for sure wouldn't have seen any lecturing or note taking. "There's no textbooks, no tests, no quizzes, no finals," explained the course's instructor, Professor Joe Pow of the College of Imaging Arts and Sciences(
The introductory course, which is required for
The 2016-2017 Class
"Motion capture is using various techniques to track real world motion into virtual world motion."
This year’s project was the creation of a motion capture system. "Motion capture,” explained first-year Imaging Science major Greg Nero. “Is using various techniques to track real world motion into virtual world motion." More specifically, Nero and his classmates created a way to map the movements of a real person’s face and body onto a virtual counterpart.
This is the same sort of technology used in films such as the recently rebooted "Planet of the Apes" series and James Cameron’s "Avatar". Such tech can simulate realistic movement with computer-generated characters.
"[The students] own this project. After we give them the challenge, it's up to them to organize themselves and break up into teams."
As one would expect, learning how to work with motion capture was no easy task Yet, it is something that these first-year students were charged to do in only two semesters. "They own this project,” said Pow. “After we give them the challenge, it's up to them to organize themselves and break up into teams." Doing so skillfully, allows the students to effectively use all of their resources and be prepared for every technical problem that could arise.
"My job right now is to give them the tools and the material they need to get the job done," said Pow. "I don't stand up and tell them what to do, I don't direct any activities. [The students] are in charge of this."
A variety of different majors tend to be represented in the class. According to Pow, the Innovative Freshmen Experience class has had students from 17 different majors participate in the class over the years, including various Engineering students and even Game Design and Development majors. "As long as you can contribute,” said Pow. “Anybody can enroll in the class."
Creating the Motion Capture System
Since the students in the Innovative Freshman Experience class decided their own roles, they choose to organize themselves in a fashion that leaned on each others particular strengths. First-year Imaging Science Major Peter Jarvis, for instance, contributed more towards the hand-on construction aspects of the project (as opposed to the work that was done digitally) as that better suited his skills.
Jarvis described some of the ways that the class independently organized itself to build the Motion Capture project from scratch. "We originally broke up into two teams: the real-world team and the virtual-world team," said Jarvis. From there, the real-world team further separated into groups tackling issues such as how to track the face, how to track the body and what methods to use as proxies. Jarvis was involved in tracking the face at first.
"We had to come up with a way of how to track all of the facial features and get them to correspond with Unreal so that they're moving in real time," said Jarvis. Unreal Engine 4, known as
To properly capture facial movements, Jarvis and his classmates also utilized a unique piece of software called Faceware Live 2.0. Faceware, the company that provides this software, also makes the facial capture software used by game development companies such as Ubisoft and Crytek. Faceware Live 2.0 was critical to the project, because it includes a plugin that allows data to be streamed from a camera right into the Unreal engine used in their project – all without any other code.
Even with this software, there was still the problem of keeping a camera focused on someone’s face while they move. This required a way to mount a camera to one’s head that stays balanced in place, while causing as little discomfort to the wearer as possible.
In the end, their solution came in the form of a bicycle helmet with an aluminum pipe drilled through it. The contraption they devised has the camera mounted to the front end of the pipe, while the back part acts as a counterbalance.
After his work on the face-tracking headset,
Tracking the motion of a person’s body required that the class create what’s called "active marker body capture". In other words, an actor would wear a number of different colored markers on their body, cameras with color sensors would then track the changing positions of the markers.
The color of each marker corresponded to the part of the body that it attached to. Additionally, details like the orientation of the actor and their position on the ground were tracked by an Arduino and a rotary encoder system respectively. An Arduino is a microcontroller, a little chip that holds a functional computer on it that can be programmed to do just about anything. In this case, the Arduino was made to translate the yaw, pitch and roll of real world motion into a virtual space.
Rotary encoders are small devices that read rotational motion and convert it into x and y coordinates. All of these smaller systems worked in combination just to track the body.
While Jarvis focused on licensing software and the physical aspects of the headset and frame, he lent his own strengths to a much bigger project. The great amount of technological feats accomplished in creating all of the components of the final system could not have been done without a full, organized team of dedicated students.
Like every project made in this class previously, the motion capture system made this year was built with plans to display it at ImagineRIT. After that, according to Pow, it was disassembled, and its parts will likely find their way into future projects. The structure was simply too large to keep around unfortunately. Even so, the knowledge and experience gained by all those who worked on the project will last a lifetime.