Main Content Region

CSUSB Motion Capture Projects

The College of William & Mary Motion Capture Project: Infinite Impact

The Multimedia & Immersive Technologies (MIT) department, xREAL Lab, and Theater Arts Department collaborated with the College of William & Mary on the immersive digital performance and VR experience Infinite Impact developed through the Art and Science Exchange, conceived by Theater Arts Professor and Artistic Director of CSUSB's MoCap Studio Kristi Papailler and Dr. Omiyẹmi (Artisia) Green, Director of the Art & Science Exchange at the College of William & Mary. The initiative brought together faculty, staff, and students across disciplines to explore motion capture, real-time 3D environments, and virtual production as tools for interdisciplinary storytelling, historic preservation , and experiential learning.

A woman in a blue blouse and headwrap stands on a stage, extending her arm and pointing forward while looking in that direction. Behind her, another person wearing a black motion capture suit with tracking markers and a head-mounted device looks the same way. The setting appears to be a theater or studio space, with a visible exit sign, a maximum occupancy sign, and stage flooring.

The project led by Professor Kristi Papailler integrated theatre arts faculty and students with the MIT/xREAL Lab's core infrastructure and technical support functions, experiential learning, and digital performance technology and production workflows. The production pipeline was led by Bobby Laudeman and Yutong Liu, overseeing motion capture and live performance integration. MIT videographer and 3D modeler Francisco Casillas and student assistant Luic Semmens developed the 3D environments. Dimitry Astakhov filmed the onsite process. Theater performance student Anthony Blackwell Tallent led character development, including the delivery of a fully realized MetaHuman asset. Liliana Hernandez served as a motion capture student assistant, supporting recording and production workflows. Theatre technology students Jacob Neer and Scarlet Rodriquez served as Infinite Impact production assistants, supporting personnel during the filming.

A central component of the project was the work of Lacroy “Atlas” Nixon, the Inaugural Poet Laureate of Williamsburg, Virginia, who wrote and performed Infinite Impact- a selection of three original poems, directed by Professor Kristi Papailler with assistant direction by Yutong Liu. The production featured both a live performance and a corresponding "playable" simulation with AI MetaHuman representation and real and/or animated Infinite Impact experience options demonstrating how performance can extend into digital, character-driven storytelling.

The project progressed through an active production phase with defined deliverables, technical requirements, and formal production goals guiding development. Coordination between Dr. Green and Professor Papailler, ensured alignment of logistics and technical needs.

A stylized 3D-rendered scene shows a man in a black T-shirt and jeans walking through a city street at night. Tall buildings with glowing windows surround him, and the environment has a colorful, slightly distorted visual effect. The road beneath him features bold red and blue patterns, and he appears to be mid-step with one hand raised as if interacting with something or moving cautiously.

The motion capture team participated in a featured workshop at the College of William & Mary, on March 23rd, 2026, where the team delivered a live motion capture demonstration to a full audience. This workshop was the opening experience of the 2nd Annual Arts and Science Exchange at William & Mary. The presentation included a live- streamed MoCap premiere of Lacroy Atlas Nixon's newest work, a cinematic rendering of Infinite Impact, and the final production featuring three poems, each set within distinct, fully realized 3D environments. Attendees also experienced an immersive virtual reality component, allowing the performance to be explored interactively. The work was presented both as a cinematic 2D video and within VR environments, demonstrating both formats.

Two people stand on a stage facing a large projection screen that displays a 3D-rendered man in a city environment. The stage is set up with motion capture equipment, including cameras on tripods and tracking devices. The surrounding space is dark with black curtains, and the individuals appear to be observing or reviewing the virtual scene.

The audience response was highly positive. The integration of live performance, motion capture, and digital visualization created a compelling format that resonated with both arts and sciences audiences. The showcase generated interest from peer institutions and initiated discussions for continued collaboration with the College of William & Mary.

This project highlights MIT’s commitment to immersive media as a platform for collaborative production, applied research, and student-centered learning. By aligning faculty leadership, technical expertise, and student participation, CSUSB continues to expand innovative approaches to digital storytelling while strengthening cross-institutional partnerships.