RealSense & Zed Support Added in OPT V2.2

Support for the Stereolabs Zed and Intel RealSense imagers have been added in V2.2 of OpenPTrack, as well as preliminary support (people tracking only) for the Microsoft Kinect Azure.  The Kinect V2 continues to be supported as well. V2.2 also includes updates to the underlying software stack, to Ubuntu 18.04 […]


New OPT Installation at UCLA School of Theater, Film and Television

OpenPTrack has been installed in the Little Theater at the UCLA School of Theater, Film and Television (UCLA TFT). This is the second semi-permanent OPT deployment currently at TFT—the other installed in the School’s TV Studio #3 since 2015, to support software development by UCLA REMAP researchers and interactive artworks created by TFT students. […]


New Performances with OPT at Noviembre Electrónico, Bueno Aires

Three new performances using OpenPTrack as the interface between movement and media will run during the Noviembre Electrónico festival at the Centro Cultural San Martín in Buenos Aires, Argentina. The pieces were developed during UCLA REMAP and cheLA‘s Laboratorio de Interactividad Corporal (Body Interactivity Laboratory), which got underway this past February. Created by […]


OPT Pose Recognition

As part of the V2 Gnocchi update, OpenPTrack now uses machine learning for pose recognition alongside person tracking and its new object tracking capabilities. OPT pose recognition extends the OpenPose skeletal tracking library to multiple cameras, and includes the ability to train the system to detect unique poses. Early adopters include UCLA and Indiana University STEP researchers, […]


Real-time Movement Analysis – OpenMoves

OpenPTrack senses where people are, but not how they move. That’s where OpenMoves comes in. Interpreting human motion data in real time is an important capability for mixed reality applications.  Being developed by UCLA REMAP, OpenMoves is a new toolset for generating higher-level short-time features, as well as learning and recognizing […]


OpenPTrack V2 at Laboratorio de Interactividad Corporal

OpenPTrack is now being used for the ongoing Laboratorio de Interactividad Corporal (the Body Interactivity Laboratory) at cheLA in Buenos Aires, Argentina. The Lab got underway this past February, and is developing three new performance pieces that “explore concepts, implications and technologies of physical tracking and augmented reality,” particularly, “as they concern the […]


Docker for OpenPTrack V2

OpenPTrack Version 2 (Gnocchi) can now be run as a Docker container. The newly Dockerized OPT V2 targets anyone who does not want or need to compile OpenPTrack and its dependencies from source. It provides separate, pre-built images of OpenPTrack and its dependencies for Ubuntu Linux 16.04. To run OpenPTrack, all […]


Arranging Gardens and Sending Bees to Flowers with OPT Gnocchi

Indiana University researchers used OpenPTrack’s new object tracking and pose recognition this past spring to help first and second graders learn about the complex system of honeybee foraging. Dr. Joshua Danish from the Learning Sciences program led the research team, as part of the NSF-funded Promoting Learning through Annotation of Embodiment (PLAE) project, which engages students in embodied sociodramatic play. […]