Yearly Archives: 2018


New Performances with OPT at Noviembre Electrónico, Bueno Aires

Three new performances using OpenPTrack as the interface between movement and media will run during the Noviembre Electrónico festival at the Centro Cultural San Martín in Buenos Aires, Argentina. The pieces were developed during UCLA REMAP and cheLA‘s Laboratorio de Interactividad Corporal (Body Interactivity Laboratory), which got underway this past February. Created by […]


OPT Pose Recognition

As part of the V2 Gnocchi update, OpenPTrack now uses machine learning for pose recognition alongside person tracking and its new object tracking capabilities. OPT pose recognition extends the OpenPose skeletal tracking library to multiple cameras, and includes the ability to train the system to detect unique poses. Early adopters include UCLA and Indiana University STEP researchers, […]


Real-time Movement Analysis – OpenMoves

OpenPTrack senses where people are, but not how they move. That’s where OpenMoves comes in. Interpreting human motion data in real time is an important capability for mixed reality applications.  Being developed by UCLA REMAP, OpenMoves is a new toolset for generating higher-level short-time features, as well as learning and recognizing […]


OpenPTrack V2 at Laboratorio de Interactividad Corporal

OpenPTrack is now being used for the ongoing Laboratorio de Interactividad Corporal (the Body Interactivity Laboratory) at cheLA in Buenos Aires, Argentina. The Lab got underway this past February, and is developing three new performance pieces that “explore concepts, implications and technologies of physical tracking and augmented reality,” particularly, “as they concern the […]


Docker for OpenPTrack V2

OpenPTrack Version 2 (Gnocchi) can now be run as a Docker container. The newly Dockerized OPT V2 targets anyone who does not want or need to compile OpenPTrack and its dependencies from source. It provides separate, pre-built images of OpenPTrack and its dependencies for Ubuntu Linux 16.04. To run OpenPTrack, all […]


Arranging Gardens and Sending Bees to Flowers with OPT Gnocchi

Indiana University researchers used OpenPTrack’s new object tracking and pose recognition this past spring to help first and second graders learn about the complex system of honeybee foraging. Dr. Joshua Danish from the Learning Sciences program led the research team, as part of the NSF-funded Promoting Learning through Annotation of Embodiment (PLAE) project, which engages students in embodied sociodramatic play. […]


OPT Object Tracking with YOLO

As part of the V2 Gnocchi update, OpenPTrack now uses the open source software YOLO V2 for object tracking alongside person tracking and its new pose recognition capabilities. YOLO can track many everyday objects off-the-shelf, and can also be trained to track newly introduced objects. It is already being used by UCLA and Indiana University STEP research teams, who […]


OPT V2 Gnocchi Code Available!

The OpenPTrack team is excited to announce the release of OPT V2 (Gnocchi). The code can be found on the OPT V2 Github page, and provides new, fundamental features (GPU acceleration required): Object Tracking. For the first time, OpenPTrack will track objects in addition to the human body. V2 will add […]