Category Archives : Updates


OpenPTrack V2 at Laboratorio de Interactividad Corporal

OpenPTrack is now being used for the ongoing Laboratorio de Interactividad Corporal (the Body Interactivity Laboratory) at cheLA in Buenos Aires, Argentina. The Lab got underway this past February, and will develop three new performance pieces that “explore concepts, implications and technologies of physical tracking and augmented reality,” particularly, “as they concern the relationships between the body and […]


iSTEP

Arranging Gardens and Sending Bees to Flowers with OPT Gnocchi

Indiana University researchers used OpenPTrack’s new object tracking and pose recognition this past spring to help first and second graders learn about the complex system of honeybee foraging. Dr. Joshua Danish from the Learning Sciences program led the research team, as part of the NSF-funded Promoting Learning through Annotation of Embodiment (PLAE) project, which engages students in embodied sociodramatic play. While being taught by their […]


OPT Object Tracking with YOLO

As part of the V2 Gnocchi update, OpenPTrack now uses the open source software YOLO V2 for object tracking alongside person tracking and its new pose recognition capabilities. YOLO can track many everyday objects off-the-shelf, and can also be trained to track newly introduced objects. It is already being used by UCLA and Indiana University STEP research teams, who are incorporating tracking of custom […]

objects2

OPT_Poses3

OPT Pose Recognition

As part of the V2 Gnocchi update, OpenPTrack now uses machine learning for pose recognition alongside person tracking and its new object tracking capabilities. OPT pose recognition extends the OpenPose skeletal tracking library to multiple cameras, and includes the ability to train the system to detect unique poses. Early adopters include UCLA and Indiana University STEP researchers, who are integrating pose recognition into […]


OPT V2 Gnocchi Code Available!

The OpenPTrack team is excited to announce the release of OPT V2 (Gnocchi). The code can be found on the OPT V2 Github page, and provides new, fundamental features (GPU acceleration required): Object Tracking. For the first time, OpenPTrack will track objects in addition to the human body. V2 will add the capability to track objects […]

Multiview Pose Tracking

Zed Camera, CNN-based People Detection

OpenPTrack V2 “Gnocchi” Coming Soon!

The OpenPTrack team is preparing for the release of OPT V2 (Gnocchi), which will provide object tracking, pose recognition and ZED camera support, as well as enhanced real-time person detection.  Gnocchi will also update the underlying software stack to Ubuntu Linux 16.04 LTS and Robot Operating System (ROS) Kinetic Kame.  OPT V2 Gnocchi has been under development for the […]


PLAE with OpenPTrack in Indiana

A team of Indiana University researchers are using OpenPTrack to develop body-based cyberlearning tools in conjunction with Bloomington area elementary schools. The partnership aims to engage students in embodied sociodramatic play in physically interactive educational spaces, to help them learn about complex science phenomena. The collaboration’s installation supports the NSF-funded Promoting Learning through Annotation of Embodiment (PLAE) project led by Dr. […]


Body-Based Drawing with OpenPTrack

UCLA REMAP researcher/GSE&IS PhD student Randy Illum led 36 elementary school students in body-based art-making exercises enabled by OpenPTrack. The project was conducted over three Fridays in March 2017 at UCLA Lab School. It investigated collaboration between students using a digital body-based drawing program compared to sharpie markers. Working in groups of six, the students developed ideas for digital drawings on backgrounds chosen […]


New TouchDesigner Components

Two open source components for Derivative’s TouchDesigner have been released for receiving person tracks streamed from OpenPTrack. The components were developed by Phoenix-based developer/stage designer Ian Shelanskey, and can be found in our GitHub repository as well as in Ian’s. The first component is a TOX using Python that improves on the original examples in the […]


OpenPTrack at IEEE VR Los Angeles

UCLA researcher & PhD student Randy Illum presented Mixed-Reality Barriers: Person-Tracking in K-12 Schools at the IEEE VR conference in Los Angeles on March 19. The paper, cowritten with GSE&IS PhD student Maggie Dahn, details the use of OpenPTrack in classrooms at two schools—one, a university laboratory elementary school, and the other, a public charter school. Illum’s presentation was given as part of […]