Docker for OpenPTrack V2

OpenPTrack Version 2 (Gnocchi) can now be run as a Docker container. The newly Dockerized OPT V2 targets anyone who does not want or need to compile OpenPTrack and its dependencies from source. It provides separate, pre-built images of OpenPTrack and its dependencies for Ubuntu Linux 16.04. To run OpenPTrack, all that is required is installing […]


OpenPTrack V2 at Laboratorio de Interactividad Corporal

OpenPTrack is now being used for the ongoing Laboratorio de Interactividad Corporal (the Body Interactivity Laboratory) at cheLA in Buenos Aires, Argentina. The Lab got underway this past February, and will develop three new performance pieces that “explore concepts, implications and technologies of physical tracking and augmented reality,” particularly, “as they concern the relationships between the body and […]


Arranging Gardens and Sending Bees to Flowers with OPT Gnocchi

Indiana University researchers used OpenPTrack’s new object tracking and pose recognition this past spring to help first and second graders learn about the complex system of honeybee foraging. Dr. Joshua Danish from the Learning Sciences program led the research team, as part of the NSF-funded Promoting Learning through Annotation of Embodiment (PLAE) project, which engages students in embodied sociodramatic play. While being taught by their […]

iSTEP

objects2

OPT Object Tracking with YOLO

As part of the V2 Gnocchi update, OpenPTrack now uses the open source software YOLO V2 for object tracking alongside person tracking and its new pose recognition capabilities. YOLO can track many everyday objects off-the-shelf, and can also be trained to track newly introduced objects. It is already being used by UCLA and Indiana University STEP research teams, who are incorporating tracking of custom […]


OPT Pose Recognition

As part of the V2 Gnocchi update, OpenPTrack now uses machine learning for pose recognition alongside person tracking and its new object tracking capabilities. OPT pose recognition extends the OpenPose skeletal tracking library to multiple cameras, and includes the ability to train the system to detect unique poses. Early adopters include UCLA and Indiana University STEP researchers, who are integrating pose recognition into […]

OPT_Poses3

Multiview Pose Tracking

OPT V2 Gnocchi Code Available!

The OpenPTrack team is excited to announce the release of OPT V2 (Gnocchi). The code can be found on the OPT V2 Github page, and provides new, fundamental features (GPU acceleration required): Object Tracking. For the first time, OpenPTrack will track objects in addition to the human body. V2 will add the capability to track objects […]


OpenPTrack V2 “Gnocchi” Coming Soon!

The OpenPTrack team is preparing for the release of OPT V2 (Gnocchi), which will provide object tracking, pose recognition and ZED camera support, as well as enhanced real-time person detection.  Gnocchi will also update the underlying software stack to Ubuntu Linux 16.04 LTS and Robot Operating System (ROS) Kinetic Kame.  OPT V2 Gnocchi has been under development for the […]

Zed Camera, CNN-based People Detection

PLAE with OpenPTrack in Indiana

A team of Indiana University researchers are using OpenPTrack to develop body-based cyberlearning tools in conjunction with Bloomington area elementary schools. The partnership aims to engage students in embodied sociodramatic play in physically interactive educational spaces, to help them learn about complex science phenomena. The collaboration’s installation supports the NSF-funded Promoting Learning through Annotation of Embodiment (PLAE) project led by Dr. […]


Body-Based Drawing with OpenPTrack

UCLA REMAP researcher/GSE&IS PhD student Randy Illum led 36 elementary school students in body-based art-making exercises enabled by OpenPTrack. The project was conducted over three Fridays in March 2017 at UCLA Lab School. It investigated collaboration between students using a digital body-based drawing program compared to sharpie markers. Working in groups of six, the students developed ideas for digital drawings on backgrounds chosen […]


New TouchDesigner Components

Two open source components for Derivative’s TouchDesigner have been released for receiving person tracks streamed from OpenPTrack. The components were developed by Phoenix-based developer/stage designer Ian Shelanskey, and can be found in our GitHub repository as well as in Ian’s. The first component is a TOX using Python that improves on the original examples in the […]