Jeff Burke


New OPT Installation at UCLA School of Theater, Film and Television

OpenPTrack has been installed in the Little Theater at the UCLA School of Theater, Film and Television (UCLA TFT). This is the second semi-permanent OPT deployment currently at TFT—the other installed in the School’s TV Studio #3 since 2015, to support software development by UCLA REMAP researchers and interactive artworks created by TFT students. […]


New Performances with OPT at Noviembre Electrónico, Bueno Aires

Three new performances using OpenPTrack as the interface between movement and media will run during the Noviembre Electrónico festival at the Centro Cultural San Martín in Buenos Aires, Argentina. The pieces were developed during UCLA REMAP and cheLA‘s Laboratorio de Interactividad Corporal (Body Interactivity Laboratory), which got underway this past February. Created by […]


OPT Pose Recognition

As part of the V2 Gnocchi update, OpenPTrack now uses machine learning for pose recognition alongside person tracking and its new object tracking capabilities. OPT pose recognition extends the OpenPose skeletal tracking library to multiple cameras, and includes the ability to train the system to detect unique poses. Early adopters include UCLA and Indiana University STEP researchers, […]


Real-time Movement Analysis – OpenMoves

OpenPTrack senses where people are, but not how they move. That’s where OpenMoves comes in. Interpreting human motion data in real time is an important capability for mixed reality applications.  Being developed by UCLA REMAP, OpenMoves is a new toolset for generating higher-level short-time features, as well as learning and recognizing […]


OpenPTrack V2 at Laboratorio de Interactividad Corporal

OpenPTrack is now being used for the ongoing Laboratorio de Interactividad Corporal (the Body Interactivity Laboratory) at cheLA in Buenos Aires, Argentina. The Lab got underway this past February, and is developing three new performance pieces that “explore concepts, implications and technologies of physical tracking and augmented reality,” particularly, “as they concern the […]


Docker for OpenPTrack V2

OpenPTrack Version 2 (Gnocchi) can now be run as a Docker container. The newly Dockerized OPT V2 targets anyone who does not want or need to compile OpenPTrack and its dependencies from source. It provides separate, pre-built images of OpenPTrack and its dependencies for Ubuntu Linux 16.04. To run OpenPTrack, all […]


OPT Object Tracking with YOLO

As part of the V2 Gnocchi update, OpenPTrack now uses the open source software YOLO V2 for object tracking alongside person tracking and its new pose recognition capabilities. YOLO can track many everyday objects off-the-shelf, and can also be trained to track newly introduced objects. It is already being used by UCLA and Indiana University STEP research teams, who […]


OpenPTrack V2 “Gnocchi” Coming Soon!

The OpenPTrack team is preparing for the release of OPT V2 (Gnocchi), which will provide object tracking, pose recognition and ZED camera support, as well as enhanced real-time person detection.  Gnocchi will also update the underlying software stack to Ubuntu Linux 16.04 LTS and Robot Operating System (ROS) Kinetic Kame.  OPT V2 Gnocchi has […]


New TouchDesigner Components

Two open source components for Derivative’s TouchDesigner have been released for receiving person tracks streamed from OpenPTrack. The components were developed by Phoenix-based developer/stage designer Ian Shelanskey, and can be found in our GitHub repository as well as in Ian’s. The first component is a TOX using Python that improves on […]