Jeff Burke


OPT Pose Recognition

As part of the V2 Gnocchi update, OpenPTrack now uses machine learning for pose recognition alongside person tracking and its new object tracking capabilities. OPT pose recognition extends the OpenPose skeletal tracking library to multiple cameras, and includes the ability to train the system to detect unique poses. Early adopters include UCLA and Indiana University STEP researchers, […]


Real-time Movement Analysis – OpenMoves

OpenPTrack senses where people are, but not how they move. That’s where OpenMoves comes in. Interpreting human motion data in real time is an important capability for mixed reality applications.  Being developed by UCLA REMAP, OpenMoves is a new toolset for generating higher-level short-time features, as well as learning and recognizing […]


Docker for OpenPTrack V2

OpenPTrack Version 2 (Gnocchi) can now be run as a Docker container. The newly Dockerized OPT V2 targets anyone who does not want or need to compile OpenPTrack and its dependencies from source. It provides separate, pre-built images of OpenPTrack and its dependencies for Ubuntu Linux 16.04. To run OpenPTrack, all […]


OpenPTrack V2 “Gnocchi” Coming Soon!

The OpenPTrack team is preparing for the release of OPT V2 (Gnocchi), which will provide object tracking, pose recognition and ZED camera support, as well as enhanced real-time person detection.  Gnocchi will also update the underlying software stack to Ubuntu Linux 16.04 LTS and Robot Operating System (ROS) Kinetic Kame.  OPT V2 Gnocchi has […]


New TouchDesigner Components

Two open source components for Derivative’s TouchDesigner have been released for receiving person tracks streamed from OpenPTrack. The components were developed by Phoenix-based developer/stage designer Ian Shelanskey, and can be found in our GitHub repository as well as in Ian’s. The first component is a TOX using Python that improves on […]


OpenPTrack at IEEE VR Los Angeles

UCLA researcher & PhD student Randy Illum presented Mixed-Reality Barriers: Person-Tracking in K-12 Schools at the IEEE VR conference in Los Angeles on March 19. The paper, cowritten with GSE&IS PhD student Maggie Dahn, details the use of OpenPTrack in classrooms at two schools—one, a university laboratory elementary school, and the other, a public charter school. Illum’s presentation […]


An iSTEP for Cyberlearning with OpenPTrack

Building on the ongoing Science Through Technology Enhanced Play (STEP) project, Interactive Science Through Technology Enhanced Play (iSTEP) will begin to incorporate new OpenPTrack capabilities currently in development.  STEP’s computer simulation, with OpenPTrack as the interface for body-based interaction, has been helping students understand scientific phenomena at UCLA Lab School and Indiana […]


OpenPTrack to be Presented at MW2016

UCLA REMAP researcher and UCLA GSE&IS doctoral student Randy Illum will be introducing OpenPTrack to MW2016 on April 9. Scheduled as one of the conference’s Lightning Talks, Illum will present “OpenPTrack: Body-Based Group Tracking for Informal Learning Spaces,” which will share the OpenPTrack platform and OpenPTrack projects-to-date, to generate conversation about the future of […]


UCLA REMAP Crowdfunding for New OpenPTrack Collaborations

UCLA REMAP has launched a campaign via UCLA’s crowdfunding platform, UCLA Spark, to raise support for three new OpenPTrack deployments and deepen the open source process. By collaborating with artists and educators not initially involved in OpenPTrack’s development, the research team will be able to expand upon efforts to make the […]