Support for the Microsoft Kinect2, which will provide enhanced range and accuracy, is now in testing at UCLA! The detection code leverages GPU processing to handle the higher resolution RGBD images.
We are currently seeking developers to create easy-to-use front-end interfaces for OpenPTrack—fully automating the steps of designing the camera network and setting up, configuring, calibrating, operating, and debugging the system. Required Skills: Experience with C++. Experience with Robot Operating System (ROS). Nice To Have: Experience developing web interfaces. Experience with ROS tools for developing web […]
OpenPTrack now has alpha support for publishing data via Named Data Networking (NDN), a future internet architecture project being led by UCLA in collaboration with seven other campuses and a consortium including Cisco, Intel, Panasonic, and others. Please contact the developers for more information, or browse the “ndn” branch on Github.
OpenPTrack is being used by UCLA REMAP’s Interpretive Media Laboratory (IMLab) to develop prototype interpretive exhibits for the Los Angeles State Historic Park Welcome Pavilion. More information about this deployment is available here. Please watch the short video about IMLab and the Welcome Pavilion below.
Whorl, an installation by Damon Seeley, Eitan Mendelowitz, and David Glicksman, is the first third-party interactive work created with OpenPTrack. It is installed as a demo at the UCLA Interpretive Media Laboratory‘s space in Downtown Los Angeles. Please watch the video below.
UCLA REMAP is now testing OpenPTrack as a primary interface for body-based interaction with interpretive media content for the Welcome Pavilion at the new Los Angeles State Historic Park (LASHP). A full-sized mockup of the Pavilion (to be completed with the new Park in 2015) is installed at the nearby space of the Interpretive Media Laboratory […]
In September 2014, a prototype deployment of OpenPTrack was installed semi-permanently in UCLA Lab School’s Gregg G. Juárez Community Hall, using seven Kinect 1, one stereo camera pair, and two Mesa SwissRanger 4500 imagers. The installation will support the NSF-funded Science Through Technology Enhanced Play (STEP) project, creating a physically interactive educational space for embodied play. STEP […]
Initial support for custom stereo camera rigs has been added, with drivers for the Point Grey BlackFly ethernet-based cameras. Preliminary documentation can be found here; it is currently being wikified.
The OpenPTrack project was kicked off in the Winter of 2013 to provide an open source solution for scalable, multi-imager person tracking, aimed to support applications in education, arts and culture. An alpha version of the code is currently available in our GitHub repository and supports multi-imager tracking using the Microsoft Kinect 360 and Mesa […]