As part of the V2 Gnocchi update, OpenPTrack now uses the open source software YOLO V2 for object tracking alongside person tracking and its new pose recognition capabilities. YOLO can track many everyday objects off-the-shelf, and can also be trained to track newly introduced objects. It is already being used by UCLA and Indiana University STEP research teams, who are incorporating tracking of custom sets of objects into elementary school classrooms.
Additionally, two new updates are in the works for OPT object tracking:
—Custom software to guide users through the YOLO training process while automatically annotating data, reducing the manual work associated with creating machine learning training sets, and finally, automatically exporting the annotated data to YOLO.
—And YOLO V3 will be incorporated in the near future.
A second method for tracking objects is also included in the Gnocchi update, using CamShift. It is outlined in this paper.
OpenPTrack v2 Object Tracking from UCLA REMAP on Vimeo.