**6. Conclusions and Future Work**

In this paper, we have presented an AI-assisted method for automatic assessment of human motor behavior from video recorded using a single RGB camera. Results demonstrate that the multi-stage method, which includes pose estimation, target tracking and action classification, provides an accurate target-specific classification of activities in the presence of other human actors and is robust to changing environments. The work presented herein focused on the classification of basic postures (sitting, standing and walking) and transitions (sitting-to-standing and standing-to-sitting), which commonly occur during the performance of many daily activities and are relevant to understanding the impact of diseases like Parkinson's disease and stroke on the functional ability of patients. This has laid the foundation for future research efforts that will be directed towards detecting and quantifying clinically meaningful information like detection of emergency events (e.g., falls, seizures) and assessment of symptom severity (e.g., gait impairments, tremor) in patients with various mobility limiting conditions. The proposed method is mainly intended for offline processing of video recording. To provide real-time detection of serious events (e.g., falls), future research efforts should focus on enabling real-time target tracking and action classification by developing more computationally efficient approaches. In addition, achieving high-resolution temporal localization of actions will be necessary to ensure accurate assessment of clinical events of interest (e.g., duration of a seizure) for certain medical applications. Lastly, the code and models developed during this work are being made available for the benefit of the broader research community (https://github.com/brezaei/PoseTrack\_ActionClassification).

**Author Contributions:** Conceptualization: B.R., S.O. and S.P.; data curation: B.R., Y.C., B.H., K.T. and K.E.; formal analysis: B.R.; investigation, B.H. and K.E.; methodology: B.R., S.O. and S.P.; project administration: K.E.; resources: B.H.; software: B.R.; supervision: S.O. and S.P.; writing—original draft: B.R. and S.P.; writing—review and editing: B.R., Y.C., B.H., K.T., K.E., S.O. and S.P.

**Funding:** Funding for this work was provided by Pfizer, Inc.

**Acknowledgments:** The authors would like to acknowledge the BlueSky project team for generating the data that made this work possible. Specifically, we would like to acknowledge Hao Zhang, Steve Amato, Vesper Ramos, Paul Wacnik and Tairmae Kangarloo for their contributions to study design and data collection.

**Conflicts of Interest:** S.P., Y.C. and M.K.E. are employees of Pfizer, Inc. The remaining authors declare no conflict of interest.
