Multimodal Collaboration Analytics

This project reviews empirical publications that used high-frequency data collection tools to capture facets of small collaborative groups—i.e., papers that conduct Multimodal Collaboration Analytics (MMCA) research. For the scope of this paper, we focus on: (1) the sensor-based metrics computed from multimodal data sources (e.g., speech, gaze, face, body, physiological, log data); (2) outcome measures, or operationalizations of collaborative constructs (e.g., group performance, conditions for effective collaboration); (3) the connections found by researchers between sensor-based metrics and outcomes; and (4) how theory was used to inform these connections. An added contribution is an interactive online visualization where researchers can explore collaborative sensor-based metrics, collaborative constructs, and how the two are connected. 

References

  • Schneider, B., Sung, G., Chng, E., & Yang, S. (2022). How Can High-Frequency Sensors Capture Collaboration? A Review of the Empirical Links Between Multimodal Metrics and Collaborative Constructs. Sensors, 21(24), 8185.