Collaboration is at the heart of many complex tasks, and mixed reality (MR) offers a powerful new medium to support it. Understanding how teams coordinate in immersive environments is critical for designing effective MR applications that support collaborative work. However, existing methods rely on external observation systems and manual annotation, lacking deployable solutions for capturing temporal collaboration dynamics. We present MURMR, a system with two complementary modules that passively analyze multimodal interaction data from commodity MR headsets. Our structural analysis module constructs automated sociograms revealing group organization and roles, while our temporal analysis module performs unsupervised clustering to identify moment-to-moment dyad behavior patterns. Through a 48-participant study with egocentric video validation, we demonstrate that the structural module captures stable interaction patterns while the temporal module reveals substantial behavioral variability that session-level approaches miss. This dual-module architecture advances collaboration research by establishing that structural and temporal dynamics require separate analytical approaches, enabling both real-time group monitoring and detailed behavioral understanding in immersive collaborative environments.
翻译:暂无翻译