Jaw Tracking System
An optical measurement system for open-source tracking of jaw motions - enabling design, control, and validation of rehabilitative jaw exoskeletons.
The JawTrackingSystem (JTS) is a customizable, low-cost, precise, non-invasive, and biocompatible jaw tracking system based on optical motion capture technology. Designed to address the need for accessible and adaptable research tools in temporomandibular disorder (TMD) research and jaw exoskeleton development.

Abstract
Precise tracking of jaw kinematics is crucial for diagnosing various musculoskeletal and neuromuscular diseases affecting the masticatory system and for advancing rehabilitative devices such as jaw exoskeletons. Our system encompasses a complete pipeline from data acquisition, processing, and kinematic analysis to filtering, visualization, and data storage.
Key Performance Metrics:
- Precision: $(182 ± 47) \, \mu\mathrm{m}$ and $(0.126 ± 0.034)\, °$
- Sampling Rate: Up to $200\, \mathrm{Hz}$
- Cost: $<\$ 100$ for hardware components + OMoCap system
- Open Source: Fully customizable and extensible
System Architecture
Hardware Components
The jaw tracking system consists of four key components:
- Mandibular Tracking Array (MTA): A reflective marker-equipped mouthpiece rigidly attached to the lower teeth via custom, single-use adapters and temporary dental adhesive
- Cranial Reference Array (CRA): A lightweight, marker-equipped headpiece positioned on the forehead to track head movements
- Digitizing Pointer (DP): A marker-equipped pointer with known geometry for digitizing anatomical tooth landmarks
- Optical Motion Capture (OMoCap) System: Tracks 3D positions of all markers with high precision
Component | Function | Material Requirements |
---|---|---|
Mouthpiece (MTA) | Jaw attachment point | Standard 3D printing materials |
Teeth Attachment | Secure mounting to dental structure | Biocompatible PETG, IBT Resin, … |
Headpiece (CRA) | Reference frame for head motion | Biocompatible PETG, IBT Resin, … |
Digitizing Pointer (DP) | Anatomical landmark identification | High precision printing |
Additional Components
- 2BA thread for dart point attachment
- Dart points for precise landmark digitization, attached to the DP
- Reflective fibers/tape for motion capture markers
- Temporary dental glue for secure attachment of the MTA to the teeth
Mathematical Framework
Coordinate System Definitions
The system employs multiple coordinate frames for precise kinematic analysis:
- $O_{\text{OMoCap}}$: Global optical motion capture coordinate system
- $CS_{\text{MTA}}$: Local coordinate system of the mandibular tracking array
- $CS_{\text{CRA}}$: Local coordinate system of the cranial reference array
- $CS_{\text{Mand}}^{\text{Anat}}$: Anatomical landmark coordinate system of the mandible
- $CS_{\text{Max}}^{\text{Anat}}$: Anatomical landmark coordinate system of the maxilla
- $O_{\text{VM}}$: Global virtual jaw model coordinate system
All transformations are represented as homogeneous matrices $\mathbf{T} \in \mathbb{R}^{4 \times 4}$.
Calibration and Landmark Registration
Six anatomical landmarks, three on the mandibular and three on the maxillary teeth, are used to define the anatomical coordinate systems:
- Three mandibular points: $\mathbf{P}_{\text{Mand},i}$ where $i = 1, 2, 3$
- Three maxillary points: $\mathbf{P}_{\text{Max},i}$ where $i = 1, 2, 3$
These points define the anatomical coordinate systems $CS_{\text{Mand}}^{\text{Anat}}$ and $CS_{\text{Max}}^{\text{Anat}}$.
Kinematic Analysis Pipeline
1. Static Transformation Determination
The constant transformation between the MTA and anatomical mandible frame: \({}^{CS_{\text{MTA}}}\mathbf{T}_{CS_{\text{Mand}}^{\text{Anat}}}\)
2. Dynamic Tracking
At time $t$, the OMoCap system provides:
- \({}^{O_{\text{OMoCap}}}\mathbf{T}_{CS_{\text{MTA}}}(t)\): MTA pose in global frame
- \({}^{O_{\text{OMoCap}}}\mathbf{T}_{CS_{\text{CRA}}}(t)\): CRA pose in global frame
3. Relative Motion Calculation
The pose of the anatomical mandible frame relative to the CRA:
\[{}^{CS_{\text{CRA}}}\mathbf{T}_{CS_{\text{Mand}}^{\text{Anat}}}(t) = \left({}^{O_{\text{OMoCap}}}\mathbf{T}_{CS_{\text{CRA}}}(t)\right)^{-1} \cdot {}^{O_{\text{OMoCap}}}\mathbf{T}_{CS_{\text{MTA}}}(t) \cdot {}^{CS_{\text{MTA}}}\mathbf{T}_{CS_{\text{Mand}}^{\text{Anat}}}\]4. Virtual Model Registration
Using the Kabsch algorithm with maxillary landmarks \(\mathbf{P}_{\text{Max},i}\) defined in $O_{\text{VM}}$ and $CS_{\text{CRA}}$ to find the optimal transformation \({}^{O_{\text{VM}}}\mathbf{T}_{CS_{\text{CRA}}}\).
The final mandible pose in the virtual model frame:
\[{}^{O_{\text{VM}}}\mathbf{T}_{CS_{\text{Mand}}^{\text{Anat}}}(t) = {}^{O_{\text{VM}}}\mathbf{T}_{CS_{\text{CRA}}} \cdot {}^{CS_{\text{CRA}}}\mathbf{T}_{CS_{\text{Mand}}^{\text{Anat}}}(t)\]Software Implementation
Core Architecture
from jts.core import JawMotionAnalysis, ConfigManager
# Load configuration
config = ConfigManager.load_config('config.json')
# Initialize analysis pipeline
analysis = JawMotionAnalysis(config)
# Run complete analysis
results = analysis.run_analysis()
Key Modules
-
core.py
: Main analysis pipeline and orchestration -
qualisys.py
: Motion capture data interface with real-time streaming -
calibration_controllers.py
: Anatomical landmark registration using Kabsch algorithm -
helper.py
: Coordinate transformations and data export utilities -
plotly_visualization.py
: Interactive $3$D visualization -
precision_analysis.py
: Statistical analysis and filtering (Savitzky-Golay, Butterworth)
Experimental Validation
Study Design
A preliminary study with four participants (Ethics approval: TU Darmstadt EK $3$/$2025$) evaluated system performance across various jaw movements:
- Opening and closing motions
- Protrusion and retrusion
- Lateral movements
- Cyclic motions
Data Acquisition
- Sampling Rate: $200\, \mathrm{Hz}$
- Motion Capture System: Qualisys Oqus with three cameras
- Post-calibration Accuracy (OMoCap): $0.6\, \mathrm{mm}$ average
Signal Processing
Noise Analysis: Raw signals filtered using fourth-order Butterworth low-pass filter:
- Cutoff frequency: $4.5 ± 0.5\, \mathrm{Hz}$ (bidirectional to prevent phase shift)
- Selection criterion: $10\, \mathrm{dB}$ drop in signal-to-noise ratio
Precision Estimation: Analysis of residual signals after filtering yielded:
- Translational precision: $(182 ± 47) \, \mu\mathrm{m}$
- Rotational precision: $(0.126 ± 0.034)\, °$
Clinical Applications
Temporomandibular Disorder Research
- Quantitative analysis of jaw movement disorders
- Treatment monitoring and rehabilitation progress assessment
- Biomechanical studies of normal vs. pathological function
Jaw Exoskeleton Development
- Sensor calibration and validation for active rehabilitation devices
- Control algorithm development using precise kinematic feedback
- Safety validation through motion analysis
Diagnostic Applications
- Objective assessment of masticatory function
- Pre/post-surgical evaluation of jaw mobility
- Custom treatment protocol development
Advantages Over Commercial Systems
Feature | JTS | Commercial Systems |
---|---|---|
Cost | $<\$ 100$ + OMoCap | $>\$ 30,000$ |
Customization | Full source access | Limited |
Biocompatibility | Medical-grade materials | Often proprietary |
Real-time Capability | Yes (with streaming OMoCap) | Partially |
Open Source | Complete system | Proprietary |
Research Flexibility | Unlimited modifications | Vendor restrictions |
Technical Specifications
- Python: $3.10+$ compatibility
- Motion Capture: Qualisys native support, extensible to Vicon, OptiTrack, …
- Data Formats: HDF5 export, JSON configuration
- Visualization: Plotly-based interactive $3$D plots
- Filtering: Savitzky-Golay bidirectional
- Registration: Kabsch algorithm for optimal point-set alignment
- Testing: Comprehensive pytest suite
- Real-time: Support for real-time data streaming and processing, but not yet evaluated and tested
Installation and Usage
Quick Installation
# From PyPI
pip install jaw-tracking-system
# From GitHub (latest development)
pip install git+https://github.com/paulotto/jaw_tracking_system.git
Basic Usage
# Command line interface
python -m jts.core path/to/config.json --verbose --plot
# Python API
from jts.core import JawMotionAnalysis
analysis = JawMotionAnalysis(config_path)
results = analysis.run_analysis()
Future Development
Technical Enhancements
- Real-time processing validation
- Machine learning for automated landmark detection
- Additional motion capture system integration
- Enhanced calibration using spherical tool tips
Clinical Translation
- Validation studies against gold-standard systems
- Phantom jaw models for accuracy assessment
- Clinical interface development for non-technical users
Repository and Resources
GitHub repository: JawTrackingSystem (JTS)
- Code: Documented Python codebase
- 3D models: STL files and FreeCAD projects for hardware components
- Example config files: Sample configuration files
Citation
@InProceedings{mueller2025jawtracking,
title={An Optical Measurement System for Open-Source Tracking of Jaw Motions},
author={Müller, Paul-Otto and Suppelt, Sven and Kupnik, Mario and {von Stryk}, Oskar},
booktitle = {2025 IEEE Sensors, Vancouver, Canada},
year={2025},
publisher = {IEEE},
note={Accepted}
}
Authors and Acknowledgments
- Paul-Otto Müller, Simulation, Systems Optimization and Robotics Group, TU Darmstadt
- Sven Suppelt, Measurement and Sensor Technology Group, TU Darmstadt
- Mario Kupnik, Measurement and Sensor Technology Group, TU Darmstadt
- Oskar von Stryk, Simulation, Systems Optimization and Robotics Group, TU Darmstadt
Partial funding: German Research Foundation (DFG) within RTG $2761$ LokoAssist (Grant no. $450821862$)