A micro-computer hosting NotochordOS and the core software in charge of reading the data incoming from the sensors.
Technically a "shield" for the RaspberryPi. It gives you accesst to 6 gates, to which you can connect 6 branches of KCeptor sensors.
The sensing unit of Chordata Motion. It contains a 3D gyroscope, accelerometer and magnetometer and allows to track the orientation of a body segment. A performer has to wear on of these for each body part you want to track. They expose two RJ45 connectors, and are daisy chainable.
A standard RJ45 patch cables, normally used for ethernet connectivity. It's used to connect the Hub++ and KCeptors++.
A regular power bank, as the one you normally use to charge cellphones or other compatible devices. It's used to power the Chordata hardware. It's recommended that you use a powerbank of 10000mAh or superior.
They allow to attack the KCeptors and the rest of the hardware to the performer's body.
A custom version of Rapsi OS. Preloaded with all the software needed for working with Chordata Motion systems. It's the easiest way of setting up all the required software in your Raspberry pi
The core software unit of the Chordata system. It takes care of all the low level operations, and main signal processing tasks. The current version of the notochord is a python module which allows great flexibility for Chordata power users. Regular users don't need to use it directly since it comes installed and configured in the Notochord OS.
The main and oldest client for the Chordata system. It allows to visualize and manage the capture process from the Blender GUI. This allows you to have your capture recorded directly in one the most powerful 3D manipulation tool available.
A new client for the Chordata software. It currently allows you to perform basic capture managment tasks, such as configuring the notochord, starting or stopping it, or updating components of the Notochord OS from any browser. Our dev team is currently working on expanding it, to include all basic motion capture tasks. It will allow to manage and visualize the complete mocap pipeline, including a 3D visualization of the captured avatar in real time. This will allow users to capture from any device, including smatphones.