Overview

GitHub: https://github.com/daviddorf2023/prosthetic_project 

Traditional prosthetic devices use electromyography (EMG) control and lack the ability of finger abduction. A new approach to prosthetics using flexible materials and photoplethysmography (PPG) or mechanomyography (MMG) sensors could allow for more complex gestures and improved dexterity when manipulating objects. Advancements in the field of 3D printing can allow for semi-hollow devices made of stretchable materials, like thermoplastic polyurethane (TPU), to be employed without sacrificing resilience to the elements. Combining a novel MMG/PPG control system with a prosthetic with hexagonal infill 3D printing pattern and biomimetic joints containing flexible MCP ligaments can allow for a lighter, more affordable, and durable prosthetic device.

Biosensor Wristband Data

IMG_1527.MOV

Figure 1: Data for one PPG sensor on the subject forearm during flexion and extension tasks. This study was later conducted again using two sensors with a multiplexer using a 3D printed wristband, with similar results and improved spatial resolution across the forearm tendons.

Figure 2: Time series data conducted with two sensors and an I2C multiplexer to handle address conflicts on the communication bus. There was a significant reduction in the resolution attainable by the system as a result of the new configuration, but a variety of gestures were still achievable. The plateau regions reflected  pinky side flexion, and spikes above that plateau were "trigger" or "tool" grip flexion. Large spikes without plateau regions were full hand contractions.

Prosthetic Components

Bionic Hand

Materials: TPU 95A filament, nylon fishing line, M2.5 screws, DYNAMIXEL XC330-M288-T motors, OpenRB-150 motor driver/microcontroller, 2000mAh LiPo

Sensor Wristband

Materials: TPU 95A filament, MAX30105 PPG sensors, TCA9548A multiplexer, Qwiic connectors, M2.5 screws, QWIIC connectors, wires. Likely to be all on a much more compact PCB in the future 

Communication Architecture

Conclusion

The project was a deep dive into the alternatives for traditional prosthetic control. Overall, the MMG data was not very promising and showed a lack of spatial resolution and an inability to record held grasps, only immediate muscle action. More data on the MMG experiments can be found within the figures folder of the GitHub attached above. PPG experiments proved extremely promising and could accurately control the DYNAMIXEL motors within the bionic hand. 

Signal processing algorithms were developed as a result of this project which could be used for other sensor fusion applications besides the biomedical/robotics space. Examples include a circular buffer subtraction technique, smoothing functions, and sensor output division/normalization. Additional techniques such as FFTs, neural networks, and window functions were explored for future use, especially for larger datasets with a higher number of sensors. Other potential improvements include custom PCBs to create a PPG sensor array, accelerometer data to account for motion creating sensor inaccuracies, and an exploration of other materials that could enable the hand design itself to be more lightweight and dextrous.