top of page

From Robotic Legs to Tongue Controls: New Standards of Accessibility in 2026

People with prosthetics and crutches walk in a bright, modern atrium, engaging with a digital interface; mood is futuristic and dynamic.

Abstract

The trajectory of assistive technology (AT) has historically been defined by a progression from passive mechanical aids to microprocessor-controlled devices. However, the period spanning 2024 to early 2025 marks a distinct paradigm shift toward "embodied integration"—hardware that does not merely support the user but integrates computationally and biologically with the user's intent. This report provides an exhaustive analysis of recent breakthroughs in wearable accessibility hardware, focusing on three primary domains: sensory substitution via electrotactile feedback, intraoral human-computer interaction (HCI), and active robotic prosthetics. Through a detailed examination of innovations such as the "Seeing with the Hands" project from the University of Chicago, the Augmental MouthPad^, and the BionicM Bio Leg, we explore how novel sensor placements, material sciences, and artificial intelligence (AI) are redefining the boundaries of human capability. We further investigate the emerging field of soft robotics, where light-driven elastomers and sim-to-real reinforcement learning are solving long-standing latency and calibration challenges in exoskeletons. This review synthesizes technical specifications, clinical outcomes, and engineering methodologies to present a comprehensive outlook on the state of adaptive technology.

1. Introduction: The Era of Embodied Intelligence

The field of biomedical engineering and accessibility hardware is currently witnessing a renaissance driven by the convergence of miniaturized sensing, high-density battery technology, and edge-computing artificial intelligence. For decades, the primary goal of accessibility hardware was functional replacement—providing a rigid prosthetic limb to replace a missing leg or a cane to detect an obstacle. In 2025, the objective has shifted toward functional restoration and sensory augmentation, where devices actively interpret the environment and the user’s neural drive to execute complex tasks with near-biological fluidity.1

This shift is characterized by a move away from "command-and-control" interfaces, where the user must explicitly direct the device, toward "intent-recognition" systems. In these newer systems, the hardware anticipates the user's needs through sensor fusion—the aggregation of data from inertial measurement units (IMUs), electromyography (EMG), and computer vision.3 The result is a seamless loop of perception and action, reducing the cognitive load on the user.

Furthermore, the form factor of these devices is undergoing a radical transformation. We are seeing a departure from bulky, medical-looking equipment toward discreet, socially acceptable wearables. Whether it is an intraoral trackpad disguised as a dental retainer or a soft robotic exosuit that fits under clothing, the "invisibility" of the technology is becoming as important as its utility.5 This report delves into the specific engineering breakthroughs that have made this possible, analyzing the mechanical, electrical, and computational architectures that underpin the next generation of accessibility tools.

2. Sensory Substitution: The Tactile Retina

One of the most profound challenges in accessibility engineering is the restoration of vision. While retinal implants and cortical stimulation have garnered significant attention, they remain highly invasive and expensive. Sensory substitution—the process of converting information from one sensory modality (e.g., vision) into another (e.g., touch or hearing)—offers a non-invasive alternative. The years 2024 and 2025 have yielded a critical breakthrough in this domain, specifically regarding how tactile information is mapped to the body to support manual interaction.7

2.1 The "Seeing with the Hands" Project

Researchers at the University of Chicago have introduced a novel paradigm in sensory substitution that fundamentally rethinks the "viewpoint" of assistive devices. Traditional visual assistive tools, such as smart glasses, locate the camera on the head, aligning the machine's view with the user's face. While effective for navigation, this "head-centric" perspective is suboptimal for manual tasks, such as grasping a cup or finding a tool on a workbench. The head often looks in a different direction than where the hands are operating, creating a disconnect between the sensor and the effector.9

The "Seeing with the Hands" project addresses this by relocating the sensor (camera) to the wrist and the actuator (display) to the back of the hand. This "hand-centric" architecture aligns the sensory input with the motor output, allowing users to perceive the object relative to their hand's position.8

2.1.1 Electrotactile Transduction Mechanism

Unlike previous haptic devices that relied on bulky vibrotactile motors (eccentric rotating mass or linear resonant actuators), the University of Chicago team utilized an electrotactile display.

  • Actuation Principle: The system employs a 5x6 array of electrode pairs printed on a thin, flexible substrate. Instead of creating mechanical vibration, the device passes a small, controlled electrical current (typically 3-5 mA) through the skin.

  • Receptor Activation: This current stimulates the cutaneous mechanoreceptors (specifically the Meissner’s corpuscles and Merkel cell neurite complexes) directly, bypassing the need for moving parts.

  • Perceptual Output: The user perceives this stimulation as a "tingle" or pressure sensation. By modulating the frequency and pulse width of the current, the system can "draw" complex shapes and edges on the back of the hand.7

The choice of electrotactile stimulation is critical for two reasons. First, it eliminates the mechanical latency of motors, allowing for near-instantaneous feedback. Second, it drastically reduces power consumption and bulk, enabling the display to be thin (approximately 0.1 mm) and conformable to the dorsal skin of the hand.9

2.1.2 Computer Vision and Image Processing

The wrist-mounted camera captures the scene in front of the palm. The video feed is processed in real-time to extract high-contrast edges and contours—a process known as edge detection (likely using Canny or Sobel algorithms, though specific proprietary algorithms may vary). These contours are then downsampled to the 5x6 resolution of the electrode array.11

Despite the low resolution (30 pixels total), the temporal resolution of the human tactile system allows users to "scan" the environment. By moving their hand, users integrate the low-resolution frames over time to build a mental model of the object's shape, similar to how foveated vision works in sighted individuals.8

2.2 Clinical Implications and Ergonomics

The primary innovation of this system lies in its support for "hand preshaping." In natural grasping, a sighted person shapes their hand to match the object (e.g., forming a 'C' shape for a cup) before touching it. Blind individuals typically must touch the object first to determine its shape, risking knocking it over.

In user studies conducted for the CHI 2025 conference, participants using the "Seeing with the Hands" device demonstrated the ability to preshape their hands based on the tactile feedback received while hovering over the object. This capability significantly improved the ergonomics of manual interaction. Participants reported less need to crouch or lean forward (actions often required to bring the head-mounted cameras closer to objects) because the "eye" was effectively on their hand.10

Qualitative feedback from blind and low-vision (BLV) participants highlighted a distinct preference for the "hand perspective" for manipulation tasks, noting that it allowed for exploring tight spaces and occluded areas where the head could not reach.9 This suggests a future where sensory substitution devices are not monolithic but modular—using smart glasses for global navigation and wrist-worn haptics for local manipulation.

2.3 Broader Haptic and Audio Innovations

While the University of Chicago project focuses on manual interaction, other 2025 breakthroughs target whole-body navigation and communication.

  • Strap Technologies' Ara: This device represents the evolution of the "wearable cane." Equipped with sensors on the head, chest, and lower body, it interprets the user's surroundings and translates obstacle data into haptic feedback on the chest. This somatic mapping allows users to "feel" obstacles at different elevations (e.g., a low branch vs. a curb) without auditory distraction.13

  • Envision Glasses: In the auditory domain, Envision has integrated advanced "Scene Recognition" and multilingual support into their smart glasses. By 2025, these devices have evolved from simple text readers to context-aware AI assistants capable of describing complex environments and identifying objects in real-time, bridging the gap between visual perception and semantic understanding.1

3. Intraoral Interfaces: The Tongue as a Controller

While the hand is the primary manipulator for most, individuals with high-level spinal cord injuries (SCI), such as quadriplegia, often lose manual dexterity. For these users, the tongue represents an underutilized motor resource. The tongue is directly connected to the brain via cranial nerves (hypoglossal for motor, trigeminal/facial/glossopharyngeal for sensory), bypassing the spinal cord entirely. It possesses immense dexterity and a large representation in the motor cortex (the "homunculus"), making it an ideal candidate for fine motor control.14

3.1 The Augmental MouthPad^

In 2024 and 2025, the startup Augmental, a spin-off from the MIT Media Lab, commercialized the MouthPad^, a breakthrough intraoral interface. Unlike legacy "sip-and-puff" mechanisms or external chin joysticks, the MouthPad^ is fully contained within the oral cavity, resembling a dental retainer.5

3.1.1 Hardware Architecture and Sensors

The device integrates a suite of sensors into a custom-fitted dental resin body.

  • Capacitive Touchpad: Located on the roof of the mouth (the palate), a pressure-sensitive trackpad detects the position and movement of the tongue tip. This allows the user to control a cursor on a screen with high precision.

  • Inertial Motion Unit (IMU): The device contains accelerometers and gyroscopes to track head movements. This enables a hybrid control mode where head rotation moves the cursor, and tongue gestures perform clicks or scrolls.5

  • Pressure/Sip Detection: Specific gestures, such as a "sip" motion (creating negative pressure) or tongue presses, are mapped to mouse clicks (left click, right click, drag).

3.1.2 Engineering for the Oral Environment

Designing electronics for the mouth presents unique challenges regarding biocompatibility and durability. The oral environment is wet, enzymatic, and subject to mechanical crushing forces.

  • Encapsulation: The electronics are fully encapsulated in a medical-grade, biocompatible dental resin. This resin is 3D printed based on a digital intraoral scan of the user's mouth, ensuring a perfect fit and seal against saliva.5

  • Battery and Power: The device houses a miniaturized lithium-ion battery capable of 5+ hours of continuous use. Charging is achieved via a custom case, likely utilizing pogo pins or inductive charging to maintain the sealed integrity of the device. The charge time is approximately 1.5 hours.16

  • Wireless Connectivity: The MouthPad^ utilizes Bluetooth Low Energy (BLE) to communicate with computers, smartphones, and tablets. It is recognized as a standard Human Interface Device (HID), meaning it requires no specialized driver installation—a crucial feature for accessibility across different operating systems (iOS, Android, Windows, macOS).18

3.1.3 Impact on Agency and Privacy

Beyond the technical specifications, the MouthPad^ offers a significant psychological benefit: invisibility. Traditional assistive devices for quadriplegia are often conspicuous, marking the user as "disabled" to the public. The MouthPad^ is invisible during use, and because it does not impede speech (due to its slim profile on the palate), users can converse while operating their devices.19 This restoration of privacy and social invisibility is a recurring theme in 2025's accessibility tech, moving from "medical necessity" to "stealth augmentation."

4. Active Robotics: The Bionic Leg

For amputees, particularly transfemoral (above-knee) amputees, the loss of the knee joint represents a catastrophic loss of biomechanical function. The knee is responsible for generating power during the sit-to-stand transition and providing eccentric braking during stair descent. Traditional passive prosthetic knees (even microprocessor-controlled ones) can provide resistance but cannot generate positive power. They rely on the user's hip strength to swing the leg, which is energetically inefficient and leads to asymmetry and long-term joint damage.21

4.1 BionicM Bio Leg

The Bio Leg, developed by the Japanese startup BionicM (founded by Xiaojun Sun, a Ph.D. from the University of Tokyo), represents the forefront of "active" prosthetics. Awarded "Best of Innovation" at CES 2025, the Bio Leg distinguishes itself by integrating a high-torque motor drive that actively assists the user.22

4.1.1 Actuation and Biomechanics

The core differentiator of the Bio Leg is its ability to mimic physiological muscle function.

  • Powered Extension: When a user attempts to stand up from a chair, the Bio Leg detects the shift in weight and thigh angle and engages the motor to extend the knee, actively lifting the user. This replaces the function of the quadriceps, which is lost in transfemoral amputation.24

  • Stair Ascent: Passive knees require amputees to take stairs one at a time, leading with the sound leg. The Bio Leg provides the propulsive force to lift the user up the step, enabling reciprocal stair climbing (step-over-step).24

  • Drop Foot Prevention: During the swing phase of walking, the motor actively flexes the knee to ensure toe clearance, reducing the risk of tripping.21

4.1.2 Sensor Fusion and Control Logic

The control system of the Bio Leg relies on a sophisticated array of sensors to infer user intent.

  • Sensor Suite: The device incorporates inertial sensors (IMUs) to detect limb orientation and acceleration, angle sensors to measure knee flexion (up to 132°), and load sensors (force) to detect ground contact and weight bearing.24

  • Intent Recognition: While some research prototypes utilize surface electromyography (sEMG) to read muscle signals directly from the residual limb, the commercial Bio Leg primarily utilizes mechanical sensor fusion to deduce intent from limb movement dynamics. However, research partnerships (e.g., with Tokyo University of Science) continue to explore EMG integration to reduce latency between thought and action.4

  • Finite State Machines (FSM): The software likely operates on a finite state machine model, switching between distinct modes (e.g., "Sitting," "Standing," "Level Walking," "Stair Ascent") based on the sensor data thresholds. The transition between these states must be seamless to prevent falls.4

4.1.3 Regulatory and Market Milestones

A significant barrier to the adoption of powered prosthetics is cost and insurance coverage. In a landmark achievement for 2024, BionicM received approval for the PDAC (Pricing, Data Analysis and Coding) code L5859 in the United States. This code specifically covers "prosthetic knees with the function of assisting knee bending and extending using motor power," making the Bio Leg one of the few reimbursable powered knees in the U.S. market and the first from an Asian manufacturer.25 This regulatory breakthrough is as critical as the engineering itself, as it opens the technology to a mass market.

5. Soft Robotics and Material Intelligence

While the Bio Leg represents the pinnacle of rigid robotics, a parallel revolution is occurring in soft robotics. Rigid exoskeletons can be heavy, misaligned with biological joints, and uncomfortable. Soft robotics utilizes compliant materials (elastomers, textiles, fluids) to create actuators that conform to the body.

5.1 Light-Driven Actuators (Rice University)

In 2025, researchers at Rice University unveiled a soft robotic arm made from azobenzene liquid crystal elastomer (LCE). This material is photothermal, meaning it changes shape (contracts or bends) when exposed to light.

  • Mechanism: The researchers used a patterned laser to heat specific regions of the LCE arm. The heat causes the liquid crystals to disorder, leading to contraction. By controlling the light pattern, they could induce complex, multi-degree-of-freedom movements without any onboard wires, batteries, or motors.27

  • Implications: While currently in the proof-of-concept stage, this technology suggests a future of "tetherless" assistive devices. Imagine a surgical tool or an intra-body assist device that is powered and controlled entirely by external light, eliminating the need for heavy battery packs or risk of electrical shorts inside the body.

5.2 AI-Driven Soft Exosuits (NCSU & Harvard)

A major challenge with wearable robots is calibration. Every human walks differently, and an exoskeleton must be tuned to the individual's gait to provide a metabolic benefit. Traditional "human-in-the-loop" optimization requires hours of walking in a lab, which is exhausting for patients.

  • Sim-to-Real Reinforcement Learning: Researchers at North Carolina State University (NCSU) developed a new AI framework published in Nature (2024/2025). They trained a neural network in a computer simulation (millions of virtual steps) to control an exoskeleton. This policy was then transferred to a physical exoskeleton.28

  • Results: The "embodied AI" allowed the exoskeleton to immediately assist users in walking, running, and climbing stairs without any manual calibration. It reduced metabolic energy consumption by 24.3% during walking.28 This "plug-and-play" capability is a prerequisite for consumer adoption of exosuits.

At Harvard's Wyss Institute, similar soft exosuit technology is being applied to Parkinson's disease. By applying mechanical forces to the hips, the soft suit helps lengthen the stride and eliminate "freezing of gait" (a symptom where patients feel their feet are glued to the floor). The soft form factor means the device can be worn under regular clothing, reducing social stigma.6

6. Emerging Sensor Trends and Challenges

6.1 The Rise of Multimodal Sensing

The review of 2025 literature indicates a definitive move away from single-modality sensors. Where 2020-era wearables might rely solely on an accelerometer (IMU) to count steps, 2025 devices fuse IMU data with physiological signals.

  • EMG + IMU: The integration of electromyography (muscle electrical activity) allows devices to detect the onset of movement intent milliseconds before the limb actually moves. This reduces the "drag" feeling in prosthetics.4

  • Computer Vision + Haptics: As seen in the Chicago project and Envision glasses, the camera is becoming a standard sensor for non-visual tasks, feeding data into haptic or auditory outputs.1

6.2 The Battery Bottleneck

Despite these advances, power density remains the Achilles' heel of wearable robotics. The Bio Leg weighs 3kg, a significant portion of which is the lithium-ion battery required to drive the high-torque motor.24 The MouthPad^ manages 5 hours of life because it uses low-power BLE, but active mobility devices require significantly more energy. Future breakthroughs in solid-state batteries or energy harvesting (using the body's heat or motion to recharge) are desperately needed to extend the autonomy of these systems beyond a few hours.

6.3 Privacy and Ethics in the Age of Cameras

The proliferation of camera-based wearables (smart glasses, wrist cameras) raises significant privacy concerns. Bystanders may not consent to being filmed by an accessibility device. Current research explores "privacy-preserving" computer vision, where the camera processes data at the edge (on the device) and discards the images immediately, extracting only abstract features (e.g., "obstacle ahead") without storing faces or identifiable details. However, the tension between the user's need for information and the public's right to privacy remains a critical ethical debate in 2025.30

6.4 Biocompatibility of "Intimate" Computing

As wearables move inside the mouth (MouthPad^) or directly stimulate the skin (electrotactile), biocompatibility becomes paramount. Materials must not only be non-toxic but also resistant to bacterial colonization (biofouling). The use of dental-grade resins and encapsulated electronics in the MouthPad^ sets a standard for this new class of "intimate computing" devices.32

7. Conclusions

The landscape of wearable accessibility technology in 2025 is defined by the dissolution of boundaries. The boundary between user and machine is blurring through intent-recognition AI and soft interfaces. The boundary between different senses is being crossed by high-fidelity sensory substitution. And the boundary between "medical device" and "consumer electronic" is eroding as accessibility tools become sleeker, smarter, and more socially integrated.

The breakthroughs highlighted in this report—the hand-centric haptic vision of the University of Chicago, the invisible tongue control of Augmental, and the active bionic propulsion of BionicM—demonstrate that the future of hardware is not just about restoring lost function. It is about creating new, adaptive modalities of interaction that empower the human body to engage with the world in unprecedented ways. As these technologies mature and navigate the regulatory pathway to reimbursement, they promise to transform disability from a limitation of the body into a catalyst for technological innovation.

Table 1: Comparative Analysis of Key 2024-2025 Wearable Accessibility Hardware

Device

Developer

Primary Modality

Target User

Key Innovation

Regulatory Status

"Seeing with the Hands"

Univ. of Chicago

Vision to Electrotactile

Blind/Low Vision

Wrist-camera + Palmar haptics; supports hand preshaping

Research Prototype (CHI '25)

MouthPad^

Augmental

Tongue to Digital Cursor

Quadriplegia / SCI

Intraoral trackpad; invisible form factor; head tracking

Commercial Product (Consumer)

Bio Leg

BionicM

Active Motorized Knee

Transfemoral Amputees

Powered extension for sit-to-stand; reciprocal stair climbing

FDA Class II / PDAC L5859

Exosuit (Sim-to-Real)

NCSU / Harvard

Soft Robotic Exoskeleton

Stroke / Parkinson's

AI training in simulation for instant user adaptation

Research / Clinical Trials

Envision Glasses

Envision

Vision to Audio (AI)

Blind/Low Vision

Multilingual text-to-speech; advanced scene description

Commercial Product

Table 2: Technical Specifications of Selected Devices


Feature

Augmental MouthPad^

BionicM Bio Leg

Weight

~7.5 g

3.0 kg (inc. battery)

Dimensions

~30 x 50 x 80 mm (custom fit)

Build Height: 283 mm

Battery Life

5+ hours continuous

~5.7 hours 4

Charging Time

~1.5 hours

3 - 4 hours

Sensor Types

Capacitive Touch, IMU

IMU, Angle, Load Cell

Connectivity

Bluetooth Low Energy (BLE)

Proprietary / Bluetooth

Actuation

Passive (User movement)

Active Motor (High Torque)

Max Flexion

N/A

132°

Works cited

  1. What's Next for 2025: The Future of Assistive Technology & AI Assistants - Envision AI, accessed January 10, 2026, https://www.letsenvision.com/blog/future-assistive-technology

  2. Discover Top 10 Wearables Trends in 2025 - StartUs Insights, accessed January 10, 2026, https://www.startus-insights.com/innovators-guide/wearables-trends/

  3. Advancements in Wearable Sensor Technologies for Health Monitoring in Terms of Clinical Applications, Rehabilitation, and Disease Risk Assessment: Systematic Review - JMIR mHealth and uHealth, accessed January 10, 2026, https://mhealth.jmir.org/2026/1/e76084

  4. Design and Testing of an Emg-Controlled Semi-Active Knee Prosthesis - MDPI, accessed January 10, 2026, https://www.mdpi.com/1424-8220/25/24/7505

  5. CSUN 2025: Augmental MouthPad Smart Mouthwear - YouTube, accessed January 10, 2026, https://www.youtube.com/watch?v=dt5Vo_fxBto

  6. Soft robotic, wearable device improves walking for individual with Parkinson's disease, accessed January 10, 2026, https://www.sciencedaily.com/releases/2024/01/240105145104.htm

  7. Sensory Substitution Device Tingles Back Of Your Hand - Hackaday, accessed January 10, 2026, https://hackaday.com/2025/03/03/sensory-substitution-device-tingles-back-of-your-hand/

  8. Seeing with the Hands: A sensory substitution that supports manual interactions (CHI 2025 Live Talk) - YouTube, accessed January 10, 2026, https://www.youtube.com/watch?v=s1ebklQrWGg

  9. Seeing with the Hands | A sensory substitution that supports manual interactions, accessed January 10, 2026, https://seeingwiththehands.com/

  10. Seeing with the Hands: A Sensory Substitution That Supports Manual Interactions, accessed January 10, 2026, https://lab.plopes.org/published/2025-CHI-SeeingWithTheHands.pdf

  11. A wearable device that allows you to see things with your hands is developed - GIGAZINE, accessed January 10, 2026, https://gigazine.net/gsc_news/en/20250304-seeing-with-the-hands/

  12. Seeing with the Hands: A sensory substitution that supports manual interactions - YouTube, accessed January 10, 2026, https://www.youtube.com/watch?v=EP9Y2XWmlds

  13. Top 8 New Assistive Technologies for 2024 | Focus Care, accessed January 10, 2026, https://focuscare.com.au/blog/5-new-assistive-technologies-for-2024

  14. MouthPad^ | VML, accessed January 10, 2026, https://www.vml.com/work/mouthpad

  15. Mouth-based touchpad enables people living with paralysis to interact with computers | MIT News | Massachusetts Institute of Technology, accessed January 10, 2026, https://news.mit.edu/2024/mouth-based-touchpad-augmental-0605

  16. Augmental - Home, accessed January 10, 2026, https://www.augmental.tech/

  17. #3DStartup: Augmental's MouthPad^ Makes Technology More Accessible - 3Dnatives, accessed January 10, 2026, https://www.3dnatives.com/en/startup-of-the-month-augmentals-mouthpad-makes-technology-more-accessible-061020254/

  18. The MouthPad^ - Closing The Gap, accessed January 10, 2026, https://www.closingthegap.com/guide-product/the-mouthpad/

  19. FAQ - Augmental, accessed January 10, 2026, https://www.augmental.tech/faq

  20. Tech Tea Time: Hands-free Device Control with the MouthPad by Augmental, accessed January 10, 2026, https://www.tsbvi.edu/events/tech-tea-time-hands-free-device-control-with-the-mouthpad-by-augmental

  21. CES 2025: BionicM Bio Leg with Micro-Processor Knee - YouTube, accessed January 10, 2026, https://www.youtube.com/watch?v=j0ClpwWSsDg

  22. Bio Leg - CES, accessed January 10, 2026, https://www.ces.tech/ces-innovation-awards/2025/bio-leg/

  23. BionicM Bio Leg Gets CES Innovation Award - Rehab Management, accessed January 10, 2026, https://rehabpub.com/orthotics-prosthetics/prosthetics/bionicm-bio-leg-gets-ces-innovation-award/

  24. Product | BionicM, accessed January 10, 2026, https://bionicm.com/product/

  25. Powered Prosthetic Knee “Bio Leg” Has Obtained PDAC approval with 5 L-codes|BionicM, accessed January 10, 2026, https://db.bionicm.com/powered-prosthetic-knee-bio-leg-receives-insurance-coverage-approval-in-the-u-s/

  26. BionicM Signs Joint Research Agreement with Tokyo University of Science for “Bio Leg®”, accessed January 10, 2026, https://bionicm.com/newsDetail/9652/BionicM%20Signs%20Joint%20Research%20Agreement%20with%20Tokyo%20University%20of%20Science%20for%20%E2%80%9CBio%20Leg%C2%AE%E2%80%9D

  27. Rethinking how robots move: Light and AI drive precise motion in soft robotic arm developed at Rice | Rice News, accessed January 10, 2026, https://news.rice.edu/news/2025/rethinking-how-robots-move-light-and-ai-drive-precise-motion-soft-robotic-arm-developed

  28. Harnessing AI to enhance human mobility | NSF - U.S. National Science Foundation, accessed January 10, 2026, https://www.nsf.gov/science-matters/harnessing-ai-enhance-human-mobility

  29. AI-Powered Simulation Training Improves Human Performance in Robotic Exoskeletons, accessed January 10, 2026, https://news.ncsu.edu/2024/06/ai-training-robotic-exoskeletons/

  30. Wearable Computing - Challenges and opportunities for privacy protection, accessed January 10, 2026, https://www.priv.gc.ca/en/opc-actions-and-decisions/research/explore-privacy-research/2014/wc_201401/

  31. Privacy, ethics, transparency, and accountability in AI systems for wearable devices - PMC, accessed January 10, 2026, https://pmc.ncbi.nlm.nih.gov/articles/PMC12209263/

  32. Wearable Orofacial Technology and Orthodontics - PMC - NIH, accessed January 10, 2026, https://pmc.ncbi.nlm.nih.gov/articles/PMC9858298/

  33. Challenges in Developing Wearable Medical Devices: Navigating Biocompatibility and Material Selection - Mighty Studios, accessed January 10, 2026, https://www.mighty-studios.com/insights/biocompatibility-and-materials-selection-for-wearable-medical-devices

Comments


bottom of page