Research

Haptic Shared Control 

Numerous companies and academic groups are pushing to develop autonomous vehicles with the aim of freeing up attention for drivers and improving safety on the road. However, barriers remain for deployment of fully autonomous vehicles, including technical, legal, and social barriers.  Combining the best capacities of a human driver with the speed, accuracy, and tirelessness of automation will require an interface that is intuitive for the driver. In this project, we are developing  an interface based on haptic shared control, wherein the human driver and automation system communicate and negotiate authority over the steering angle by modulating not only imposed torque, but also their respective mechanical impedance. (by: Akshay Bhardwaj)

Shared Control in Vehicle Steering Across Routine and Off-Nominal Conditions

Experience with increasingly autonomous systems in aviation and other complex domains has shown that performance breakdowns tend to occur at transition points and in off-nominal conditions, rather than during routine operations. In particular, operators experience ‘automation surprises’ during transitions between levels of automation, in cases where the system acts in unanticipated or unexplained ways, and when it transfers control to the operator without adequate advance warning. Further, a lack of transparency regarding the capabilities, limitations, and strategies of highly automated systems has been associated with trust miscalibration and the failure to intervene when necessary. Keeping operators involved, informed, and engaged on a continuing basis has proven critical for handling unexpected events and preventing such events from turning into accidents. This project focuses on understanding the authority transitions between human drivers and vehicle automation systems by comparing driving performance under discrete and continuous control sharing schemes and under normal (intended) and faulty (adversarial) automation behaviors. (by: Akshay Bhardwaj)

 

 

The Holy Braille Project

The goal of the Holy Braille project is to develop a large-area dense-array tactile display device capable of displaying a full page of braille characters and tactile diagrams in the same format. Imagine a kindle or an iPad, where instead of pixels, there are physical features that you can touch and interact with. Current commercial refreshable braille displays are expensive and bulky, effectively limiting displays to a single line of braille. A single line of text makes reading difficult and displaying tactile graphics impossible.

We are using microfluidic technology to build and control pneumatic actuators for a full-page refreshable braille display. Pressure is routed through networks of microchannels to raise and lower bubbles on a surface. We are constructing the devices in a clean room environment using conventional microfabrication techniques. See recent updates and news about the project here. (by: Alex Russomanno)

 

Digital Hydraulic Body-Powered Exoskeletons for Stroke Rehabilitation

Traditional exoskeletons employ highly-geared motors to provide enough power to move or constrain the human body. This leads to a heavy, non-backdrivable exoskeleton with limited portability. We are developing an alternative style of exoskeleton, in which power can be routed from one limb to another for assistance and rehabilitation via a digital hydraulic transmission. By linking the joints together through a variable transmission, we will be able to coordinate and constrain relative motions between the user’s joints. Using digital hydraulics, we will connect different cylinders through a valve bank, altering the transmission ratio to relate the user’s input motions to desired output motions. Since the user’s own body power is employed to move another limb, the device is inherently safe and backdrivable. (by: Emma Treadway)