Maggie Collier

I am a researcher with multidisciplinary experience in robotics, assistive technology, and biomedical device development. My current research interests include Human Robot Interaction, Assistive Robotics, and Disability Studies.

Please excuse this messy website. It's currently under construction.

I am currently pursuing my Ph.D. in Robotics at Carnegie Mellon University (CMU) where I conduct assistive robotics research in the Human and Robot Partners Lab (HARP Lab) . In the spring of 2019, I was awarded the National Defense Science and Engineering Graduate Fellowship to fund my graduate education.

In 2019, I earned two bachelor of science degrees (Electrical Engineering and Biomedical Engineering) from the University of Alabama at Birmingham (UAB). During my time at UAB, I conducted three years of tissue engineering research and participated in two robotics summer research programs: Georgia Tech's 2017 SURE Robotics program (in Prof. Charlie Kemp's lab) and Carnegie Mellon's 2018 Robotics Institute Summer Scholars program (in Prof. Henny Admoni's lab (HARP Lab). Additionally, I was named a Goldwater Scholar in 2017.

Email  /  GitHub  /  Google Scholar  /  LinkedIn  /  CV

The image displays a profile photo of the researcher Maggie Collier. In the image, a woman with brown curly hair that stops just past her shoulders is sitting on a couch with her arm resting on the arm rest, while smiling at the camera. She is wearing a white fitted t-shirt with a red plaid skirt and black tights.

Research

I'm interested in Human Robot Interaction, Assistive Robotics, and Disability Studies.

A flowchart to demonstrate assistive teleoperation or shared control. The left-most part of the chart includes an image of a person and an image of computer displaying an image of a brain. The computer represents the agent computing the robot's actions. Actions from the human and robot are combined in the middle of the flowchart in the command arbitration step which involves an equation - alpha (some arbitration parameter) * the robot action + (1 - alpha) * the user's action = the output action. The rightmost part of the image show the output command going to a robot arm.

Uncovering People`s Preferences for Robot Autonomy in Assistive Teleoperation


Human and Robot Partners Lab, Carnegie Mellon University
2021
paper poster /

ABSTRACT: What factors influence people’s preferences for robot assistance? Answering this question can help roboticists formalize assistance that leads to higher user satisfaction and increased user acceptance of assistive technology. Often in assistive robotics literature, we see paradigms that aim to optimize task success metrics or measures of users’ perceived task complexity and cognitive load. However, frequently in this literature, participants express a preference for paradigms that do not perform optimally with respect to these metrics. Therefore, task success and cognitive load metrics alone do not encapsulate all of the factors that inform users’ needs or desires for robotic assistance. We focus on a subset of assistance paradigms for manipulation called assistive teleoperation in which the system combines control signals from the user and the automated assistance. In this work, we aim to study potential factors that influence users’ preferences for assistance during object manipulation tasks. We design a study to evaluate two factors (magnitude of end effector movement and the degrees of freedom being controlled) that may influence the amount of automated assistance the user wants.

Gif to demonstrate eye gaze behavior during teleoperation of a robot. The gif shows a robot arm moving rightward on a kichten counter with an eye gaze dot overlayed in green. The gaze dot starts on the hand of the robot arm but then transitions to being focused on a few objects above the counter as the robot arm moves toward the objects.

Eye Gaze Behavior in Teleoperation of a Robot in a Multi-stage Task


Human and Robot Partners Lab, Carnegie Mellon University
2020
paper poster /

ABSTRACT: Individuals with motor disabilities can use assistive robotics to independently perform activities of daily living. Many of these activities, such as food preparation and dressing, are complex and represent numerous subtasks which can often be performed in a variety of sequences to successfully achieve a person’s overall goal. Because different kinds of tasks require different types or levels of assistance, the variety of subtask sequences which the user can choose makes robotic assistance challenging to implement in a multi-stage task. However, a system that can anticipate the user’s next intended subtask can optimize the assistance provided during a multi-stage task. Psychology research indicates that non-verbal communication, such as eye gaze, can provide clues about people’s strategies and goals while they manipulate objects. In this work, we investigate the use of eye gaze for user goal anticipation during the telemanipulation of a multi-stage task.

Image to demonstrate robot-assisted dressing. A person sits in front of a robot with their arm outstretched while the robot pulls a cloth gown over the person's arm.

Capacitive Sensing in Robot-Assisted Dressing


Healthcare Robotics Lab, Georgia Tech
2017
paper

ABSTRACT: Dressing is a fundamental task of everyday living and robots offer an opportunity to assist older adults and people with motor impairments who need assistance with dressing. While several robotic systems have explored robot-assisted dressing, few have considered how a robot can manage error due to human pose estimation or adapt to human motion during dressing assistance. In addition, estimating pose changes due to human motion can be challenging with vision-based techniques since dressing is intended to visually occlude the body with clothing. We present a method that uses proximity sensing to enable a robot to account for errors in the estimated pose of a person and adapt to the person’s motion during dressing assistance.

Image to demonstrate coil embolization. The image has three frames, with each including an image of a blood vessel with an aneurysm. A gray wire indicating the coil in the image shows the coil being inserted into the aneurysm with a catheter and the coil filling up the volume of the aneurysm.

Improving Coil Embolization of Brain Aneurysms


Biomedical Engineering Dept., UAB
2017

ABSTRACT: Brain aneurysms represent a health risk that must be treated as quickly and efficiently as possible to prevent fatality and the brain damage often associated with hemorrhage. Surgical clipping and coil embolization are the two most effective treatments for ruptured brain aneurysms; yet, both are problematic. While surgical clipping is disadvantageous due to the extreme invasiveness of the craniotomy it requires, coiling carries a higher risk of rebleeding (aneurysmal rerupture) than clipping, an unfortunate outcome with a mortality rate of 70%. This higher rate of rebleeding is thought to be the result of poor aneurysmal healing in the months following initial coil placement. This project aimed to decrease rates of rebleeding in brain aneurysms treated with coiling by enhancing this healing process. The lab previously demonstrated a self-assembling nanomatrix coating that provides an endothelium-mimicking microenvironment on the surface of cardiovascular devices. This kind of environment, we hypothesized, might encourage proper aneurysmal healing. Thus, we applied the coating to the surface of the coils in attempt to reduce the risk of rebleeding associated with coil embolization.




Other Projects

These include coursework, side projects and unpublished research work.

The image displays three books about disability. Care Work - Dreaming Disability Justice by Leah Lakshmi Piepzna-Samarasinha. Disability Visibility by Alice Wong. The Disability Studies Reader by Lennard J. Davis.

Assistive Applications, Accessibility, and Disability Ethics (A3DE) Workshop at HRI 2024


organizing
2024-03-01
paper /

Description: This full-day workshop addresses the problems of accessibility in HRI and the interplay of ethical considerations for disability-centered design and research, accessibility concerns for disabled researchers, and the design of assistive HRI technologies. The workshop will include keynote speakers from academia and industry, a discussion with expert academic, industry, and disability advocate panelists, and activities and breakout sessions designed to facilitate conversations about the accessibility of the HRI community, assistive technology researchers, and research ethics from all areas of HRI (technical, social, psychological, design, etc.) We will explore guidelines around research ethics in this area of HRI, as well as new directions for this area of research.

The gif shows two frames simultaneously. In the left frame, a video of a mobile robot moving around an arena to capture blocks and balls underneath the robot. In the right frame, the localization component of the pipeline is demonstrated. A black screen shows a moving target that represents the robot. Four walls are detected on the screen to show the walls of the arena being sensed by the robot.

EE Capstone Project: Autonomous Robot for Hardware Competition


capstone
2019-04-01

Description: For EE Senior capstone, my team and I built an autonomous robot for the IEEE SouthEast Student Competition. I worked on the localization component which can be seen in the project thumbnail.

The image shows a solid model of the alarm clock. The front face is an analog clock face that typically are used by blind people. On the top of the clock is the novel alarm and time-setting buttons in which you can set the current time or the alarm using braille. The top of the clock includes space for four braille characters.

BME Capstone Project: Alarm Clock for People with Deaf-Blindness


capstone
2017-04-01

Description: A request from the Alabama Department of Rehabilitation Services led us to discover the need for an inexpensive method to provide a timekeeping and alarm device that can be set by people with deaf-blindness without assistance from a caretaker, thereby increasing the users’ independence. Based on price and quality gaps between existing solutions and the desired device, we developed an alarm clock that features an analog clock face to be read by tactile stimulation, an electrical interface that matches those found in standard bed vibrators already commonly used in the deaf community, and a “reverse braille” push button mechanism to set the original time as well as the alarm. The reverse braille input mechanism is novel in itself, due to its relatively minimalist direction which contrasts modern designs which have become increasingly complex. In addition to its usage in our cost-effective alarm clock, the reverse braille input mechanism can be applied to a wide variety of independent niche uses for individuals with deaf-blindness, which could increase their independence drastically.


Design and source code from Jon Barron's website