colored-CARL-Logo



Workshop On Interacting with Robots Through Touch

September 13, 2016

9AM - 6PM

Social & Behavioral Sciences Gateway, Room 1517

University of California, Irvine

Schedule

Confirmed Speakers

Register Now!

Pre-registration is FREE! Registration on the day of the event is $25

 Robots and autonomous systems are increasingly becoming a part of our everyday life. In particular co-Robots, in which robots have a symbiotic relationship with people, have the potential to increase social well-being and open up new socioeconomic opportunities. For example, Human-Robot Interaction (HRI), co-Robotics, and Socially Assistive Robots (SARs) are increasingly being used for entertainment, education, telepresence, rehabilitation and therapy. SARs have the potential to help children with developmental disorders, such as autism or attention deficit disorders, because robots fall somewhere between toys, which do not elicit novel social behaviors, and people or animals, which can be a source of confusion and distress to children with developmental disorders. In the classroom, social robots can act as digital ethnographers by: automatically detecting what robot-generated activities children enjoy most, monitoring development of social structure within the classroom, keeping track of and improving vocabulary development, and provide useful information to parents, teachers, clinicians & researchers. To date, most of these co-robots focus on eye contact (e.g., shared attention, shared gaze, etc.) and auditory cues (e.g., catch phrases and music), but tend to neglect other sensory systems important for social behavior, such as tactile interaction.

 The purpose of this workshop is to explore the use of tactile sensing in HRI and SARs. While haptics is a well developed research domain, we intend to focus on the kind of touch that occurs between humans and with their environment in social learning and collaborative settings. We will examine the observed and potential role for touch in human-to- robot and robot-to-robot interactions. We will explore application domains, as well as different materials (i.e., artificial skin and soft robotics) for exploring the sense of touch in robots. Our goal is: 1) Educate the community on making tactile sensing and interaction a main focus of their co-robots. 2) Explore the possibilities of using touch and new methods of interaction in HRI and SAR systems. The workshop will be a one-day open-forum with talks given by the organizers and an interactive session with demonstrations and posters given by students and other interested participants.

Organizers

 

Andrea Chiba

achiba@ucsd.edu

http://healthsciences.ucsd.edu/education/neurograd/faculty/Pages/andrea-chiba.aspx

Andrea Chiba is a Professor in the Department of Cognitive Science and in the Program for Neuroscience at the University of California, San Diego. She is Co-Director and the Science Director of the Temporal Dynamics of Learning Center, an NSF Science of Learning Center. The Center research is focused on the importance of time and timing in various aspects of learning, from the level of the synapse to social interactions. The goal of the Center is not only to understand learning, but also to translate this understanding to the practice of educating. Chiba is involved in many Center projects that allow cross-species comparisons of learning and memory, bridging from rodent to human. The Chiba Laboratory is focused on gaining an understanding of the neural systems and principles underlying aspects of learning, memory, affect, and attention, with an emphasis on neural plasticity. Her group’s work is highly interdisciplinary, using a variety of neurobiological, neurochemical, neurophysiology, computational, and behavioral techniques. 

 

Ting-Shuo Chou

tingshuc@uci.edu

https://www.linkedin.com/in/ting-shuo-chou-a05b1955

Ting-Shuo Chou received the Ph.D. degree in computer sciences from University of California, Irvine in 2015 and the M.S. degree in computer science from National Tsing Hua University, Hsinchu, Taiwan in 2006. He is currently an Assistant Project Scientist in the department of cognitive sciences, UCI School of Social Sciences. His research interests include socially assistive robotics (SARs), neurorobotics, neuromorphic engineering, and simulations on large-scale neural networks. His recent research focuses on a SAR that communicates with people through tactile interactions. Based on the robot, he explores opportunities of helping children with ASD or ADHD.

 

Deborah Forster

dforster@ucsd.edu

http://www.calit2.net/people/detail.php?id=948

Deborah Forster received BSc in biology and PhD in cognitive science from UC San Diego. She is an Assistant Project Scientist at the Qualcomm Institute, where she currently leads the Machine Perception Lab and is a researcher in the Temporal Dynamics of Learning Center, at the Institute for Neural Computation at UC San Diego. Her research program on Technology Enhanced Learning is informed by her field studies of social complexity and distributed cognition in wild­ranging baboons, and of humans in technology­rich environments such as driving, and the use of social robots in early childhood education.

 

Jeff Krichmar

jkrichma@uci.edu

http://www.socsci.uci.edu/~jkrichma/

Jeff Krichmar received a B.S. in Computer Science in 1983 from the University of Massachusetts at Amherst, a M.S. in Computer Science from The George Washington University in 1991, and a Ph.D. in Computational Sciences and Informatics from George Mason University in 1997. He currently is a professor in the Department of Cognitive Sciences and the Department of Computer Science at the University of California, Irvine. He has been promoting the field of Neurorobots and Brain-Based Robotics for over 15 years. One of his more recent robots senses the environment through tactile interaction. It can sense touch on its “shell” and gives feedback through auditory cues, movement, and coloration of its shell. The robot’s control is guided by neurobiologically plausible model of learning and somatosensory processing in the brain. It has potential applications as a SAR and in entertainment.

 

Michael T. Tolley

tolley@ucsd.edu

http://bioinspired.eng.ucsd.edu/

Mike Tolley received BEng from McGill University, Montreal, Canada, and PhD in mechanical engineering from Cornell University, Ithaca, New York. He is an Assistant Professor in the Dept. of Mechanical and Aerospace Engineering at UC San Diego, where he leads the Bioinspired Robotics and Design Lab. His research program aim to  borrow ideas from nature to inspire engineered systems with new capabilities. In particular, he has ongoing projects in soft robotics, fabrication by folding, and self­assembly.

 

Participants

 

William Harwin

w.s.harwin@reading.ac.uk

https://www.reading.ac.uk/sse/about/staff/w-s-harwin.aspx

Professor William Harwin Director of Research for the School of Systems Engineering at the University of Reading, where his research interests encompass cybernetics and the interfaces between humans and smart machines as typified by haptic devices, and medical and rehabilitation robots.

 

Guy Hoffman

hoffman@cornell.edu

http://guyhoffman.com/

Guy Hoffman holds a PhD from MIT in the field of human robot interaction. He is an Assistant Professor in the Sibley School of Mechanical and Aerospace Engineering at Cornell University. He heads the Human-Robot Collaboration and Companionship (HRC2) group, studying the algorithms, interaction schema, and designs enabling close interactions between people and personal robots in the workplace and at home.

 

Francis McGlone

F.P.McGlone@ljmu.ac.uk

https://www.ljmu.ac.uk/about-us/staff-profiles/faculty-of-science/natural-sciences-and-psychology/francis-mcglone

Professor Francis McGlone is the head of the Somatosensory & Affective Neuroscience Group at the School of Natural Sciences & Psychology, Liverpool JM University, and Professor in Neuroscience at LJMU. He is also Visiting Professor at the University of Liverpool. His primary area of academic research is characterising the role of afferent c-fibres in humans, investigating their role in pain, itch, and more concertedly the functional and affective properties of a novel class of c-fibres - C-tactile afferents – hypothesised to code for the pleasure of intimate touch. Techniques used in this research span single unit recordings with microneurography, psychophysical measurements, functional neuroimaging, behavioural measures, and psychopharmacological approaches to investigate the role of the brain transmitter serotonin in affiliative and social touch.

 

David Reinkensmeyer

dreinken@uci.edu

http://biorobotics.eng.uci.edu/people/djr

Professor Reinkensmeyer's research interests are in neuromuscular control, motor learning, robotics, and rehabilitation. A major goal is to develop physically interacting, robotic and mechatronic devices to help the nervous system recover arm, hand, and leg movement ability after neurologic injuries such as stroke and spinal cord injury. Another goal is to understand the computational mechanisms of human motor learning, in order to provide a rational basis for designing movement training devices. Prof. Reinkensmeyer's laboratory has developed a variety of robotic devices for manipulating and measuring movement in humans and rodents. These devices are being used to investigate the role of mechanical assistance in retraining arm movement following stroke, the feasibility of providing movement training remotely using the Internet, and the role of sensory information in locomotor plasticity after spinal cord injury. Dr. Reinkensmeyer's laboratory helped develop the T-WREX arm exoskeleton for neurologic rehabilitation, commercialized by Hocoma as ArmeoSpring and now in use in over 700 facilities worldwide, and the MusicGlove, a glove that senses touch for hand rehabilitation, commercialized by Flint Rehabilitation Devices.

 

Veronica Santos

vjsantos@ucla.edu ,

http://BiomechatronicsLab.ucla.edu

Professor Veronica Santos is an Associate Professor in the Mechanical and Aerospace Engineering Department at the University of California, Los Angeles, and Director of the UCLA Biomechatronics Lab. She received the B.S. degree in mechanical engineering with a music minor from the University of California, Berkeley in 1999. From 2000 to 2001, she was a Quality Engineer and Research and Development Engineer at Guidant Corporation in Santa Clara, CA, specializing in life-saving cardiovascular technology. Dr. Santos received the M.S. and Ph.D. degrees in mechanical engineering with a biometry minor from Cornell University, Ithaca, NY in 2004 and 2007, respectively. From 2007 to 2008, she was a postdoctoral research associate at the Alfred E. Mann Institute for Biomedical Engineering at the University of Southern California where she worked on a team to develop a novel biomimetic tactile sensor for prosthetic hands. From 2008 to 2014, Dr. Santos was an Assistant Professor in the Mechanical and Aerospace Engineering Program at Arizona State University, where she directed the ASU Biomechatronics Lab. She currently serves as an Associate Editor of the ASME Journal of Mechanisms and Robotics.

 

Consultants

 

Carissa Cascio

carissa.cascio@vanderbilt.edu

http://www.casciolab.com/

Dr. Cascio’s graduate training was in neuroscience at Emory University. My work was centered on sensory neuroscience applied to human and nonhuman primates, with an emphasis on tactile perception and functional imaging. Having developed an interest in the neuroscience of autism, I pursued postdoctoral studies at the Neurodevelopmental Disorders Research Center at the University of North Carolina. There she focused dually on somatosensory processing in individuals with autism, and diffusion tensor imaging in young children with autism and other developmental disabilities. In 2007, she joined the Psychiatric Neuroimaging Program in Vanderbilt’s Department of Psychiatry. Her lab focuses on the neural basis of sensory and repetitive behaviors in individuals with autism spectrum disorders.

 

Javier Movellan

javier@emotient.com

www.emotient.com

Javier Movellan received his PhD at UC Berkeley where he was a Fulbright Scholar. He was a research associate at Carnegie Mellon University before coming to UCSD. At UCSD he founded the Machine Perception Laboratory (MPLab), whose mission was to learn about intelligent behavior by developing systems that operate in uncertain, time constrained, and sensory rich conditions of daily life. His research spans machine learning, machine perception (including vision and speech), automatic analysis of human behavior, and social robots. He pioneered the development of social robots and their use for early childhood education.   Javier was president/CEO of Machine Perception Technologies from 2008 to 2012, and is the founder and lead researcher at Emotient .

 

Janet Wiles

j.wiles@uq.edu.au

http://staff.itee.uq.edu.au/janetw/

Janet Wiles received BSc and the PhD degrees in computer science from The University of Sydney, Sydney, Australia. She is a Professor of Complex and Intelligent Systems in the School of Information Technology and Electrical Engineering at The University of Queensland, Brisbane, Australia, and UQ-node Director of the ARC Centre of Excellence in the Dynamics of Language. Her research program involves using computational modeling, and social robots, to understand complex systems with particular applications in biology, neuroscience, and cognition. Robots developed by her group include the iRat, a rat-sized robot for studies in cognitive and social robotics, and a child-sized robot for education and social interactions with children.

 
 
 

 

Carl