Research

Text for this section is not ready yet, please have a look at the picture on the right at the meantime for the main research areas of the ECHOS group.

Research Projects

  • National Centre for Nuclear Robotics (NCNR)

    Research in multimodal, augmented interfaces for robot operators and safe tele-operation of robots in nuclear decommissioning facilities.

    This project is led by Birmingham University in association with University of Bristol. Nuclear decommissioning and the safe disposal of nuclear waste is a global problem of enormous societal importance. The UK alone contains 4.9million tonnes of legacy nuclear waste, representing the largest, and most complex, environmental remediation project in the whole of Europe. UK nuclear clean-up is currently estimated to take over 100 years to complete, with costs as high as £220billion. At least 20% of this expenditure must involve remote interventions using robotics, because the materials and environments are too hazardous for human workers, even wearing protective air-fed suits. Although this project has only recently started, the main focus for BRL-UWE will be to research and develop novel interfaces for humans who interact and tele-operate robots in nuclear environments. A prime focus will be on methods for variable autonomy and shared control in this context.

  • Robotics for Nuclear Environments (RNE)

    Research in VR and AR interfaces for robot operators in nuclear environments and interaction with robot swarms.

    This EPSRC-funded Programme Grant project is led by Manchester University, with partners BRL-UWE and Birmingham University. The massive overall task is made even more daunting by the extreme environments encountered in many legacy facilities. These may contain radiological, chemical, thermal and other hazards, restricting access by humans and necessitating the use of robots to complete many jobs. Unfortunately, current robotic technology is not capable of doing a lot of what will be required. Even straightforward tasks such as turning valves on and off, navigating staircases and moving over rough terrain can be problematic. The five-year research programme has been created to address these issues. The programme’s brief is both extremely clear and immensely challenging – to make major scientific and technological advances to nuclear robots in a very short timescale. Research will be carried out across the home institutions and at the Dalton Cumbrian Facility, in west Cumbria, which has strong links with the nuclear industry. The parts of the project that form the focus of BRL-UWE contributions are to carry out research in heterogeneous team working, human-robot team interaction, and on-line behaviour risk assessment.

  • Remote Medical Diagnostician (ReMeDi)

    Research in user experience of doctors, who use a tele-operated robot for palpation and ultrasonography.

    The ReMeDi project designs and implements a robot system for medical tele-examination of patients. Successful medical treatment depends on a timely and correct diagnosis, but the availability of doctors of various specializations is limited, especially in provincial hospitals or after regular working hours. The target use cases in ReMeDi are two of the most widely used techniques for physical examination, palpation and ultrasonography.

  • Christian Doppler Labor ``Contextual Interfaces``

    Research in multimodal, augmented interfaces for industrial robot programmers

    The Center for Human-Computer Interaction at the University of Salzburg operated the Christian Doppler Laboratory Contextual Interfacesan, which was a long-term co-operation with industrial partners. This laboratory examined contextual interaction from qualitative, constructional, and methodological viewpoints. Apart from basic research activities focusing on methods and tools for contextual research, the laboratory explores the contexts “car” and “factory” from these three perspectives.

  • Joint Action for Multimodal Embodied Social Systems (JAMES)

    Research in multimodal, short-term, dynamic, socially appropriate human-robot interaction.

    The goal of JAMES was to develop a robot that was able to interact with humans in a socially-appropriate manner to perform tasks. As a result, the robot required not only the necessary physical skills to operate in the world, but also the social skills to understand and respond to the needs of the people it interacted it. The main demonstration of this behaviour was a bartending scenario in which multiple users could interact with the robot bartender to order drinks.

  • Audio for Communication and Environment Perception (AudiComm)

    Research in sound localization, speaker identification, and natural language processing.

    AudiComm focused on high- and low-level sound and speech processing. The main goals of the project were the implementation of methods to find out who is speaking, where the speaker was located, and what the speaker was saying. The approaches for sound localization, speaker identification, and speech processing that were developed in AudiComm enabled robots to communicate with humans in more natural ways.

  • Joint Action Science and Technology (JAST)

    Research in multimodal fusion, output generation, and human-robot joint action.

    The goal of the JAST project was to build intelligent, autonomous agents that cooperate and communicate with their peers and with humans while working on a mutual task. For that, we build a robot which had a pair of manipulator arms with grippers, mounted in a position to resemble human arms, and an animatronic talking head capable of producing facial expressions, rigid head motion, and lip-synchronised synthesised speech. The robot was able to recognise and manipulate pieces of a wooden toy construction set called Baufix, which were placed on a table in front of the robot. A human and the robot worked together to assemble target objects from Baufix pieces, coordinating their actions through speech (English or German), gestures, and facial expressions.