PhD in Engineering: Intuitive Human-Robot Collaboration in Unstructured Environments
Intuitive interaction is an important aspect of human-robot collaboration.
Recently, the use of image schemas has been proposed as a means for exploring user experience and intuitive interaction. An image schema is a dynamic pattern of organism-environment interaction that gives understanding to experience emanating from human bodily interaction with the physical world. Image schemas satisfy the two basic prerequisites for intuitive interaction: previous knowledge and subconscious processing.
A recent study has linked users’ experiences to the specific image schemas employed to complete a task. This method allows qualitative and quantitative evaluation of designs by identifying specific image schemas and product design features that have been positively or negatively received by the users. This allows user experience to be assessed in a systematic way, which leads to a better understanding of the value associated with particular design features.
Project aims and methods
The project aims to develop advanced mechanisms for intuitive interaction, which enable genuine synergy between a human and a robot and lead to increased productivity, improved efficiency and enhanced user experience. The innovative aspects of this projects are:
You are expected to produce high quality journal articles for scientific publication during the course of your PhD and would be expected to participate and present your research in international conferences.
Research training environment
You will benefit from an excellent research training environment provided by the High-Value Manufacturing group led by Professor Rossi Setchi. The research will be conducted at the Autonomous Systems and Robotics Lab, which is equipped with advanced robots (Care-O-Bot 4.0, Kuka IWA, YouBots, turtle bots) and equipment for intuitive interaction (e.g. Emotive, Smart Eye).
For more information click "LINK TO ORIGINAL" below.
This opportunity has expired. It was originally published here: