Department of Informatics
Fellowship in Self-aware Interactive Music Systems
Position as PhD/Postdoc Research fellow is available at the Robotics and Intelligent Systems (ROBIN) group at the Department of Informatics. The project is part of the Research Council of Norway funded project Engineering Predictability with Embodied Cognition (EPEC). The project aims at creating multimodal systems that are able to sense, learn and predict future events.
The fellowship will be for a period of 3 years, with an option for extension of the employment period by 2-6 months depending background and qualifications for additional duty work which typically includes work on courses (lecturing, preparation and organizing of exercises etc). The starting date is planned to be 01.1.2016. If no applicants qualify for PhD researcher fellowship (position code 1017), applicants will be considered for employment as postdoctoral researcher (position code 1352).
Job/ project description:
The Robotics and Intelligent Systems research group focus on adaptive systems research often including biologically inspired methods. We target to apply these methods within robotics, programmable logic and applications like active music. We are interested in studying human and robotic motion using motion capture analysis and apply this knowledge in designing adaptive robotic or computing systems. This work is a part of the interdisciplinary collaboration with Dept. of Musicology and Dept. of Psychology at the University of Oslo through the fourMs initiative. We also have a number of international collaborators and having a stay of up to 6 months with one of these could be possible.
Active music is music that is controllable by the listener. Along an imaginary axis between an active music performer and a passive music perceiver, there is room for participation. This participation can range from directly inputting low-level control actions to the system, to indirect control through higher-level features extracted from sensor data using machine learning approaches (e.g. energy, mood, etc.). The goal of this PhD position is to develop such music systems for smart phones that can adapt the music to the given setting and music-related user preferences, i.e. make systems being self-aware. Interaction between multiple users is also an important part of the research. That is, multiple users sharing a musical synthesis process based on their own musical preferences. Predictive systems would then be needed for synchronizing different users. The work should build on our earlier research in the area involving both direct and indirect control of real-time musical synthesis on a smartphone using biologically inspired methods.
- salary for PhD (code 1017), Pay grade: 50 - 57 (NOK 430 500 – 483 700 per year), salary for Postdoc (code 1352), Pay grade: 60 – 65 (NOK 510 100 – 560 700 per year)
- a challenging and stimulating working environment
- attractive welfare benefits
The application must include:
- Application letter
- CV (summarizing education, positions and academic work - scientific publications)
- Copies of educational certificates, transcript of records and letters of recommendation
- Documentation of English proficiency
- The most relevant and important publications and academic work that the applicant wishes to be considered by the evaluation committee (maximum 5)
- Names and contact details of 2-3 references (name, relation to candidate, e-mail and telephone number)
Foreign applicants are advised to attach an explanation of their University's grading system. Please remember that all documents should be in English or a Scandinavian language.
In accordance with the University of Oslo's equal opportunities policy, we invite applications from all interested individuals regardless of gender or ethnicity but would like to specifically encourage female candidates to apply.
UiO has an agreement for all employees, aiming to secure rights to research results a.o.
Job type: Contract
Working hours: Full-time
Working days: Day
Reference number: 2015/9635
Home page: http://www.ifi.uio.no