Oliver Beren Kaul

Oliver Beren Kaul
Appelstr. 9A
30167 Hannover
Germany
Room 909
+49 (511) 762-14152


Resume

Oliver Beren Kaul is a researcher in the Human-Computer Interaction Group at the University of Hannover. His main area of interest is Augmented Reality and combining AR applications with new ways of interaction and feedback mechanisms.

During his studies he focused on Software Engineering, Computer Vision and Human-Computer Interaction for mobile devices. He worked for more than three years as a student research assistant on various projects in the areas of Multi Agent Robotic Systems, Mobile Systems to control robots and Superpixel algorithms running on GPUs. After finishing his studies, he worked as a Software Engineer for iOS and Android for three months before joining the Human Computer Interaction Group in August '15.

Projects

HapticHead - Around-the-head tactile display


Publications

Full Papers

HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality Oliver Beren Kaul, Michael Rohs Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '17
        

Workshop Papers

Wearable Head-mounted 3D Tactile Display Application Scenarios Oliver Beren Kaul, Michael Rohs Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct
        

Posters

HapticHead: 3D Guidance and Target Acquisition Through a Vibrotactile Grid Oliver Beren Kaul, Michael Rohs Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems
        
Follow the Force: Steering the Index Finger towards Targets using EMS Oliver Beren Kaul, Max Pfeiffer, Michael Rohs Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems
        
In mobile contexts guidance towards objects is usually done through the visual channel. Sometimes this channel is overloaded or not appropriate. A practicable form of haptic feedback is challenging. Electrical muscle stimulation (EMS) can generate mobile force feedback but has a number of drawbacks. For complex movements several muscles need to be actuated in concert and a feedback loop is necessary to control movements. We present an approach that only requires the actuation of six muscles with four pairs of electrodes to guide the index finger to a 2D point and let the user perform mid-air disambiguation gestures. In our user study participants found invisible, static target positions on top of a physical box with a mean 2D deviation of 1.44 cm from the intended target.