User Tools

Site Tools


about

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
about [2018/10/17 19:14] – created 134.157.85.106about [2018/12/12 17:53] (current) – [References] 77.141.190.84
Line 1: Line 1:
-what is the project?+===== A collaborative research project ===== 
 +RASPUTIN is a fundamental collaborative research project (PRCE) at the intersection of « Sciences et Technologies Numérique » and « Psychologie », aiming at reducing the cognitive complexity of navigation by the visually impaired in new interior surroundings through digital simulations and virtual auditory reality explorations as preparation and mental map construction exercises. The objective promotes accessibility to information by all, and from anywhere. 
 +With training, it is possible to acoustically judge the distance to walls or other sound reflective objects, a skill learned by many visually impaired individuals. RASPUTIN is concerned with investigating the use of perceptually realistic room acoustic simulations for assisting visually impaired individuals in the preparation of indoor navigation.  
 + 
 +===== A fundamental research project ===== 
 +RASPUTIN addresses fundamental questions of spatial perception, memory, acoustics, and signal processing related to functional VR room simulations while addressing psychoacoustic and cognitive impacts of rendering quality. Evaluations will consider added benefits in terms of navigation speed, precision, improved self-confidence, and sense of security.  
 +The research goals of RASPUTIN are fourfold: first, a significant advancement in understanding the fundamental capacities in spatial architectural perception and memory through auditory experience; second, the improvement and integration of a real-time room acoustic simulation algorithm into an open source research virtual reality platform; third, the evaluation of interactive room acoustic simulations as a planning/training aid for visually impaired individuals; and fourth, the improvement of autonomy of visually impaired people. 
 +The RASPUTIN project addresses the advancement of Human-Computer Interaction through the development of virtual auditory environments that improve training and learning for the handicapped as well as improving access and understanding to public sites.  
 + 
 +===== Project aims ===== 
 +The aim of the project is in full accordance with the joint challenge « Société de l’information et de la communication / Sociétés innovantes, intégrantes et adaptative - La révolution numérique : rapports aux savoirs et à la culture » concerning « Education et formation », with a clear fundamental research collaboration between « Sciences et Technologies Numérique » and « Psychologie », thereby addressing « Orientation n° 33 (Innovations sociales, éducatives et culturelles) ». RASPUTIN aims at reducing the cognitive complexity of navigation by the visually impaired in new interior surroundings through digital simulations and virtual auditory reality explorations as preparation and mental map construction exercises.  
 + 
 +===== Project organization ===== 
 +The project is organized in the following principal research workpackages: 
 +== WP 1 - Spatial perception and memory using auditory VR exploration == 
 +The first scientific work package comprises fundamental studies in cognitive science concerning spatial perception and representation resulting from various presentation and exploration means by the blind. These studies investigate the capacities of mental map creation and memory through a series of experiments comparing physical navigation, tactile maps, verbal descriptions, and virtual navigation via auditory simulations of the architectural space.  
 + 
 +== WP 2 - Real-time room acoustic rendering optimized for spatial understanding == 
 +The second scientific work package concerns the development of the real-time acoustic rendering engine, the technological heart of the project. To facilitate working in parallel, several variances of the engine are studied relying on different approaches to the rendering engine. The first will be an integration of the current state of the art developments into a functioning prototype, allowing for initial evaluations. As alternate editions of the engine are developed, these can be swapped out seamlessly into the prototype architecture. 
 + 
 +== WP 3 - Virtual Reality prototype for preparatory navigation planning == 
 +The third scientific work package addresses the construction of the navigation training prototype and the associate test scenarios. This includes developing the interface of the prototype in collaboration with ergonomic specialists in assistive devices and an established panel of potential users from the visually impaired community. Sites will be identified in collaboration with the user panel and the municipal Conseil régional, Île de France to prioritize accessibility in public spaces. Subsequent architectural-acoustical models will be created and calibrated for the virtual navigation experiences.  
 + 
 +== WP 4 - Evaluation of training aide in real navigation conditions == 
 +The fourth and final scientific work package constitutes the evaluation of the navigation prototype, employing spatial cognition methodologies developed in the first work package and the various iterations of the acoustic rendering engine developed in the second. Evaluations will specifically investigate the extent to which auditory virtual navigations aide in the preparation of indoor navigation in unfamiliar spaces, when compared to more classical methods. These evaluations will be at regular intervals throughout the project.  
 + 
 +---- 
 + 
 +===== Selected References ===== 
 +  * Afonso, Blum, Katz, Tarroux, Borst, & Denis, “Structural properties of spatial representations in blind people: Scanning images constructed from haptic exploration or from locomotion in a 3–D audio virtual environment,” Memory & Cognition, 38, 2010. 
 +  * Carpentier & Warusfel, “Twenty years of Ircam Spat: looking back, looking forward,” in Int. Computer Music Conf. (ICMC), Univ. of North Texas, 2015. 
 +  * Carpentier, Noisternig, & Warusfel, “Hybrid Reverberation Processor with Perceptual Control,” in Int. Conf. Digital Audio Effects (DAFx), Erlangen, 2014. 
 +  * Noisternig, Katz, Siltanen, & Savioja, “Framework for real–time auralization in architectural acoustics,” Acta Acustica united with Acustica, 94, 2008. 
 +  * Picinali, Afonso, Denis, & Katz, “Exploration of architectural spaces by the blind using virtual auditory reality for the construction of spatial knowledge,” Int J Human–Comp Studies, 72(4), 2014. 
 +  * Postma & Katz, “Creation and calibration method of virtual acoustic models for historic auralizations,” Virtual Reality, SI: Satial Sound, 2015. 
about.txt · Last modified: 2018/12/12 17:53 by 77.141.190.84