User Tools

Site Tools


about

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Last revision Both sides next revision
about [2018/12/12 16:00]
134.157.85.106
about [2018/12/12 17:52]
77.141.190.84
Line 1: Line 1:
 +===== A collaborative research project =====
 RASPUTIN is a fundamental collaborative research project (PRCE) at the intersection of « Sciences et Technologies Numérique » and « Psychologie », aiming at reducing the cognitive complexity of navigation by the visually impaired in new interior surroundings through digital simulations and virtual auditory reality explorations as preparation and mental map construction exercises. The objective promotes accessibility to information by all, and from anywhere. RASPUTIN is a fundamental collaborative research project (PRCE) at the intersection of « Sciences et Technologies Numérique » and « Psychologie », aiming at reducing the cognitive complexity of navigation by the visually impaired in new interior surroundings through digital simulations and virtual auditory reality explorations as preparation and mental map construction exercises. The objective promotes accessibility to information by all, and from anywhere.
 With training, it is possible to acoustically judge the distance to walls or other sound reflective objects, a skill learned by many visually impaired individuals. RASPUTIN is concerned with investigating the use of perceptually realistic room acoustic simulations for assisting visually impaired individuals in the preparation of indoor navigation.  With training, it is possible to acoustically judge the distance to walls or other sound reflective objects, a skill learned by many visually impaired individuals. RASPUTIN is concerned with investigating the use of perceptually realistic room acoustic simulations for assisting visually impaired individuals in the preparation of indoor navigation. 
  
-Simulation algorithms have been improving in their ability to predict acoustic metrics. In real-time VR systems, where sound source, listener, and room architecture vary in unpredicted ways, investigations of perceptual quality/realism have been hindered by algorithm simplifications.  +===== A fundamental research project =====
 RASPUTIN addresses fundamental questions of spatial perception, memory, acoustics, and signal processing related to functional VR room simulations while addressing psychoacoustic and cognitive impacts of rendering quality. Evaluations will consider added benefits in terms of navigation speed, precision, improved self-confidence, and sense of security.  RASPUTIN addresses fundamental questions of spatial perception, memory, acoustics, and signal processing related to functional VR room simulations while addressing psychoacoustic and cognitive impacts of rendering quality. Evaluations will consider added benefits in terms of navigation speed, precision, improved self-confidence, and sense of security. 
 The research goals of RASPUTIN are fourfold: first, a significant advancement in understanding the fundamental capacities in spatial architectural perception and memory through auditory experience; second, the improvement and integration of a real-time room acoustic simulation algorithm into an open source research virtual reality platform; third, the evaluation of interactive room acoustic simulations as a planning/training aid for visually impaired individuals; and fourth, the improvement of autonomy of visually impaired people. The research goals of RASPUTIN are fourfold: first, a significant advancement in understanding the fundamental capacities in spatial architectural perception and memory through auditory experience; second, the improvement and integration of a real-time room acoustic simulation algorithm into an open source research virtual reality platform; third, the evaluation of interactive room acoustic simulations as a planning/training aid for visually impaired individuals; and fourth, the improvement of autonomy of visually impaired people.
 The RASPUTIN project addresses the advancement of Human-Computer Interaction through the development of virtual auditory environments that improve training and learning for the handicapped as well as improving access and understanding to public sites.  The RASPUTIN project addresses the advancement of Human-Computer Interaction through the development of virtual auditory environments that improve training and learning for the handicapped as well as improving access and understanding to public sites. 
  
 +===== Project aims =====
 The aim of the project is in full accordance with the joint challenge « Société de l’information et de la communication / Sociétés innovantes, intégrantes et adaptative - La révolution numérique : rapports aux savoirs et à la culture » concerning « Education et formation », with a clear fundamental research collaboration between « Sciences et Technologies Numérique » and « Psychologie », thereby addressing « Orientation n° 33 (Innovations sociales, éducatives et culturelles) ». RASPUTIN aims at reducing the cognitive complexity of navigation by the visually impaired in new interior surroundings through digital simulations and virtual auditory reality explorations as preparation and mental map construction exercises.  The aim of the project is in full accordance with the joint challenge « Société de l’information et de la communication / Sociétés innovantes, intégrantes et adaptative - La révolution numérique : rapports aux savoirs et à la culture » concerning « Education et formation », with a clear fundamental research collaboration between « Sciences et Technologies Numérique » and « Psychologie », thereby addressing « Orientation n° 33 (Innovations sociales, éducatives et culturelles) ». RASPUTIN aims at reducing the cognitive complexity of navigation by the visually impaired in new interior surroundings through digital simulations and virtual auditory reality explorations as preparation and mental map construction exercises. 
  
 +===== Project organization =====
 The project is organized in the following principal research workpackages: The project is organized in the following principal research workpackages:
 == WP 1 - Spatial perception and memory using auditory VR exploration == == WP 1 - Spatial perception and memory using auditory VR exploration ==
-WP 1 will examine the fundamental acoustic, psychoacoustic, and cognitive facets of spatial perception and representation resulting from various presentation means (navigation, tactile maps, etc.). Evaluation methods include quality assessments of mental spatial models (topological organization and conservation of metric relations) which evaluate the stability and accuracy of the mental maps, being directly correlated to the real environment. Results of these studies will be incorporated into WP 2, to be evaluated in detail in WP 4+The first scientific work package comprises fundamental studies in cognitive science concerning spatial perception and representation resulting from various presentation and exploration means by the blind. These studies investigate the capacities of mental map creation and memory through a series of experiments comparing physical navigation, tactile maps, verbal descriptions, and virtual navigation via auditory simulations of the architectural space
  
 == WP 2 - Real-time room acoustic rendering optimized for spatial understanding == == WP 2 - Real-time room acoustic rendering optimized for spatial understanding ==
-WP 2 addresses the development of the room acoustic simulation engine. This research task concerns initially the improvement of the previously method (Noisternig, 2008), an Iterative Image Source Method (IISM)This involvesfirst and foremost, the integration of the IISM into a geometrical scene engine which will allow for the definition of a unified visual and acoustical geometrical model. The open source multiplatform architecture Blender  has been previously identified as a good candidate for integration.  +The second scientific work package concerns the development of the real-time acoustic rendering enginethe technological heart of the projectTo facilitate working in parallelseveral variances of the engine are studied relying on different approaches to the rendering engine. The first will be an integration of the current state of the art developments into a functioning prototypeallowing for initial evaluationsAs alternate editions of the engine are developed, these can be swapped out seamlessly into the prototype architecture.
- +
-The audio rendering will be implemented with the powerful real-time spatialization library Spat~ (Carpentier, 2015). The quality of the simulated room acoustic responses will be further improved through the addition of spatially coherent statistical reverberation models for the later parts of the acoustic response. These statistical models will be developed based on recent works concerning hybrid reverberators (Carpentier2014) which combine a high spatial-resolution convolution engine with feedback delay networks; appearing particularly well-suited for pairing with room acoustics simulation softwareThe model parameterizations will be based on simplified approximations of the room geometry obtained from the unified geometrical visual-acoustic model. WP 2 and WP 3 concern implementations and evaluations that can be carried out in parallel, after the completion of WP 1+
  
 == WP 3 - Virtual Reality prototype for preparatory navigation planning == == WP 3 - Virtual Reality prototype for preparatory navigation planning ==
-WP 3 involves the development of a proof-of-concept prototype for the informative exploration of virtual acoustic environmentsWorking in conjunction with a selected user group panel who will remain engaged in the project for the durationseveral test cases of interest will be identified for integration into the prototype and evaluations in WP 4.+The third scientific work package addresses the construction of the navigation training prototype and the associate test scenariosThis includes developing the interface of the prototype in collaboration with ergonomic specialists in assistive devices and an established panel of potential users from the visually impaired community. Sites will be identified in collaboration with the user panel and the municipal Conseil régionalÎle de France to prioritize accessibility in public spaces. Subsequent architectural-acoustical models will be created and calibrated for the virtual navigation experiences
  
 == WP 4 - Evaluation of training aide in real navigation conditions == == WP 4 - Evaluation of training aide in real navigation conditions ==
-Evaluations by a recruited panel of visually impaired potential users will establish the degree of benefit of the VR training relative to traditional preparatory meanssuch as tactile maps and verbal descriptionsBenefits will be quantified with respect to spatial mental model accuracy, speed of navigation, autonomy, success rate to destinations, and level of self-confidence.+The fourth and final scientific work package constitutes the evaluation of the navigation prototypeemploying spatial cognition methodologies developed in the first work package and the various iterations of the acoustic rendering engine developed in the secondEvaluations will specifically investigate the extent to which auditory virtual navigations aide in the preparation of indoor navigation in unfamiliar spaceswhen compared to more classical methods. These evaluations will be at regular intervals throughout the project
  
-== References == +---- 
-  * Noisternig, Katz, Siltanen, & Savioja, “Framework for real–time auralization in architectural acoustics,” Acta Acustica united with Acustica942008.+ 
 +===== References ===== 
 +  * Afonso, Blum, Katz, Tarroux, Borst, & Denis, “Structural properties of spatial representations in blind people: Scanning images constructed from haptic exploration or from locomotion in a 3–D audio virtual environment,” Memory & Cognition382010.
   * Carpentier & Warusfel, “Twenty years of Ircam Spat: looking back, looking forward,” in Int. Computer Music Conf. (ICMC), Univ. of North Texas, 2015.   * Carpentier & Warusfel, “Twenty years of Ircam Spat: looking back, looking forward,” in Int. Computer Music Conf. (ICMC), Univ. of North Texas, 2015.
   * Carpentier, Noisternig, & Warusfel, “Hybrid Reverberation Processor with Perceptual Control,” in Int. Conf. Digital Audio Effects (DAFx), Erlangen, 2014.   * Carpentier, Noisternig, & Warusfel, “Hybrid Reverberation Processor with Perceptual Control,” in Int. Conf. Digital Audio Effects (DAFx), Erlangen, 2014.
 +  * Noisternig, Katz, Siltanen, & Savioja, “Framework for real–time auralization in architectural acoustics,” Acta Acustica united with Acustica, 94, 2008.
 +  * Picinali, Afonso, Denis, & Katz, “Exploration of architectural spaces by the blind using virtual auditory reality for the construction of spatial knowledge,” Int J Human–Comp Studies, 72(4), 2014.
 +  * Postma & Katz, “Creation and calibration method of virtual acoustic models for historic auralizations,” Virtual Reality, SI: Satial Sound, 2015.
  
about.txt · Last modified: 2018/12/12 17:53 by 77.141.190.84