About the project:
FlashSight is software application created for people with visual impairment. It is a simple and intuitive way to explore, to assess, and to recognize the surrounded environment with the help of virtual audible signs
A world without access to printed signs and surrounding landmarks often leaves people with vision loss lacking the necessary information to successfully navigate the physical environment. Flashsight helps fill in the gaps left by location or routing applications: - These applications focus on spatial language (Ex: North, 2 o'clock) which is more cognitively taxing or users are left not sure if the direction are accurate ("Is it my 2 o'clock right now, or when I first loaded the directions?"). - These applications focus on street intersects and don't provide users with a sense of context and their environment. Telling a user to turn left at the coffee shop (which they might be able to hear and smell) may be more meaningful. - These applications not provide the customize level of Categories temporary or permanent status. - These applications not support the selective level receiving information from high to low. Give the User possibility to listen only POI's Name or More details gradual. - They don't provide users with larger landmarks that can help orient a user in the larger area. When we see the tall buildings of downtown, we have a better sense of where we are. - They don't address the smooth transition and connection between outdoor and indoor orientation. - Points of interest are listed linearly based on the developer's choice (ordered alphabetically, on popularity, on distance, on time, etc.). - Finally, none of these applications not give User opportunity to create, edit, share, and delete their own audible signs and landmarks.
Our mission was to create FlashSight App that will using an iPhone as a receiver for virtual transmitters that mark signs and landmarks in the real-world environment within “sighted” directional distance. The goal was to produce FREE software for people with visual impairment by utilizing the potentials of audible augmented reality technology, by optimizing the native features of mobile devices, and by integrating the psychoacoustics research.