Restoring Vision Through the Eyes of a Smartphone

For millions of people, the ability to navigate around this complex world requires the assistance of a cane or a faithful companion.  With the ubiquity of smartphones today, and their ever increasing capabilities, there is an opportunity to create an application that could assist the visually impaired in becoming functionally independent.

After much research, we have found an application, the vOICe, that promotes functional independence, however we believe it can be improved.  This application utilizes the smartphone’s camera to continuously capture images, which are processed and converted to sound.  The key opportunity for enhancement would be to simplify the captured image before converting it to sound. To improve this process, we can simplify the image by removing the background and preserving what is important.  We will leverage existing image processing techniques to assist us in this task.

A challenge for this technology is the learning curve required for a user to become comfortable using the application.  A challenge we believe will be made significantly easier by only giving them the part of the scene that is important.  Our biggest contributor to this simplification comes from the utilization of depth-of-field blur.  By removing the blurred portion of the scene we can translate the portion of the scene most important to the user.  Seeing through sound will be easier when the user no longer has to distinguish relevant from irrelevant soundwaves.  Figure 1 shows the current implementation’s output while our intended output can be seen in figure 2.

Our team is excited to push forward with this project and are hopeful to have a working prototype by the end of summer 2012.  At which point we will look into the possibility of increasing the translations per second to communicate to the user a sense of motion!

Figure 1 shows the Current Implementation of the image-to-sound translation.  Figure 2 shows the implementation that we have proposed.

About Ciaran Hannigan

Ciaran is advised under Dr Tyson. He has worked as a teaching assistant for several classes, taught the mobile class in the Summer ’11, his main area of focus in the mobile lab is working on Android projects, although he has began to look into iOS! Ciaran’s passion focuses mainly on user interaction (UI) and user experience (UX). Designing UIs and creating new components to enrich the users experience is what he is all about!