In this project, we designed an iPad prototype for evaluating using finger-mouted camera with vibration & feedback system to enable reading of printed text for the blind. The corresponding papers are published in [ECCV Workshop on Assistive Computer Vision and Robotics (ACVR). 2014] 1 and [ACM Transaction on Accessibility] 2. To cite the papers, please include the following:
Stearns, L., Du, R., Oh, U., Jou, C., Findlater, L., Ross, D., & Froehlich, J. (2016) Evaluating Haptic and Auditory Directional Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras. ACM Transactions on Accessible Computing (TACCESS) 2016, 1:1-1:38. 10.1145/2914793
Stearns, L., Du, R., Oh, U., Wang, Y., Chellappa, R., Findlater, L., & Froehlich, J. (2014) The Design and Preliminary Evaluation of a Finger-Mounted Camera and Feedback System to Enable Reading of Printed Text for the Blind. Proceedings of ECCV Workshop on Assistive Computer Vision and Robotics, 2014.
As P3 reports in our user study for ACVR 2014:
“It puts the blind reading on equal footing with rest of the society, because I am reading from the same reading material that others read, not just braille, which is limited to blind people only”.
In ACVR paper, we introduce the preliminary design of a novel vision-augmented touch system called HandSight intended to support activities of daily living (ADLs) by sensing and feeding back non-tactile information about the physical world as it is touched. Though we are interested in supporting a range of ADL applications, here we focus specifically on reading printed text. We discuss our vision for HandSight, describe its current implementation and results from an initial performance analysis of finger-based text scanning. We then present a user study with four visually impaired participants (three blind) exploring how to continuously guide a user’s finger across text using three feedback conditions (haptic, audio, and both). Though preliminary, our results show that participants valued the ability to access printed material, and that, in contrast to previous findings, audio finger guidance may result in the best reading performance
Though I only have less than 1 year of iOS programming experience, my code may be benefitial to you in the following aspects
- Dynamic layout of UI controls
- Bluetooth Low Energy connection
- Text-to-speech with variant speed on the go
- Two-column text layout
- Logging of touch events
- Reading / loading files
- Design patterns
Using third-party libraries, you may find the following code examples:
- HTTP request for sending and receiving message via AFNetworking
- Playing sound using SoundManager
1.0.1
HandSights uses a number of open source projects to work properly:
- SoundManager - Playing sound and music in iOS or Mac apps.
- BLEMini - BLE Mini is a small, certified BLE development board for makers to do their innovative projects. It can be used for BLE development using TI CC254x SDK.
- AFNetworking - AFNetworking is a delightful networking library for iOS and Mac OS X. It's built on top of the Foundation URL Loading System, extending the powerful high-level networking abstractions built into Cocoa. It has a modular architecture with well-designed, feature-rich APIs that are a joy to use.
- Interface for restraunt menus
- Ruofei Du
- me (at) duruofei (dot) com
Free Software, Hell Yeah!