According to Wikipedia, approximately 40 million people in the world are blind. IBM’s alphaWorks Services division has embarked on a noble project aimed to benefit these 40 million. Called “Virtual Worlds User Interface for the Blind”, an overview can be found here:
And an FAQ document here:
The service currently works with Second Life (only), but IBM may support additional virtual worlds in the future. If they do add such support, they’ll tie new virtual worlds into the existing client, so that users only need to learn a single application.
With the IBM application, a virtual world is rendered via text (no graphics) and sighted users have the ability to annotate objects of the virtual worlds via text descriptions or recorded audio.
The implementers chose to leverage some open source and off-the-shelf technology:
The system also uses Quicktime (to play event sound prompts and verbal annotations) and NVDA (an open source screen reader). IBM recommends the use of the open source software Audacity for recording the verbal narrations).
I commend IBM for this effort and admire the flexibility and openness they’ve chosen in the implementation.