Drexel Student Team Wins First Place at Hackathon to Improve Web Accessibility

evoHaX Drexel Team
Drexel University team presenting at the April 18 evoHaX hackathon in Philadelphia, Pa.

A Drexel undergraduate student team recently won first place at a Philadelphia hackathon to create awareness about Web accessibility among student developers, and tackling the challenges people with disabilities face accessing information on the Web.

On April 18, College of Computing & Informatics (CCI) student Karishma Changlani, College of Engineering students Maissoun Ksara and Michelle Lanshe, and School of Biomedical Engineering, Science and Health Systems students Anna Lu and Neha Thomas represented the Drexel University team competing at evoHaX, a 24-hour-long hackathon to develop Web accessibility solutions. The event, which brought together five university student teams from the Philadelphia area, was part of the fifth annual Philly Tech Week (April 17-25), a celebration of Philadelphia-area innovation and technology.

The winning team designed an augmented speech-reader and Bluetooth module to assist people with cognitive (such as alexia, a reading disorder that is a common side-effect of a stroke), visual and physical disabilities in using a computer.

Existing text-to-speech reading tools have some disadvantages: they read all text from start to finish; they do not offer the ability to highlight; they lack header, paragraph and image identification; and they stop at line breaks instead of at the end of a sentence.

The Drexel module uses voice recognition enables an entirely hands-free audio interaction (up to 30 feet away)—eliminating the need for a keyboard or mouse. Using Bluetooth technology, Arduino and Python command scripts, the module allows the user to speak to text and give commands (e.g., “open Chrome” or “search Google: cats”), and then interfaces with the text-to-speech reader to identify clickable elements and enable the selective reading of search results.

“The inspiration for the module’s voice recognition aspect came initially from watching ‘Star Trek’ and seeing how they interact with their computer system only though voice commands on the show,” Lu said. “The Bluetooth speech-to-text aspect of the project was an extension of an idea I wanted to try initially with natural language text processing.”

The Drexel module also improves upon existing solutions—such as the open source, Microsoft Windows-based screen reader NVDA (NonVisual Desktop Access)—by allowing the user to read selectively by mouse pointer location (highlighting with NVDA requires the use of a mouse); automatic boxing around headers, images, paragraphs and clickable elements; and implements a text identifier that parses sentences by punctuation.

The team believes that the interdisciplinary perspectives—brought by five different majors within the group—was integral to the project’s success. “Each of us contributed our strengths toward the actual design,” Lu said. “The circuitry and hardware was put together by Neha (biomedical engineering) and Michelle (computer engineering), the software coded by Karishma (computer science) and myself (biomedical engineering), and the presentation done by Maissoun (civil engineering).”

In the wake of their success at this year’s evoHaX competition, the Drexel team has agreed to continue working on the project in their free time to further develop and improve the module.

In This Article

Related News


Contact Us

Have a question? We’re eager to talk with you.

Contact Us