Principal Investigator: Wei Wang
CoPrincipal Investigator(s): Leslie Neely, Kathy Ewoldt
Organization: University of Texas at San Antonio
NSF Award Information: Enhancing Programming and Machine Learning Education for Students with Visual Impairments through the Use of Compilers, AI and Cloud Technologies
Abstract:
Attractive high-paying and highly flexible Computer Science careers should be more readily accessible for people with blindness or visual impairments (BVI). Unfortunately, teaching the required computer programming and data science skills to students with BVI is extremely challenging due to two major difficulties. The first difficulty comes from the limited capability of current screen readers to properly read computer codes that are a mix of English letters, digits, and punctuation marks. The specialized set of keystrokes used in programming is also not conveniently read by screen readers (e.g., spaces and tabs). The second difficulty comes from time-consuming and frustrating code navigation, whereby students with BVI must repeatedly use screen readers to read every line to locate the desired line for editing. Partnering with San Antonio Lighthouse for the Blind and Vision Impaired, the project will develop new accessibility tools, including a program syntax- and semantics-aware screen reader and a voice-command-based code navigation framework to address the above two difficulties. These accessibility tools will be offered through cloud-based web interfaces to provide nationwide access to students and educators. The success of this project will improve the effectiveness of teaching computer programming and data science to students with BVI, which in turn will increase accessibility for more individuals with BVI to participate in Computing Science with high-paying career opportunities and could lead to a more-diverse Computer Science workforce.
These accessibility tools will use compilers, Artificial Intelligence (AI), and cloud technologies to read computer code statements based on their meanings, rather than only reading one character at a time. The screen reader will articulate the necessary information that beginning coders need and help them more easily understand the lexicon and semantics used in computer programming and data science. The voice-command-based code navigation will employ speech recognition and natural language processing so that students will be able to use their voice to easily locate a specific statement (e.g., a variable declaration) within their code. These accessibility tools will be integrated into Jupyter notebook and offered through the cloud which will give nationwide access to students and educators. This cloud-based solution will also allow sophisticated AI models to be employed without requiring the students to have powerful and expensive computers to run these accessibility tools. The project will conduct a systematic evaluation of these accessibility tools using single-case research design to deepen the understanding of how technologies, including compilers, AI, and cloud computing, can be applied to teaching Computer Science skills to students with BVI. The evaluation will also provide feedback on the effectiveness of different speech styles and provide additional feedback for future improvements of these accessibility tools.
This award reflects NSF’s statutory mission and has been deemed worthy of support through evaluation using the Foundation’s intellectual merit and broader impacts review criteria.
Pingback: Artificial Intelligence and Adaptivity to Strengthen Equity in Student Learning | Getting Smart