Welcome to A11y Lab

We work on technical HCI domain. Therefore, we welcome students with a strong technical background who are passionate about accessible computing and ability-based design and development (more info) !

A11y Lab is part of the Human-Centered Design and Development (HCDD) program at College of Information Science and Technology (IST) in the Pennsylvania State University (Penn State).

A11y stands for Accessibility – there are 11 letters between the “A” and the “y”. We believe accessibility is a human right; we also believe making things accessible for people with disabilities is a hard problem. Often, this problem is layered, multi-dimensinoal, and difficult to understand, describe, and theorize.

At A11y Lab, we aim to address the following challenges:

  • Designing technology for specific groups with certain abilities or disabilities
  • Making Assistive Technologies easy-to-learn, frictionless, and ubiquitous
  • Designing robust, efficient, and extensible accessibility APIs in Operating Systems
  • Making non-visual interaction as efficient as visual interaction
  • Building tools for students with disabilities or from marginalized communities to broaden participation in computer education
  • Building assistive technologies (ATs) for emerging areas of computing, such as AR/VR/Mixed reality and embodied interaction

Please feel free to contact me if you are also working on these challenges and interested in collaborating with us.


November 1, 2022
We are on the news: New coding tool could aid computer programmers who are blind or have low vision

October 9, 2022
Our paper, Grid-Coding, received the UIST 2022 Best Paper Award! Congratulations, Ehtesham! Try Grid Coding

October 5, 2022
An NIH R01 proposal is submitted

Sept 15, 2022
Multiple papers submitted to CHI 2022

August 16, 2022
Serving as a PC member in AAAI 2023 and IUI 2023 conferences

August 15, 2022
Welcome Mahjabin Nahar, our new lab member!

August 04, 2022
Our paper, Grid-Coding: An Accessible, Efficient, and Structured Coding Paradigm for Blind and Low-Vision Programmers is accepted to UIST 2022

June 21, 2022
One journal paper submitted to ACM Transactions on Interactive Intelligent Systems (TiiS)

June 8, 2022
One paper submitted to USENIX Security 2023

June 1, 2022
One journal paper submitted to ACM Transactions on Accessible Computing (TACCESS)

April 19, 2022
We are on the news: Tech designed to aid visually impaired could benefit from human-AI collaboration

April 06, 2022
Our paper, Helping Helpers: Supporting Volunteers in Remote Sighted Assistance with Augmented Reality Maps is accepted to DIS 2022

Feb 28, 2022
Received two seed grants ($33K and $40K) on accessible computer vision algorithm and AI-based study companion

Dec 20, 2021
Our paper, Opportunities for Human-AI Collaboration in Remote Sighted Assistance is accepted to IUI 2022

Nov 16, 2021
Our journal paper, Iterative Design and Prototyping of Computer Vision Mediated Remote Sighted Assistance is accepted to ACM Transactions on Computer-Human Interaction (TOCHI)

August 06, 2021
Our paper, Tilt-Explore: Making Tilt Gestures Usable for Low-Vision Smartphone Users is accepted to UIST 2021

June 15, 2021
Our paper, Understanding Screen Readers' Plugins is accepted to ASSETS 2021

March 18, 2021
We are on the news: New tool could help lessen bias in live television broadcasts

August 05, 2020
Our poster, Towards Multi-Wheel Input Device for Non-Visual Interaction is accepted to UIST 2020

July 24, 2020
Our paper on balancing actors' gender and skin-color in live telecasts is accepted to CSCW 2020

March 05, 2020
Our poster Wayfinding and Navigating in Virtual Environment: Learning from Audio Games is accepted in Thinking Within Symposium 2020 – Exploring Immersive Technologies in Education and Research

Oct. 30, 2019
Our paper DarkReader got the best paper nominee award in ASSETS 2019!

June 28, 2019
1 paper accepted in ASSETS 2019!

... see all News