Microsoft and Be My Eyes collaborate in a first-of-it-kind program to make AI models more inclusive by closing the “accuracy gap” for over 340 million people in the blind and low-vision community

Microsoft and Be My Eyes collaborate in a first-of-it-kind program to make AI models more inclusive by closing the "accuracy gap" for over 340 million people in the blind and low-vision community

What you need to know

  • Microsoft and Be My Eyes are building on their partnership by collaborating to make AI models more inclusive of disability.
  • Be My Eyes will provide Microsoft with video datasets to close the accuracy gap in AI models on disability.
  • The company promises to maintain users’ privacy by omitting personal details from the videos shared with Microsoft. 

As someone who has spent years working and living alongside individuals with diverse abilities, I find this collaboration between Microsoft and Be My Eyes to be not just a step forward but a giant leap for inclusivity. As technology continues to evolve, it’s crucial that it does so in a way that considers everyone, regardless of their physical capabilities.


Be My Eyes collaborates with Microsoft to ensure their AI models cater effectively to the approximately 340 million individuals globally who are visually impaired or blind, making these technologies more inclusive for them. To clarify, Be My Eyes is a mobile application that facilitates communication between users with visual impairments and sighted helpers for visual assistance purposes.

According to the company, integrating AI into the app will make it more effective and efficient:

Typically used datasets for training artificial intelligence (AI) models may not include information about real-world context or accurately represent the lives of people who are visually impaired. This absence of disability-related data could lead to a future dominated by AI that is biased and inaccessible, repeating past errors from the development of the internet but with potentially greater consequences.

As an analyst, I’ve observed that when it comes to developing AI models, the experiences of people with disabilities, particularly those within the blind and low-vision community, often seem overlooked. This oversight has resulted in challenges for these AI models when dealing with tasks related to disability.

The AI revolution will be “more inclusive”

It’s important to recognize that training AI models relies significantly on the data they are given. Unfortunately, people with disabilities are not always accurately represented or even included in these training datasets. This oversight can result in AI models being limited in their usefulness and potentially introducing biases.

Currently, Be My Eyes is planning to share exclusive video datasets with Microsoft, which aims to enhance the training of artificial intelligence (AI) models specifically designed for individuals with disabilities, particularly those who are visually impaired. These videos will showcase the obstacles encountered by visually impaired people, potentially reducing the “accuracy gap” that currently makes AI models less effective for this community.

Privacy and security continue to raise concerns as generative AI becomes more prevalent. Be My Eyes’ first-of-its-kind collaboration with Microsoft promises to maintain privacy in its new efforts by scrapping all personal data from the video metadata, including all user, account, and Personal Identifiable Information (PII) before sharing the content with the tech giant. Likewise, users can opt out of Be My Eyes’ new campaign, preventing their personal information and data from being shared. 

According to Be My Eyes:

As a data analyst, I am working with video datasets that provide insights into the daily experiences of people within the blind and low vision community. My aim is to leverage these datasets to enhance the precision and accuracy of AI systems in understanding and describing scenes, ultimately making AI more beneficial and accessible for this community.

By the close of 2023, I witnessed Microsoft forming a partnership with Be My Eyes, a remarkable step towards enhancing accessibility within their customer service department, specifically for visually impaired individuals. The collaboration unveiled an innovative application, aptly named ‘Be My AI’, driven by OpenAI’s GPT-4 model. This cutting-edge tool offers users detailed, vivid descriptions of images, transforming visual content into a more inclusive experience.

Read More

2024-10-18 15:09