How a Simple Online Game Is Teaching AI to See with Human Insight
Researchers at Brown University use an online game, Click Me, to teach AI how humans perceive images. This helps AI interpret visuals more like people, improving accuracy and trust.

Teaching AI to See Like Humans: Insights from Brown University
At Brown University, researchers are exploring a simple yet powerful approach to improve how artificial intelligence perceives images—through an online game called Click Me. This interactive game invites participants to click on parts of an image that they believe are most informative for AI recognition. The data collected helps train AI models to interpret visual information more like humans do.
AI has made significant strides in image recognition, identifying animals, objects, and even medical conditions. Despite this progress, AI systems still make errors that humans rarely make. For example, an AI might mislabel a dog wearing sunglasses or fail to recognize a stop sign partially obscured by graffiti. These mistakes highlight a gap between human and AI visual perception that tends to widen as models grow in size and complexity.
Bridging the Gap with Psychology and Neuroscience
To address this challenge, the project combines psychology, neuroscience, and machine learning. The goal is to decode how humans process visual information and then embed those patterns into AI algorithms. This approach aims to improve AI’s ability to represent the visual world in ways that align with human perception.
Participants in the Click Me game help AI models learn by selectively revealing parts of images. The AI only "sees" the portions clicked by players, encouraging strategic choices rather than random clicks. Later, during a process called "neural harmonization," AI models are trained to focus on the exact image features identified by human players. This alignment ensures that AI visual recognition strategies closely mirror human ones.
Engaging the Public to Advance AI Research
The project’s success owes much to widespread public participation. Thousands have played Click Me, generating tens of millions of interactions across platforms like Reddit and Instagram. This large-scale data collection accelerates research by providing diverse insights into how people perceive images.
Alongside the game, researchers developed a computational framework that trains AI models using behavioral data from humans. This method aligns AI decision-making speed and choices with human responses, leading to more natural and interpretable AI behavior.
Practical Benefits Across Industries
- Medicine: AI tools that explain their diagnoses in human-like ways can build trust with doctors, improving adoption and patient care.
- Autonomous Vehicles: AI that better understands human visual decision-making can predict driver behavior more accurately, enhancing safety.
- Accessibility and Education: Human-aligned AI can improve tools that support learning and accessibility for diverse user needs.
Beyond these domains, this approach could enhance decision support systems across various fields by making AI more transparent and reliable.
Advancing Human Vision Science Through AI
The project also contributes to neuroscience by producing AI models that resemble human visual processing more closely than previous models. This synergy between AI and brain science is supported by federal research funding, such as from the National Science Foundation (NSF). Such investments drive scientific progress and deliver practical technologies that improve the safety and effectiveness of AI systems we rely on daily.
For those interested in expanding their AI expertise, exploring courses on human-centered AI and machine learning may be valuable. Resources like Complete AI Training’s latest courses offer practical insights into the intersection of AI and human cognition.