Second phase secured: DFG extends TRR 318 for social, context-aware AI explanations

DFG renews TRR 318 with €14M, keeping social AI research at Paderborn and Bielefeld running from Jan 2026 for 3.5 years. Phase two zeroes in on dialogue and contextual explanations.

Categorized in: AI News Science and Research
Published on: Nov 22, 2025
Second phase secured: DFG extends TRR 318 for social, context-aware AI explanations

DFG extends TRR 318 "Constructing Explainability" - €14M for social AI research at Paderborn and Bielefeld

Germany's Research Foundation (DFG) has approved a second funding phase for the Transregional Collaborative Research Centre 318 "Constructing Explainability." The project at Paderborn University and Bielefeld University will continue from January 2026 for another three and a half years with around €14 million in funding.

"This decision highlights the importance of research into social artificial intelligence and demonstrates the exceptional interdisciplinary expertise at an elite international level that we are pooling here at Paderborn and Bielefeld Universities," said Professor Matthias Bauer, President of Paderborn University. "The universities are collaborating on this project as strong regional partners. This collaboration clearly showcases the strength of research and innovative capacity that sets Ostwestfalen-Lippe apart - this Collaborative Research Centre is an international flagship for the region," added Professor Angelika Epple, Principal of Bielefeld University.

What TRR 318 is building

TRR 318 brings together computer science, linguistics, media studies, philosophy, psychology, sociology, and economics. Since July 2021, the team has been asking a simple question with hard implications: how do we make AI explanations genuinely useful for people?

The approach goes beyond traditional "explainable AI." Instead of one-way justifications, the focus is on social interaction-systems that adapt to users' needs and collaboratively arrive at clarity.

Explanations work best as dialogue

Over four and a half years of research, one pattern kept showing up: explanations land only when they reflect the receiver's perspective. "Although we often want a perfect explanation, one provided in the form of a monologue may not be successful. Instead, a dialogue creates an opportunity for the people on both sides to be actively involved in shaping the process of what they understand and how," said Professor Katharina Rohlfing, spokesperson for TRR 318 and Professor of Psycholinguistics at Paderborn University.

Empirical studies examined how people use language, gestures, and reactions to signal understanding-and how those cues can guide AI systems. The team also looked at explainability in everyday contexts and included current developments such as large language models like ChatGPT early in the work.

Phase two: context-aware explainability

The next phase zeroes in on context: situations, settings, people, interpretations, and the shared knowledge that builds up in conversation. The goal is to adapt explanations to these conditions in real time.

"In one context, a brief, technical explanation could be helpful, whilst another requires a more detailed, everyday approach. Explanation requirements can also vary within a single setting," said Professor Philipp Cimiano, deputy spokesperson for TRR 318 and Professor of Semantic Databases at Bielefeld University. Future systems should respond to these shifts and flexibly manage the dialogue with users.

"We are delighted about the trust that the DFG have placed in us," added Professor Rohlfing. "During the second phase, we look forward to taking responsibility for developing a more social form of explainable AI and applying these findings to practical settings so that AI explanations are comprehensible, helpful and relevant for users."

Scale and structure

TRR 318 involves more than 60 researchers across 20 sub-projects spanning seven disciplines. Work is organized into three research areas and supported by a graduate school to develop early-career talent.

The extension secures a productive collaboration and strengthens Paderborn and Bielefeld as (inter)national hubs for AI research with measurable outcomes for real-world applications.

Why this matters for research and practice

  • Design implication: explanations should be interactive, with feedback loops that detect and respond to user signals (verbal and nonverbal).
  • Evaluation shift: include user comprehension and task outcomes, not just model fidelity or feature attribution accuracy.
  • Context handling: build policies that switch explanation style (technical vs. everyday) based on situation, expertise, and shared history.
  • Deployment: prioritize field studies in everyday settings to test whether explanations help people make better decisions.

For background on Germany's coordinated research programs, see the German Research Foundation (DFG).

If you're building dialog-based, explainable systems and want structured upskilling, explore practical resources by job role: AI courses by job.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)