Weverse deploys Google Cloud AI to handle fan support across 245 countries

Weverse has deployed Google Cloud's conversational AI to handle fan support across 245 countries since March. The system manages multilingual inquiries around the clock for 180-plus artists, including BTS and Blackpink.

Categorized in: AI News Customer Support
Published on: May 08, 2026
Weverse deploys Google Cloud AI to handle fan support across 245 countries

Weverse deploys Google Cloud AI to handle global fan support at scale

Weverse has deployed Google Cloud's conversational AI in its fan support system, handling customer inquiries across 245 countries and regions since March. The platform uses machine learning and natural language processing to provide round-the-clock responses in multiple languages about ticket reservations, merchandise purchases and platform use.

The move addresses a core operational challenge: fan communities can generate massive traffic spikes during artist events and product releases. Weverse hosts communities for more than 180 artists, including BTS, Blackpink, Seventeen, Dua Lipa and The Kid Laroi, spanning K-pop and Western pop markets.

Why this matters for support teams

Support teams managing global audiences face predictable crises. When a major artist announces a tour or drops merchandise, inquiry volumes can overwhelm human staff within hours. Weverse's system absorbs these spikes by automating responses to common questions, freeing teams to handle complex issues that require human judgment.

The deployment builds on existing infrastructure. Weverse previously moved its data analytics to Google Cloud's BigQuery to manage traffic peaks. The new support system integrates with that data layer, allowing the platform to learn from customer behavior patterns and adjust responses during high-demand periods.

How the system works

The conversational AI handles inquiries in native languages without routing them through translation layers. This matters because support quality degrades when customers must wait for human translation or navigate interface delays during peak periods.

Joon Choi, president of Weverse Company, said the system addresses operational complexity at global scale. "By collaborating with Google Cloud to implement advanced AI automation, we are now able to provide high-quality, instantaneous support to our global community in their native languages," he said.

Weverse plans to double processing efficiency within the year. The company measures success in transaction continuity-how many customer inquiries get resolved without disrupting ticket sales or merchandise purchases during busy periods.

Broader industry context

The deployment reflects a wider shift across consumer internet platforms toward AI for customer support, particularly where multilingual requirements and sudden demand spikes create operational strain. Entertainment and eCommerce platforms face sharper traffic surges than most industries.

Ruth Sun, managing director of Google Cloud Korea, framed the project in terms of infrastructure requirements. "Global cultural events that transcend borders require a support infrastructure that can handle massive scale and diverse languages instantaneously," she said.

For support professionals, the Weverse case demonstrates how generative AI and LLM systems can absorb predictable demand spikes without degrading service quality. The key difference from earlier chatbot deployments is integration with live transaction data-the system learns what questions matter most during specific periods and adjusts accordingly.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)