Google bets on offline AI translation with TranslateGemma models

Google has launched TranslateGemma, a new set of open AI translation models designed to run on everyday devices, with the announcement later shared on LinkedIn by Google Chief Strategist Neil Hoyne.

Google has launched TranslateGemma, a new collection of open AI translation models built to support text translation across 55 languages, including offline use on mobile devices and other low-compute environments.

TranslateGemma is built on Google’s Gemma 3 model family and is being released openly for developers and researchers to use, adapt, and integrate into their own applications. The models are designed to run locally rather than relying on cloud infrastructure, signaling a continued shift toward on-device AI deployment.

Google develops large-scale AI models and developer tools across search, productivity, and machine learning research, with recent releases placing greater emphasis on open models that can operate outside centralized cloud systems.

Offline translation targets mobile and low-connectivity settings

TranslateGemma is available in three model sizes and is designed to run across a range of hardware, from smartphones to consumer laptops. Google said the smallest model was optimized for mobile and edge deployment, allowing translation to work without an internet connection or cloud subscription.

Following the launch, Hoyne took to LinkedIn to highlight the practical implications of the release. Hoyne wrote, “What if you could translate 55 languages on your phone – offline, for free? That’s basically what Google just made possible.”

He added that the models were “designed to run on regular devices, not just massive Cloud servers,” positioning the release as a move away from hardware-heavy AI deployment.

Google said TranslateGemma was trained and evaluated across 55 languages, spanning high-, mid-, and low-resource language families. The company also confirmed that nearly 500 language pairs were used during training, with the aim of improving translation quality for languages that are often underserved by commercial AI tools.

Hoyne highlighted this point directly in his post, writing, “55 languages, including many that usually get overlooked in AI implementations.” He added that the models were trained on “nearly 500 language pairs total,” describing the scale as significant for communities that rarely receive high-quality translation tools.

Open release lowers barriers for developers and communities

TranslateGemma is being released openly, with downloads available through platforms including Kaggle and Hugging Face, alongside deployment options via Google’s Vertex AI. Google said the models were intended to be used, modified, and built into third-party applications without licensing restrictions.

Hoyne also pointed to multimodal capabilities, noting that the models could “translate text in images too – think signs, menus, screenshots.”

Reflecting on the broader impact, Hoyne wrote, “This feels like a real step toward making AI translation accessible to everyone, not just people with expensive hardware or paid subscriptions.”

ETIH Innovation Awards 2026

The ETIH Innovation Awards 2026 are now open and recognize education technology organizations delivering measurable impact across K–12, higher education, and lifelong learning. The awards are open to entries from the UK, the Americas, and internationally, with submissions assessed on evidence of outcomes and real-world application.

Previous
Previous

CodeSignal ties university partnerships to skills-to-jobs push for AI-era graduates

Next
Next

Anthropic hires former Microsoft India chief Irina Ghose to lead India expansion