tiprankstipranks
Advertisement
Advertisement

Cohere Targets Multilingual Edge AI With Tiny Aya Open-Weight Model Family

Cohere Targets Multilingual Edge AI With Tiny Aya Open-Weight Model Family

New updates have been reported about Cohere.

Claim 55% Off TipRanks

Cohere has introduced Tiny Aya, a new open-weight family of multilingual AI models designed to run efficiently on everyday hardware, placing the company squarely in the on-device and emerging-markets AI race. Developed by Cohere Labs, the 3.35‑billion‑parameter base model and its TinyAya‑Global and regional variants (Earth, Fire, Water) support more than 70 languages, including key South Asian tongues such as Hindi, Bengali, Tamil, Telugu, Marathi, Punjabi, Urdu, and Gujarati, and are tailored to capture local linguistic and cultural nuance while maintaining broad multilingual coverage.

Trained on a single cluster of 64 Nvidia H100 GPUs, Tiny Aya underscores Cohere’s focus on cost-efficient training and deployment, with models engineered to run offline on laptops and similar devices for use cases such as translation and local-language applications in connectivity‑constrained markets like India. The models, along with associated training and evaluation datasets, are being distributed through HuggingFace, the Cohere Platform, Kaggle, and Ollama, signaling a deliberate push into the open ecosystem while expanding Cohere’s developer footprint and potential enterprise adoption. The company, led by CEO Aidan Gomez who has publicly discussed plans to go public “soon,” reportedly closed 2025 with $240 million in annual recurring revenue and 50% quarter‑over‑quarter growth, suggesting that Tiny Aya could further reinforce its revenue trajectory and competitive positioning in enterprise and edge AI.

Disclaimer & DisclosureReport an Issue

1