Trillion Labs Unveils 70 Billion-Parameter Language Model 'Tri-70B'

TrillionLabs Opens New Horizons for Domestic AI: Unveiling of the 70 Billion Parameter 'Tri-70B' Model and a Groundbreaking Open-Source Declaration TrillionLabs, poised to bear the future of South Korean AI technology, is once again making...

Sep 10, 2025 - 00:00
 0  675
Trillion Labs Unveils 70 Billion-Parameter Language Model 'Tri-70B'
TrillionLabs Opens New Horizons for Domestic AI: Unveiling of the 70 Billion Parameter 'Tri-70B' Model and a Groundbreaking Open-Source Declaration TrillionLabs, poised to bear the future of South Korean AI technology, is once again making a significant impact on the domestic AI ecosystem with another bold move. By unveiling its 70-billion-parameter large language model, 'Tri-70B,' to the world, it has set a record as undeniably the largest language model designed and trained independently from scratch in South Korea. TrillionLabs' latest announcement goes beyond simply showcasing the latest technology; it is drawing even more attention with its groundbreaking decision to declare 'Open Source Month' and fully open its entire model lineup, ranging from 0.5B to 70B, under the Apache 2.0 license. By allowing commercial use beyond research purposes, this demonstrates a strong commitment to broadening the base of domestic AI technology and accelerating industrial innovation. A particular highlight of this release is the generous sharing not only of the completed models but also of the intermediate checkpoints generated during the model training process. TrillionLabs is boldly following a precedent that even some non-profit organizations like the Allen Institute or Hugging Face have only attempted restrictively, thereby maximizing transparency in AI research and revealing its sincerity in providing valuable assets to the global developer community. This open approach is expected to accelerate the pace of AI technology development and further solidify the foundation for subsequent research. The Tri-70B lineup also includes models specialized in multilingual translation and specialized models capable of real-time search integration, adding practical value. The search models are designed to reflect the latest information in real-time by linking with external search engines like DuckDuckGo, holding the potential to meet the demands of modern society where information immediacy is crucial. Shin Jae-min, CEO of TrillionLabs, emphasized that this comprehensive disclosure, by transparently revealing even the training processes and core techniques, will lay the groundwork for the development of the AI research ecosystem and serve as a crucial starting point to demonstrate that domestic AI companies can secure global competitiveness through technological excellence and an open research culture. This amounts to a declaration that presents a vision for Korean AI technology to stand out on the world stage. Although a nascent startup established in August 2024, TrillionLabs has demonstrated its strength by forming a team with top-tier talent from institutions like KAIST, Oxford, Berkeley, and companies such as Amazon and Naver, and by independently designing and pre-training Korean-centric LLMs. It successfully secured $5.8 million in pre-seed investment last September and has consistently demonstrated its technological prowess and philosophy of openness by continuously open-sourcing smaller models such as Tri-7B and Tri-21B. TrillionLabs' unveiling of its large-scale model and its comprehensive open-source strategy are expected to be a powerful catalyst for the advancement of domestic AI technology, and attention is drawn to their next steps.

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0