"Đăng ký ngay" Báo cáo bán tuần Dự án Chất lượng Bậc A, Khám phá 1% Dự án xuất sắc nhất
API Tải ứng dụng RootData

Gradient releases the Echo-2 RL framework, improving AI research efficiency by over 10 times

Feb 12, 2026 23:14:51

Chia sẻ để

The distributed AI lab Gradient today released the Echo-2 distributed reinforcement learning framework, aimed at breaking the barriers of training efficiency in AI research. By achieving a complete decoupling of Learner and Actor at the architectural level, Echo-2 has reduced the post-training cost of a 30B model from $4,500 to $425. This results in over 10 times the research throughput under the same budget.

The framework utilizes compute-storage separation technology for asynchronous training (Async RL), offloading massive sampling computation to unstable GPU instances and Parallax-based heterogeneous GPUs. Coupled with breakthroughs in bounded staleness, instance fault-tolerant scheduling, and the self-developed Lattica communication protocol, it significantly enhances training efficiency while ensuring model accuracy. Along with the framework release, Gradient will soon launch the RLaaS platform Logits, promoting the shift in AI research from a "capital accumulation" paradigm to an "efficiency iteration" paradigm. Logits is now open for reservations for students and researchers worldwide.

Tài chính và đầu tư gần đây

Xem thêm
$5M Apr 3
$1M Apr 2
-- Apr 2

Token được phát hành gần đây

Xem thêm
Mar 30
Mar 23
edgeX EDGE
Mar 19

𝕏 Sự quan tâm mới nhất

Xem thêm