Baidu Ships ERNIE 5.0: A 2.4-Trillion Parameter MoE Model That Almost No One Outside China Will Notice

Baidu launched ERNIE 5.0 as a natively omni-modal model with 2.4 trillion parameters using mixture-of-experts — a frontier-class release that highlights the widening gap between Chinese and Western AI discourse.

Subscribe to unlock all stories

Get full access to The Singularity Ledger, archive included.

Cancel anytime. Payments powered by Stripe.