@percyliang
We did a very careful study of 10 optimizers with no horse in the race. Despite all the excitement about Muon, Mars, Kron, Soap, etc., at the end of the day, if you tune the hyperparameters rigorously and scale up, the speedup over AdamW diminishes to only 10% :-( Experiments are made possible by Marin (https://t.co/UgEjGM0HPY); anyone developing new optimizers: please come try your method on this benchmark!