@natolambert
Google dropped 4 different Gemma open-weight models! I'm most excited that they're finally adopting a standard Apache 2.0 open source license. This'll massively boost adoption. The standard of better licenses was set by mostly Chinese open model labs, and now labs in the U.S. companies are following suit. The models are really like 31B dense, 26B-4B active MoE, 8B, 5B dense (called smaller for some reason). Base models too. Good sizes for tinkering, some local uses, and research (8/5B). 30B is particularly a great size range for building useful tools (which is why we made Olmo 3 that size too). Gemini doesn't release bad models so I'm excited to try these! Congrats Googlers.