@omarsar0
IBM presents Granite 3.0, lightweight foundation models ranging from 400 million to 8B parameters. Supports coding, RAG, reasoning, and function calling, focusing on enterprise use cases, including on-premise and on-device settings. The technical report contains detailed discussions on how they collect synthetic datasets for code, reasoning, RAG, tool use, and more. "Granite 3.0 language models demonstrate strong performance across a battery of academic benchmarks for language understanding, reasoning, coding, function calling, and safety" The release includes pre-trained and post-trained versions of all Granite 3.0 models under a permissive Apache 2.0 license. Very strong release by the Granite Team at IBM.