🐦 Twitter Post Details

Viewing enriched Twitter post

@osanseviero

Some fun things people may have missed from Gemma 3 270M: 1. Out of 270M params, 170M are embedding params and 100M are transformers blocks. Bert from 2018 was larger 🀯 2. The vocabulary is quite large (262144 tokens). This makes Gemma 3 270M very good model to be hyper specialized in a task or a specific language, as the model will work very well even with less common tokens. 3. We released both a pre-trained and an instruct model, enabling you to fine-tune for your needs. 4. We collaborated closely with the developer ecosystem to get this out, allowing you to use Hugging Face transformers and transformers.js, Ollama, Kaggle, LM Studio, Docker, LiteRT, Vertex, llama.cpp, Keras, MLX, Gemma.cpp, UnSloth, JAX, Cloud Run, and more. https://t.co/CLciq44qOS

Media 1

πŸ“Š Media Metadata

{
  "media": [
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/1956258657483534803/media_0.jpg?",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/1956258657483534803/media_0.jpg?",
      "type": "photo",
      "filename": "media_0.jpg"
    }
  ],
  "processed_at": "2025-08-15T22:36:33.755162",
  "pipeline_version": "2.0"
}

πŸ”§ Raw API Response

{
  "type": "tweet",
  "id": "1956258657483534803",
  "url": "https://x.com/osanseviero/status/1956258657483534803",
  "twitterUrl": "https://twitter.com/osanseviero/status/1956258657483534803",
  "text": "Some fun things people may have missed from Gemma 3 270M:\n\n1. Out of 270M params, 170M are embedding params and 100M are transformers blocks. Bert from 2018 was  larger 🀯\n2. The vocabulary is quite large (262144 tokens). This makes Gemma 3 270M very good model to be hyper specialized in a task or a specific language, as the model will work very well even with less common tokens.\n3. We released both a pre-trained and an instruct model, enabling you to fine-tune for your needs.\n4. We collaborated closely with the developer ecosystem to get this out, allowing you to use Hugging Face transformers and transformers.js, Ollama, Kaggle, LM Studio, Docker, LiteRT, Vertex, llama.cpp, Keras, MLX, Gemma.cpp, UnSloth, JAX, Cloud Run, and more.\n\nhttps://t.co/CLciq44qOS",
  "source": "Twitter for iPhone",
  "retweetCount": 86,
  "replyCount": 32,
  "likeCount": 843,
  "quoteCount": 3,
  "viewCount": 64890,
  "createdAt": "Fri Aug 15 07:36:24 +0000 2025",
  "lang": "en",
  "bookmarkCount": 290,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "1956258657483534803",
  "displayTextRange": [
    0,
    273
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "osanseviero",
    "url": "https://x.com/osanseviero",
    "twitterUrl": "https://twitter.com/osanseviero",
    "id": "207744565",
    "name": "Omar Sanseviero",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/1676696716693700608/t4kv-MrC_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/207744565/1559310065",
    "description": "",
    "location": "Zurich",
    "followers": 49476,
    "following": 2602,
    "status": "",
    "canDm": true,
    "canMediaTag": false,
    "createdAt": "Mon Oct 25 23:29:03 +0000 2010",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {}
    },
    "fastFollowersCount": 0,
    "favouritesCount": 24680,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 1348,
    "statusesCount": 10565,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [],
    "profile_bio": {
      "description": "Making ML go brr at Google\n\nex-Chief Llama Officer @huggingface πŸ¦™\nFounder @AI_Learners.\n100% Hacker LlamaπŸ‡΅πŸ‡ͺπŸ‡²πŸ‡½",
      "entities": {
        "description": {
          "user_mentions": [
            {
              "id_str": "0",
              "indices": [
                51,
                63
              ],
              "name": "",
              "screen_name": "huggingface"
            },
            {
              "id_str": "0",
              "indices": [
                74,
                86
              ],
              "name": "",
              "screen_name": "AI_Learners"
            }
          ]
        },
        "url": {
          "urls": [
            {
              "display_url": "osanseviero.github.io/hackerllama/",
              "expanded_url": "https://osanseviero.github.io/hackerllama/",
              "indices": [
                0,
                23
              ],
              "url": "https://t.co/3LrSRqlhYK"
            }
          ]
        }
      }
    },
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {},
  "card": null,
  "place": {},
  "entities": {
    "urls": [
      {
        "display_url": "huggingface.co/google/gemma-3…",
        "expanded_url": "https://huggingface.co/google/gemma-3-270m",
        "indices": [
          742,
          765
        ],
        "url": "https://t.co/CLciq44qOS"
      }
    ]
  },
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "article": null
}