🐦 Twitter Post Details

Viewing enriched Twitter post

@osanseviero

Introducing T5Gemma 2, the next generation of encoder-decoder models πŸš€ Built on top of Gemma 3, we were able to build compact models at sizes of 270m-270m, 1B-1B, and 4B-4B sizes. While most models today are decoder-only, T5Gemma 2 is the first (I'm aware of) multimodal, long-context, and heavily multilingual (140 languages) encoder-decoder model out there. We hope this model enables the model research community as well as the community of devs ready to explore with new architectures. Blog: https://t.co/12ScxYcjxa Models: https://t.co/D38wNFo5Bc Paper: https://t.co/2rypSQ7Bf6

Media 1
Media 2
Media 3

πŸ“Š Media Metadata

{
  "media": [
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2001723652635541566/media_0.jpg?",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2001723652635541566/media_0.jpg?",
      "type": "photo",
      "filename": "media_0.jpg"
    },
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2001723652635541566/media_1.jpg?",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2001723652635541566/media_1.jpg?",
      "type": "photo",
      "filename": "media_1.jpg"
    },
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2001723652635541566/media_2.jpg?",
      "media_url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/2001723652635541566/media_2.jpg?",
      "type": "photo",
      "filename": "media_2.jpg"
    }
  ],
  "processed_at": "2025-12-18T20:55:25.153520",
  "pipeline_version": "2.0"
}

πŸ”§ Raw API Response

{
  "type": "tweet",
  "id": "2001723652635541566",
  "url": "https://x.com/osanseviero/status/2001723652635541566",
  "twitterUrl": "https://twitter.com/osanseviero/status/2001723652635541566",
  "text": "Introducing T5Gemma 2, the next generation of encoder-decoder models πŸš€\n\nBuilt on top of Gemma 3, we were able to build compact  models at sizes of 270m-270m, 1B-1B, and 4B-4B sizes. \n\nWhile most models today are decoder-only, T5Gemma 2 is the first (I'm aware of) multimodal, long-context, and heavily multilingual (140 languages) encoder-decoder model out there. \n\nWe hope this model enables the model research community as well as the community of devs ready to explore with new architectures.\n\nBlog: https://t.co/12ScxYcjxa\nModels: https://t.co/D38wNFo5Bc\nPaper: https://t.co/2rypSQ7Bf6",
  "source": "Twitter for iPhone",
  "retweetCount": 50,
  "replyCount": 19,
  "likeCount": 492,
  "quoteCount": 10,
  "viewCount": 26932,
  "createdAt": "Thu Dec 18 18:38:03 +0000 2025",
  "lang": "en",
  "bookmarkCount": 169,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "2001723652635541566",
  "displayTextRange": [
    0,
    276
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "osanseviero",
    "url": "https://x.com/osanseviero",
    "twitterUrl": "https://twitter.com/osanseviero",
    "id": "207744565",
    "name": "Omar Sanseviero",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/1676696716693700608/t4kv-MrC_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/207744565/1559310065",
    "description": "Developer Experience Lead at @GoogleDeepMind\n\nBuilding Gemini API, Gemma, AI Studio and more AI products. My views\n\nex-Chief Llama Officer @huggingface πŸ‡΅πŸ‡ͺπŸ‡²πŸ‡½",
    "location": "Zurich",
    "followers": 58719,
    "following": 2692,
    "status": "",
    "canDm": true,
    "canMediaTag": false,
    "createdAt": "Mon Oct 25 23:29:03 +0000 2010",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {
        "urls": [
          {
            "display_url": "osanseviero.github.io/hackerllama/",
            "expanded_url": "https://osanseviero.github.io/hackerllama/",
            "url": "https://t.co/3LrSRqlhYK",
            "indices": [
              0,
              23
            ]
          }
        ]
      }
    },
    "fastFollowersCount": 0,
    "favouritesCount": 25610,
    "hasCustomTimelines": false,
    "isTranslator": false,
    "mediaCount": 1403,
    "statusesCount": 10891,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {
      "label": {
        "url": {
          "url": "https://twitter.com/GoogleAIStudio",
          "urlType": "DeepLink"
        },
        "badge": {
          "url": "https://pbs.twimg.com/profile_images/1957558782067896323/6jXpPKD4_bigger.png"
        },
        "description": "Google AI Studio",
        "userLabelType": "BusinessLabel",
        "userLabelDisplayType": "Badge"
      }
    },
    "possiblySensitive": false,
    "pinnedTweetIds": [],
    "profile_bio": {
      "description": "Developer Experience Lead at @GoogleDeepMind\n\nBuilding Gemini API, Gemma, AI Studio and more AI products. My views\n\nex-Chief Llama Officer @huggingface πŸ‡΅πŸ‡ͺπŸ‡²πŸ‡½"
    },
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "display_url": "pic.x.com/gnxk9p8me6",
        "expanded_url": "https://x.com/osanseviero/status/2001723652635541566/photo/1",
        "id_str": "2001722131109863424",
        "indices": [
          277,
          300
        ],
        "media_key": "3_2001722131109863424",
        "media_url_https": "https://pbs.twimg.com/media/G8eLrDPXYAAnXmm.jpg",
        "type": "photo",
        "url": "https://t.co/gnxk9p8me6",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {
            "faces": []
          },
          "medium": {
            "faces": []
          },
          "small": {
            "faces": []
          },
          "orig": {
            "faces": []
          }
        },
        "sizes": {
          "large": {
            "h": 607,
            "w": 1080,
            "resize": "fit"
          },
          "medium": {
            "h": 607,
            "w": 1080,
            "resize": "fit"
          },
          "small": {
            "h": 382,
            "w": 680,
            "resize": "fit"
          },
          "thumb": {
            "h": 150,
            "w": 150,
            "resize": "crop"
          }
        },
        "original_info": {
          "height": 607,
          "width": 1080,
          "focus_rects": [
            {
              "x": 0,
              "y": 0,
              "w": 1080,
              "h": 605
            },
            {
              "x": 263,
              "y": 0,
              "w": 607,
              "h": 607
            },
            {
              "x": 300,
              "y": 0,
              "w": 532,
              "h": 607
            },
            {
              "x": 414,
              "y": 0,
              "w": 304,
              "h": 607
            },
            {
              "x": 0,
              "y": 0,
              "w": 1080,
              "h": 607
            }
          ]
        },
        "media_results": {
          "result": {
            "media_key": "3_2001722131109863424"
          }
        }
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {
    "hashtags": [],
    "symbols": [],
    "urls": [
      {
        "display_url": "blog.google/technology/dev…",
        "expanded_url": "https://blog.google/technology/developers/t5gemma-2",
        "url": "https://t.co/12ScxYcjxa",
        "indices": [
          503,
          526
        ]
      },
      {
        "display_url": "huggingface.co/collections/go…",
        "expanded_url": "https://huggingface.co/collections/google/t5gemma-2",
        "url": "https://t.co/D38wNFo5Bc",
        "indices": [
          535,
          558
        ]
      },
      {
        "display_url": "arxiv.org/abs/2512.14856",
        "expanded_url": "https://arxiv.org/abs/2512.14856",
        "url": "https://t.co/2rypSQ7Bf6",
        "indices": [
          566,
          589
        ]
      }
    ],
    "user_mentions": []
  },
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "article": null
}