🐦 Twitter Post Details

Viewing enriched Twitter post

@PetarV_93

Round and Round we Go! πŸ”„ Rotary Positional Encodings (RoPE) are a common staple of frontier LLMs. _Why_ do they work so well, and _how_ do LLMs make advantage of them? The results might surprise you, as they challenge commonly-held wisdom! Read on ↩️ Work led by @fedzbar! https://t.co/C61UvK5zOb

Media 1

πŸ“Š Media Metadata

{
  "media": [
    {
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/1844319204062933165/media_0.png",
      "type": "photo",
      "original_url": "https://pbs.twimg.com/media/GZhTIuSXkBAKluu.png",
      "recovered_from_supabase": true
    }
  ],
  "conversion_date": "2025-08-13T00:27:38.660737",
  "format_converted": true,
  "original_structure": "had_media_only"
}

πŸ”§ Raw API Response

{
  "user": {
    "created_at": "2013-01-08T18:29:17.000Z",
    "default_profile_image": false,
    "description": "Senior Staff Research Scientist @GoogleDeepMind | Affiliated Lecturer @Cambridge_Uni | Assoc @clarehall_cam | GDL Scholar @ELLISforEurope. Monoids. πŸ‡·πŸ‡ΈπŸ‡²πŸ‡ͺπŸ‡§πŸ‡¦",
    "fast_followers_count": 0,
    "favourites_count": 2929,
    "followers_count": 35565,
    "friends_count": 553,
    "has_custom_timelines": false,
    "is_translator": false,
    "listed_count": 372,
    "location": "London πŸ‡¬πŸ‡§",
    "media_count": 432,
    "name": "Petar VeličkoviΔ‡",
    "normal_followers_count": 35565,
    "possibly_sensitive": false,
    "profile_banner_url": "https://pbs.twimg.com/profile_banners/1071640880/1670548783",
    "profile_image_url_https": "https://pbs.twimg.com/profile_images/1843767750130298880/J1cl0hF0_normal.jpg",
    "screen_name": "PetarV_93",
    "statuses_count": 2515,
    "translator_type": "none",
    "url": "https://t.co/PAsT7CMhth",
    "verified": false,
    "withheld_in_countries": [],
    "id_str": "1071640880"
  },
  "id": "1844319204062933165",
  "conversation_id": "1844319204062933165",
  "full_text": "Round and Round we Go! πŸ”„\n\nRotary Positional Encodings (RoPE) are a common staple of frontier LLMs.\n\n_Why_ do they work so well, and _how_ do LLMs make advantage of them?\n\nThe results might surprise you, as they challenge commonly-held wisdom! Read on ↩️\n\nWork led by @fedzbar! https://t.co/C61UvK5zOb",
  "reply_count": 9,
  "retweet_count": 109,
  "favorite_count": 616,
  "hashtags": [],
  "symbols": [],
  "user_mentions": [
    {
      "id_str": "1073302912854564870",
      "name": "Federico Barbero",
      "screen_name": "fedzbar",
      "profile": "https://twitter.com/fedzbar"
    }
  ],
  "urls": [],
  "media": [
    {
      "media_url": "https://pbs.twimg.com/media/GZhTIuSXkBAKluu.png",
      "type": "photo"
    }
  ],
  "url": "https://twitter.com/PetarV_93/status/1844319204062933165",
  "created_at": "2024-10-10T10:08:58.000Z",
  "#sort_index": "1844319204062933165",
  "view_count": 76192,
  "quote_count": 6,
  "is_quote_tweet": false,
  "is_retweet": false,
  "is_pinned": false,
  "is_truncated": false,
  "startUrl": "https://x.com/petarv_93/status/1844319204062933165"
}