🐦 Twitter Post Details

Viewing enriched Twitter post

@LightningAI

Looking for a small but very capable LLM to avoid out-of-memory errors? Lit-GPT now supports phi-1.5, a SOTA 1.3B parameter LLM! ⚡️ Finetune it on a single GPU ⚡️ Pretrain it on a cluster ⚡️ Just use it on a MacBook More info: https://t.co/Yk8dKmpedk #LLMs #MachineLearning #DeepLearning

Media 1

📊 Media Metadata

{
  "media": [
    {
      "type": "photo",
      "url": "https://pbs.twimg.com/media/F64CuX5XUAE00D0.jpg",
      "media_url": "https://pbs.twimg.com/media/F64CuX5XUAE00D0.jpg",
      "filename": "media_0.jpg"
    }
  ],
  "conversion_date": "2025-08-13T00:34:30.922763",
  "format_converted": true,
  "original_structure": "had_media_only",
  "enhanced_from_raw_response": true,
  "enhanced_at": "2025-08-13T17:20:00Z"
}

🔧 Raw API Response

{
  "user": {
    "created_at": "2019-08-02T13:33:42.000Z",
    "default_profile_image": false,
    "description": "Train, deploy and build AI with PyTorch - Lightning fast ⚡️. Creators of PyTorch Lightning, TorchMetrics, Fabric, Lit-GPT, Lit-LLaMA 🚀",
    "fast_followers_count": 0,
    "favourites_count": 2320,
    "followers_count": 32724,
    "friends_count": 283,
    "has_custom_timelines": true,
    "is_translator": false,
    "listed_count": 815,
    "location": "New York, NY",
    "media_count": 728,
    "name": "Lightning AI ⚡️",
    "normal_followers_count": 32724,
    "possibly_sensitive": false,
    "profile_banner_url": "https://pbs.twimg.com/profile_banners/1157283331509235713/1685633294",
    "profile_image_url_https": "https://pbs.twimg.com/profile_images/1683914901784231936/r1qSgdcS_normal.jpg",
    "screen_name": "LightningAI",
    "statuses_count": 2230,
    "translator_type": "none",
    "url": "https://t.co/hXSE3Zh1Vf",
    "verified": false,
    "withheld_in_countries": [],
    "id_str": "1157283331509235713"
  },
  "id": "1706305169602826295",
  "conversation_id": "1706305169602826295",
  "full_text": "Looking for a small but very capable LLM to avoid out-of-memory errors?\n\nLit-GPT now supports phi-1.5, a SOTA 1.3B parameter LLM!\n⚡️ Finetune it on a single GPU\n⚡️ Pretrain it on a cluster\n⚡️ Just use it on a MacBook\n\nMore info: https://t.co/Yk8dKmpedk\n\n#LLMs #MachineLearning #DeepLearning",
  "reply_count": 2,
  "retweet_count": 33,
  "favorite_count": 128,
  "hashtags": [
    "LLMs",
    "MachineLearning"
  ],
  "symbols": [],
  "user_mentions": [],
  "urls": [
    {
      "url": "https://t.co/gz4Ozu5RSi",
      "expanded_url": "https://github.com/Lightning-AI/lit-gpt/blob/main/tutorials/download_phi15.md",
      "display_url": "github.com/Lightning-AI/l…"
    }
  ],
  "media": [
    {
      "media_url": "https://pbs.twimg.com/media/F64CuX5XUAE00D0.jpg",
      "type": "photo"
    }
  ],
  "url": "https://twitter.com/LightningAI/status/1706305169602826295",
  "created_at": "2023-09-25T13:50:28.000Z",
  "#sort_index": "1706305169602826295",
  "view_count": 23332,
  "quote_count": 0,
  "is_quote_tweet": false,
  "is_retweet": false,
  "is_pinned": false,
  "is_truncated": true,
  "startUrl": "https://twitter.com/lightningai/status/1706305169602826295"
}