🐦 Twitter Post Details

Viewing enriched Twitter post

@perplexity_ai

Code LLaMA is now on Perplexity’s LLaMa Chat! Try asking it to write a function for you, or explain a code snippet: 🔗 https://t.co/rwcPzknBgE This is the fastest way to try @MetaAI’s latest code-specialized LLM. With our model deployment expertise, we are able to provide you with this model less than 24 hours of it’s release. What’s next? We’ll integrate code LLaMA into Perplexity, all in service of providing you with the best answers to your most technical questions!

📊 Media Metadata

{
  "links_checked": true,
  "checked_at": "2025-08-10T10:33:08.784528",
  "media": [
    {
      "type": "video",
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/1694845231936557437/media_0.mp4?",
      "filename": "media_0.mp4"
    }
  ],
  "reprocessed_at": "2025-08-12T15:28:01.621176",
  "reprocessed_reason": "missing_media_array"
}

🔧 Raw API Response

{
  "user": {
    "created_at": "2022-12-05T02:12:13.000Z",
    "default_profile_image": false,
    "description": "Your Research Assistant. Available wherever you are: 🤖 https://t.co/hehYkpI3k5 // 📱 https://t.co/wMsGq0AOdT",
    "fast_followers_count": 0,
    "favourites_count": 562,
    "followers_count": 55875,
    "friends_count": 13,
    "has_custom_timelines": false,
    "is_translator": false,
    "listed_count": 958,
    "location": "San Francisco, CA",
    "media_count": 106,
    "name": "Perplexity",
    "normal_followers_count": 55875,
    "possibly_sensitive": false,
    "profile_banner_url": "https://pbs.twimg.com/profile_banners/1599587232175849472/1692212928",
    "profile_image_url_https": "https://pbs.twimg.com/profile_images/1691929847784980491/zeYyB1xu_normal.jpg",
    "screen_name": "perplexity_ai",
    "statuses_count": 205,
    "translator_type": "none",
    "url": "https://t.co/xJkINlk2Bd",
    "verified": false,
    "verified_type": "Business",
    "withheld_in_countries": [],
    "id_str": "1599587232175849472"
  },
  "id": "1694845231936557437",
  "conversation_id": "1694845231936557437",
  "full_text": "Code LLaMA is now on Perplexity’s LLaMa Chat!\n\nTry asking it to write a function for you, or explain a code snippet: 🔗 https://t.co/rwcPzknBgE\n\nThis is the fastest way to try @MetaAI’s latest code-specialized LLM. With our model deployment expertise, we are able to provide you with this model less than 24 hours of it’s release.\n\nWhat’s next? \nWe’ll integrate code LLaMA into Perplexity, all in service of providing you with the best answers to your most technical questions!",
  "reply_count": 26,
  "retweet_count": 153,
  "favorite_count": 682,
  "hashtags": [],
  "symbols": [],
  "user_mentions": [
    {
      "id_str": "1034844617261248512",
      "name": "Meta AI",
      "screen_name": "MetaAI",
      "profile": "https://twitter.com/MetaAI"
    }
  ],
  "urls": [
    {
      "url": "https://t.co/gyiDw6u6IJ",
      "expanded_url": "https://labs.pplx.ai/code-llama",
      "display_url": "labs.pplx.ai/code-llama"
    }
  ],
  "media": [
    {
      "media_url": "https://pbs.twimg.com/ext_tw_video_thumb/1694845069113659393/pu/img/S8hJCckTT5H5DNpb.jpg",
      "type": "video",
      "video_url": "https://video.twimg.com/ext_tw_video/1694845069113659393/pu/vid/784x720/MyawxO05fAXBM0y2.mp4?tag=12"
    }
  ],
  "url": "https://twitter.com/perplexity_ai/status/1694845231936557437",
  "created_at": "2023-08-24T22:52:46.000Z",
  "#sort_index": "1694845231936557437",
  "view_count": 250230,
  "quote_count": 32,
  "is_quote_tweet": false,
  "is_retweet": false,
  "is_pinned": false,
  "is_truncated": true,
  "startUrl": "https://twitter.com/perplexity_ai/status/1694845231936557437"
}