🐦 Twitter Post Details

Viewing enriched Twitter post

@dair_ai

Very few are talking about proactive agents, but they are coming! Current LLM agents wait for you to ask for help. But the best assistant anticipates what you need before you ask. Existing agents follow a reactive paradigm. Users must unlock their phone, navigate to an app, and issue explicit instructions. During a conversation about travel plans, you have to manually ask for weather updates. While shopping, you have to explicitly request price comparisons. This new research introduces ProAgent, an end-to-end proactive agent system that continuously perceives your environment through wearable sensors and delivers assistance before you ask. The key idea: instead of waiting for commands, ProAgent uses egocentric video, audio, motion, and location data from AR glasses and smartphones to anticipate user needs. An on-demand tiered perception system keeps low-cost sensors always on while activating high-cost vision only when patterns suggest assistance opportunities. When you're at a bus stop, ProAgent notices the last bus just left and offers to book an Uber. During a conversation about weekend plans, it proactively checks the weather and your calendar for conflicts. While browsing headphones in a store, it finds lower prices online and gathers reviews. Results across real-world testing with 20 participants: ProAgent achieves 33.4% higher proactive prediction accuracy, 16.8% higher tool-calling F1 score, and 1.79x lower memory usage compared to baselines. User studies show 38.9% higher satisfaction across five dimensions of proactive services. The system runs on edge devices like NVIDIA Jetson Orin with 4.5-second average latency, keeping all data local for privacy. Shifting from reactive to proactive agents reduces both physical and cognitive workload. You stop missing timely information during conversations and attention-intensive tasks. Paper: https://t.co/3zFVP5igxe Learn to build effective agents in our academy: https://t.co/zQXQt0PMbG

Media 1
Media 2

📊 Media Metadata

{
  "media": [
    {
      "type": "photo",
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/1998775732001190018/media_0.jpg?",
      "filename": "media_0.jpg"
    },
    {
      "type": "photo",
      "url": "https://crmoxkoizveukayfjuyo.supabase.co/storage/v1/object/public/media/posts/1998775732001190018/media_1.png?",
      "filename": "media_1.png"
    }
  ],
  "processed_at": "2025-12-10T16:36:15.200686",
  "pipeline_version": "2.0"
}

🔧 Raw API Response

{
  "type": "tweet",
  "id": "1998775732001190018",
  "url": "https://x.com/dair_ai/status/1998775732001190018",
  "twitterUrl": "https://twitter.com/dair_ai/status/1998775732001190018",
  "text": "Very few are talking about proactive agents, but they are coming!\n\nCurrent LLM agents wait for you to ask for help. But the best assistant anticipates what you need before you ask.\n\nExisting agents follow a reactive paradigm. Users must unlock their phone, navigate to an app, and issue explicit instructions. During a conversation about travel plans, you have to manually ask for weather updates. While shopping, you have to explicitly request price comparisons.\n\nThis new research introduces ProAgent, an end-to-end proactive agent system that continuously perceives your environment through wearable sensors and delivers assistance before you ask.\n\nThe key idea: instead of waiting for commands, ProAgent uses egocentric video, audio, motion, and location data from AR glasses and smartphones to anticipate user needs. An on-demand tiered perception system keeps low-cost sensors always on while activating high-cost vision only when patterns suggest assistance opportunities.\n\nWhen you're at a bus stop, ProAgent notices the last bus just left and offers to book an Uber. During a conversation about weekend plans, it proactively checks the weather and your calendar for conflicts. While browsing headphones in a store, it finds lower prices online and gathers reviews.\n\nResults across real-world testing with 20 participants: ProAgent achieves 33.4% higher proactive prediction accuracy, 16.8% higher tool-calling F1 score, and 1.79x lower memory usage compared to baselines. User studies show 38.9% higher satisfaction across five dimensions of proactive services.\n\nThe system runs on edge devices like NVIDIA Jetson Orin with 4.5-second average latency, keeping all data local for privacy.\n\nShifting from reactive to proactive agents reduces both physical and cognitive workload. You stop missing timely information during conversations and attention-intensive tasks.\n\nPaper: https://t.co/3zFVP5igxe\nLearn to build effective agents in our academy: https://t.co/zQXQt0PMbG",
  "source": "Twitter for iPhone",
  "retweetCount": 11,
  "replyCount": 1,
  "likeCount": 43,
  "quoteCount": 0,
  "viewCount": 3271,
  "createdAt": "Wed Dec 10 15:24:04 +0000 2025",
  "lang": "en",
  "bookmarkCount": 31,
  "isReply": false,
  "inReplyToId": null,
  "conversationId": "1998775732001190018",
  "displayTextRange": [
    0,
    281
  ],
  "inReplyToUserId": null,
  "inReplyToUsername": null,
  "author": {
    "type": "user",
    "userName": "dair_ai",
    "url": "https://x.com/dair_ai",
    "twitterUrl": "https://twitter.com/dair_ai",
    "id": "889050642903293953",
    "name": "DAIR.AI",
    "isVerified": false,
    "isBlueVerified": true,
    "verifiedType": null,
    "profilePicture": "https://pbs.twimg.com/profile_images/1643277398522187778/31dedbLo_normal.jpg",
    "coverPicture": "https://pbs.twimg.com/profile_banners/889050642903293953/1742055232",
    "description": "Democratizing AI research, education, and technologies.",
    "location": "",
    "followers": 82942,
    "following": 1,
    "status": "",
    "canDm": true,
    "canMediaTag": true,
    "createdAt": "Sun Jul 23 09:12:45 +0000 2017",
    "entities": {
      "description": {
        "urls": []
      },
      "url": {
        "urls": [
          {
            "display_url": "dair.ai",
            "expanded_url": "https://www.dair.ai/",
            "url": "https://t.co/lkqPZtMmfU",
            "indices": [
              0,
              23
            ]
          }
        ]
      }
    },
    "fastFollowersCount": 0,
    "favouritesCount": 3868,
    "hasCustomTimelines": true,
    "isTranslator": false,
    "mediaCount": 83,
    "statusesCount": 2634,
    "withheldInCountries": [],
    "affiliatesHighlightedLabel": {},
    "possiblySensitive": false,
    "pinnedTweetIds": [
      "1998775732001190018"
    ],
    "profile_bio": {},
    "isAutomated": false,
    "automatedBy": null
  },
  "extendedEntities": {
    "media": [
      {
        "display_url": "pic.x.com/JSat2DARCT",
        "expanded_url": "https://x.com/dair_ai/status/1998775732001190018/photo/1",
        "id_str": "1998775727584604160",
        "indices": [
          282,
          305
        ],
        "media_key": "3_1998775727584604160",
        "media_url_https": "https://pbs.twimg.com/media/G70T7yxacAAd7_2.jpg",
        "type": "photo",
        "url": "https://t.co/JSat2DARCT",
        "ext_media_availability": {
          "status": "Available"
        },
        "features": {
          "large": {
            "faces": []
          },
          "medium": {
            "faces": []
          },
          "small": {
            "faces": []
          },
          "orig": {
            "faces": []
          }
        },
        "sizes": {
          "large": {
            "h": 1794,
            "w": 1612,
            "resize": "fit"
          },
          "medium": {
            "h": 1200,
            "w": 1078,
            "resize": "fit"
          },
          "small": {
            "h": 680,
            "w": 611,
            "resize": "fit"
          },
          "thumb": {
            "h": 150,
            "w": 150,
            "resize": "crop"
          }
        },
        "original_info": {
          "height": 1794,
          "width": 1612,
          "focus_rects": [
            {
              "x": 0,
              "y": 0,
              "w": 1612,
              "h": 903
            },
            {
              "x": 0,
              "y": 0,
              "w": 1612,
              "h": 1612
            },
            {
              "x": 0,
              "y": 0,
              "w": 1574,
              "h": 1794
            },
            {
              "x": 0,
              "y": 0,
              "w": 897,
              "h": 1794
            },
            {
              "x": 0,
              "y": 0,
              "w": 1612,
              "h": 1794
            }
          ]
        },
        "media_results": {
          "result": {
            "media_key": "3_1998775727584604160"
          }
        }
      }
    ]
  },
  "card": null,
  "place": {},
  "entities": {
    "hashtags": [],
    "symbols": [],
    "urls": [
      {
        "display_url": "arxiv.org/abs/2512.06721",
        "expanded_url": "https://arxiv.org/abs/2512.06721",
        "url": "https://t.co/3zFVP5igxe",
        "indices": [
          1883,
          1906
        ]
      },
      {
        "display_url": "dair-ai.thinkific.com",
        "expanded_url": "https://dair-ai.thinkific.com/",
        "url": "https://t.co/zQXQt0PMbG",
        "indices": [
          1955,
          1978
        ]
      }
    ],
    "user_mentions": []
  },
  "quoted_tweet": null,
  "retweeted_tweet": null,
  "article": null
}