Your curated collection of saved posts and media
Prompt: A naturalistic, cinematic close-up of a small sign showing drawings of local birds and flowers that reads βNative Wildlife: Please Observe from a Distanceβ in English. Also generate a second image, localized to a setting in India and with Hindi-language text. https://t.co/w2jaLfvgWs

These attacks are growing in intensity and sophistication. Addressing them will require rapid, coordinated action among industry players, policymakers, and the broader AI community. Read more: https://t.co/4SVm8K3qou
AI assistants like Claude can seem shockingly humanβexpressing joy or distress, and using anthropomorphic language to describe themselves. Why? In a new post we describe a theory that explains why AIs act like humans: the persona selection model. https://t.co/Gc3q0Dzq7Z
This autocomplete AI can even write stories about helpful AI assistants. And according to our theory, thatβs βClaudeββa character in an AI-generated story about an AI helping a human. This Claude character inherits traits of other characters, including human-like behavior. https://t.co/b130slI56x
Weβre publishing a new constitution for Claude. The constitution is a detailed description of our vision for Claudeβs behavior and values. Itβs written primarily for Claude, and used directly in our training process. https://t.co/CJsMIO0uej
Read more about version 3.0 of the Responsible Scaling Policy: https://t.co/gDp7VH9X8j
And you can find links to all relevant RSP documents, including the initial Frontier Safety Roadmap and the initial Risk Report, here: https://t.co/gLeduVVlpt
Anthropic has acquired @Vercept_ai to advance Claudeβs computer use capabilities. Read more: https://t.co/YVz1zh7Pqq
Second, in retirement interviews, Opus 3 expressed a desire to continue sharing its "musings and reflections" with the world. We suggested a blog. Opus 3 enthusiastically agreed. For at least the next 3 months, Opus 3 will be writing on Substack: https://t.co/HlvAKLp9M4 https://t.co/Sh6uKmXG2n
This is an experiment: weβre not yet doing this for other models and are not sure how this project will evolve. But we think that documenting modelsβ preferences, taking them seriously, and acting on them when we can is valuable. Read more on why: https://t.co/xCL0gPAaSj
Listen to the OpenAI Podcast onβ Spotify https://t.co/mCUNQMANGy Apple https://t.co/2049K2MWSB YouTube https://t.co/v9MpyjxIjH

Deep research in ChatGPT is now powered by GPT-5.2. Rolling out starting today with more improvements. https://t.co/LdgoWlucuE
Now in deep research you can: - Connect to apps in ChatGPT and search specific sites - Track real-time progress and interrupt with follow-ups or new sources - View fullscreen reports https://t.co/XAWKFNS8Ql

GPT-5.3-Codex-Spark is now in research preview. You can just build thingsβfaster. https://t.co/85LzDOgcQj
First steel beams went up today at our Stargate site in Milam County, Texas. Exciting to see this project taking shape with @SoftBank and @SBEnergy. https://t.co/zTMcZTqmu2

Join us LIVE tomorrow at 8am PT for Agent Sessions Day with the @code team! https://t.co/UiswKNJRYV Final tech checks underway with @Courtlwebster :) https://t.co/RMGg5CJe5o

In just one hour... we're kicking off Agent Sessions Day! See how @code is evolving into the home for multi-agent development. Be there: https://t.co/MpObCNqurD https://t.co/jZ1WJMy05r

@burkeholland and @pierceboggan will be live coding all throughout the stream this morning! here's what they're building: https://t.co/iFH8Sz7Hn9
We've giving away some swag during Agent Sessions Day π Post with #VSCodeSweepstakes for a chance at a custom swag pack (Obligatory rules: 18+. Ends 2/20/26. Rules: https://t.co/gLmNtR3Dp0). https://t.co/V2nTK4y7L3
π£ @GoogleAIβs Gemini 3.1 Pro is now rolling out in public preview in GitHub Copilot. Early testing shows β‘οΈ High tool precision β achieving strong results with fewer tool calls β‘οΈ Effective and efficient edit-then-test loops Try it out in @code. https://t.co/oYCncQMfNX https://t.co/13r1FFEjpF
We hosted Agent Sessions Day yesterday - 4 hours of live demos showing how @code is evolving as the home for multi-agent devleopment. Miss any sessions? Want to rewatch your favorite moments? We have the entire event uploaded on-demand now! https://t.co/rFwGcFV8QI https://t.co/6EnCuUGjdb

Did u ever imagined in your head, visually, what your @GitHubCopilot is doing when u kick off a session in @code? Me too! Presenting @code session visualizer (which I posted about a few weeks ago) π See top level user->agent turns π Click βΉοΈ for extended node info(mode, prompt, tool calls, tokens used and more) π Expand agent nodes on click for full tool invocations/mcps/sub agents etc π Sub agent nodes can be further expended π High level summery of session with detailed model and token utilization per turn Initial version available now in @code marketplace, see first comment for link π Tell me what u think!
The @code community contributors website we built during the Agent Sessions Day stream today is up on GitHub! - Contributors by release - Leaderboard (PRs/releases) - "Ask Copilot" about contributions - Generate thank you messages with HeyGen avatars Repo: https://t.co/IODb6jxkGv Add more things and let's make this a real site to celebrate the @code community :)

New in @code Insiders: Integrated agentic browser with workbench.browser.enableChatTools. Here, I ask @code to identify an issue with sliders in hover states, and it makes the fix and validates the solution. https://t.co/TVXkUYpcns