Your curated collection of saved posts and media
βI did not succeed in life by intelligence. I succeeded because I have a long attention span.β β Charlie Munger https://t.co/B9eXSAX9jJ
Ben Horowitz: "Was Steve Jobs so mean to people because that just goes with being so brilliant? Or because he could get away with it, because he was so brilliant?" "I think it's a little bit more the latter. He didn't have to do that." "The problem with founders modeling themselves after Jobs... Well, if you're not Jobs or Elon, maybe you can't get away with that." "It's very hard to have the highest performing talent without them coming with some of those behaviors." @bhorowitz with @bhalligan
Charlie Munger's last interview before his passing at age 99: βThere are things you give up with time.β Wise words from a wise old man. https://t.co/30BzQshwtW
A decade ago, a young intern helped build the AI that stunned the world. Chris Maddison worked on the Go-playing system that became AlphaGo and defeated champion Lee Sedol. The moment didnβt just change a game, it signaled the arrival of a new era for AI. https://t.co/owpnku8A9L @newscientist
A corporate treasury transfer just happened at blockchain speed. Circle shifted $68 million between its own entities in about 30 minutes using its stablecoin infrastructure, replacing bank wires that usually take days. Programmable money is starting to change how companies move capital. https://t.co/92G5UcGUId @circle @sndr_krisztian @Aoyon_A @coindesk
The next wave of AI disruption may hit unexpected professions first. Peter Thiel suggests roles built around mathematics and quantitative analysis could be more exposed than language-heavy work. Banks are already hinting that advances in AI could mean leaner teams in finance. https://t.co/PdCYL0oCjj @fortunemagazine
The full video from Cortical Labs explaining how they put 200,000 brain cells onto a silicon chip and had it play Doom is wild: βWhen a demon appears on the left of the screen, specific electrodes stimulate the sensory area of the neural culture on the left side. The neurons react to that stimulation. We then listen to their response, the spikes, and interpret that activity as motor commands. If the neurons fire in a specific pattern, the Doom guy shoots.β
Fetchquest is a must-have for VR devs and content creators. It's a better video and photo sync tool for the Meta Quest that solves most of the pain points of using Meta Quest Developer Hub and natively integrates with cloud storage and NAS. https://t.co/OSG5xJEKUp
Last week at #UpfrontSummit, had a great discussion with Clay Bavor at @SierraPlatform, @lsturdy1 at @CapitalG, and @shiringhaffary at @business talking about the next era of AI agents. This moment in time feels a lot like early mobile. Here's what I mean β https://t.co/ZpSVKWXw96
People underestimate how foundational some articles from Anthropic and OpenAI are. We just donβt have time to read anything anymore. History has been made with things like these Agent Skills: https://t.co/QqMhoR0UJ2 Harness Engineering: https://t.co/2o9RTicSvD
Google AntiGravity is amazing. In less than 2.5 hours, I've built this Solar System Explorer playground with Gemini 3.1 Pro and 0 lines of code. Cherry on the cake, you can chat with Gemini Live API as you're exploring the solar system and get fun facts about each planet and satellite in our system. Check it out here: https://t.co/2hkXx0ol0r
@HamelHusain https://t.co/hzrkSFskvL
@HamelHusain https://t.co/hzrkSFskvL
New research from Yann LeCun and collaborators at NYU. It's a really good read for anyone working on efficient Transformer inference. The paper dissects two recurring phenomena in Transformer language models: massive activations (where a few tokens exhibit extreme outlier values, and attention sinks (where certain tokens attract disproportionate attention regardless of semantic relevance). They show the co-occurrence is largely an architectural artifact of pre-norm design, not a fundamental property. Massive activations function as implicit model parameters. Attention sinks modulate outputs locally. Why does it matter? These phenomena directly impact quantization, pruning, and KV-cache management. Understanding their root cause could enable better engineering decisions for efficient inference at scale. Paper: https://t.co/wfzeDpfu4x
Meet Vestaboard Note in White. A radiant new finish, to celebrate the ones who make every day brighter. Available to pre-order now: https://t.co/EnpLLvhQl1 https://t.co/f3IwfB5Zla

the $200/mo C-suite: https://t.co/gSOPG8rxBt
genuinely how it feels https://t.co/cFjD6o90Fp
SemiAnalysis founder @dylan522p said that their current annual Claude Code run rate is $5m. What is he cooking? https://t.co/WLrGK8kZmt
Saint Tibo: Giver of Tokens and Reseter of Limits https://t.co/vpi4BEQSjf
Saint Tibo: Giver of Tokens and Reseter of Limits https://t.co/vpi4BEQSjf
here you go @jxnlco, she went the extra mile and put together a demo of her workflow :') https://t.co/awXb4O7rbo
Thanks, AK @_akhaliq !!! We release the Gradio Demo and Code here: Code: https://t.co/F5K6iWzN7m Demo: https://t.co/z5LoWYkWOL

Thanks, AK @_akhaliq !!! We release the Gradio Demo and Code here: Code: https://t.co/F5K6iWzN7m Demo: https://t.co/z5LoWYkWOL
RealWonder Real-Time Physical Action-Conditioned Video Generation paper: https://t.co/U8RM31zcVD https://t.co/GEMCJ14Yda