Your curated collection of saved posts and media

Showing 32 posts · last 14 days · by score
D
DerekJGrossman
@DerekJGrossman
📅
Dec 14, 2025
107d ago
🆔61176938

“Pax Silica” is now a thing. It’s Trump admin’s attempt to establish a group of critical partners to maintain and expand global supply chains and develop AI. In the Indo-Pacific, here’s the list: —Australia —Japan —Singapore —South Korea —Taiwan (guest) https://t.co/cjQeromtlV

Media 1
🖼️ Media
D
dair_ai
@dair_ai
📅
Dec 15, 2025
106d ago
🆔33030703

NEW Research from Meta Superintelligence Labs and collaborators. The default approach to improving LLM reasoning today remains extending chain-of-thought sequences. Longer reasoning traces aren't always better. Longer traces conflate reasoning depth with sequence length and inherit long-context failure modes. This new research introduces Parallel-Distill-Refine (PDR), a framework that treats LLMs as improvement operators rather than single-pass reasoners. Instead of one long reasoning chain, PDR operates in phases: - Generate diverse drafts in parallel. - Distill them into a bounded textual workspace. - Refine conditioned on this workspace. - Repeat. Context length becomes controllable via degree of parallelism, no longer conflated with total tokens generated. The model accumulates wisdom across rounds through compact summaries rather than replaying full histories. On AIME 2024, PDR achieves 93.3% accuracy compared to 79.4% for standard long chain-of-thought at matched latency budgets. For o3-mini at 49k effective tokens, accuracy improves from 76.9% (Long CoT) to 86.7% (PDR), a 9.8 percentage point gain. PDR also achieves the same accuracy as sequential refinement with 2.57x smaller sequential budget by converting parallel compute into accuracy without lengthening per-call context. The researchers also trained an 8B model with operator-consistent RL to make training match the PDR inference interface. Mixing standard and operator RL yields an additional 5% improvement on both AIME benchmarks. Bounded memory iteration can substitute for long reasoning traces while holding latency fixed. Strategic parallelism and distillation is shown to beat brute-force sequence extension. Paper: https://t.co/EviERpmTu7 Learn to build effective AI Agents in our academy: https://t.co/zQXQt0PMbG

Media 1Media 2
🖼️ Media
A
AdinaYakup
@AdinaYakup
📅
Dec 15, 2025
107d ago
🆔02156420

Alibaba @TONGYI_SpeechAI just released a new TTS 🔥 ✨ 0.5B - Apache2.0 ✨ 9 languages/18+ Chinese dialects, with cross lingual zero-shot voice cloning ✨ Low latency: Bi-streaming (text-in / audio-out) with ~150ms latency ✨ Leading performance in content accuracy, speaker similarity, and prosody

Media 1
🖼️ Media
O
osanseviero
@osanseviero
📅
Dec 15, 2025
107d ago
🆔60892049

PSA: https://t.co/n1RdT94ij3 is a good link to bookmark 🤗(and refresh)

Media 1
🖼️ Media
🔁huggingface retweeted
O
Omar Sanseviero
@osanseviero
📅
Dec 15, 2025
107d ago
🆔60892049

PSA: https://t.co/n1RdT94ij3 is a good link to bookmark 🤗(and refresh)

Media 1
❤️964
likes
🔁71
retweets
🖼️ Media
N
natolambert
@natolambert
📅
Dec 14, 2025
107d ago
🆔63734026

Open models year in review What a year! We're back with an updated open model builder tier list, our top models of the year, and our predictions for 2026. First, the winning models: 1. DeepSeek R1 (@deepseek_ai): Transformed the AI world 2. Qwen 3 Family (@AlibabaGroup): The new default open models 3. Kimi K2 Family (@Kimi_Moonshot): Models that convinced the world that DeepSeek wasn't special and China would produce numerous leading models. Runner up models: MiniMax M2 (@MiniMax__AI), GLM 4.5 (@Zai_org), GPT-OSS (@OpenAI), Gemma 3 (@GoogleAI), Olmo 3 (@allen_ai) Honorable Mentions: Nvidia's (@nvidia) Parakeet speech-to-text model & Nemotron 2 LLM, Moondream 3 VLM (@moondreamai), Granite 4 LLMs (@IBMResearch), and HuggingFace's (@huggingface) SmolLM3. Updated Tier list: Frontier open labs: DeepSeek (@deepseek_ai), Qwen (@AlibabaGroup), and Kimi Moonshot (@Kimi_Moonshot) Close behind: https://t.co/d5wmnd2o3C (@Zai_org) & MiniMax AI (@MiniMax__AI) (notably none from the U.S. here and up) Noteworthy (a mix of US & China): StepFun AI (@StepFun_ai), Ant Group's (@AntGroup/ @TheInclusionAI Inclusion AI, Meituan (@Meituan_LongCat), Tencent (@TencentHunyuan), IBM (@IBMResearch), Nvidia (@nvidia), Google (@GoogleAI), & Mistral (@MistralAI) Then a bunch more below that, which we detail. Predictions for 2026: 1. Scaling will continue with open models. 2. No substantive changes in the open model safety narrative. 3. Participation will continue to grow. 4. Ongoing general trends will continue w/ MoEs, hybrid attention, dense for fine-tuning. 5. The open and closed frontier gap will stay roughly the same on any public benchmarks. 6. No Llama-branded open model releases from Meta in 2026. Read the full post on @interconnectsai -- link below.

Media 1
🖼️ Media
N
npaka123
@npaka123
📅
Dec 15, 2025
107d ago
🆔15018643

今日のAIニュースをゆる〜くお届け <第18話 Reachy Mini> #ゆるふわAI研究所 https://t.co/7F4l3Gtza8

Media 1
🖼️ Media
🔁huggingface retweeted
N
布留川英一 / Hidekazu Furukawa
@npaka123
📅
Dec 15, 2025
107d ago
🆔15018643

今日のAIニュースをゆる〜くお届け <第18話 Reachy Mini> #ゆるふわAI研究所 https://t.co/7F4l3Gtza8

Media 1
❤️29
likes
🔁8
retweets
🖼️ Media
P
patloeber
@patloeber
📅
Dec 15, 2025
107d ago
🆔17131445

my reachy mini is here, can’t wait to play around with it! https://t.co/wyEp2lXk4X

Media 1
🖼️ Media
🔁huggingface retweeted
P
Patrick Loeber
@patloeber
📅
Dec 15, 2025
107d ago
🆔17131445

my reachy mini is here, can’t wait to play around with it! https://t.co/wyEp2lXk4X

Media 1
❤️48
likes
🔁4
retweets
🖼️ Media
S
satyanadella
@satyanadella
📅
Dec 11, 2025
110d ago
🆔53879967

2/ In GitHub Copilot, GPT-5.2 is a fantastic multi-purpose model, especially when it comes to long-context and reasoning when coding or investigating a complex code base. Also available today. https://t.co/VMnJP1TXdp

🖼️ Media
N
newstart_2024
@newstart_2024
📅
Dec 14, 2025
107d ago
🆔89152958

Imagine: Two animals run the exact same distance. One chooses to. One is forced. The voluntary runner gets healthier—better heart rate, blood pressure, glucose. The forced runner? Gets sicker. Andrew Huberman says the same rule destroys or upgrades your stress, your workouts, even your life. And the craziest proof: People who watched hours of Boston bombing news coverage suffered MORE acute stress than those who were actually there. Your mind doesn’t know the difference between experiencing and relentlessly consuming. So… what “have-to” in your life are you ready to reframe as a choice? This 1:58 clip just rewired my brain.

🖼️ Media
O
omarsar0
@omarsar0
📅
Dec 15, 2025
106d ago
🆔20434037

NEW Research from Apple. When you think about it, RAG systems are fundamentally broken. Retrieval and generation are optimized separately, retrieval selects documents based on surface-level similarity while generators produce answers without feedback about what information is actually needed. There is an architectural mismatch. Dense retrievers rank documents in embedding space while generators consume raw text. This creates inconsistent representation spaces that prevent end-to-end optimization, redundant text processing that causes context overflow, and duplicated encoding for both retrieval and generation. This new research introduces CLaRa, a unified framework that performs retrieval and generation over shared continuous document representations. They encode documents once into compact memory-token representations that serve both purposes. Instead of maintaining separate embeddings and raw text, documents are compressed into dense vectors that both the retriever and generator operate on directly. This enables something previously impossible: gradients flowing from the generator back to the retriever through a differentiable top-k selector using Straight-Through estimation. The retriever learns which documents truly enhance answer generation rather than relying on surface similarity. To make compression work, they introduce SCP, a pretraining framework that synthesizes QA pairs and paraphrases to teach the compressor which information is essential. Simple QA captures atomic facts, complex QA promotes relational reasoning, and paraphrases preserve semantics while altering surface form. Results: At 16x compression, CLaRa-Mistral-7B surpasses the text-based DRO-Mistral-7B on NQ (51.41 vs 51.01 F1) and 2Wiki (47.18 vs 43.65 F1) while processing far less context. At 4x compression, it exceeds uncompressed text baselines by 2.36% average on Mistral-7B. Most notably, CLaRa trained with only weak supervision from next-token prediction outperforms fully supervised retrievers with ground-truth relevance labels. On HotpotQA, it achieves 96.21% Recall@5, exceeding BGE-Reranker (85.93%) by over 10 points despite using no annotated relevance data. Well-trained soft compression can retain essential reasoning information while substantially reducing input length. The compressed representations filter out irrelevant content and focus the generator on reasoning-relevant context, leading to better generalization than raw text inputs. Great read for AI devs. (bookmark it) Paper: https://t.co/JtMukGVNwV Learn to build with RAG and AI Agents in my academy: https://t.co/JBU5beIoD0

Media 1
🖼️ Media
I
iScienceLuvr
@iScienceLuvr
📅
Dec 15, 2025
106d ago
🆔66493590

NVIDIA introduces the Nemotron 3 family of models! Super, Ultra will be released later, Nano released today * Mixture-of-Experts hybrid Mamba–Transformer architecture Super and Ultra models: * are trained with NVFP4 * LatentMoE (project token embedding to smaller latent dimension for experts to process) * multi-token prediction full pretraining+post-training data and code will be made open-source

Media 1Media 2
🖼️ Media
A
AWSstartups
@AWSstartups
📅
Dec 15, 2025
107d ago
🆔20494157

🧠⚙️Building the world’s first autonomous science operating system requires extraordinary scale, speed & security. Find out how @LilaSciences are partnering with #AWS to support their mission of accelerating scientific discovery. 👉 https://t.co/HBCCvY0wLu https://t.co/waU86oEeVi

🖼️ Media
H
hardmaru
@hardmaru
📅
Dec 15, 2025
107d ago
🆔34672148

“iRobot Corp., the company that revolutionized robot vacuum cleaners in the early 2000s with its Roomba model, filed for bankruptcy and proposed handing over control to its main Chinese supplier.” 😥 https://t.co/sSifua7Tjn

Media 1
🖼️ Media
_
_NativeInLA
@_NativeInLA
📅
May 28, 2020
2133d ago
🆔49765632

INDIGENOUS PEOPLES FOR BLACK LIVES #JusticeForAhmadArbery #JusticeForBigFloyd #JusticeforBreonnaTaylor #JusticeForTonyMcDade https://t.co/dFNZ7JfLH3

Media 1Media 2
🖼️ Media
W
whoissd
@whoissd
📅
Jun 13, 2020
2118d ago
🆔39974144

Art https://t.co/4sVDaplWGe

Media 1
🖼️ Media
_
_NativeInLA
@_NativeInLA
📅
Nov 04, 2020
1973d ago
🆔97752583

Went to bed at 4am. Anxious. Tired. Heavy heart. I keep thinking about my relatives, the communities I’m grateful to be part of & advocate for, hoping all our heart work isn’t for nothing. Praying for a future where Indigenous & Black relatives aren’t murdered, missing & ignored. https://t.co/8ssQRNw7e1

Media 1
🖼️ Media
_
_NativeInLA
@_NativeInLA
📅
Jun 12, 2021
1753d ago
🆔00812298

Being a voice, trying to make an impact & educate has been feeling pretty tokenizing & draining. Feeling really down lately. After all the heart work we do as Indigenous Ppls, the demands for our emotional labor not being implemented is frustrating. Staying focused & positive. https://t.co/ad0M0zqDQj

Media 1
🖼️ Media
N
nareavera
@nareavera
📅
Oct 04, 2023
909d ago
🆔15592513

—— JORDAN kumpulan tweets au si anak kedua https://t.co/StZ0fORv6W

Media 1
🖼️ Media
A
AttorneyCrump
@AttorneyCrump
📅
Dec 15, 2025
107d ago
🆔11891034

New data reveals over 75,000 people with no criminal record were arrested by ICE during the first 9 months of the Trump admin. The administration claimed they were targeting criminals, but the truth is clear: undocumented people with no record were detained & made to disappear. https://t.co/DDJxScMQno

🖼️ Media
P
PhilWMagness
@PhilWMagness
📅
Dec 15, 2025
107d ago
🆔21502933

Trump claims he's taken in over $18 trillion in tariffs in the last 10 months. To give you an idea of just how nonsensical and detached from reality this claim is, federal taxes in 2024 brought in just under $5 trillion. United States GDP in 2024 was $29 trillion. https://t.co/Z1RwR7JIOR

🖼️ Media
C
commonsenseplay
@commonsenseplay
📅
Dec 15, 2025
107d ago
🆔09627524

BREAKING: TRUMP LIES TO THE ENTIRE COUNTRY Trump’s tariff claim is completely false. Trump said: "Because of the tariffs, we've taken in $18 TRILLION. There's never been anything like it! The Biden admin took in less than $1 trillion in 4 years. We took in more than $18T in 10 months. That's good!" Reality: - The U.S. has NEVER collected $18T in tariffs, not in 10 months, not EVER. - Total U.S. federal revenue from all taxes combined is only $4–5T per year. - Tariffs historically bring in $70–100B per year...billions, not trillions! How wrong is $18T? - It’s 180× larger than actual annual tariff revenue - It’s 4× the entire federal budget - It’s economically impossible!!! Tariffs are paid by U.S. importers & consumers, not “free money” from abroad, and they’ve never generated trillions. Has TRUMP completely lost it!

🖼️ Media
A
Akash722602
@Akash722602
📅
Dec 12, 2025
110d ago
🆔42505916

📢 Introducing: NATIVE Recap Beeliever Edition 🐝🔥 Starting this week, I’ll be dropping a weekly @goNativeCC Recap every Sunday a simple, clear update covering: 🔸 Key BTCFi highlights 🔸 Native’s latest progress 🔸 My personal take on the narrative Short, sharp, and easy to follow. BTCFi is growing fast, and Native is building quietly but confidently. This recap will help you stay ahead without the noise. If you’re a Beeliever 🐝, you’ll love this. If not… you might become one soon. See you Sunday. NATIVE Recap begins. 🐝🔥

Media 1
🖼️ Media
V
vinaygodara0001
@vinaygodara0001
📅
Dec 14, 2025
108d ago
🆔99108501

Everyone keeps asking what Bitcoin can become. Native quietly asked something weirder — “What if Bitcoin already knows the answer?” 🐝🧠 Native doesn’t force behavior onto BTC. It listens, verifies, and lets the rules speak for themselves. Turns out… Bitcoin is way smarter when you stop interrupting it. ⚡🧡 Some protocols add opinions. Native adds ears. That’s a very different kind of progress. @goNativeCC @Tommy_The_Gods #Native #goNative #BTCFi #nBTC #BeeLiever #CTEnergy #beeartist

Media 1
🖼️ Media
L
LoicSharma
@LoicSharma
📅
Dec 14, 2025
107d ago
🆔41947900

@SebastianRoehl On native integration: this is an area that Flutter is investing in heavily! Some things we have in the current pipeline: 1. We're revamping Flutter's threading model to make interop easier: https://t.co/BjcDGEGPgz

Media 1
🖼️ Media
N
nativeantidote
@nativeantidote
📅
Jan 09, 2023
1178d ago
🆔38721793

A wrapped package has just arrived! We can't wait for you to see what's inside! 👀 https://t.co/TZx8OTvw61 https://t.co/2SQt0ITR4f https://t.co/H7tdsmFFK9

Media 2
🖼️ Media
N
NativeLover11
@NativeLover11
📅
Jul 14, 2023
992d ago
🆔05124865

I know I'm not pretty😔so everyone dislikes me💔 https://t.co/C0uWv13iuV

Media 1
🖼️ Media
N
Native3rd
@Native3rd
📅
Jul 15, 2024
625d ago
🆔74772977

“We meet people we haven’t ever seen before, but deep inside, there is a feeling that we have known them for thousands of years.” ~ L. Agni, https://t.co/tHxsXbuIBD

Media 1
🖼️ Media
J
JamesTate121
@JamesTate121
📅
Dec 15, 2025
107d ago
🆔41664071

ALARM bells people. 🚨🚨🚨 Neoconservative David Brooks wrote this in an OpEd in the NY Times, "Trumpism… is primarily about the acquisition of power — power for its own sake. It is a multifront assault to make the earth a playground for ruthless men, so of course any institutions that might restrain power must be weakened or destroyed. Trumpism is about ego, appetite and acquisitiveness and is driven by a primal aversion to the higher elements of the human spirit — learning, compassion, scientific wonder, the pursuit of justice. … What is happening now is not normal politics. We’re seeing an assault on the fundamental institutions of our civic life, things we should all swear loyalty to — Democrat, independent or Republican. It’s time for a comprehensive national civic uprising. It’s time for Americans in universities, law, business, nonprofits and the scientific community, and civil servants and beyond to form one coordinated mass movement. Trump is about power. The only way he’s going to be stopped is if he’s confronted by some movement that possesses rival power. … I’m really not a movement guy. I don’t naturally march in demonstrations or attend rallies that I’m not covering as a journalist. But this is what America needs right now."

Media 1
🖼️ Media
F
franceinfo
@franceinfo
📅
Dec 13, 2025
108d ago
🆔67973108

🔴 Société : "Ce qui se passe aujourd’hui me sidère : des vérités scientifiques objectivées peuvent être contestées par des arguments politiques. Je croyais cela impossible", constate Étienne Klein, physicien et philosophe des sciences. #ToutEstPolitique #canal16 https://t.co/bW4GuQSbhj

🖼️ Media
← PreviousPage 276 of 555Next →