Your curated collection of saved posts and media

Showing 24 posts Β· last 7 days Β· quality filtered
A
AnthropicAI
@AnthropicAI
πŸ“…
Feb 26, 2026
17d ago
πŸ†”75528261

A statement from Anthropic CEO, Dario Amodei, on our discussions with the Department of War. https://t.co/rM77LJejuk

Media 1
πŸ–ΌοΈ Media
P
PeteHegseth
@PeteHegseth
πŸ“…
Feb 27, 2026
17d ago
πŸ†”95832410

Thank you for your attention to this matter. cc: @AnthropicAI @DarioAmodei https://t.co/FLCByLHF73

Media 1Media 2
πŸ–ΌοΈ Media
A
AnthropicAI
@AnthropicAI
πŸ“…
Feb 28, 2026
16d ago
πŸ†”99446918

A statement on the comments from Secretary of War Pete Hegseth. https://t.co/Gg7Zb09IMR

Media 1
πŸ–ΌοΈ Media
I
iScienceLuvr
@iScienceLuvr
πŸ“…
Feb 28, 2026
16d ago
πŸ†”69381979

"We will challenge any supply chain risk designation in court" - Anthropic They are saying Department of War cannot restrict customers' use of Claude outside of Dep of War contract work. https://t.co/3FDsXmmcZi

@AnthropicAI β€’ Sat Feb 28 01:24

A statement on the comments from Secretary of War Pete Hegseth. https://t.co/Gg7Zb09IMR

Media 1
πŸ–ΌοΈ Media
I
iScienceLuvr
@iScienceLuvr
πŸ“…
Feb 28, 2026
16d ago
πŸ†”02900284

huh, why hasn't Wikipedia been updated in more than 2 years on HuggingFace? https://t.co/DxbO8RpM6G

Media 1
πŸ–ΌοΈ Media
C
charliebcurran
@charliebcurran
πŸ“…
Feb 28, 2026
16d ago
πŸ†”49446722

Marco Rubio finding out he has to run Anthropic now too. https://t.co/Ffc5jsvzLi

πŸ–ΌοΈ Media
D
Daractenus
@Daractenus
πŸ“…
Jan 15, 2026
60d ago
πŸ†”46581585

I know we don’t do facts anymore, but here’s the "dangerous and collapsing" EU that Elon Musk and MAGA influencers keep warning you about. https://t.co/RZrT5p9p9D

Media 1
πŸ–ΌοΈ Media
P
ProjectLincoln
@ProjectLincoln
πŸ“…
Jan 15, 2026
60d ago
πŸ†”42431511

This has been the plan all along. - Foment violence and chaos in the streets of MN - Implement the Insurrection Act - Declare Martial Law - Suspend elections https://t.co/Ky3Jf3awqc

Media 1
πŸ–ΌοΈ Media
R
RepJasonCrow
@RepJasonCrow
πŸ“…
Jan 16, 2026
59d ago
πŸ†”67783177

As a three-time combat veteran, I get pretty damn hot when a five-time draft dodger like @realDonaldTrump pounds his chest and bangs the war drums. America is over it. No more sending our sons & daughters to fight for oil. https://t.co/sgyXKme3h3

πŸ–ΌοΈ Media
E
EastEndJoe
@EastEndJoe
πŸ“…
Jan 16, 2026
59d ago
πŸ†”57690090

Where’s the outrage? https://t.co/Hd0Br38yOF

πŸ–ΌοΈ Media
πŸ”ylecun retweeted
E
Joe G
@EastEndJoe
πŸ“…
Jan 16, 2026
59d ago
πŸ†”57690090

Where’s the outrage? https://t.co/Hd0Br38yOF

❀️17,228
likes
πŸ”6,941
retweets
πŸ–ΌοΈ Media
G
guillaumgrallet
@guillaumgrallet
πŸ“…
Jan 15, 2026
60d ago
πŸ†”42727236

Le siΓ¨ge social d’AMI Labs, la sociΓ©tΓ© de ⁦Yann Le Cun⁩, qui a longtemps Γ©tΓ© chief scientist de Meta (Facebook, Whatsapp et Instagram). sera Γ  Paris. Le chercheur ⁦⁦@ylecun⁩, qui a dΓ©cidΓ©ment de top lectures😊, reste professeur Γ  ⁦⁦⁦⁦@nyuniversity⁩ https://t.co/g7cAHoF7zw

Media 1
πŸ–ΌοΈ Media
C
ChrisMurphyCT
@ChrisMurphyCT
πŸ“…
Jan 17, 2026
58d ago
πŸ†”66446869

We told you the Venezuela invasion was just corruption. It took one whole week to get the proof. Trump took Venezuela's oil at gunpoint, and gave it to one of his biggest campaign donors. 1/ But when you learn the details, it's even worse. A short🧡on this corruption story. https://t.co/ZExM5S89VK

Media 1
πŸ–ΌοΈ Media
P
PessimistsArc
@PessimistsArc
πŸ“…
Jan 17, 2026
58d ago
πŸ†”18913424

Nostalgia has tricked people into thinking the 1990s and early 2000s were a time of cutesy and comfortable digital disruption. https://t.co/yPMO0Ab3Ga

Media 1Media 2
+2 more
πŸ–ΌοΈ Media
D
Daractenus
@Daractenus
πŸ“…
Jan 17, 2026
58d ago
πŸ†”21143693

From the country that lectures Europe daily on its supposed "lack of free speech": https://t.co/6nLs1Ykfwt

πŸ–ΌοΈ Media
πŸ”ylecun retweeted
D
Daractenus
@Daractenus
πŸ“…
Jan 17, 2026
58d ago
πŸ†”21143693

From the country that lectures Europe daily on its supposed "lack of free speech": https://t.co/6nLs1Ykfwt

❀️2,859
likes
πŸ”683
retweets
πŸ–ΌοΈ Media
R
RichardHanania
@RichardHanania
πŸ“…
Jan 18, 2026
57d ago
πŸ†”33737171

Citizen: β€œWhy are you asking for my paperwork?” Border patrol agent: β€œBecause of your accent.” Citizen: β€œYou have an accent too!” Jesus Christ. This might be the funniest form of fascism in history. https://t.co/ocvFDeQzof

πŸ–ΌοΈ Media
S
SchmidhuberAI
@SchmidhuberAI
πŸ“…
Jan 15, 2026
60d ago
πŸ†”01574264

1st and 34th author of @GoogleDeepMind's paper [1] each got 1/4 Nobel Prize for protein structure prediction through Alphafold. Who invented that? (Disclaimer: a student from my lab co-founded DeepMind.) The 2021 paper [1] failed to cite important prior work [2] by Baldi and Pollastri (2002): at a time when compute was roughly ten thousand times more expensive than in 2021, [2] introduced a pipeline very similar to the one of Alphafold 2, using multiple sequence alignment (MSA) to predict the secondary protein structure with the help of a position-specific scoring matrix (PSSM) or a profile matrix, going beyond even earlier work of 1988 [5][6][10]. The extra step (absent in Alphafold 2) was to predict the protein's topology, too. See also the follow-up work of 2012 [3]. [1] didn't cite @HochreiterSepp et al.'s first successful application [7] of deep learning to protein folding (2007, using LSTM instead of MSA to construct a profile). [1] also failed to cite the essential prior work by Golkov et al (2016) [4][8], which had crucial aspects of AlphaFold: (1) identify homologous sequences in a database of proteins with known structure, (2) compute the co-evolution statistics using the homologous sequences, (3) train a graph NN to predict the protein contact map (that determines its 3D structure) directly from the co-evolution statistics, (4) demonstrate experimentally a significant boost in performance on the CASP dataset [4][9]. See the attached image! Instead of the contact map, DeepMind (2021) predicted the distance map, and instead of graph CNNs, they used the quadratic Transformer published in 2017 (the unnormalized linear Transformer had existed since 1991 [11]). DeepMind also used more training data and much more compute for hyperparameter tuning etc. Image credits: [4][8] REFERENCES [1] J. Jumper, R. Evans, A. Pritzel, T. Green, M. Figurnov, O. Ronneberger, K. Tunyasuvunakool, R. Bates, A. Zidek, A. Potapenko, A. Bridgland, C. Meyer, S. A. A. Kohl, A. J. Ballard, A. Cowie, B. Romera-Paredes, S. Nikolov, R. Jain, J. Adler, T. Back, S. Petersen, D. Reiman, E. Clancy, M. Zielinski, M. Steinegger, M. Pacholska, T. Berghammer, S. Bodenstein, D. Silver, O. Vinyals, A. W. Senior, K. Kavukcuoglu, P. Kohli & D. Hassabis. Highly accurate protein structure prediction with AlphaFold. Nature 596, 583-589, 2021. [2] P. Baldi, G. Pollastri. A machine learning strategy for protein analysis. IEEE Intelligent Systems 17.2 (2002): 28-35. [3] P. Di Lena, K. Nagata, and P. Baldi. Deep Architectures for Protein Contact Map Prediction. Bioinformatics, 28, 2449-2457, (2012). [4] V. Golkov, M. J. Skwark, A. Golkov, A. Dosovitskiy, T. Brox, J. Meiler, D. Cremers (2016). Protein contact prediction from amino acid co-evolution using convolutional networks for graph-valued images. NeurIPS, Barcelona, 2016. [5] N. Qian and T.J. Sejnowski (1988). Predicting the secondary structure of globular proteins using neural network models. J. Mol. Biol. 1988, 202, 865-884. [6] H. Bohr, J. Bohr, S. Brunak, R.M.J. Cotterill, B. Lautrup, L. Norskov, O.H. Olsen, S.B. Petersen (1988). Protein secondary structure and homology by neural networks. The Ξ±-helices in rhodopsin. FEBS Lett. 1988, 241, 223-228. [7] S. Hochreiter, M. Heusel, K. Obermayer. Fast model-based protein homology detection without alignment. Bioinformatics 23(14):1728-36, 2007. Successful application of deep learning to protein folding problems, through an LSTM that was orders of magnitude faster than competing methods. [8] D. Cremers (July 2025). LinkedIn post on the Nobel Prize for AlphaFold. [9] A Nobel Prize for Plagiarism. Technical Report IDSIA-24-24, 2024 (updated 2025) https://t.co/u9YxfBuqNf . Popular tweets on this: https://t.co/heYSuPQDxp https://t.co/QQU9FKpqAh [10] The Nobel Committee for Chemistry (2024). Scientific Background to the Nobel Prize in Chemistry 2024. [11] Annotated History of Modern AI and Deep Learning. Technical Report IDSIA-22-22, IDSIA, Switzerland, 2022 (updated 2025). Preprint https://t.co/YZrEphq1qx. This extends the 2015 award-winning deep learning survey in the journal "Neural Networks."

Media 1
πŸ–ΌοΈ Media
J
jeremyphoward
@jeremyphoward
πŸ“…
Jan 16, 2026
59d ago
πŸ†”86742240

The answer turns out to be "yes, kinda". After spending few minutes clicking "like" on posts I liked and "show less like this" on those I don't, here's what my feed now contains. https://t.co/4E96OQKee6

@jeremyphoward β€’ Sun Jan 11 03:06

Is this true?

Media 1Media 2
+2 more
πŸ–ΌοΈ Media
R
RitchieVink
@RitchieVink
πŸ“…
Jan 16, 2026
59d ago
πŸ†”70149716

So there was quite a sensational rant post titling "DuckDB beats Polars for 1TB of data" and the video "Polars Got Destroyed by DuckDB in this 1TB Test" that was shared a lot. There was no code shared for Polars and upon request, we were ignored. These posts were conveniently shared in posts and newsletters because they fit a narrative. In any case, I went through the effort to reproduce the dataset and run the exact benchmark. The post mentioned 64GB RAM, so I ran on a 5a.8xlarge (32vCPU / 64GB RAM). Polars did not go OOM, but finished the query in 14 minutes never exceeding 14GB RAM usage. On the same machine DuckDB also took 14 minutes. Both tools hit the bandwidth limit: 1 TB / 10 Gbps = 13.3 min, but that makes less of a title πŸ˜‰. The whole benchmark was just hard to reproduce, the 1TB part of it made it unwieldy, but didn't matter. It could have done with a 100GB benchmark as the cardinality of the groups was just ~1800. Here is the Polars query: https://t.co/62oLSctcSd So I guess... Code or it didn't happen.

Media 1Media 2
πŸ–ΌοΈ Media
F
fleetwood___
@fleetwood___
πŸ“…
Jan 16, 2026
59d ago
πŸ†”22485811

CuTe algebra is extremely elegant. I can't believe we've all been writhing around on the floor like Terence Tao to do tile indexing. https://t.co/4puH9HSefw

Media 1
πŸ–ΌοΈ Media
B
BlinkDL_AI
@BlinkDL_AI
πŸ“…
Jan 16, 2026
59d ago
πŸ†”68198711

Meta paper 2601.10639 is exactly the same as RWKV DeepEmbed original version πŸ˜‚ We have an improved version in https://t.co/28vbKfGXdn (line 261-262)

@BlinkDL_AI β€’ Mon May 26 10:00

RWKV-8 "Heron" preview (1) - DeepEmbed. Seems Gemma3n is trying similar tricks (Per-Layer Embedding), so I will discuss it first πŸͺΆ It's essentially free performance - lots of params, but can be offloaded to RAM/SSD, and simple to train and deployπŸš€ https://t.co/UY1rhh0JLQ

Media 1
πŸ–ΌοΈ Media
H
himanshustwts
@himanshustwts
πŸ“…
Jan 17, 2026
58d ago
πŸ†”98155686

wait this is actually big. this deepseek research used LogitLens (lets you see what the model is 'thinking' at each layer) and CKA (compares what different layers are actually learning) to figure out why the new Engram architecture works. apparently this is the first time i have seen mech interpretability research being used in a capabilities paper. feels like a shift.

@scaling01 β€’ Mon Jan 12 16:19

DeepSeek is back! "Conditional Memory via Scalable Lookup: A New Axis of Sparsity for Large Language Models" They introduce Engram, a module that adds an O(1) lookup-style memory based on modernized hashed N-gram embeddings Mechanistic analysis suggests Engram reduces the need

Media 1
πŸ–ΌοΈ Media
K
kyliebytes
@kyliebytes
πŸ“…
Nov 20, 2023
847d ago
πŸ†”13279733

OpenAI is nothing without its people!!!! This is incredible to witness from so many OpenAI employees https://t.co/wmcKMC8gIJ

πŸ–ΌοΈ Media