Your curated collection of saved posts and media

Showing 32 posts ยท last 7 days ยท newest first
N
NativeMag
@NativeMag
๐Ÿ“…
Aug 28, 2022
1305d ago
๐Ÿ†”15096576

Revisiting @sarkodieโ€™s stellar debut album, โ€˜Sarkologyโ€™ ๐Ÿ”ฅ https://t.co/JRBw6m5Oyo

Media 1
๐Ÿ–ผ๏ธ Media
N
Native3rd
@Native3rd
๐Ÿ“…
Aug 09, 2024
593d ago
๐Ÿ†”38550539

โ€œIf you are driven by fear, anger or prideโ€” Nature will force you to compete. If you are guided by courage, awareness, tranquility and peaceโ€ฆ nature will serve you.โ€ ~ A. Ray, https://t.co/5H27c6LKSr

Media 1
๐Ÿ–ผ๏ธ Media
N
native_info
@native_info
๐Ÿ“…
Jul 24, 2016
3532d ago
๐Ÿ†”56289280

ใ€Œใ™ใƒผใฑใƒผใใซๅญใ€ใ‚ˆใ‚Šใ€Žใ™ใƒผใฑใƒผใใซๅญ OL Ver.ใ€ใ€€ๅŽŸๅž‹ๅˆถไฝœ๏ผšๅ…ซๅทป #wf2016s #ใƒใ‚คใƒ†ใ‚ฃใƒ– https://t.co/yQIEmir5Ph

Media 1Media 2
๐Ÿ–ผ๏ธ Media
F
FlukeArea
@FlukeArea
๐Ÿ“…
Jun 10, 2020
2115d ago
๐Ÿ†”72953856

for youโ€‹ ๐Ÿฅฐ Cr. Hello_asian #เน€เธˆเน‰เธฒเนเธเน‰เธกเธเน‰เธญเธ™ #Fluke_Natouch https://t.co/nY8YvUA2UJ

Media 1Media 2
๐Ÿ–ผ๏ธ Media
F
FlukeArea
@FlukeArea
๐Ÿ“…
Oct 13, 2020
1990d ago
๐Ÿ†”31709696

เธ•เน‰เธฒเธงเน€เธ”เน‡เธเธกเธฒเธเธฑเธšเธเธ™ ๐Ÿ’ฆ #เน€เธˆเน‰เธฒเนเธเน‰เธกเธเน‰เธญเธ™ #Fluke_Natouch https://t.co/LaBbA8TqDT

Media 1
๐Ÿ–ผ๏ธ Media
F
FlukeArea
@FlukeArea
๐Ÿ“…
Dec 02, 2020
1939d ago
๐Ÿ†”69956096

เธขเธญเธกเน€เธ›เน‡เธ™เธ—เธฒเธชเน€เธ—เธญ ๐Ÿ˜ณ #เน€เธˆเน‰เธฒเนเธเน‰เธกเธเน‰เธญเธ™ #Fluke_Natouch https://t.co/iDeKPoK1Kw

Media 1Media 2
๐Ÿ–ผ๏ธ Media
S
Sanemavcil
@Sanemavcil
๐Ÿ“…
Dec 20, 2025
95d ago
๐Ÿ†”31096286

Iโ€™ve been thinking about the โ€œfine-tuningโ€ question in cosmology. Many physical constants (gravity, the cosmological constant, particle masses) seem to sit in extremely narrow ranges that allow stars, galaxies, and life to exist. Even small changes could make a lifeless universe. Some people interpret this as a hint of intentional design. Others argue it could be explained by the multiverse (we simply live in the universe that works), or by deeper laws of physics we havenโ€™t discovered yet. Iโ€™m not claiming a final answer โ€” I just find the question fascinating: What do you think fine-tuning actually tells us?

Media 1
๐Ÿ–ผ๏ธ Media
N
native_info
@native_info
๐Ÿ“…
Jul 31, 2017
3160d ago
๐Ÿ†”40033028

ใ€ใƒฉใ‚นใƒˆ2ๆ—ฅใ€‘ ใƒใ‚คใƒ†ใ‚ฃใƒ–ใ‚ˆใ‚Šใ€ใ€Œใ™ใƒผใฑใƒผใใซๅญ ใƒ‰ใ‚ธใƒƒๅจ˜OL ver.ใ€๏ผใ‚‚ใ†ใ™ใๅ—ๆณจ็ท ใ‚ใงใ™๏ผ(`๏ฝฅฯ‰๏ฝฅยด)ใ‚ž https://t.co/oXuavWaOj7 https://t.co/Iit0vZfvUp

Media 1Media 2
+3 more
๐Ÿ–ผ๏ธ Media
N
native_info
@native_info
๐Ÿ“…
Dec 05, 2018
2668d ago
๐Ÿ†”79127552

ใ€ 12ๆœˆ19ๆ—ฅ(ๆฐด) ๅ—ๆณจ็ท ๅˆ‡ ใ€‘ momiๆฐใฎๆใ็ˆฝใ‚„ใ‹็พŽๅฐ‘ๅฅณใ€Œไธƒๆตท ่‘ตใ€ใ‚’ใƒญใ‚ฑใƒƒใƒˆใƒœใƒผใ‚คใŒๅฟ ๅฎŸใซ็ซ‹ไฝ“ๅŒ–๏ผ ใƒœใƒ‡ใ‚ฃใฎไบคๆ›ใซใ‚ˆใ‚Šใ€ๆฐด็€ใฎ้ฃŸใ„่พผใฟ่กจ็พใจ่งฃๆ”พ็Šถๆ…‹ใฎไธกๆ–นใ‚’ๅ†็พๅฏ่ƒฝใซใ„ใŸใ—ใพใ—ใŸใ€‚ https://t.co/nRqEPywzKj https://t.co/ulxa5KmFVw

Media 1Media 2
+1 more
๐Ÿ–ผ๏ธ Media
N
native_info
@native_info
๐Ÿ“…
Feb 10, 2019
2601d ago
๐Ÿ†”90893568

ใ€Žๅณถ็”ฐใƒ•ใƒŸใ‚ซใƒใ€ใ•ใ‚“ใ‚คใƒฉใ‚นใƒˆใ€ŒใƒŽใƒผใƒฉใ€ ๅŽŸๅž‹ๅˆถไฝœ๏ผšใ‚ขใƒ“ใƒฉ #wf2019w #ใƒใ‚คใƒ†ใ‚ฃใƒ– https://t.co/EsfqPyHxxP

Media 1Media 2
+2 more
๐Ÿ–ผ๏ธ Media
N
Native3rd
@Native3rd
๐Ÿ“…
Apr 24, 2021
1796d ago
๐Ÿ†”60318471

โ€œThose who know how to maintain silence, knows how to maintain everything.โ€ ~>N.Namdeo https://t.co/YM1tmM35dm

Media 1Media 2
๐Ÿ–ผ๏ธ Media
N
NativeMag
@NativeMag
๐Ÿ“…
Sep 02, 2021
1666d ago
๐Ÿ†”97243392

๐Ÿšจ NEW DIGITAL COVER ALERT๐Ÿšจ The NATIVE Presents: Sounds From ๐“ฃ๐“ฑ๐“ฒ๐“ผ Side featuring: Street Pop 3.0 ๐Ÿ‡ณ๐Ÿ‡ฌ Amapiano ๐Ÿ‡ฟ๐Ÿ‡ฆ Asakaa Drill ๐Ÿ‡ฌ๐Ÿ‡ญ FULL STORY: https://t.co/Ka8lhCfueu https://t.co/9ETnVgdVJd

Media 1Media 2
+7 more
๐Ÿ–ผ๏ธ Media
N
Nativetoday_
@Nativetoday_
๐Ÿ“…
Mar 30, 2023
1091d ago
๐Ÿ†”25913349

GREAT PHOTO OF >ADAM BEACH< HAVE A BLESSED WEEKEND BROTHER https://t.co/aSGWjCReBM

Media 1Media 2
๐Ÿ–ผ๏ธ Media
H
HamelHusain
@HamelHusain
๐Ÿ“…
Dec 20, 2025
95d ago
๐Ÿ†”03048575

Found old @modal swag. Smells good! https://t.co/Ld3VV0RzRI

Media 1Media 2
๐Ÿ–ผ๏ธ Media
G
github
@github
๐Ÿ“…
Dec 20, 2025
95d ago
๐Ÿ†”61716460

Looking for a festive Yule log to brighten up your terminal? Youโ€™ll love @leereillyโ€™s GitHub CLI extension that gives you a cozy, animated Git log. ๐Ÿ”ฅ ๐Ÿชต https://t.co/2oMEsTkEMP https://t.co/bxhmd254GE

+1 more
๐Ÿ–ผ๏ธ Media
A
AnthropicAI
@AnthropicAI
๐Ÿ“…
Dec 20, 2025
95d ago
๐Ÿ†”24619581

Weโ€™re releasing Bloom, an open-source tool for generating behavioral misalignment evals for frontier AI models. Bloom lets researchers specify a behavior and then quantify its frequency and severity across automatically generated scenarios. Learn more: https://t.co/TwKstpLSy3

Media 1
๐Ÿ–ผ๏ธ Media
S
Sanemavcil
@Sanemavcil
๐Ÿ“…
Dec 20, 2025
95d ago
๐Ÿ†”49610773

โ€˜Logan was terrified for Jakeโ€™ โ€” and honestlyโ€ฆ you can feel the tension in his face. What do you think โ€” genuine fear, or just a bad freeze-frame/angle? ๐ŸฅŠ๐Ÿ‘€ @LoganPaul @jakepaul https://t.co/Hku9WLy1GC

๐Ÿ–ผ๏ธ Media
J
joshuihuii
@joshuihuii
๐Ÿ“…
Jan 03, 2024
813d ago
๐Ÿ†”92129976

Nana conference Attacca conference https://t.co/pcGv8RkuQP

Media 1Media 2
๐Ÿ–ผ๏ธ Media
D
dair_ai
@dair_ai
๐Ÿ“…
Dec 20, 2025
95d ago
๐Ÿ†”89649084

RAG systems struggle with multi-hop reasoning. In most cases, the problem isn't the LLMs. It's the retrieval system. Standard RAG treats each piece of evidence as equally reliable, ignoring how documents connect to each other. Why is this a problem? When questions require reasoning across multiple sources, single-shot retrieval often misses "bridge" documents whose entities aren't mentioned in the original query. Iterative retrieval helps, but it introduces new issues: LLM-guided graph traversal can hallucinate or become stuck on partial reasoning from previous steps. This new research introduces SA-RAG, a framework that applies spreading activation, a mechanism from cognitive psychology, to knowledge-graph-based retrieval. How does it work? Instead of relying on the LLM to decide which documents to fetch next, activation propagates automatically through a knowledge graph. Starting from entities matched to the query, activation spreads outward through weighted connections, with strength diminishing over distance. Documents linked to highly activated entities get retrieved. The system builds a hybrid structure during indexing. An LLM extracts entities and relationships from text chunks, creating a knowledge graph where documents connect to entities through "describes" links. At query time, seed entities are identified by embedding similarity, then activation flows through the graph in a breadth-first manner. On MuSiQue, SA-RAG alone achieves 67% answer correctness with phi4, outperforming naive RAG at 45% and CoT-based iterative retrieval at 55%. When combined with chain-of-thought iterative retrieval, it reaches 74% on MuSiQue and 87% on 2WikiMultiHopQA. This system demonstrates a 25% to 39% absolute improvement over naive RAG across benchmarks. Notably, these results come from small, open-weight models like phi4 and gemma3, which require no fine-tuning. Spreading activation captures associative relevance rather than surface-level similarity. The method works as a plug-and-play module, boosting any training-free RAG pipeline without architectural changes. Paper: https://t.co/jLZLkacDAX Learn to build effective RAG and AI agents in our academy: https://t.co/zQXQt0PMbG

Media 1Media 2
๐Ÿ–ผ๏ธ Media
O
omarsar0
@omarsar0
๐Ÿ“…
Dec 20, 2025
95d ago
๐Ÿ†”56958572

Check out the other skill examples in the repo. https://t.co/Oj3Oh5kJ9v

Media 1
๐Ÿ–ผ๏ธ Media
I
ivanleomk
@ivanleomk
๐Ÿ“…
Dec 20, 2025
95d ago
๐Ÿ†”26205307

@gr00vyfairy You can also use it to create what I think is the best OG image ever https://t.co/0D8Hr798NW

Media 1Media 2
๐Ÿ–ผ๏ธ Media
R
readswithravi
@readswithravi
๐Ÿ“…
Dec 19, 2025
96d ago
๐Ÿ†”52215830

Action produces information. https://t.co/MMP7wfuWCw

Media 1
๐Ÿ–ผ๏ธ Media
๐Ÿ”RealEricD retweeted
R
Reads with Ravi
@readswithravi
๐Ÿ“…
Dec 19, 2025
96d ago
๐Ÿ†”52215830

Action produces information. https://t.co/MMP7wfuWCw

Media 1Media 2
โค๏ธ6,813
likes
๐Ÿ”987
retweets
๐Ÿ–ผ๏ธ Media
O
omarsar0
@omarsar0
๐Ÿ“…
Dec 20, 2025
95d ago
๐Ÿ†”03929759

Skills is now officially supported in Codex. There is a neat built-in skill for planning. This is the best way to pull in the right context at the right time. Also, a great way to build highly specialized skills for your coding agents. https://t.co/fviSC12aci

Media 1
๐Ÿ–ผ๏ธ Media
J
jxnlco
@jxnlco
๐Ÿ“…
Dec 20, 2025
95d ago
๐Ÿ†”76456841

Me and the homies. https://t.co/pU9i20ako1

๐Ÿ–ผ๏ธ Media
V
vxylily
@vxylily
๐Ÿ“…
Dec 19, 2025
96d ago
๐Ÿ†”10225938

What are you buying? https://t.co/UsipE27Stv

Media 1
๐Ÿ–ผ๏ธ Media
D
DataChaz
@DataChaz
๐Ÿ“…
Dec 19, 2025
96d ago
๐Ÿ†”30962734

This is wild. A real-time webcam demo using SmolVLM from @huggingface and llama.cpp! ๐Ÿคฏ Running fully local on a MacBook M3. https://t.co/BQ1HyP7RoC

๐Ÿ–ผ๏ธ Media
R
rasbt
@rasbt
๐Ÿ“…
Dec 20, 2025
95d ago
๐Ÿ†”66188246

I really didn't expect another major open-weight LLM release this December, but here we go: NVIDIA released their new Nemotron 3 series this week. It comes in 3 sizes: 1. Nano (30B-A3B), 2. Super (100B), 3. and Ultra (500B). Architecture-wise, the models are a Mixture-of-Experts (MoE) Mamba-Transformer hybrid architecture. As of this morning (Dec 19), only the Nano model has been released as an open-weight model, so this post will focus on that one (shown in my drawing below). Nemotron 3 Nano (30B-A3B) is a 52-layer hybrid Mamba-Transformer model that interleaves Mamba-2 sequence-modeling blocks with sparse Mixture-of-Experts (MoE) feed-forward layers, and uses self-attention only in a small subset of layers. Thereโ€™s a lot going on in the figure above, but in short, the architecture is organized into 13 macro blocks with repeated Mamba-2 โ†’ MoE sub-blocks, plus a few Grouped-Query Attention layers. In total, if we multiply the macro- and sub-blocks, there are 52 layers in this architecture. Regarding the MoE modules, each MoE layer contains 128 experts but activates only 1 shared and 6 routed experts per token. The Mamba-2 layers would take a whole article itself to explain (perhaps a topic for another time). But for now, conceptually, you can think of them as similar to the Gated DeltaNet approach that Qwen3-Next and Kimi-Linear use, which I covered in my Beyond Standard LLMs article. The similarity between Gated DeltaNet and Mamba-2 layers is that both replace standard attention with a gated-state-space update. The idea behind this state-space-style module is that it maintains a running hidden state and mixes new inputs via learned gates. In contrast to attention, it scales linearly instead of quadratically with the input sequence length. Whatโ€™s actually quite exciting about this architecture is its really good performance compared to pure transformer architectures of similar size (like Qwen3-30B-A3B-Thinking-2507 and GPT-OSS-20B-A4B), while achieving much higher tokens-per-second throughput. Overall, this is an interesting direction, even more extreme than Qwen3-Next and Kimi-Linear in its use of only a few attention layers. However, one of the strengths of the transformer architecture is its performance at a (really) large scale. I am curious to see how the larger Nemotron 3 Super and especially Ultra will compare to the likes of DeepSeek V3.2.

@rasbt โ€ข Sat Dec 13 14:21

Just updated the Big LLM Architecture Comparison article... ...it grew quite a bit since the initial version in July 2025, more than doubled! https://t.co/oEt8XzNxik https://t.co/RZuwp6ZUaF

Media 1
๐Ÿ–ผ๏ธ Media
A
adamwathan
@adamwathan
๐Ÿ“…
Dec 18, 2025
98d ago
๐Ÿ†”54058543

https://t.co/gBgot9C1ZV

Media 1
๐Ÿ–ผ๏ธ Media
I
ivanleomk
@ivanleomk
๐Ÿ“…
Dec 20, 2025
95d ago
๐Ÿ†”90560126

https://t.co/1VkhHOw1X6

@adamwathan โ€ข Thu Dec 18 10:13

https://t.co/gBgot9C1ZV

Media 1
๐Ÿ–ผ๏ธ Media
N
Native3rd
@Native3rd
๐Ÿ“…
Nov 04, 2022
1237d ago
๐Ÿ†”31204354

Believe so brightly that everyone sees the beauty in believing. ~ Native American ๐Ÿชถโœจ https://t.co/PLFFDE0g86

Media 1Media 2
๐Ÿ–ผ๏ธ Media
N
Nativetoday_
@Nativetoday_
๐Ÿ“…
Apr 02, 2023
1088d ago
๐Ÿ†”49323521

Native Beauty ๐ŸŒนโค๏ธโ€๐Ÿ”ฅโค๏ธโ€๐Ÿ”ฅ๐ŸŒน๐ŸŒน If you're a Native beauty fan of mine can I get a bigโ€ฆ.YESS !!! I love you Allโค๏ธ https://t.co/gG4VDMMcWq

Media 1Media 2
๐Ÿ–ผ๏ธ Media