@_akhaliq
RT @alvarobartt: 🌐 pplx-embed is @perplexity_ai new collection of state-of-the-art multilingual embedding models optimized for real-world, web-scale retrieval tasks! - Built on Qwen3 w/ diffusion-based pretraining and bidirectional attention - Available at 0.6B and 4B parameters w/ native INT8 quantization - pplx-embed-v1 for independent text embeddings - pplx-embed-context-v1 for document chunks in RAG - Validated on real-world search scenarios over tens of millions of documents - Permissive MIT License - Available on the @huggingface Hub, and supported on Text Embeddings Inference, Sentence Transformers, and Transformers.js