MTEB: Massive Text Embedding Benchmark
-
Updated
Jun 30, 2024 - Python
MTEB: Massive Text Embedding Benchmark
Train and Infer Powerful Sentence Embeddings with AnglE | 🔥 SOTA on STS and MTEB Leaderboard
Click below to checkout the website
Generative Representational Instruction Tuning
Examples and guides for using the Preternatural SDK.
Topic Embedding, Text Generation and Modeling using diffusion
Artificial Intelligence Services
PL-MTEB: Polish Massive Text Embedding Benchmark
cUrl examples for the Rosette API
Codebase for RetroMAE and beyond.
This project is based on the Cocktail Recommendation System, which utilizes the Retrieval-Augmented Generation (RAG) approach to provide users with personalized cocktail recommendations based on their queries.
Go module for fetching embeddings from embeddings providers
Retrieval-Augmented Generation using Azure OpenAI
Mind-X is my intelligent alter ego that understands me the best. It assists with and resolves my bothersome tasks, growing in real-time as a next-generation PersonAI system.
[ACL 2023] One Embedder, Any Task: Instruction-Finetuned Text Embeddings
Rosette API Client Library for PHP
Rosette API Client Library for C#
Rosette API Client Library for Ruby
MS-marco-MiniLM-L-12-v2 model can be used for Information Retrieval: Given a query, encode the query will all possible passages (e.g. retrieved with ElasticSearch). Then sort the passages in a decreasing order.
This is a sentence embedding model, initialized from xlm-roberta-large and continually trained on a mixture of multilingual datasets. It supports 100 languages from xlm-roberta, but low-resource languages may see performance degradation.
Add a description, image, and links to the text-embedding topic page so that developers can more easily learn about it.
To associate your repository with the text-embedding topic, visit your repo's landing page and select "manage topics."