Text embedding models
📄️ Aleph Alpha
There are two possible ways to use Aleph Alpha's semantic embeddings. If you have texts with a dissimilar structure (e.g. a Document and a Query) you would want to use asymmetric embeddings. Conversely, for texts with comparable structures, symmetric embeddings are the suggested approach.
📄️ AzureOpenAI
Let's load the OpenAI Embedding class with environment variables set to indicate to use Azure endpoints.
📄️ Bedrock Embeddings
📄️ Clarifai
Clarifai is an AI Platform that provides the full AI lifecycle ranging from data exploration, data labeling, model training, evaluation, and inference.
📄️ Cohere
Let's load the Cohere Embedding class.
📄️ DashScope
Let's load the DashScope Embedding class.
📄️ DeepInfra
DeepInfra is a serverless inference as a service that provides access to a variety of LLMs and embeddings models. This notebook goes over how to use LangChain with DeepInfra for text embeddings.
📄️ Elasticsearch
Walkthrough of how to generate embeddings using a hosted embedding model in Elasticsearch
📄️ Embaas
embaas is a fully managed NLP API service that offers features like embedding generation, document text extraction, document to embeddings and more. You can choose a variety of pre-trained models.
📄️ Fake Embeddings
LangChain also provides a fake embedding class. You can use this to test your pipelines.
📄️ Google Cloud Platform Vertex AI PaLM
Note: This is seperate from the Google PaLM integration. Google has chosen to offer an enterprise version of PaLM through GCP, and this supports the models made available through there.
📄️ GPT4All
This notebook explains how to use GPT4All embeddings with LangChain.
📄️ Hugging Face Hub
Let's load the Hugging Face Embedding class.
📄️ InstructEmbeddings
Let's load the HuggingFace instruct Embeddings class.
📄️ Jina
Let's load the Jina Embedding class.
📄️ Llama-cpp
This notebook goes over how to use Llama-cpp embeddings within LangChain
📄️ LocalAI
Let's load the LocalAI Embedding class. In order to use the LocalAI Embedding class, you need to have the LocalAI service hosted somewhere and configure the embedding models. See the documentation at https//localai.io/features/embeddings/index.html.
📄️ MiniMax
MiniMax offers an embeddings service.
📄️ ModelScope
Let's load the ModelScope Embedding class.
📄️ MosaicML embeddings
MosaicML offers a managed inference service. You can either use a variety of open source models, or deploy your own.
📄️ NLP Cloud
NLP Cloud is an artificial intelligence platform that allows you to use the most advanced AI engines, and even train your own engines with your own data.
📄️ OpenAI
Let's load the OpenAI Embedding class.
📄️ SageMaker Endpoint Embeddings
Let's load the SageMaker Endpoints Embeddings class. The class can be used if you host, e.g. your own Hugging Face model on SageMaker.
📄️ Self Hosted Embeddings
Let's load the SelfHostedEmbeddings, SelfHostedHuggingFaceEmbeddings, and SelfHostedHuggingFaceInstructEmbeddings classes.
📄️ Sentence Transformers Embeddings
SentenceTransformers embeddings are called using the HuggingFaceEmbeddings integration. We have also added an alias for SentenceTransformerEmbeddings for users who are more familiar with directly using that package.
📄️ Spacy Embedding
Loading the Spacy embedding class to generate and query embeddings
📄️ TensorflowHub
Let's load the TensorflowHub Embedding class.