This page covers all LangChain integrations with Hugging Face Hub and libraries like transformers, sentence transformers, and datasets.Documentation Index
Fetch the complete documentation index at: https://langchain-5e9cc07a-preview-andyye-1778820730-7214c62.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Chat models
ChatHuggingFace
We can use theHugging Face LLM classes or directly use the ChatHuggingFace class.
See a usage example.
LLMs
HuggingFaceEndpoint
We can use theHuggingFaceEndpoint class to run open source models via serverless Inference Providers or via dedicated Inference Endpoints.
See a usage example.
HuggingFacePipeline
We can use theHuggingFacePipeline class to run open source models locally.
See a usage example.
Embedding models
HuggingFaceEmbeddings
We can use theHuggingFaceEmbeddings class to run open source embedding models locally.
See a usage example.
HuggingFaceEndpointEmbeddings
We can use theHuggingFaceEndpointEmbeddings class to run open source embedding models via a dedicated Inference Endpoint.
See a usage example.
Text Embeddings Inference (TEI)
For self-hosted production serving of Sentence Transformers models, Hugging Face publishes Text Embeddings Inference, a dedicated inference server with batching and GPU support. Point LangChain at a TEI deployment viaHuggingFaceEndpointEmbeddings or see the dedicated TEI integration guide.
BGE embedding models
BGE models on Hugging Face are a strong open-source embedding family from the Beijing Academy of Artificial Intelligence (BAAI).BGE models are Sentence Transformers models, so use
HuggingFaceEmbeddings with encode_kwargs={"normalize_embeddings": True}. See a usage example.
Legacy embedding classes
The following classes fromlangchain-community predate langchain-huggingface. Prefer HuggingFaceEmbeddings or HuggingFaceEndpointEmbeddings for new projects:
HuggingFaceInferenceAPIEmbeddings(langchain_community.embeddings): deprecated sincelangchain-community==0.2.2in favor ofHuggingFaceEndpointEmbeddingsfromlangchain-huggingface, which covers both Inference Providers (provider="hf-inference"etc.) and dedicated Inference Endpoints.HuggingFaceInstructEmbeddings(langchain_community.embeddings): useHuggingFaceEmbeddingswith a modern instruction-aware model andencode_kwargs={"prompt": ...}. See Instructor embeddings.HuggingFaceBgeEmbeddings(langchain_community.embeddings): useHuggingFaceEmbeddingswithencode_kwargs={"normalize_embeddings": True}, and setquery_encode_kwargs={"prompt": "..."}when the model needs a query prefix (e.g., the olderBAAI/bge-*-en-v1.5family). See BGE on Hugging Face.
Document loaders
Hugging Face dataset
Hugging Face Hub is home to over 75,000 datasets in more than 100 languages that can be used for a broad range of tasks across NLP, Computer Vision, and Audio. They used for a diverse range of tasks such as translation, automatic speech recognition, and image classification.We need to install
datasets python package.
Hugging Face model loader
Load model information fromHugging Face Hub, including README content. This loader interfaces with theHugging Face Models APIto fetch and load model metadata and README files. The API allows you to search and filter models based on specific criteria such as model tags, authors, and more.
Image captions
It uses the Hugging Face models to generate image captions. We need to install several python packages.Tools
Hugging Face hub tools
Hugging Face Tools
support text I/O and are loaded using the load_huggingface_tool function.
We need to install several python packages.
Hugging Face Text-to-Speech model inference.
It is a wrapper around OpenAI Text-to-Speech API.
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

