RAGChain Docs
  • Introduction
  • Quick Start
  • Installation
  • RAGchain Structure
    • File Loader
      • Dataset Loader
        • Ko-Strategy-QA Loader
      • Hwp Loader
      • Rust Hwp Loader
      • Win32 Hwp Loader
      • OCR
        • Nougat Loader
        • Mathpix Markdown Loader
        • Deepdoctection Loader
    • Text Spliter
      • Recursive Text Splitter
      • Markdown Header Splitter
      • HTML Header splitter
      • Code splitter
      • Token splitter
    • Retrieval
      • BM25 Retrieval
      • Hybrid Retrieval
      • Hyde Retrieval
      • VectorDB Retrieval
    • LLM
    • DB
      • MongoDB
      • Pickle DB
    • Reranker
      • BM25 Reranker
      • UPR Reranker
      • TART Reranker
      • MonoT5 Reranker
      • LLM Reranker
    • Benchmark
      • Auto Evaluator
      • Dataset Evaluators
        • Qasper
        • Ko-Strategy-QA
        • Strategy-QA
        • ms-marco
  • Utils
    • Query Decomposition
    • Evidence Extractor
    • Embedding
    • Slim Vector Store
      • Pinecone Slim
      • Chroma Slim
    • File Cache
    • Linker
      • Redis Linker
      • Dynamo Linker
      • Json Linker
    • REDE Search Detector
    • Semantic Clustering
  • Pipeline
    • BasicIngestPipeline
    • BasicRunPipeline
    • RerankRunPipeline
    • ViscondeRunPipeline
  • For Advanced RAG
    • Time-Aware RAG
    • Importance-Aware RAG
Powered by GitBook
On this page
  • Overview
  • Usage
  • Supporting Embedding Models
  1. Utils

Embedding

EmbeddingFactory Class Documentation

PreviousEvidence ExtractorNextSlim Vector Store

Last updated 1 year ago

Overview

The EmbeddingFactory class returns an Langchian's class according to the specified embedding type. This class simplifies the process of creating an embedding instance. It provides common embeddings used by RAG workflows.

Usage

from RAGchain.utils.embed import EmbeddingFactory

openai_embedding = EmbeddingFactory(embed_type='openai', device_type='cuda').get()

Supporting Embedding Models

You can use and in Langchain. You don't have to use EmbeddingFactory. But, with EmbeddingFactory you can easily get below embedding models.

Model Name
embed_type

openai

contriever

multilingual_e5

ko_sroberta_multitask

kosimcse

You can put embed_type string of model that you want to use when initialize EmbeddingFactory.

Embeddings
Embeddings
OpenAI text-embedding-ada-002
Contriever
Multilingual-e5
Ko-sroberta-multitask
KoSimCSE