bge en icl

BAAI

Introduction

BGE-EN-ICL is a model developed by the Beijing Academy of Artificial Intelligence for in-context learning and state-of-the-art performance in tasks like BEIR and AIR-Bench. It is designed to handle new tasks effectively using few-shot examples.

Architecture

The BGE-EN-ICL model leverages in-context learning capabilities to improve task performance. It uses advanced text embedding techniques to achieve high accuracy in various evaluations.

Training

The model was trained on publicly available and full datasets. The technical report "Making Text Embedders Few-Shot Learners" provides insights into its training and performance.

Guide: Running Locally

  • Clone the Repository: git clone https://github.com/FlagOpen/FlagEmbedding.git
  • Navigate to Directory: cd FlagEmbedding
  • Install Dependencies: pip install -e .
  • Using FlagEmbedding: Import the FlagICLModel for encoding queries and documents, and calculate similarity scores.
  • Using Transformers: Use transformers package to encode inputs and extract embeddings from the last hidden state.
  • GPU Recommendation: It's recommended to use cloud-based GPUs (e.g., AWS EC2, Google Cloud GPU) for efficient computation.

License

FlagEmbedding is licensed under the MIT License.

More Related APIs in Feature Extraction