bge multilingual gemma2
BAAIIntroduction
BGE-MULTILINGUAL-GEMMA2 is a model developed by the Beijing Academy of Artificial Intelligence. It is designed for tasks such as feature extraction and sentence similarity using libraries like sentence-transformers and safetensors.
Architecture
The model utilizes the GEMMA2 architecture and is compatible with various applications, including sentence-transformers for sentence similarity and feature extraction.
Training
Details on the specific training methodologies, datasets, and parameters used for BGE-MULTILINGUAL-GEMMA2 are not provided in the documentation snippet. However, it likely involves multilingual datasets to support its functionality across different languages.
Guide: Running Locally
- Clone the repository from Hugging Face:
git clone https://huggingface.co/BAAI/bge-multilingual-gemma2
- Install necessary dependencies, likely involving
transformers
andsentence-transformers
libraries:pip install transformers sentence-transformers
- Load the model in your Python environment:
from transformers import AutoModel model = AutoModel.from_pretrained("BAAI/bge-multilingual-gemma2")
- Consider using cloud GPU services such as AWS, Google Cloud, or Azure for efficient model inference, especially for larger datasets.
License
The model is distributed under the GEMMA license, though specific terms are not detailed in the provided snippet.