Introduction

This project is a fine-tuned model based on T5, designed for translating between Spanish and Wayuunaiki. Wayuunaiki is the native language of the Wayuu people, the largest indigenous group in northern Colombia.

Architecture

The model utilizes the T5 architecture, which is a text-to-text transformer model. It is implemented using PyTorch, making it compatible with various deployment and inference endpoints.

Training

The model was trained using a specialized corpus comprising Spanish and Wayuunaiki text. The training process involved fine-tuning the pre-existing T5 model to understand and generate translations between these two languages effectively.

Guide: Running Locally

  1. Clone the Repository: Clone the repository to your local machine.

  2. Install Dependencies: Ensure that you have the necessary libraries, such as Transformers and PyTorch, installed in your environment.

  3. Load the Model: Use the Transformers library to load the model, specifying the model name from the Hugging Face model hub.

  4. Run Inference: Use the loaded model to perform translations from Spanish to Wayuunaiki or vice versa.

For optimal performance, it is recommended to use cloud GPUs such as those available on AWS, GCP, or Azure.

License

The project is released under a license that dictates the terms of use, distribution, and modification. Please refer to the repository for specific licensing information.

More Related APIs in Text2text Generation