Ara Electra Arabic S Qu A Dv2 Q A

ZeyadAhmed

Introduction

The AraElectra-Arabic-SQuADv2-QA model is designed for Question Answering in Arabic, leveraging the AraElectra model fine-tuned on the Arabic-SQuADv2.0 dataset. It handles both answerable and unanswerable questions using the AraElectra Classifier to predict unanswerable ones.

Architecture

  • Language Model: AraElectra
  • Language: Arabic
  • Task: Extractive Question Answering
  • Training Data: Arabic-SQuADv2.0
  • Evaluation Data: Arabic-SQuADv2.0
  • Test Data: Arabic-SQuADv2.0
  • Infrastructure: Utilizes 1x Tesla K80 GPU for training.

Training

  • Batch Size: 8
  • Number of Epochs: 4
  • Learning Rate: 3e-5
  • Optimizer: AdamW
  • Base Model: AraElectra
  • Padding: Dynamic

Guide: Running Locally

  1. Prerequisites: Install the transformers library and arabert preprocessor.
  2. Load Preprocessor and Models:
    • Use the AraBert preprocessor for text preprocessing.
    • Load the ElectraForQuestionAnswering and ElectraForSequenceClassification models from Hugging Face.
  3. Inference:
    • Set up a pipeline for question-answering.
    • Input your question and context in Arabic.
    • Use the classifier to determine if the question is answerable.
  4. Recommended Setup:
    • For optimal performance, consider using cloud GPUs such as AWS EC2 instances with Tesla K80 or equivalent.

License

The model and related components are available under the MIT License, which allows for free use, modification, and distribution.

More Related APIs in Question Answering