bert base multilingual cased finetuned polish squad1
henrykIntroduction
This document provides an overview of a BERT-based model fine-tuned for Polish question answering tasks. The model is a multilingual variant designed by Google and has been specifically adapted for use with Polish translations of the SQuAD1.1 dataset.
Architecture
The model architecture is based on the BERT-base-multilingual-cased, consisting of 12 layers, 768 hidden states, 12 attention heads, and a total of 110 million parameters. It has been trained on cased text data across 104 languages with large Wikipedia entries.
Training
Training involved translating the SQuAD1.1 dataset into Polish using the mtranslate
module. The translations were used to find starting tokens in the text, although some translations were incomplete due to contextual differences. The model was trained on a Tesla V100 GPU. The training command included parameters such as 2 epochs, a maximum sequence length of 384, and a document stride of 128. The model achieved an Exact Match (EM) score of 60.67 and an F1 score of 71.89.
Guide: Running Locally
To run the model locally, follow these steps:
-
Install Transformers Library: Ensure you have the Hugging Face Transformers library installed.
-
Initialize Pipeline: Use the
pipeline
function from the Transformers library to initialize a question-answering pipeline with the model.from transformers import pipeline qa_pipeline = pipeline( "question-answering", model="henryk/bert-base-multilingual-cased-finetuned-polish-squad1", tokenizer="henryk/bert-base-multilingual-cased-finetuned-polish-squad1" ) result = qa_pipeline({ 'context': "Warszawa jest największym miastem w Polsce pod względem liczby ludności i powierzchni", 'question': "Jakie jest największe miasto w Polsce?" })
-
Output: The model will output the answer with a confidence score.
Cloud GPUs such as those provided by AWS, Google Cloud, or Azure can be used for more efficient processing.
License
The model and its usage adhere to the licensing agreements as specified by the Hugging Face platform and the respective model creators. Please review the specific license details on the Hugging Face model page.