# xlm-roberta-large-qa-multilingual-finedtuned-ru **Repository Path**: modelee/xlm-roberta-large-qa-multilingual-finedtuned-ru ## Basic Information - **Project Name**: xlm-roberta-large-qa-multilingual-finedtuned-ru - **Description**: Pretrained model using a masked language modeling (MLM) objective. - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2023-05-23 - **Last Updated**: 2024-04-02 ## Categories & Tags **Categories**: llm **Tags**: None ## README --- language: - en - ru - multilingual license: apache-2.0 --- # XLM-RoBERTa large model whole word masking finetuned on SQuAD Pretrained model using a masked language modeling (MLM) objective. Fine tuned on English and Russian QA datasets ## Used QA Datasets SQuAD + SberQuAD [SberQuAD original paper](https://arxiv.org/pdf/1912.09723.pdf) is here! Recommend to read! ## Evaluation results The results obtained are the following (SberQUaD): ``` f1 = 84.3 exact_match = 65.3