AIhub 11:46 am on June 4, 2024
AIhub's research team fine-tuned a pre-trained model with 600 million parameters, resulting in impressive Geez translation scores. The future includes expanding the dataset and digitalizing religious texts.
- Fine-Tuning Model: Utilized a distilled version of a pre-trained model to achieve high Bleu scores for Amharic-Geez, Geez-Amharic, and Gee to Respectively.
- Future Plans: Expand datasets and digitize religious texts from the Ethiopian Orthodox Tewahedo Church.
- Platform Development: Created a platform for translator submissions and review.
- Acknowledgments: AIhub's work is supported by donations from MBZUAI, indicating collaborative funding efforts.
https://aihub.org/2024/06/04/interview-with-henok-biadglign-ademtew-creating-an-amharic-geez-and-english-parallel-dataset/
< Previous Story - Next Story >