LINE DistilBERT Japanese Reviews Save

DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.

No reviews for this project.

Add review

Open Source Agenda Badge

Open Source Agenda Rating

From the blog