LINE DistilBERT Japanese Resources Save

DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.

No resources for this project.

Add resource

Open Source Agenda Badge

Open Source Agenda Rating
Submit Resource Articles, Courses, Videos