ModuleFormer Reviews Save

ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters.

No reviews for this project.

Add review

Open Source Agenda Badge

Open Source Agenda Rating

From the blog