ModuleFormer Resources Save

ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters.

No resources for this project.

Add resource

Open Source Agenda Badge

Open Source Agenda Rating
Submit Resource Articles, Courses, Videos