Basic UI For GPT J 6B With Low Vram Reviews Save

A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.

No reviews for this project.

Add review

Open Source Agenda Badge

Open Source Agenda Rating

From the blog