Deploy your own GPT-3 API endpoint with ease using Docker and Fly!
100% OpenAI does NOT like this. this WILL get patched (likely soon). i wrote this to be maybe used for a hobby project and proof of work.
as long as you have heard the big red alarm bells you may continue.
if you happen to be in the market for an engineer, please peep my resume. 💖
This repo is built with waylaidwanderer's awesome node-chatgpt-api wrapper and compiles into a 100mb production ready docker image.
These instructions will give you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on deploying the project on a live system.
Clone the repo
git clone https://github.com/queercat/gpt-api-docker
cd gpt-api-docker
Build the dockerfile
docker build -t gpt-api .
Run it!
docker run -p 3000:3000 -e GPT_API_KEY={YOUR_GPT_API_KEY} -t gpt-api
Enjoy!
Example:
To deploy with fly.io you can just use fly launch and edit the fly.toml such that the ports section reflects the server config (default: 3000).
Clone everything.
git clone https://github.com/queercat/gpt-api-docker
cd gpt-api-docker
Setup Fly.
fly launch
vim fly.toml
To deploy make sure to set your secret with
fly secrets set GPT_API_KEY={YOUR_API_KEY_HERE}
fly deploy
If you have any issues you probably need to make sure SSL is enabled and your ports are correct
Assuming you've set everything up correctly, send a post request to your hostname at the url /conversation (e.g. localhost:3000/conversation).
Here is an example query.
curl {hostname:port}/conversation -X POST -H "Content-Type: application/json" -d "{\"message\": \"what is the square root of 100?\"}"
If you want more information check out the wrapper repo
Feel free to just throw up a pull request or issue and I'll try to get to it as quickly as I can when I see it.
This project is licensed under the Open MIT License do with it what you want!.