GPTSwarm Save

🐝 GPTSwarm: LLM agents as (Optimizable) Graphs

Project README

Page arXiv License Issues Twitter Follow Wechat Coverage Status

GPTSwarm

🐝 GPTSwarm is a graph-based framework for LLM-based agents, providing two high-level features:

  • It lets you build LLM-based agents from graphs.
  • It enables the customized and automatic self-organization of agent swarms with self-improvement capabilities.

News

  • πŸ”₯ [05/01] GPTSwarm has been accepted by ICML2024.

  • πŸ”₯ [03/01] GPTSwarm can be installed via pip now: pip install gptswarm

  • πŸ”₯ [02/27] Our academic paper: Language Agents as Optimizable Graphs is released.

Edge optimization example

Here is the edge optimization process that updates edge probabilities toward improvement of the benchmark score. Notice that within an agent, the edges are fixed, whereas the inter-agent connections are getting optimized towards either edge pruning (value 0, blue) or creation (value 1, red).

Edge optimization

About GPTSwarm

Framework

At a granular level, GPTSwarm is a library that includes the following components:

Module Description
swarm.environment Domain-specific operations, agents, tools, and tasks
swarm.graph Graph-related functions for creating and executing agent graphs and swarm composite graphs
swarm.llm Interface for selecting LLM backends and calculating their operational costs
swarm.memory Index-based memory
swarm.optimizer Optimization algorithms designed to enhance agent performance and overall swarm efficiency

Quickstart

Clone the repo

git clone https://github.com/metauto-ai/GPTSwarm.git
cd GPTSwarm/

Install packages

conda create -n swarm python=3.10
conda activate swarm
pip install poetry
poetry install

You should add API keys in .env.template and change its name to .env

OPENAI_API_KEY="" # for OpenAI LLM backend
SEARCHAPI_API_KEY="" # for Web Search

Getting started with GPTSwarm is easy. Quickly run a predefined swarm

from swarm.graph.swarm import Swarm

swarm = Swarm(["IO", "IO", "IO"], "gaia")
task = "What is the capital of Jordan?"
inputs = {"task": task}
answer = await swarm.arun(inputs)

or make use of tools, such as the file analyzer

from swarm.graph.swarm import Swarm
swarm = Swarm(["IO", "TOT"], "gaia")
task = "Tell me more about this image and summarize it in 3 sentences."
files = ["./datasets/demos/js.png"]
inputs = {"task": task, "files": files}
danswer = swarm.run(inputs)

Check out the minimal Swarm example in Colab here: Open In Colab.

See how to create a custom Agent and run a Swarm with it here: Open In Colab.

Here is a Youtube video on how to run the demo notebooks:

πŸ”₯πŸ”₯πŸ”₯ See our experiments for more advanced use of our framework.

Class diagram

Edge optimization

Example of the Swarm

Edge optimization

More Visualizations

Edge optimization

Running with a local LLM

We support local LM inference via LM Studio. Download their desktop app for Mac or Windows, choose a model from the Huggingface repository and start the server. Use model_name='lmstudio' in GPTSwarm code to run with the local LLM.

Edge optimization

Contributors

Please read our developer document if you are interested in contributing.

Citation

Please cite our paper if you find the library useful or interesting.

@article{zhuge2024language,
  title={Language Agents as Optimizable Graphs},
  author={Zhuge, Mingchen and Wang, Wenyi and Kirsch, Louis and Faccio, Francesco and Khizbullin, Dmitrii and Schmidhuber, Jurgen},
  journal={arXiv preprint arXiv:2402.16823},
  year={2024}
}
Open Source Agenda is not affiliated with "GPTSwarm" Project. README Source: metauto-ai/GPTSwarm
Stars
351
Open Issues
4
Last Commit
2 weeks ago
Repository
License
MIT

Open Source Agenda Badge

Open Source Agenda Rating