Skip to content

LLM Group Chat Framework: chat with multiple LLMs at the same time. Đại mô hình đàn liêu dàn giáo: Đồng thời cùng nhiều đại ngôn ngữ mô hình nói chuyện phiếm.

License

Notifications You must be signed in to change notification settings

InternLM/OpenAOE

Repository files navigation

Latest Progress 🎉

  • [March 2024] Add gemma-7b and qwen-7b models ( based on Ollama )
  • [February 2024] Add mistral-7b model (based on Ollama)
  • [February 2024] Add gemini-pro model (based on Open API)
  • [January 2024] refactor the config-template.yaml to control the backend and the frontend settings at the same time,clickto find more introduction about theconfig-template.yaml
  • [January 2024] Add internlm2-chat-7b model (based on LMDeploy)
  • [January 2024] Released version v0.0.1, officially open source!

Introduction

What is OpenAOE?

AOE, an acronym from DOTA2 for Area Of Effect, denotes an ability that can affect a group of targets within a certain area. Here, AOE in AI implies that user can obtain parallel outputs from multiple LLMs with one single prompt at the same time.

What problem does OpenAOE want to solve?

Currently, there are many open-source frameworks based on the ChatGPT for chat, but the LGC(LLM Group Chat) framework is still not coming yet.

The emergence of OpenAOE fills this gap: OpenAOE can help LLM researchers, evaluators, engineering developers, and even non-professionals to quickly access the market's well-known commercial and open-source LLMs, providing both single model serial response mode and multi-models parallel response mode.

What can you get from OpenAOE?

OpenAOE can:

  1. return one or more LLMs' answersat the same timeby a single prompt.
  2. provide access to commercial LLM APIs, with default support for gpt3.5, gpt4, Google Palm, Minimax, Claude, Spark, etc., and also support user-defined access to other large model APIs. (API keys need to be prepared in advanced)
  3. provide access to open-source LLM APIs. ( We recommend to useLMDeployto deploy with one click)
  4. provide backend APIs and a WEB-UI to meet the needs of different requirements.

Quick Run

Tip

Require Python >= 3.9

We provide three different ways to run OpenAOE:run by pip,run by dockerandrun by source codeas well.

Run by pip

Install

pip install -U openaoe

Start

openaoe -f /path/to/your/config-template.yaml

Run by docker

Install

There are two ways to get the OpenAOE docker image by:

  1. pull the OpenAOE docker image
docker pull opensealion/openaoe:latest
  1. or build a docker image
git clone https://github /internlm/OpenAOE
cdOpenAOE
docker build.-f docker/Dockerfile -t opensealion/openaoe:latest

Start

docker run -p 10099:10099 -v /path/to/your/config-template.yaml:/app/config.yaml --name OpenAOE opensealion/openaoe:latest

Run by source code

Install

  1. clone this project
git clone https://github /internlm/OpenAOE
  1. [optional] build the frontend project when the frontend codes are changed
cdOpenAOE/openaoe/frontend
npm install
npm run build

Start

cdOpenAOE#this OpenAOE is the clone directory
pip install -r openaoe/backend/requirements.txt
Python -m openaoe.main -f /path/to/your/config-template.yaml

Tip

/path/to/your/config-template.yamlis a configuration file loaded by OpenAOE at startup, which contains the relevant configuration information for the LLMs, including: API URLs, AKSKs, Tokens, etc. A template configuration yaml file can be found inopenaoe/backend/config/config-template.yaml. Note that, thisconfig-template.yamlDOES NOT contain any API access data, you should add them by yourself.

Tech Report

You are always welcome to fork this project to contribute your work and find theTODOs in furture.

If you want to add more LLMs' APIs or features based on OpenAOE, the following info might be helpful.

Tech Stack

The technology stack we use includes:

  1. Backend framework based on Python + fastapi;
  2. Frontend framework based on typescript + Sealion-Client (encapsulated based on React) + Sealion-UI.
  3. Build tools:
    1. conda: quickly create a virtual Python env to install necessary packages
    2. npm: build the frontend project

Tip

The build tools can be installed quickly bypip install -U sealion-cli

Organization of the Repo

  • Frontend codes are inopenaoe/frontend
  • Backend codes are inopenaoe/backend
  • Project entry-point isopenaoe/main.py

How to add a new model

Frontend

  • Add new model info likename,avatar,provider,etc inopenaoe/frontend/src/config/model-config.ts
  • Add a new model basic API request payload configuration inopenaoe/frontend/src/config/api-config.ts
  • Modify your new model's payload specifically inopenaoe/frontend/src/services/fetch.ts,you may need to change the payload structure and handle corner cases according to your model's API definition.

About

LLM Group Chat Framework: chat with multiple LLMs at the same time. Đại mô hình đàn liêu dàn giáo: Đồng thời cùng nhiều đại ngôn ngữ mô hình nói chuyện phiếm.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published