Using SmartGPT.
Prerequisites: Rust and Cargoarrow-up-right
Clone the Repository.
git clone https://github.com/Cormanz/smartgpt.git
Alternatively, create a Github Codespacearrow-up-right and run it there.
Run it your first time with cargo run --release, which will auto-generate a config.yml.
cargo run --release
config.yml
Fill in and optionally modify your config.yml, then run it again.
Install docker composearrow-up-right, preferably the latest stable version.
docker compose
Clone the repository with git clone https://github.com/Cormanz/smartgpt.git && cd smartgpt.
git clone https://github.com/Cormanz/smartgpt.git && cd smartgpt
Build the docker image with: docker compose build
docker compose build
Run it in release mode docker compose run --rm smartgpt. This will create a config.yml for you.
docker compose run --rm smartgpt
Adjust the config to your liking, and execute it once again.
SmartGPT has experimental support for local models using our llm-rs integration. Anywhere where you see llm:, you can try to swap in a local model:
llm-rs
llm:
Last updated 2 years ago
local: type: llama # llama / bloom / gpt2 / gptj / neox model path: PATH context tokens: 2048 mmap: true #optional