Install FlawFerret AI Server
Repo: flawferretAIserver
https://github.com/rgmichaels/flawferretAIserver
API Server docs: https://ai.flawferret.com/docs
Local Node run
Clone/download https://github.com/rgmichaels/flawferretAIserver.
Install:
npm install
Set environment variables:
OPENAI_API_KEY (required for OpenAI mode)
optional OPENAI_MODEL (default: gpt-4o-mini)
optional PORT (default: 8787)
Start:
npm run start
Server runs at http://localhost:8787.
Docker run
Set env vars (OPENAI_API_KEY, optional OPENAI_MODEL, optional PORT).
Start:
docker compose up --build
Connect Extension to AI Server
In FlawFerret extension options:
Set AI Server URL
local default: http://localhost:8787
Choose AI Provider
openai (uses server-side OpenAI key), or
ollama (uses your Ollama URL, usually http://localhost:11434) (codellama:7b is required -
'ollama pull codellama:7b')
Set model as needed.
Use Test AI Connection.
