Install FlawFerret AI Server

Repo: flawferretAIserver

https://github.com/rgmichaels/flawferretAIserver

API Server docs: https://ai.flawferret.com/docs

Local Node run
  1. Clone/download https://github.com/rgmichaels/flawferretAIserver.

  2. Install:

    npm install

  3. Set environment variables:

    • OPENAI_API_KEY (required for OpenAI mode)

    • optional OPENAI_MODEL (default: gpt-4o-mini)

    • optional PORT (default: 8787)

  4. Start:

    npm run start

  5. Server runs at http://localhost:8787.

Docker run
  1. Set env vars (OPENAI_API_KEY, optional OPENAI_MODEL, optional PORT).

  2. Start:

    docker compose up --build

Connect Extension to AI Server

In FlawFerret extension options:

  1. Set AI Server URL

  2. Choose AI Provider

    • openai (uses server-side OpenAI key), or

    • ollama (uses your Ollama URL, usually http://localhost:11434) (codellama:7b is required -

      'ollama pull codellama:7b')

  3. Set model as needed.

  4. Use Test AI Connection.