This commit is contained in:
171
README.md
171
README.md
@@ -1,21 +1,11 @@
|
||||
# UNFI UDM Pro SE Security Copilot
|
||||
|
||||
LAN-only TypeScript monorepo for:
|
||||
- realtime UDM Pro SE log/event visibility
|
||||
- policy-based security posture auditing
|
||||
- GPT-5.3-Codex-gated recommendations
|
||||
- queued low-risk remediation with manual apply, backup, verify, rollback
|
||||
Docker-only security copilot for UniFi UDM Pro SE:
|
||||
- realtime logs/events
|
||||
- config posture checks
|
||||
- AI recommendations and remediation queue
|
||||
|
||||
## Monorepo layout
|
||||
- `apps/api` - Fastify HTTP + WebSocket API
|
||||
- `apps/worker` - background posture/recommendation/execution loops
|
||||
- `apps/web` - React dashboard
|
||||
- `packages/contracts` - shared zod schemas + types
|
||||
|
||||
## Beginner install (Ubuntu Server 24.04)
|
||||
This is the easiest and recommended path for a new setup.
|
||||
|
||||
### 1) Install Docker
|
||||
## 1) Install Docker (Ubuntu 24.04)
|
||||
```bash
|
||||
sudo apt update
|
||||
sudo apt install -y ca-certificates curl git
|
||||
@@ -36,25 +26,19 @@ sudo apt update
|
||||
sudo apt install -y docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
|
||||
```
|
||||
|
||||
Optional (use Docker without `sudo`):
|
||||
Optional (run Docker without sudo):
|
||||
```bash
|
||||
sudo usermod -aG docker $USER
|
||||
newgrp docker
|
||||
```
|
||||
|
||||
Verify:
|
||||
```bash
|
||||
docker --version
|
||||
docker compose version
|
||||
```
|
||||
|
||||
### 2) Get the project
|
||||
## 2) Get the project
|
||||
```bash
|
||||
git clone <YOUR_REPO_URL> unfiAgent
|
||||
cd unfiAgent
|
||||
```
|
||||
|
||||
### 3) Create app config + encryption key
|
||||
## 3) Create env + encryption key
|
||||
```bash
|
||||
cp .env.example .env
|
||||
mkdir -p secrets
|
||||
@@ -62,53 +46,37 @@ openssl rand -base64 32 > secrets/unfi_encryption_key
|
||||
chmod 600 secrets/unfi_encryption_key
|
||||
```
|
||||
|
||||
### 4) Pick your LLM mode in `.env`
|
||||
For your setup (OpenWebUI + Ollama), use:
|
||||
```env
|
||||
LLM_PROVIDER=local_ollama
|
||||
OPENAI_MODEL=qwen3-coder-next
|
||||
LOCAL_LLM_BASE_URL=http://YOUR_OPENWEBUI_HOST:8080
|
||||
LOCAL_LLM_MODELS_PATH=/ollama/api/tags
|
||||
LOCAL_LLM_CHAT_PATH=/ollama/api/chat
|
||||
LOCAL_LLM_API_KEY=
|
||||
```
|
||||
|
||||
### 5) Start the app
|
||||
## 4) Start the app
|
||||
```bash
|
||||
docker compose up --build -d
|
||||
docker compose ps
|
||||
docker compose logs -f api
|
||||
```
|
||||
|
||||
### 6) Open UI and bootstrap account
|
||||
- Open `http://<SERVER_IP>:5173`
|
||||
- Create first owner account via **Bootstrap owner**
|
||||
- Add TOTP URI to authenticator app
|
||||
- Sign in
|
||||
## 5) Open the app
|
||||
- UI: `http://<SERVER_IP>:5173`
|
||||
- First login: use **Bootstrap owner** once, then sign in with MFA.
|
||||
|
||||
## If commands are missing (`npm` / `codex` not found)
|
||||
You only need this if you want `LLM_PROVIDER=codex_oauth` or local Node runtime.
|
||||
## 6) Configure AI provider in the app (interactive)
|
||||
In the UI:
|
||||
1. Open **AI Setup**.
|
||||
2. Pick provider (`Local Ollama / OpenWebUI`, `OpenAI API`, or `Codex OAuth`).
|
||||
3. Enter model + provider connection settings.
|
||||
4. Save and confirm provider status turns `ready`.
|
||||
|
||||
Install Node.js 22:
|
||||
```bash
|
||||
curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash -
|
||||
sudo apt install -y nodejs
|
||||
node -v
|
||||
npm -v
|
||||
```
|
||||
You do not need to edit provider settings in `.env` after this.
|
||||
|
||||
Install Codex CLI and login:
|
||||
```bash
|
||||
sudo npm install -g @openai/codex
|
||||
codex --version
|
||||
codex login
|
||||
codex login status
|
||||
```
|
||||
Expected output includes: `Logged in using ChatGPT`.
|
||||
## Recommended for your setup (OpenWebUI + Ollama)
|
||||
In **AI Setup**, use:
|
||||
- Provider: `Local Ollama / OpenWebUI`
|
||||
- Model: `qwen3-coder-next`
|
||||
- Base URL: `http://YOUR_OPENWEBUI_HOST:8080`
|
||||
- Models Path: `/ollama/api/tags`
|
||||
- Chat Path: `/ollama/api/chat`
|
||||
- API key: only if your OpenWebUI endpoint requires it
|
||||
|
||||
## Common first-run errors (and fixes)
|
||||
### Error: `permission denied while trying to connect to the Docker daemon socket`
|
||||
Cause: your Linux user is not allowed to use Docker yet.
|
||||
## If Docker says permission denied
|
||||
Error example: `permission denied while trying to connect to the Docker daemon socket`
|
||||
|
||||
Fix:
|
||||
```bash
|
||||
@@ -116,84 +84,5 @@ sudo usermod -aG docker $USER
|
||||
newgrp docker
|
||||
docker compose up --build -d
|
||||
```
|
||||
If it still fails, log out and log back in, then retry.
|
||||
|
||||
### Error during API image build: `Cannot find module /workspace/apps/api/scripts/copy-sql.mjs`
|
||||
Cause: older repo revision missing the script copy in `apps/api/Dockerfile`.
|
||||
|
||||
Fix:
|
||||
```bash
|
||||
git pull
|
||||
docker compose build --no-cache api
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
### `codex login` opens URL and then returns `Not logged in`
|
||||
Cause: login flow was interrupted (for example `Ctrl+C`) or not completed in browser.
|
||||
|
||||
Fix:
|
||||
1. Run `codex login` and complete the browser flow fully.
|
||||
2. Verify with `codex login status`.
|
||||
3. Only use this if `LLM_PROVIDER=codex_oauth`. For `local_ollama`, Codex login is not required.
|
||||
|
||||
## Local Node runtime (optional, advanced)
|
||||
Use this if you do not want Docker:
|
||||
```bash
|
||||
npm install
|
||||
npm run dev
|
||||
```
|
||||
|
||||
## LLM modes
|
||||
- `LLM_PROVIDER=codex_oauth` (default): uses local `codex` CLI and your ChatGPT login session, not API key billing.
|
||||
- `LLM_PROVIDER=openai_api`: uses `OPENAI_API_KEY` and `/v1/chat/completions` metered API calls.
|
||||
- `LLM_PROVIDER=local_ollama`: uses local/openwebui-served Ollama-style endpoints for open-source models.
|
||||
|
||||
Environment knobs:
|
||||
- `OPENAI_MODEL=gpt-5.3-codex`
|
||||
- `CODEX_CLI_PATH=codex`
|
||||
- `OPENAI_API_KEY` only required in `openai_api` mode.
|
||||
- `LOCAL_LLM_BASE_URL` default `http://localhost:11434`
|
||||
- `LOCAL_LLM_MODELS_PATH` default `/api/tags`
|
||||
- `LOCAL_LLM_CHAT_PATH` default `/api/chat`
|
||||
- `LOCAL_LLM_API_KEY` optional bearer token for protected local gateways
|
||||
|
||||
Important:
|
||||
- In `codex_oauth` mode, the API process must be able to execute `codex` and access your Codex login session.
|
||||
- If you run via Docker, either install/mount Codex CLI and credentials into the API container or run API locally.
|
||||
|
||||
OpenWebUI + Ollama example:
|
||||
- Set `LLM_PROVIDER=local_ollama`
|
||||
- Set `OPENAI_MODEL=qwen3-coder-next`
|
||||
- Set `LOCAL_LLM_BASE_URL=http://YOUR_OPENWEBUI_HOST:8080`
|
||||
- Set `LOCAL_LLM_MODELS_PATH=/ollama/api/tags`
|
||||
- Set `LOCAL_LLM_CHAT_PATH=/ollama/api/chat`
|
||||
- If your OpenWebUI endpoint is protected, set `LOCAL_LLM_API_KEY`
|
||||
|
||||
## First-run auth flow
|
||||
1. Open web UI at `http://localhost:5173`.
|
||||
2. Use **Bootstrap owner** once to create the first local account and get TOTP URI.
|
||||
3. Add TOTP URI to authenticator app.
|
||||
4. Sign in with username/password/TOTP code.
|
||||
|
||||
## Security defaults
|
||||
- AI/remediation features are blocked if the configured model/provider is unavailable.
|
||||
- Queue execution is manual-trigger only (`POST /api/v1/remediation/queue/apply`).
|
||||
- Low-risk non-disruptive actions only.
|
||||
- Mandatory backup and rollback attempt for each execution.
|
||||
- Burn-in gate of 7 days before execution enablement.
|
||||
|
||||
## API endpoints
|
||||
- `POST /api/v1/udm/connect`
|
||||
- `GET /api/v1/health/dependencies`
|
||||
- `GET /api/v1/logs/realtime` (WebSocket upgrade)
|
||||
- `GET /api/v1/security/posture`
|
||||
- `GET /api/v1/security/recommendations`
|
||||
- `POST /api/v1/remediation/queue`
|
||||
- `POST /api/v1/remediation/queue/apply`
|
||||
- `GET /api/v1/remediation/executions/:id`
|
||||
- `GET /api/v1/audit/events`
|
||||
- `POST /api/v1/alerts/test-email`
|
||||
|
||||
## Notes
|
||||
- MVP targets one UDM Pro SE on UniFi stable channel.
|
||||
- Syslog fallback listens on UDP 5514 when enabled by the worker.
|
||||
If still failing, log out and log in again.
|
||||
|
||||
@@ -16,6 +16,7 @@ import { runMigrations } from "./lib/migrations.js";
|
||||
import { AlertService } from "./services/alertService.js";
|
||||
import { DependencyService } from "./services/dependencyService.js";
|
||||
import { CodexClient } from "./services/llmOrchestrator.js";
|
||||
import { LlmSettingsService, llmSettingsUpdateSchema } from "./services/llmSettingsService.js";
|
||||
import { LogIngestor } from "./services/logIngestor.js";
|
||||
import { PolicyEngine } from "./services/policyEngine.js";
|
||||
import { RemediationManager } from "./services/remediationManager.js";
|
||||
@@ -75,6 +76,18 @@ const codexClient = new CodexClient({
|
||||
localModelsPath: config.LOCAL_LLM_MODELS_PATH,
|
||||
localChatPath: config.LOCAL_LLM_CHAT_PATH
|
||||
});
|
||||
const llmSettingsService = new LlmSettingsService(repository, codexClient, encryption, {
|
||||
provider: config.LLM_PROVIDER,
|
||||
model: config.OPENAI_MODEL,
|
||||
openAiApiKey: config.OPENAI_API_KEY,
|
||||
openAiBaseUrl: config.OPENAI_BASE_URL,
|
||||
codexCliPath: config.CODEX_CLI_PATH,
|
||||
localBaseUrl: config.LOCAL_LLM_BASE_URL,
|
||||
localApiKey: config.LOCAL_LLM_API_KEY,
|
||||
localModelsPath: config.LOCAL_LLM_MODELS_PATH,
|
||||
localChatPath: config.LOCAL_LLM_CHAT_PATH
|
||||
});
|
||||
await llmSettingsService.initialize();
|
||||
const policyEngine = new PolicyEngine();
|
||||
const dependencyService = new DependencyService(unifiAdapter, codexClient);
|
||||
const alertService = new AlertService(repository, {
|
||||
@@ -138,6 +151,45 @@ app.get("/api/v1/health/dependencies", async () => {
|
||||
return dependencyService.check();
|
||||
});
|
||||
|
||||
app.get(
|
||||
"/api/v1/llm/settings",
|
||||
{
|
||||
preHandler: ensureRole("owner")
|
||||
},
|
||||
async () => {
|
||||
return llmSettingsService.getSettings();
|
||||
}
|
||||
);
|
||||
|
||||
app.put(
|
||||
"/api/v1/llm/settings",
|
||||
{
|
||||
preHandler: ensureRole("owner")
|
||||
},
|
||||
async (request, reply) => {
|
||||
const body = llmSettingsUpdateSchema.parse(request.body);
|
||||
const settings = await llmSettingsService.updateSettings(body);
|
||||
const dependency = await dependencyService.check();
|
||||
await audit.append({
|
||||
actor: request.user!.username,
|
||||
actorRole: request.user!.role,
|
||||
eventType: "llm.settings.updated",
|
||||
entityType: "llm_settings",
|
||||
entityId: "singleton",
|
||||
payload: {
|
||||
provider: settings.provider,
|
||||
model: settings.model,
|
||||
hasOpenAiApiKey: settings.hasOpenAiApiKey,
|
||||
hasLocalApiKey: settings.hasLocalApiKey
|
||||
}
|
||||
});
|
||||
return reply.status(200).send({
|
||||
settings,
|
||||
dependency
|
||||
});
|
||||
}
|
||||
);
|
||||
|
||||
app.post("/api/v1/auth/bootstrap", async (request, reply) => {
|
||||
const body = bootstrapSchema.parse(request.body);
|
||||
const result = await authService.bootstrapOwner(body.username, body.password);
|
||||
|
||||
@@ -7,7 +7,7 @@ import { recommendationSetSchema, type Recommendation, type RecommendationSet, t
|
||||
|
||||
const SECRET_KEY_PATTERN = /(password|secret|token|cookie|api[_-]?key|private[_-]?key)/i;
|
||||
|
||||
type LlmProvider = "codex_oauth" | "openai_api" | "local_ollama";
|
||||
export type LlmProvider = "codex_oauth" | "openai_api" | "local_ollama";
|
||||
|
||||
interface CommandResult {
|
||||
stdout: string;
|
||||
@@ -31,35 +31,72 @@ export interface CodexClientOptions {
|
||||
fetchFn?: typeof fetch;
|
||||
}
|
||||
|
||||
export type CodexClientRuntimeOptions = Omit<CodexClientOptions, "commandRunner" | "fetchFn">;
|
||||
|
||||
export class CodexClient {
|
||||
private availableModels = new Set<string>();
|
||||
private modelsCheckedAt = 0;
|
||||
private codexLoginCheckedAt = 0;
|
||||
private codexLoginLooksGood = false;
|
||||
private readonly provider: LlmProvider;
|
||||
private readonly model: string;
|
||||
private readonly openAiApiKey: string;
|
||||
private readonly openAiBaseUrl: string;
|
||||
private readonly codexCliPath: string;
|
||||
private readonly localBaseUrl: string;
|
||||
private readonly localApiKey: string;
|
||||
private readonly localModelsPath: string;
|
||||
private readonly localChatPath: string;
|
||||
private provider: LlmProvider;
|
||||
private model: string;
|
||||
private openAiApiKey: string;
|
||||
private openAiBaseUrl: string;
|
||||
private codexCliPath: string;
|
||||
private localBaseUrl: string;
|
||||
private localApiKey: string;
|
||||
private localModelsPath: string;
|
||||
private localChatPath: string;
|
||||
private readonly commandRunner: CommandRunner;
|
||||
private readonly fetchFn: typeof fetch;
|
||||
|
||||
constructor(options: CodexClientOptions) {
|
||||
this.commandRunner = options.commandRunner ?? runCommand;
|
||||
this.fetchFn = options.fetchFn ?? fetch;
|
||||
this.provider = "local_ollama";
|
||||
this.model = "";
|
||||
this.openAiApiKey = "";
|
||||
this.openAiBaseUrl = "https://api.openai.com/v1";
|
||||
this.codexCliPath = "codex";
|
||||
this.localBaseUrl = "http://localhost:11434";
|
||||
this.localApiKey = "";
|
||||
this.localModelsPath = "/api/tags";
|
||||
this.localChatPath = "/api/chat";
|
||||
this.configure(options);
|
||||
}
|
||||
|
||||
configure(options: CodexClientRuntimeOptions): void {
|
||||
this.provider = options.provider;
|
||||
this.model = options.model;
|
||||
this.openAiApiKey = options.openAiApiKey;
|
||||
this.openAiBaseUrl = options.openAiBaseUrl;
|
||||
this.openAiBaseUrl = trimTrailingSlash(options.openAiBaseUrl);
|
||||
this.codexCliPath = options.codexCliPath;
|
||||
this.localBaseUrl = trimTrailingSlash(options.localBaseUrl);
|
||||
this.localApiKey = options.localApiKey;
|
||||
this.localModelsPath = options.localModelsPath;
|
||||
this.localChatPath = options.localChatPath;
|
||||
this.commandRunner = options.commandRunner ?? runCommand;
|
||||
this.fetchFn = options.fetchFn ?? fetch;
|
||||
this.localModelsPath = normalizePath(options.localModelsPath);
|
||||
this.localChatPath = normalizePath(options.localChatPath);
|
||||
this.resetCaches();
|
||||
}
|
||||
|
||||
getRuntimeOptions(): CodexClientRuntimeOptions {
|
||||
return {
|
||||
provider: this.provider,
|
||||
model: this.model,
|
||||
openAiApiKey: this.openAiApiKey,
|
||||
openAiBaseUrl: this.openAiBaseUrl,
|
||||
codexCliPath: this.codexCliPath,
|
||||
localBaseUrl: this.localBaseUrl,
|
||||
localApiKey: this.localApiKey,
|
||||
localModelsPath: this.localModelsPath,
|
||||
localChatPath: this.localChatPath
|
||||
};
|
||||
}
|
||||
|
||||
private resetCaches(): void {
|
||||
this.availableModels.clear();
|
||||
this.modelsCheckedAt = 0;
|
||||
this.codexLoginCheckedAt = 0;
|
||||
this.codexLoginLooksGood = false;
|
||||
}
|
||||
|
||||
getConfiguredModel(): string {
|
||||
@@ -464,6 +501,13 @@ function trimTrailingSlash(value: string): string {
|
||||
return value;
|
||||
}
|
||||
|
||||
function normalizePath(value: string): string {
|
||||
if (!value.startsWith("/")) {
|
||||
return `/${value}`;
|
||||
}
|
||||
return value;
|
||||
}
|
||||
|
||||
function sanitizeForModel(value: unknown): unknown {
|
||||
if (Array.isArray(value)) {
|
||||
return value.map((item) => sanitizeForModel(item));
|
||||
|
||||
135
apps/api/src/services/llmSettingsService.ts
Normal file
135
apps/api/src/services/llmSettingsService.ts
Normal file
@@ -0,0 +1,135 @@
|
||||
import { z } from "zod";
|
||||
import type { EncryptionService } from "../lib/crypto.js";
|
||||
import type { Repository } from "./repository.js";
|
||||
import type { CodexClient, CodexClientRuntimeOptions, LlmProvider } from "./llmOrchestrator.js";
|
||||
|
||||
const LLM_SETTINGS_KEY = "llm_settings";
|
||||
|
||||
const providerSchema = z.enum(["codex_oauth", "openai_api", "local_ollama"]);
|
||||
|
||||
const storedSettingsSchema = z.object({
|
||||
provider: providerSchema,
|
||||
model: z.string().min(1),
|
||||
openAiBaseUrl: z.string().url(),
|
||||
codexCliPath: z.string().min(1),
|
||||
localBaseUrl: z.string().url(),
|
||||
localModelsPath: z.string().min(1),
|
||||
localChatPath: z.string().min(1),
|
||||
openAiApiKeyEncrypted: z.string().default(""),
|
||||
localApiKeyEncrypted: z.string().default("")
|
||||
});
|
||||
|
||||
export const llmSettingsUpdateSchema = z.object({
|
||||
provider: providerSchema.optional(),
|
||||
model: z.string().min(1).optional(),
|
||||
openAiBaseUrl: z.string().url().optional(),
|
||||
codexCliPath: z.string().min(1).optional(),
|
||||
localBaseUrl: z.string().url().optional(),
|
||||
localModelsPath: z.string().min(1).optional(),
|
||||
localChatPath: z.string().min(1).optional(),
|
||||
openAiApiKey: z.string().optional(),
|
||||
localApiKey: z.string().optional()
|
||||
});
|
||||
|
||||
export type LlmSettingsUpdate = z.infer<typeof llmSettingsUpdateSchema>;
|
||||
|
||||
export interface LlmSettingsView {
|
||||
provider: LlmProvider;
|
||||
model: string;
|
||||
openAiBaseUrl: string;
|
||||
codexCliPath: string;
|
||||
localBaseUrl: string;
|
||||
localModelsPath: string;
|
||||
localChatPath: string;
|
||||
hasOpenAiApiKey: boolean;
|
||||
hasLocalApiKey: boolean;
|
||||
}
|
||||
|
||||
export class LlmSettingsService {
|
||||
constructor(
|
||||
private readonly repository: Repository,
|
||||
private readonly client: CodexClient,
|
||||
private readonly encryption: EncryptionService,
|
||||
private readonly defaults: CodexClientRuntimeOptions
|
||||
) {}
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
const state = await this.repository.getAppState(LLM_SETTINGS_KEY);
|
||||
if (!state) {
|
||||
this.client.configure(this.defaults);
|
||||
await this.persist(this.defaults);
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const resolved = this.fromStored(state);
|
||||
this.client.configure(resolved);
|
||||
} catch {
|
||||
this.client.configure(this.defaults);
|
||||
await this.persist(this.defaults);
|
||||
}
|
||||
}
|
||||
|
||||
getSettings(): LlmSettingsView {
|
||||
const runtime = this.client.getRuntimeOptions();
|
||||
return {
|
||||
provider: runtime.provider,
|
||||
model: runtime.model,
|
||||
openAiBaseUrl: runtime.openAiBaseUrl,
|
||||
codexCliPath: runtime.codexCliPath,
|
||||
localBaseUrl: runtime.localBaseUrl,
|
||||
localModelsPath: runtime.localModelsPath,
|
||||
localChatPath: runtime.localChatPath,
|
||||
hasOpenAiApiKey: runtime.openAiApiKey.length > 0,
|
||||
hasLocalApiKey: runtime.localApiKey.length > 0
|
||||
};
|
||||
}
|
||||
|
||||
async updateSettings(update: LlmSettingsUpdate): Promise<LlmSettingsView> {
|
||||
const current = this.client.getRuntimeOptions();
|
||||
const merged: CodexClientRuntimeOptions = {
|
||||
provider: update.provider ?? current.provider,
|
||||
model: update.model ?? current.model,
|
||||
openAiBaseUrl: update.openAiBaseUrl ?? current.openAiBaseUrl,
|
||||
codexCliPath: update.codexCliPath ?? current.codexCliPath,
|
||||
localBaseUrl: update.localBaseUrl ?? current.localBaseUrl,
|
||||
localModelsPath: update.localModelsPath ?? current.localModelsPath,
|
||||
localChatPath: update.localChatPath ?? current.localChatPath,
|
||||
openAiApiKey: update.openAiApiKey ?? current.openAiApiKey,
|
||||
localApiKey: update.localApiKey ?? current.localApiKey
|
||||
};
|
||||
|
||||
this.client.configure(merged);
|
||||
await this.persist(merged);
|
||||
return this.getSettings();
|
||||
}
|
||||
|
||||
private async persist(options: CodexClientRuntimeOptions): Promise<void> {
|
||||
await this.repository.upsertAppState(LLM_SETTINGS_KEY, {
|
||||
provider: options.provider,
|
||||
model: options.model,
|
||||
openAiBaseUrl: options.openAiBaseUrl,
|
||||
codexCliPath: options.codexCliPath,
|
||||
localBaseUrl: options.localBaseUrl,
|
||||
localModelsPath: options.localModelsPath,
|
||||
localChatPath: options.localChatPath,
|
||||
openAiApiKeyEncrypted: options.openAiApiKey ? this.encryption.encrypt(options.openAiApiKey) : "",
|
||||
localApiKeyEncrypted: options.localApiKey ? this.encryption.encrypt(options.localApiKey) : ""
|
||||
});
|
||||
}
|
||||
|
||||
private fromStored(state: Record<string, unknown>): CodexClientRuntimeOptions {
|
||||
const parsed = storedSettingsSchema.parse(state);
|
||||
return {
|
||||
provider: parsed.provider,
|
||||
model: parsed.model,
|
||||
openAiBaseUrl: parsed.openAiBaseUrl,
|
||||
codexCliPath: parsed.codexCliPath,
|
||||
localBaseUrl: parsed.localBaseUrl,
|
||||
localModelsPath: parsed.localModelsPath,
|
||||
localChatPath: parsed.localChatPath,
|
||||
openAiApiKey: parsed.openAiApiKeyEncrypted ? this.encryption.decrypt(parsed.openAiApiKeyEncrypted) : "",
|
||||
localApiKey: parsed.localApiKeyEncrypted ? this.encryption.decrypt(parsed.localApiKeyEncrypted) : ""
|
||||
};
|
||||
}
|
||||
}
|
||||
@@ -3,11 +3,13 @@ import { NavLink, Navigate, Route, Routes } from "react-router-dom";
|
||||
import { api, clearToken, getToken, setToken } from "./lib/api";
|
||||
import { AuditPage } from "./pages/AuditPage";
|
||||
import { LogsPage } from "./pages/LogsPage";
|
||||
import { LlmSetupPage } from "./pages/LlmSetupPage";
|
||||
import { PosturePage } from "./pages/PosturePage";
|
||||
import { QueuePage } from "./pages/QueuePage";
|
||||
import { RecommendationsPage } from "./pages/RecommendationsPage";
|
||||
|
||||
const navItems = [
|
||||
{ to: "/ai-setup", label: "AI Setup" },
|
||||
{ to: "/logs", label: "Realtime Logs" },
|
||||
{ to: "/posture", label: "Security Posture" },
|
||||
{ to: "/recommendations", label: "Recommendations" },
|
||||
@@ -127,6 +129,7 @@ export default function App() {
|
||||
<section className="app-content">
|
||||
<Routes>
|
||||
<Route path="/" element={<Navigate to="/logs" replace />} />
|
||||
<Route path="/ai-setup" element={<LlmSetupPage />} />
|
||||
<Route path="/logs" element={<LogsPage />} />
|
||||
<Route path="/posture" element={<PosturePage />} />
|
||||
<Route path="/recommendations" element={<RecommendationsPage />} />
|
||||
|
||||
@@ -43,6 +43,30 @@ async function request<T>(path: string, init?: RequestInit): Promise<T> {
|
||||
return response.json() as Promise<T>;
|
||||
}
|
||||
|
||||
export interface LlmSettingsView {
|
||||
provider: "codex_oauth" | "openai_api" | "local_ollama";
|
||||
model: string;
|
||||
openAiBaseUrl: string;
|
||||
codexCliPath: string;
|
||||
localBaseUrl: string;
|
||||
localModelsPath: string;
|
||||
localChatPath: string;
|
||||
hasOpenAiApiKey: boolean;
|
||||
hasLocalApiKey: boolean;
|
||||
}
|
||||
|
||||
export interface LlmSettingsUpdate {
|
||||
provider: "codex_oauth" | "openai_api" | "local_ollama";
|
||||
model: string;
|
||||
openAiBaseUrl: string;
|
||||
codexCliPath: string;
|
||||
localBaseUrl: string;
|
||||
localModelsPath: string;
|
||||
localChatPath: string;
|
||||
openAiApiKey?: string;
|
||||
localApiKey?: string;
|
||||
}
|
||||
|
||||
export const api = {
|
||||
async bootstrap(username: string, password: string): Promise<{ username: string; otpProvisioningUri: string }> {
|
||||
return request("/api/v1/auth/bootstrap", {
|
||||
@@ -62,6 +86,17 @@ export const api = {
|
||||
return request("/api/v1/health/dependencies");
|
||||
},
|
||||
|
||||
async getLlmSettings(): Promise<LlmSettingsView> {
|
||||
return request("/api/v1/llm/settings");
|
||||
},
|
||||
|
||||
async updateLlmSettings(input: LlmSettingsUpdate): Promise<{ settings: LlmSettingsView; dependency: unknown }> {
|
||||
return request("/api/v1/llm/settings", {
|
||||
method: "PUT",
|
||||
body: JSON.stringify(input)
|
||||
});
|
||||
},
|
||||
|
||||
async posture(): Promise<unknown> {
|
||||
return request("/api/v1/security/posture");
|
||||
},
|
||||
|
||||
166
apps/web/src/pages/LlmSetupPage.tsx
Normal file
166
apps/web/src/pages/LlmSetupPage.tsx
Normal file
@@ -0,0 +1,166 @@
|
||||
import { FormEvent, useEffect, useState } from "react";
|
||||
import { api, type LlmSettingsView } from "../lib/api";
|
||||
|
||||
interface DependencyHealth {
|
||||
modelGateReady: boolean;
|
||||
llmProvider: string;
|
||||
llmModel: string;
|
||||
issues: string[];
|
||||
}
|
||||
|
||||
const defaultForm: LlmSettingsView = {
|
||||
provider: "local_ollama",
|
||||
model: "qwen3-coder-next",
|
||||
openAiBaseUrl: "https://api.openai.com/v1",
|
||||
codexCliPath: "codex",
|
||||
localBaseUrl: "http://localhost:11434",
|
||||
localModelsPath: "/api/tags",
|
||||
localChatPath: "/api/chat",
|
||||
hasOpenAiApiKey: false,
|
||||
hasLocalApiKey: false
|
||||
};
|
||||
|
||||
export function LlmSetupPage() {
|
||||
const [form, setForm] = useState<LlmSettingsView>(defaultForm);
|
||||
const [openAiApiKey, setOpenAiApiKey] = useState("");
|
||||
const [localApiKey, setLocalApiKey] = useState("");
|
||||
const [dependency, setDependency] = useState<DependencyHealth | null>(null);
|
||||
const [message, setMessage] = useState("");
|
||||
const [error, setError] = useState("");
|
||||
|
||||
useEffect(() => {
|
||||
void (async () => {
|
||||
try {
|
||||
const [settings, deps] = await Promise.all([api.getLlmSettings(), api.dependencies()]);
|
||||
setForm(settings);
|
||||
setDependency(deps as DependencyHealth);
|
||||
} catch (caught) {
|
||||
setError((caught as Error).message);
|
||||
}
|
||||
})();
|
||||
}, []);
|
||||
|
||||
async function onSubmit(event: FormEvent): Promise<void> {
|
||||
event.preventDefault();
|
||||
setError("");
|
||||
setMessage("");
|
||||
try {
|
||||
const payload = {
|
||||
provider: form.provider,
|
||||
model: form.model,
|
||||
openAiBaseUrl: form.openAiBaseUrl,
|
||||
codexCliPath: form.codexCliPath,
|
||||
localBaseUrl: form.localBaseUrl,
|
||||
localModelsPath: form.localModelsPath,
|
||||
localChatPath: form.localChatPath,
|
||||
openAiApiKey: openAiApiKey.length > 0 ? openAiApiKey : undefined,
|
||||
localApiKey: localApiKey.length > 0 ? localApiKey : undefined
|
||||
} as const;
|
||||
const result = await api.updateLlmSettings(payload);
|
||||
setForm(result.settings);
|
||||
setDependency(result.dependency as DependencyHealth);
|
||||
setOpenAiApiKey("");
|
||||
setLocalApiKey("");
|
||||
setMessage("AI settings saved.");
|
||||
} catch (caught) {
|
||||
setError((caught as Error).message);
|
||||
}
|
||||
}
|
||||
|
||||
function setField<K extends keyof LlmSettingsView>(field: K, value: LlmSettingsView[K]): void {
|
||||
setForm((previous) => ({ ...previous, [field]: value }));
|
||||
}
|
||||
|
||||
return (
|
||||
<article className="panel">
|
||||
<header className="panel-head">
|
||||
<h2>AI Setup</h2>
|
||||
{dependency ? <span className={`pill ${dependency.modelGateReady ? "pill-ok" : "pill-warn"}`}>{dependency.modelGateReady ? "ready" : "blocked"}</span> : null}
|
||||
</header>
|
||||
|
||||
{dependency?.issues?.length ? (
|
||||
<section className="card">
|
||||
<h3>Provider Checks</h3>
|
||||
<ul>
|
||||
{dependency.issues.map((issue) => (
|
||||
<li key={issue}>{issue}</li>
|
||||
))}
|
||||
</ul>
|
||||
</section>
|
||||
) : null}
|
||||
|
||||
<form onSubmit={onSubmit}>
|
||||
<label>
|
||||
Provider
|
||||
<select value={form.provider} onChange={(event) => setField("provider", event.target.value as LlmSettingsView["provider"])}>
|
||||
<option value="local_ollama">Local Ollama / OpenWebUI</option>
|
||||
<option value="openai_api">OpenAI API key</option>
|
||||
<option value="codex_oauth">ChatGPT Codex OAuth</option>
|
||||
</select>
|
||||
</label>
|
||||
|
||||
<label>
|
||||
Model
|
||||
<input value={form.model} onChange={(event) => setField("model", event.target.value)} placeholder="qwen3-coder-next or gpt-5.3-codex" />
|
||||
</label>
|
||||
|
||||
{form.provider === "openai_api" ? (
|
||||
<>
|
||||
<label>
|
||||
OpenAI Base URL
|
||||
<input value={form.openAiBaseUrl} onChange={(event) => setField("openAiBaseUrl", event.target.value)} />
|
||||
</label>
|
||||
<label>
|
||||
OpenAI API Key {form.hasOpenAiApiKey ? "(already stored)" : "(not set)"}
|
||||
<input
|
||||
type="password"
|
||||
value={openAiApiKey}
|
||||
onChange={(event) => setOpenAiApiKey(event.target.value)}
|
||||
placeholder="Leave empty to keep existing"
|
||||
/>
|
||||
</label>
|
||||
</>
|
||||
) : null}
|
||||
|
||||
{form.provider === "local_ollama" ? (
|
||||
<>
|
||||
<label>
|
||||
Local Base URL
|
||||
<input value={form.localBaseUrl} onChange={(event) => setField("localBaseUrl", event.target.value)} />
|
||||
</label>
|
||||
<label>
|
||||
Models Path
|
||||
<input value={form.localModelsPath} onChange={(event) => setField("localModelsPath", event.target.value)} />
|
||||
</label>
|
||||
<label>
|
||||
Chat Path
|
||||
<input value={form.localChatPath} onChange={(event) => setField("localChatPath", event.target.value)} />
|
||||
</label>
|
||||
<label>
|
||||
Local API Key {form.hasLocalApiKey ? "(already stored)" : "(not set)"}
|
||||
<input
|
||||
type="password"
|
||||
value={localApiKey}
|
||||
onChange={(event) => setLocalApiKey(event.target.value)}
|
||||
placeholder="Leave empty to keep existing"
|
||||
/>
|
||||
</label>
|
||||
</>
|
||||
) : null}
|
||||
|
||||
{form.provider === "codex_oauth" ? (
|
||||
<label>
|
||||
Codex CLI Path
|
||||
<input value={form.codexCliPath} onChange={(event) => setField("codexCliPath", event.target.value)} />
|
||||
</label>
|
||||
) : null}
|
||||
|
||||
<div className="button-row">
|
||||
<button type="submit">Save AI Settings</button>
|
||||
</div>
|
||||
</form>
|
||||
{message ? <p className="success-text">{message}</p> : null}
|
||||
{error ? <p className="error-text">{error}</p> : null}
|
||||
</article>
|
||||
);
|
||||
}
|
||||
@@ -41,11 +41,13 @@ p {
|
||||
}
|
||||
|
||||
button,
|
||||
input {
|
||||
input,
|
||||
select {
|
||||
font-family: "IBM Plex Mono", monospace;
|
||||
}
|
||||
|
||||
input {
|
||||
input,
|
||||
select {
|
||||
width: 100%;
|
||||
margin-top: 0.35rem;
|
||||
padding: 0.65rem 0.8rem;
|
||||
@@ -111,7 +113,7 @@ label {
|
||||
|
||||
.main-nav {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(5, minmax(0, 1fr));
|
||||
grid-template-columns: repeat(6, minmax(0, 1fr));
|
||||
gap: 0.55rem;
|
||||
margin-bottom: 1rem;
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user