orchai/docs/superpowers/plans/2026-04-17-graylog-auto-resolve-implementation.md

2027 lines
65 KiB
Markdown

# Graylog Auto-Resolve Implementation Plan
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:** Add a project-scoped Graylog module that detects recurring error/warning patterns, scores them deterministically, and automatically routes high-score subjects through the existing `analyst -> developer` worktree pipeline.
**Architecture:** Keep one unified processing queue (`processed_tickets`) and extend it to support multiple sources (`tuleap`, `graylog`). Add dedicated Graylog storage (`graylog_credentials`, `graylog_subjects`, `graylog_detections`), a Graylog polling service, and a queue bridge that inserts Graylog tickets only when score exceeds threshold and dedup allows it. Frontend gets a Graylog project page, module toggle visibility, and live activity hooks.
**Tech Stack:** Rust (Tauri, rusqlite, reqwest, tokio), SQLite migrations, React + TypeScript.
---
## Scope Check
The spec is one cohesive subsystem (Graylog detection + queue bridge + existing pipeline reuse). It can be implemented as a single plan without splitting into sub-plans.
## File Structure
```text
src-tauri/
migrations/
009_graylog_auto_resolve.sql # create: Graylog tables + processed_tickets multi-source migration
src/
db.rs # modify: register migration 009, update schema tests
lib.rs # modify: start graylog poller, register graylog commands
models/
mod.rs # modify: export graylog model
module.rs # modify: add Graylog module default
ticket.rs # modify: multi-source ticket fields/query/insert helpers
graylog.rs # create: credentials/subjects/detections model methods
services/
mod.rs # modify: export graylog services
graylog_client.rs # create: Graylog HTTP API client + parsing helpers
graylog_scoring.rs # create: normalization/grouping/scoring logic
graylog_poller.rs # create: periodic poller + dedup + queue insertion
orchestrator.rs # modify: source-aware agent resolution + prompt context
commands/
mod.rs # modify: export graylog commands
graylog.rs # create: set/get/delete/test/manual poll/list subjects/list detections
orchestrator.rs # modify: retry path uses ticket project_id fallback-safe
src/
lib/
types.ts # modify: Graylog types + ProcessedTicket new fields
api.ts # modify: Graylog API wrappers
components/
projects/
ProjectDashboard.tsx # modify: Graylog activity events + Graylog entry card
ProjectModules.tsx # modify: show module config link for Graylog card
ProjectGraylog.tsx # create: Graylog config + subjects + detections UI
tickets/
TicketList.tsx # modify: source badge + filters remain stable
TicketDetail.tsx # modify: source/project info rendering for Graylog tickets
App.tsx # modify: add /projects/:projectId/graylog route
```
---
### Task 1: Add migration 009 and bump DB schema version
**Files:**
- Create: `src-tauri/migrations/009_graylog_auto_resolve.sql`
- Modify: `src-tauri/src/db.rs`
- [ ] **Step 1: Add failing DB tests for new version and new tables**
In `src-tauri/src/db.rs`, update expected table list and user_version assertion:
```rust
assert_eq!(
tables,
vec![
"agents",
"graylog_credentials",
"graylog_detections",
"graylog_subjects",
"notifications",
"processed_tickets",
"project_agent_tasks",
"project_live_messages",
"project_live_sessions",
"project_modules",
"projects",
"tuleap_credentials",
"watched_trackers",
"worktrees",
]
);
```
```rust
assert_eq!(version, 9);
```
- [ ] **Step 2: Run DB tests to verify failure before migration wiring**
Run:
```bash
cd /home/leclere/Projets/IA/orchai/src-tauri
cargo test --lib db::tests -- --nocapture
```
Expected: FAIL because migration 009 is not registered yet.
- [ ] **Step 3: Create migration 009 with processed_tickets multi-source migration and Graylog tables**
Create `src-tauri/migrations/009_graylog_auto_resolve.sql`:
```sql
BEGIN;
PRAGMA foreign_keys = OFF;
DROP INDEX IF EXISTS idx_processed_tickets_tracker_artifact_unique;
CREATE TABLE processed_tickets_new (
id TEXT PRIMARY KEY,
tracker_id TEXT REFERENCES watched_trackers(id) ON DELETE CASCADE,
project_id TEXT REFERENCES projects(id) ON DELETE CASCADE,
source TEXT NOT NULL DEFAULT 'tuleap',
source_ref TEXT,
artifact_id INTEGER NOT NULL,
artifact_title TEXT NOT NULL,
artifact_data TEXT NOT NULL,
status TEXT NOT NULL DEFAULT 'Pending',
analyst_report TEXT,
developer_report TEXT,
worktree_path TEXT,
branch_name TEXT,
detected_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%fZ', 'now')),
processed_at TEXT
);
INSERT INTO processed_tickets_new (
id,
tracker_id,
project_id,
source,
source_ref,
artifact_id,
artifact_title,
artifact_data,
status,
analyst_report,
developer_report,
worktree_path,
branch_name,
detected_at,
processed_at
)
SELECT
pt.id,
pt.tracker_id,
wt.project_id,
'tuleap',
NULL,
pt.artifact_id,
pt.artifact_title,
pt.artifact_data,
pt.status,
pt.analyst_report,
pt.developer_report,
pt.worktree_path,
pt.branch_name,
pt.detected_at,
pt.processed_at
FROM processed_tickets pt
LEFT JOIN watched_trackers wt ON wt.id = pt.tracker_id;
DROP TABLE processed_tickets;
ALTER TABLE processed_tickets_new RENAME TO processed_tickets;
CREATE UNIQUE INDEX idx_processed_tickets_tracker_artifact_unique
ON processed_tickets(tracker_id, artifact_id)
WHERE tracker_id IS NOT NULL;
CREATE INDEX idx_processed_tickets_project_detected
ON processed_tickets(project_id, detected_at DESC);
CREATE INDEX idx_processed_tickets_source_ref
ON processed_tickets(source, source_ref);
CREATE TABLE graylog_credentials (
id TEXT PRIMARY KEY,
project_id TEXT NOT NULL UNIQUE REFERENCES projects(id) ON DELETE CASCADE,
base_url TEXT NOT NULL,
api_token_encrypted TEXT NOT NULL,
analyst_agent_id TEXT NOT NULL REFERENCES agents(id),
developer_agent_id TEXT NOT NULL REFERENCES agents(id),
stream_id TEXT,
query_filter TEXT NOT NULL DEFAULT '',
polling_interval_minutes INTEGER NOT NULL DEFAULT 10,
lookback_minutes INTEGER NOT NULL DEFAULT 30,
score_threshold INTEGER NOT NULL DEFAULT 70,
created_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%fZ', 'now')),
updated_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%fZ', 'now'))
);
CREATE TABLE graylog_subjects (
id TEXT PRIMARY KEY,
project_id TEXT NOT NULL REFERENCES projects(id) ON DELETE CASCADE,
subject_key TEXT NOT NULL,
source TEXT NOT NULL,
normalized_message TEXT NOT NULL,
first_seen_at TEXT NOT NULL,
last_seen_at TEXT NOT NULL,
last_score INTEGER NOT NULL DEFAULT 0,
active_ticket_id TEXT REFERENCES processed_tickets(id),
status TEXT NOT NULL DEFAULT 'idle',
created_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%fZ', 'now')),
updated_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%fZ', 'now')),
UNIQUE(project_id, subject_key)
);
CREATE TABLE graylog_detections (
id TEXT PRIMARY KEY,
subject_id TEXT NOT NULL REFERENCES graylog_subjects(id) ON DELETE CASCADE,
window_start TEXT NOT NULL,
window_end TEXT NOT NULL,
critical_count INTEGER NOT NULL DEFAULT 0,
error_count INTEGER NOT NULL DEFAULT 0,
warning_count INTEGER NOT NULL DEFAULT 0,
total_count INTEGER NOT NULL DEFAULT 0,
last_seen_at TEXT NOT NULL,
score INTEGER NOT NULL,
triggered INTEGER NOT NULL DEFAULT 0,
triggered_ticket_id TEXT REFERENCES processed_tickets(id),
created_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%fZ', 'now'))
);
CREATE INDEX idx_graylog_detections_subject_created
ON graylog_detections(subject_id, created_at DESC);
PRAGMA foreign_keys = ON;
COMMIT;
```
- [ ] **Step 4: Register migration 009 in `db.rs`**
Add constants and version bump:
```rust
const MIGRATION_009: &str = include_str!("../migrations/009_graylog_auto_resolve.sql");
```
```rust
if version < 9 {
conn.execute_batch(MIGRATION_009)?;
conn.pragma_update(None, "user_version", 9)?;
}
```
- [ ] **Step 5: Run DB tests and full backend tests**
Run:
```bash
cd /home/leclere/Projets/IA/orchai/src-tauri
cargo test --lib db::tests -- --nocapture
cargo test --lib -- --nocapture
```
Expected:
- DB tests PASS with version `9`.
- No migration regression in existing model tests.
- [ ] **Step 6: Commit**
```bash
cd /home/leclere/Projets/IA/orchai
git add src-tauri/migrations/009_graylog_auto_resolve.sql src-tauri/src/db.rs
git commit -m "feat(db): add graylog schema and processed_tickets multi-source migration"
```
---
### Task 2: Extend `ProcessedTicket` for multi-source queue
**Files:**
- Modify: `src-tauri/src/models/ticket.rs`
- [ ] **Step 1: Add failing tests for new ticket shape and project-scoped queries**
Add tests in `ticket.rs`:
```rust
#[test]
fn test_insert_if_new_sets_project_and_source_for_tuleap() {
let (conn, tracker_id) = setup();
let project_id = project_id_for_tracker(&conn, &tracker_id);
let ticket = ProcessedTicket::insert_if_new(
&conn,
&project_id,
&tracker_id,
777,
"Tracker issue",
"{}",
)
.unwrap()
.unwrap();
assert_eq!(ticket.project_id, project_id);
assert_eq!(ticket.source, "tuleap");
assert_eq!(ticket.source_ref, None);
assert_eq!(ticket.tracker_id.as_deref(), Some(tracker_id.as_str()));
}
#[test]
fn test_insert_external_graylog_ticket() {
let (conn, tracker_id) = setup();
let project_id = project_id_for_tracker(&conn, &tracker_id);
let ticket = ProcessedTicket::insert_external(
&conn,
&project_id,
"graylog",
Some("subject-1"),
-42,
"[Graylog] api - timeout",
"{\"score\":83}",
)
.unwrap();
assert_eq!(ticket.project_id, project_id);
assert_eq!(ticket.source, "graylog");
assert_eq!(ticket.source_ref.as_deref(), Some("subject-1"));
assert!(ticket.tracker_id.is_none());
}
```
- [ ] **Step 2: Run ticket model tests and confirm compilation fails**
Run:
```bash
cd /home/leclere/Projets/IA/orchai/src-tauri
cargo test --lib models::ticket::tests -- --nocapture
```
Expected: FAIL because `project_id/source/source_ref` fields and new methods do not exist yet.
- [ ] **Step 3: Implement multi-source fields and insert/query helpers**
Update `ProcessedTicket` and SQL mapping:
```rust
pub struct ProcessedTicket {
pub id: String,
pub tracker_id: Option<String>,
pub project_id: String,
pub source: String,
pub source_ref: Option<String>,
pub artifact_id: i32,
pub artifact_title: String,
pub artifact_data: String,
pub status: String,
pub analyst_report: Option<String>,
pub developer_report: Option<String>,
pub worktree_path: Option<String>,
pub branch_name: Option<String>,
pub detected_at: String,
pub processed_at: Option<String>,
}
```
```rust
const SELECT_ALL_COLS: &str = "SELECT id, tracker_id, project_id, source, source_ref, \
artifact_id, artifact_title, artifact_data, status, analyst_report, developer_report, \
worktree_path, branch_name, detected_at, processed_at FROM processed_tickets";
```
```rust
pub fn insert_if_new(
conn: &Connection,
project_id: &str,
tracker_id: &str,
artifact_id: i32,
artifact_title: &str,
artifact_data: &str,
) -> Result<Option<ProcessedTicket>> {
let id = Uuid::new_v4().to_string();
let now = chrono::Utc::now().to_rfc3339();
let inserted_rows = conn.execute(
"INSERT OR IGNORE INTO processed_tickets \
(id, tracker_id, project_id, source, source_ref, artifact_id, artifact_title, artifact_data, status, detected_at) \
VALUES (?1, ?2, ?3, 'tuleap', NULL, ?4, ?5, ?6, 'Pending', ?7)",
params![id, tracker_id, project_id, artifact_id, artifact_title, artifact_data, now],
)?;
if inserted_rows == 0 {
return Ok(None);
}
Ok(Some(ProcessedTicket {
id,
tracker_id: Some(tracker_id.to_string()),
project_id: project_id.to_string(),
source: "tuleap".to_string(),
source_ref: None,
artifact_id,
artifact_title: artifact_title.to_string(),
artifact_data: artifact_data.to_string(),
status: "Pending".to_string(),
analyst_report: None,
developer_report: None,
worktree_path: None,
branch_name: None,
detected_at: now,
processed_at: None,
}))
}
```
```rust
pub fn insert_external(
conn: &Connection,
project_id: &str,
source: &str,
source_ref: Option<&str>,
artifact_id: i32,
artifact_title: &str,
artifact_data: &str,
) -> Result<ProcessedTicket> {
let id = Uuid::new_v4().to_string();
let now = chrono::Utc::now().to_rfc3339();
conn.execute(
"INSERT INTO processed_tickets \
(id, tracker_id, project_id, source, source_ref, artifact_id, artifact_title, artifact_data, status, detected_at) \
VALUES (?1, NULL, ?2, ?3, ?4, ?5, ?6, ?7, 'Pending', ?8)",
params![id, project_id, source, source_ref, artifact_id, artifact_title, artifact_data, now],
)?;
Ok(ProcessedTicket {
id,
tracker_id: None,
project_id: project_id.to_string(),
source: source.to_string(),
source_ref: source_ref.map(str::to_string),
artifact_id,
artifact_title: artifact_title.to_string(),
artifact_data: artifact_data.to_string(),
status: "Pending".to_string(),
analyst_report: None,
developer_report: None,
worktree_path: None,
branch_name: None,
detected_at: now,
processed_at: None,
})
}
```
```rust
pub fn list_by_project(conn: &Connection, project_id: &str) -> Result<Vec<ProcessedTicket>> {
let sql = format!(
"{} WHERE project_id = ?1 ORDER BY detected_at DESC",
SELECT_ALL_COLS
);
let mut stmt = conn.prepare(&sql)?;
let rows = stmt.query_map(params![project_id], from_row)?;
rows.collect()
}
```
```rust
pub fn get_project_throughput_stats(
conn: &Connection,
project_id: &str,
) -> Result<ProjectThroughputStats> {
let window_start = (chrono::Utc::now() - chrono::Duration::hours(24)).to_rfc3339();
conn.query_row(
"SELECT
COALESCE(SUM(CASE WHEN status NOT IN ('Done','Error','Cancelled') THEN 1 ELSE 0 END), 0),
COALESCE(SUM(CASE WHEN status = 'Done' AND processed_at IS NOT NULL AND julianday(processed_at) >= julianday(?2) THEN 1 ELSE 0 END), 0),
COALESCE(SUM(CASE WHEN status = 'Error' AND processed_at IS NOT NULL AND julianday(processed_at) >= julianday(?2) THEN 1 ELSE 0 END), 0),
AVG(CASE WHEN status IN ('Done','Error') AND processed_at IS NOT NULL AND julianday(processed_at) >= julianday(?2)
THEN (julianday(processed_at) - julianday(detected_at)) * 86400.0
ELSE NULL END)
FROM processed_tickets
WHERE project_id = ?1",
params![project_id, window_start],
|row| {
Ok(ProjectThroughputStats {
backlog_count: row.get(0)?,
done_last_24h: row.get(1)?,
error_last_24h: row.get(2)?,
avg_lead_time_seconds: row.get(3)?,
})
},
)
}
```
- [ ] **Step 4: Update all call sites using `insert_if_new`**
Adjust:
- `src-tauri/src/services/poller.rs`
- `src-tauri/src/commands/poller.rs`
Snippet to apply at both call sites:
```rust
if let Some(ticket) = ProcessedTicket::insert_if_new(
&db,
&tracker.project_id,
&tracker.id,
artifact_id,
&artifact_title,
&artifact_data,
)? {
// existing flow
}
```
- [ ] **Step 5: Run ticket tests and backend regression tests**
Run:
```bash
cd /home/leclere/Projets/IA/orchai/src-tauri
cargo test --lib models::ticket::tests -- --nocapture
cargo test --lib commands::poller -- --nocapture
```
Expected: PASS.
- [ ] **Step 6: Commit**
```bash
cd /home/leclere/Projets/IA/orchai
git add src-tauri/src/models/ticket.rs src-tauri/src/services/poller.rs src-tauri/src/commands/poller.rs
git commit -m "feat(queue): support multi-source processed tickets"
```
---
### Task 3: Add Graylog model layer and module defaults
**Files:**
- Create: `src-tauri/src/models/graylog.rs`
- Modify: `src-tauri/src/models/mod.rs`
- Modify: `src-tauri/src/models/module.rs`
- [ ] **Step 1: Add failing tests for Graylog credentials and subjects**
In new `graylog.rs` test module, add:
```rust
#[test]
fn test_upsert_graylog_credentials_for_project() {
let conn = db::init_in_memory().unwrap();
let project = Project::insert(&conn, "Gray", "/tmp/gray", None, "main").unwrap();
let analyst = Agent::insert(&conn, "A", AgentRole::Analyst, AgentTool::Codex, "").unwrap();
let developer = Agent::insert(&conn, "D", AgentRole::Developer, AgentTool::ClaudeCode, "").unwrap();
let creds = GraylogCredentials::upsert_for_project(
&conn,
&project.id,
"https://graylog.local",
"enc-token",
&analyst.id,
&developer.id,
Some("stream-1"),
"level:(critical OR error)",
10,
30,
70,
).unwrap();
assert_eq!(creds.project_id, project.id);
assert_eq!(creds.base_url, "https://graylog.local");
}
```
- [ ] **Step 2: Implement `graylog.rs` with credentials, subjects, detections**
Create `src-tauri/src/models/graylog.rs` with these core structs and methods:
```rust
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct GraylogCredentials {
pub id: String,
pub project_id: String,
pub base_url: String,
pub api_token_encrypted: String,
pub analyst_agent_id: String,
pub developer_agent_id: String,
pub stream_id: Option<String>,
pub query_filter: String,
pub polling_interval_minutes: i32,
pub lookback_minutes: i32,
pub score_threshold: i32,
pub created_at: String,
pub updated_at: String,
}
```
```rust
pub fn upsert_for_project(
conn: &Connection,
project_id: &str,
base_url: &str,
api_token_encrypted: &str,
analyst_agent_id: &str,
developer_agent_id: &str,
stream_id: Option<&str>,
query_filter: &str,
polling_interval_minutes: i32,
lookback_minutes: i32,
score_threshold: i32,
) -> Result<GraylogCredentials> {
let now = chrono::Utc::now().to_rfc3339();
conn.execute(
"INSERT INTO graylog_credentials (
id, project_id, base_url, api_token_encrypted, analyst_agent_id, developer_agent_id,
stream_id, query_filter, polling_interval_minutes, lookback_minutes, score_threshold,
created_at, updated_at
) VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?7, ?8, ?9, ?10, ?11, ?12, ?13)
ON CONFLICT(project_id) DO UPDATE SET
base_url = excluded.base_url,
api_token_encrypted = excluded.api_token_encrypted,
analyst_agent_id = excluded.analyst_agent_id,
developer_agent_id = excluded.developer_agent_id,
stream_id = excluded.stream_id,
query_filter = excluded.query_filter,
polling_interval_minutes = excluded.polling_interval_minutes,
lookback_minutes = excluded.lookback_minutes,
score_threshold = excluded.score_threshold,
updated_at = excluded.updated_at",
params![
Uuid::new_v4().to_string(),
project_id,
base_url,
api_token_encrypted,
analyst_agent_id,
developer_agent_id,
stream_id,
query_filter,
polling_interval_minutes,
lookback_minutes,
score_threshold,
now,
now
],
)?;
GraylogCredentials::get_by_project(conn, project_id)?.ok_or(rusqlite::Error::QueryReturnedNoRows)
}
```
```rust
pub fn upsert_subject(
conn: &Connection,
project_id: &str,
subject_key: &str,
source: &str,
normalized_message: &str,
last_seen_at: &str,
last_score: i32,
) -> Result<GraylogSubject> {
let now = chrono::Utc::now().to_rfc3339();
conn.execute(
"INSERT INTO graylog_subjects (
id, project_id, subject_key, source, normalized_message, first_seen_at, last_seen_at,
last_score, status, created_at, updated_at
) VALUES (?1, ?2, ?3, ?4, ?5, ?6, ?6, ?7, 'idle', ?8, ?8)
ON CONFLICT(project_id, subject_key) DO UPDATE SET
source = excluded.source,
normalized_message = excluded.normalized_message,
last_seen_at = excluded.last_seen_at,
last_score = excluded.last_score,
updated_at = excluded.updated_at",
params![Uuid::new_v4().to_string(), project_id, subject_key, source, normalized_message, last_seen_at, last_score, now],
)?;
GraylogSubject::get_by_project_and_key(conn, project_id, subject_key)?
.ok_or(rusqlite::Error::QueryReturnedNoRows)
}
```
- [ ] **Step 3: Wire model exports and module default**
In `src-tauri/src/models/mod.rs`:
```rust
pub mod graylog;
```
In `src-tauri/src/models/module.rs`:
```rust
pub const MODULE_GRAYLOG_AUTO_RESOLVE: &str = "graylog_polling_auto_resolve";
```
And add default insertion:
```rust
insert_default(
conn,
project_id,
MODULE_GRAYLOG_AUTO_RESOLVE,
"Polling Graylog + auto-resolve",
"Surveille Graylog, score les sujets, et déclenche le pipeline analyste/developpeur.",
)?;
```
- [ ] **Step 4: Run focused model tests**
Run:
```bash
cd /home/leclere/Projets/IA/orchai/src-tauri
cargo test --lib models::graylog -- --nocapture
cargo test --lib models::module -- --nocapture
```
Expected: PASS.
- [ ] **Step 5: Commit**
```bash
cd /home/leclere/Projets/IA/orchai
git add src-tauri/src/models/graylog.rs src-tauri/src/models/mod.rs src-tauri/src/models/module.rs
git commit -m "feat(graylog): add model layer and project module default"
```
---
### Task 4: Implement Graylog scoring engine (normalize/group/score)
**Files:**
- Create: `src-tauri/src/services/graylog_scoring.rs`
- Modify: `src-tauri/src/services/mod.rs`
- Modify: `src-tauri/Cargo.toml`
- [ ] **Step 1: Add failing tests for normalization and score**
In `graylog_scoring.rs` tests:
```rust
#[test]
fn test_normalize_message_replaces_dynamic_values() {
let message = "User 42 failed from 10.0.0.1 at 2026-04-17T10:00:00Z request=9f8b-12ab";
let normalized = normalize_message(message);
assert!(normalized.contains("<num>"));
assert!(normalized.contains("<ip>"));
assert!(normalized.contains("<ts>"));
}
#[test]
fn test_compute_score_critical_recent_is_high() {
let score = compute_score(SeverityCounts { critical: 1, error: 0, warning: 0 }, 2, 30);
assert!(score >= 70);
}
```
- [ ] **Step 2: Implement scoring primitives and grouping**
Create `graylog_scoring.rs`:
```rust
use regex::Regex;
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct GraylogEvent {
pub timestamp: String,
pub source: String,
pub service: Option<String>,
pub level: String,
pub message: String,
pub raw: serde_json::Value,
}
#[derive(Debug, Clone, Copy, Default)]
pub struct SeverityCounts {
pub critical: i32,
pub error: i32,
pub warning: i32,
}
#[derive(Debug, Clone)]
pub struct SubjectAggregate {
pub subject_key: String,
pub source: String,
pub normalized_message: String,
pub counts: SeverityCounts,
pub total_count: i32,
pub last_seen_age_minutes: i64,
pub last_seen_at: String,
pub sample_events: Vec<serde_json::Value>,
pub score: i32,
}
pub fn normalize_message(input: &str) -> String {
let mut value = input.to_lowercase();
let uuid_re = Regex::new(r"\b[0-9a-f]{8}-[0-9a-f]{4}-[1-5][0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}\b").unwrap();
let ip_re = Regex::new(r"\b(?:\d{1,3}\.){3}\d{1,3}\b").unwrap();
let ts_re = Regex::new(r"\b\d{4}-\d{2}-\d{2}t\d{2}:\d{2}:\d{2}(?:\.\d+)?z\b").unwrap();
let hash_re = Regex::new(r"\b[0-9a-f]{12,}\b").unwrap();
let num_re = Regex::new(r"\b\d+\b").unwrap();
let ws_re = Regex::new(r"\s+").unwrap();
value = uuid_re.replace_all(&value, "<uuid>").to_string();
value = ip_re.replace_all(&value, "<ip>").to_string();
value = ts_re.replace_all(&value, "<ts>").to_string();
value = hash_re.replace_all(&value, "<hash>").to_string();
value = num_re.replace_all(&value, "<num>").to_string();
ws_re.replace_all(&value, " ").trim().to_string()
}
pub fn compute_score(counts: SeverityCounts, total_count: i32, last_seen_age_minutes: i64) -> i32 {
let severity_score = if counts.critical > 0 {
50
} else if counts.error > 0 {
35
} else if counts.warning > 0 {
20
} else {
0
};
let frequency_score = match total_count {
0 => 0,
1 => 5,
2..=3 => 12,
4..=7 => 22,
8..=15 => 30,
_ => 35,
};
let recency_score = if last_seen_age_minutes <= 2 {
15
} else if last_seen_age_minutes <= 10 {
12
} else if last_seen_age_minutes <= 30 {
8
} else if last_seen_age_minutes <= 120 {
4
} else {
0
};
severity_score + frequency_score + recency_score
}
pub fn group_subjects(events: &[GraylogEvent], now: chrono::DateTime<chrono::Utc>) -> Vec<SubjectAggregate> {
let mut map: HashMap<String, SubjectAggregate> = HashMap::new();
for event in events {
let source = event.service.clone().unwrap_or_else(|| event.source.clone());
let normalized_message = normalize_message(&event.message);
let subject_key = format!("{source}|{normalized_message}");
let event_time = chrono::DateTime::parse_from_rfc3339(&event.timestamp)
.map(|dt| dt.with_timezone(&chrono::Utc))
.unwrap_or(now);
let age_minutes = now.signed_duration_since(event_time).num_minutes().max(0);
let entry = map.entry(subject_key.clone()).or_insert_with(|| SubjectAggregate {
subject_key: subject_key.clone(),
source: source.clone(),
normalized_message: normalized_message.clone(),
counts: SeverityCounts::default(),
total_count: 0,
last_seen_age_minutes: age_minutes,
last_seen_at: event.timestamp.clone(),
sample_events: Vec::new(),
score: 0,
});
entry.total_count += 1;
let level = event.level.to_lowercase();
if level.contains("critical") {
entry.counts.critical += 1;
} else if level.contains("error") {
entry.counts.error += 1;
} else if level.contains("warn") {
entry.counts.warning += 1;
}
if age_minutes < entry.last_seen_age_minutes {
entry.last_seen_age_minutes = age_minutes;
entry.last_seen_at = event.timestamp.clone();
}
if entry.sample_events.len() < 5 {
entry.sample_events.push(event.raw.clone());
}
}
let mut out: Vec<SubjectAggregate> = map
.into_values()
.map(|mut aggregate| {
aggregate.score = compute_score(
aggregate.counts,
aggregate.total_count,
aggregate.last_seen_age_minutes,
);
aggregate
})
.collect();
out.sort_by(|a, b| b.score.cmp(&a.score).then_with(|| b.total_count.cmp(&a.total_count)));
out
}
```
- [ ] **Step 3: Add Rust dependencies required by scoring/client utilities**
In `src-tauri/Cargo.toml` under `[dependencies]`:
```toml
regex = "1"
urlencoding = "2"
```
- [ ] **Step 4: Export service module**
In `src-tauri/src/services/mod.rs`:
```rust
pub mod graylog_scoring;
```
- [ ] **Step 5: Run scoring tests**
Run:
```bash
cd /home/leclere/Projets/IA/orchai/src-tauri
cargo test --lib services::graylog_scoring -- --nocapture
```
Expected: PASS.
- [ ] **Step 6: Commit**
```bash
cd /home/leclere/Projets/IA/orchai
git add src-tauri/src/services/graylog_scoring.rs src-tauri/src/services/mod.rs src-tauri/Cargo.toml
git commit -m "feat(graylog): add deterministic scoring and subject grouping"
```
---
### Task 5: Implement Graylog API client and backend commands
**Files:**
- Create: `src-tauri/src/services/graylog_client.rs`
- Create: `src-tauri/src/commands/graylog.rs`
- Modify: `src-tauri/src/commands/mod.rs`
- Modify: `src-tauri/src/lib.rs`
- [ ] **Step 1: Add failing tests for Graylog payload parsing**
In `graylog_client.rs` tests:
```rust
#[test]
fn test_parse_search_response_extracts_events() {
let payload = serde_json::json!({
"messages": [
{ "message": { "timestamp": "2026-04-17T10:00:00.000Z", "source": "api-1", "level": "error", "message": "timeout id=42", "service": "api" } }
]
});
let events = parse_search_response(&payload);
assert_eq!(events.len(), 1);
assert_eq!(events[0].source, "api-1");
assert_eq!(events[0].service.as_deref(), Some("api"));
}
```
- [ ] **Step 2: Implement Graylog client**
Create `src-tauri/src/services/graylog_client.rs`:
```rust
use crate::services::graylog_scoring::GraylogEvent;
use serde_json::Value;
use std::time::Instant;
use tokio::time::{sleep, Duration};
pub struct GraylogClient {
http: reqwest::Client,
base_url: String,
token: String,
}
impl GraylogClient {
pub fn new(http: &reqwest::Client, base_url: &str, token: &str) -> Self {
Self {
http: http.clone(),
base_url: base_url.trim_end_matches('/').to_string(),
token: token.to_string(),
}
}
async fn send_get(&self, url: &str) -> Result<reqwest::Response, String> {
const MAX_ATTEMPTS: u32 = 3;
const BASE_DELAY_MS: u64 = 500;
for attempt in 1..=MAX_ATTEMPTS {
let started_at = Instant::now();
let response = self
.http
.get(url)
.header("Authorization", format!("Bearer {}", self.token))
.header("X-Requested-By", "orchai")
.send()
.await;
match response {
Ok(resp) => {
let status = resp.status();
if (status == reqwest::StatusCode::TOO_MANY_REQUESTS || status.is_server_error()) && attempt < MAX_ATTEMPTS {
let delay_ms = BASE_DELAY_MS * 2u64.pow(attempt - 1);
eprintln!("[graylog] retry GET {} after {}ms (status={})", url, delay_ms, status);
sleep(Duration::from_millis(delay_ms)).await;
continue;
}
eprintln!("[graylog] GET {} status={} {}ms", url, status, started_at.elapsed().as_millis());
return Ok(resp);
}
Err(err) => {
if attempt < MAX_ATTEMPTS {
let delay_ms = BASE_DELAY_MS * 2u64.pow(attempt - 1);
eprintln!("[graylog] retry GET {} after {}ms ({})", url, delay_ms, err);
sleep(Duration::from_millis(delay_ms)).await;
continue;
}
return Err(format!("graylog request failed: {}", err));
}
}
}
Err("graylog request failed after retries".to_string())
}
pub async fn test_connection(&self) -> Result<(), String> {
let url = format!("{}/api/system", self.base_url);
let resp = self.send_get(&url).await?;
if resp.status().is_success() {
Ok(())
} else {
Err(format!("graylog connection test failed: HTTP {}", resp.status()))
}
}
pub async fn search_relative(
&self,
query: &str,
stream_id: Option<&str>,
range_seconds: i32,
) -> Result<Vec<GraylogEvent>, String> {
let mut url = format!(
"{}/api/search/universal/relative?query={}&range={}&limit=500",
self.base_url,
urlencoding::encode(query),
range_seconds
);
if let Some(stream_id) = stream_id {
url.push_str(&format!("&streams={}", urlencoding::encode(stream_id)));
}
let resp = self.send_get(&url).await?;
if !resp.status().is_success() {
return Err(format!("graylog search failed: HTTP {}", resp.status()));
}
let body: Value = resp.json().await.map_err(|e| format!("invalid graylog JSON: {}", e))?;
Ok(parse_search_response(&body))
}
}
pub fn parse_search_response(body: &Value) -> Vec<GraylogEvent> {
let rows = body.get("messages").and_then(|v| v.as_array()).cloned().unwrap_or_default();
rows.into_iter()
.filter_map(|row| {
let message = row.get("message")?;
let timestamp = message.get("timestamp").and_then(|v| v.as_str()).unwrap_or("").to_string();
let source = message.get("source").and_then(|v| v.as_str()).unwrap_or("").to_string();
let level = message.get("level")
.and_then(|v| v.as_str())
.map(str::to_string)
.unwrap_or_else(|| message.get("level").map(|v| v.to_string()).unwrap_or_default());
let msg = message.get("message").and_then(|v| v.as_str()).unwrap_or("").to_string();
if timestamp.is_empty() || source.is_empty() || msg.is_empty() {
return None;
}
let service = message.get("service").and_then(|v| v.as_str()).map(str::to_string);
Some(GraylogEvent {
timestamp,
source,
service,
level,
message: msg,
raw: message.clone(),
})
})
.collect()
}
```
- [ ] **Step 3: Implement Graylog commands skeleton with credential CRUD and test/manual operations**
Create `src-tauri/src/commands/graylog.rs` with:
```rust
#[tauri::command]
pub fn set_graylog_credentials(
state: State<'_, AppState>,
project_id: String,
base_url: String,
api_token: String,
analyst_agent_id: String,
developer_agent_id: String,
stream_id: Option<String>,
query_filter: String,
polling_interval_minutes: i32,
lookback_minutes: i32,
score_threshold: i32,
) -> Result<GraylogCredentialsSafe, AppError> {
let token_encrypted = crypto::encrypt(&state.encryption_key, api_token.trim())
.map_err(AppError::from)?;
let db = state.db.lock().map_err(|e| AppError::from(format!("Database lock failed: {}", e)))?;
let creds = GraylogCredentials::upsert_for_project(
&db,
&project_id,
base_url.trim(),
&token_encrypted,
analyst_agent_id.trim(),
developer_agent_id.trim(),
stream_id.as_deref(),
query_filter.trim(),
polling_interval_minutes,
lookback_minutes,
score_threshold,
)?;
Ok(creds.to_safe())
}
#[tauri::command]
pub async fn test_graylog_connection(
state: State<'_, AppState>,
project_id: String,
) -> Result<String, AppError> {
let (base_url, token) = {
let db = state.db.lock().map_err(|e| AppError::from(format!("Database lock failed: {}", e)))?;
let creds = GraylogCredentials::get_by_project(&db, &project_id)?
.ok_or_else(|| AppError::from("No Graylog credentials configured".to_string()))?;
let token = crypto::decrypt(&state.encryption_key, &creds.api_token_encrypted)
.map_err(AppError::from)?;
(creds.base_url, token)
};
let client = GraylogClient::new(&state.http_client, &base_url, &token);
client.test_connection().await.map_err(AppError::from)?;
Ok("Connection successful".to_string())
}
#[tauri::command]
pub async fn manual_graylog_poll(
state: State<'_, AppState>,
app_handle: tauri::AppHandle,
project_id: String,
) -> Result<i32, AppError> {
let count = crate::services::graylog_poller::poll_project_once(
&state.db,
&state.encryption_key,
&state.http_client,
&app_handle,
&project_id,
)
.await
.map_err(AppError::from)?;
Ok(count)
}
```
- [ ] **Step 4: Register commands and service exports**
In `src-tauri/src/commands/mod.rs`:
```rust
pub mod graylog;
```
In `src-tauri/src/services/mod.rs`:
```rust
pub mod graylog_client;
```
In `src-tauri/src/lib.rs` invoke handler:
```rust
commands::graylog::set_graylog_credentials,
commands::graylog::get_graylog_credentials,
commands::graylog::delete_graylog_credentials,
commands::graylog::test_graylog_connection,
commands::graylog::manual_graylog_poll,
commands::graylog::list_graylog_subjects,
commands::graylog::list_graylog_detections,
```
- [ ] **Step 5: Run command and client tests**
Run:
```bash
cd /home/leclere/Projets/IA/orchai/src-tauri
cargo test --lib services::graylog_client -- --nocapture
cargo test --lib commands::graylog -- --nocapture
```
Expected: PASS.
- [ ] **Step 6: Commit**
```bash
cd /home/leclere/Projets/IA/orchai
git add src-tauri/src/services/graylog_client.rs src-tauri/src/commands/graylog.rs src-tauri/src/commands/mod.rs src-tauri/src/lib.rs src-tauri/src/services/mod.rs
git commit -m "feat(graylog): add client and tauri commands"
```
---
### Task 6: Implement Graylog poller and orchestrator source-aware behavior
**Files:**
- Create: `src-tauri/src/services/graylog_poller.rs`
- Modify: `src-tauri/src/services/orchestrator.rs`
- Modify: `src-tauri/src/services/mod.rs`
- Modify: `src-tauri/src/commands/orchestrator.rs`
- Modify: `src-tauri/src/lib.rs`
- [ ] **Step 1: Add failing tests for dedup strictness**
In `graylog_poller.rs` tests:
```rust
#[test]
fn test_should_trigger_subject_respects_active_ticket() {
assert!(should_trigger_subject(82, 70, false));
assert!(!should_trigger_subject(82, 70, true));
assert!(!should_trigger_subject(60, 70, false));
}
```
With helper:
```rust
fn should_trigger_subject(score: i32, threshold: i32, has_active_ticket: bool) -> bool {
score >= threshold && !has_active_ticket
}
```
- [ ] **Step 2: Implement `graylog_poller.rs` service**
Create `src-tauri/src/services/graylog_poller.rs`:
```rust
use crate::models::graylog::{GraylogCredentials, GraylogDetection, GraylogSubject};
use crate::models::module::{ProjectModule, MODULE_GRAYLOG_AUTO_RESOLVE};
use crate::models::ticket::ProcessedTicket;
use crate::services::graylog_client::GraylogClient;
use crate::services::graylog_scoring::{group_subjects, SubjectAggregate};
use crate::services::crypto;
use rusqlite::Connection;
use std::sync::{Arc, Mutex};
use tauri::{AppHandle, Emitter};
use tokio::time::{interval, Duration};
fn should_trigger_subject(score: i32, threshold: i32, has_active_ticket: bool) -> bool {
score >= threshold && !has_active_ticket
}
fn synthetic_artifact_id(subject_key: &str) -> i32 {
use std::hash::{Hash, Hasher};
let mut hasher = std::collections::hash_map::DefaultHasher::new();
subject_key.hash(&mut hasher);
let raw = (hasher.finish() & 0x3fff_ffff) as i32;
-raw.max(1)
}
fn subject_payload(aggregate: &SubjectAggregate) -> String {
serde_json::json!({
"source": aggregate.source,
"normalized_message": aggregate.normalized_message,
"counts": {
"critical": aggregate.counts.critical,
"error": aggregate.counts.error,
"warning": aggregate.counts.warning,
"total": aggregate.total_count
},
"last_seen_at": aggregate.last_seen_at,
"score": aggregate.score,
"samples": aggregate.sample_events
}).to_string()
}
```
And core poll entrypoint:
```rust
pub async fn poll_project_once(
db: &Arc<Mutex<Connection>>,
encryption_key: &[u8; 32],
http_client: &reqwest::Client,
app_handle: &AppHandle,
project_id: &str,
) -> Result<i32, String> {
let creds = {
let conn = db.lock().map_err(|e| format!("DB lock failed: {}", e))?;
let enabled = ProjectModule::is_enabled(&conn, project_id, MODULE_GRAYLOG_AUTO_RESOLVE)
.map_err(|e| format!("module lookup failed: {}", e))?;
if !enabled {
return Ok(0);
}
GraylogCredentials::get_by_project(&conn, project_id)
.map_err(|e| format!("credentials lookup failed: {}", e))?
.ok_or_else(|| "No Graylog credentials configured".to_string())?
};
let token = crypto::decrypt(encryption_key, &creds.api_token_encrypted)
.map_err(|e| format!("token decrypt failed: {}", e))?;
let client = GraylogClient::new(http_client, &creds.base_url, &token);
let query = if creds.query_filter.trim().is_empty() {
"level:(critical OR error OR warning)".to_string()
} else {
creds.query_filter.clone()
};
let events = client
.search_relative(&query, creds.stream_id.as_deref(), creds.lookback_minutes * 60)
.await?;
let now = chrono::Utc::now();
let aggregates = group_subjects(&events, now);
let mut triggered_count = 0i32;
let _ = app_handle.emit("graylog-polling-started", serde_json::json!({ "project_id": project_id }));
for aggregate in aggregates {
let (subject, active_in_progress) = {
let conn = db.lock().map_err(|e| format!("DB lock failed: {}", e))?;
let subject = GraylogSubject::upsert_subject(
&conn,
project_id,
&aggregate.subject_key,
&aggregate.source,
&aggregate.normalized_message,
&aggregate.last_seen_at,
aggregate.score,
).map_err(|e| format!("upsert subject failed: {}", e))?;
let active = match &subject.active_ticket_id {
Some(ticket_id) => ProcessedTicket::get_by_id(&conn, ticket_id)
.map(|t| matches!(t.status.as_str(), "Pending" | "Analyzing" | "Developing"))
.unwrap_or(false),
None => false,
};
(subject, active)
};
let should_trigger = should_trigger_subject(
aggregate.score,
creds.score_threshold,
active_in_progress,
);
let triggered_ticket_id = if should_trigger {
let mut conn = db.lock().map_err(|e| format!("DB lock failed: {}", e))?;
let ticket = ProcessedTicket::insert_external(
&conn,
project_id,
"graylog",
Some(&subject.id),
synthetic_artifact_id(&aggregate.subject_key),
&format!("[Graylog] {} - {}", aggregate.source, aggregate.normalized_message),
&subject_payload(&aggregate),
).map_err(|e| format!("insert graylog ticket failed: {}", e))?;
GraylogSubject::set_active_ticket(&conn, &subject.id, Some(&ticket.id), "queued")
.map_err(|e| format!("set active ticket failed: {}", e))?;
triggered_count += 1;
let _ = app_handle.emit("graylog-subject-triggered", serde_json::json!({
"project_id": project_id,
"subject_id": subject.id,
"ticket_id": ticket.id,
"score": aggregate.score
}));
Some(ticket.id)
} else {
None
};
{
let conn = db.lock().map_err(|e| format!("DB lock failed: {}", e))?;
GraylogDetection::insert(
&conn,
&subject.id,
&(now - chrono::Duration::minutes(creds.lookback_minutes as i64)).to_rfc3339(),
&now.to_rfc3339(),
aggregate.counts.critical,
aggregate.counts.error,
aggregate.counts.warning,
aggregate.total_count,
&aggregate.last_seen_at,
aggregate.score,
should_trigger,
triggered_ticket_id.as_deref(),
).map_err(|e| format!("insert detection failed: {}", e))?;
}
}
let _ = app_handle.emit("graylog-polling-finished", serde_json::json!({
"project_id": project_id,
"triggered_count": triggered_count
}));
Ok(triggered_count)
}
```
- [ ] **Step 3: Start background poller and export module**
In `src-tauri/src/services/mod.rs`:
```rust
pub mod graylog_poller;
```
In `src-tauri/src/lib.rs` setup:
```rust
services::graylog_poller::start(
db_arc.clone(),
encryption_key,
http_client.clone(),
app.handle().clone(),
);
```
- [ ] **Step 4: Make orchestrator source-aware**
In `src-tauri/src/services/orchestrator.rs`, use ticket source to resolve project/agents:
```rust
let (project, analyst_agent, developer_agent) = {
let conn = db.lock().map_err(|e| format!("DB lock failed: {}", e))?;
let project = crate::models::project::Project::get_by_id(&conn, &ticket.project_id)
.map_err(|e| format!("project lookup failed: {}", e))?;
if ticket.source == "graylog" {
let cfg = crate::models::graylog::GraylogCredentials::get_by_project(&conn, &project.id)
.map_err(|e| format!("graylog credentials lookup failed: {}", e))?
.ok_or_else(|| "Graylog credentials are missing".to_string())?;
let analyst_agent = Agent::get_by_id(&conn, &cfg.analyst_agent_id)
.map_err(|_| "Configured graylog analyst not found".to_string())?;
let developer_agent = Agent::get_by_id(&conn, &cfg.developer_agent_id)
.map_err(|_| "Configured graylog developer not found".to_string())?;
(project, analyst_agent, developer_agent)
} else {
let tracker_id = ticket.tracker_id.as_deref().ok_or_else(|| "Missing tracker_id for tuleap ticket".to_string())?;
let tracker = WatchedTracker::get_by_id(&conn, tracker_id)
.map_err(|e| format!("get tracker failed: {}", e))?;
let analyst_id = tracker.analyst_agent_id.as_deref().ok_or_else(|| "Tracker has no analyst".to_string())?;
let developer_id = tracker.developer_agent_id.as_deref().ok_or_else(|| "Tracker has no developer".to_string())?;
let analyst_agent = Agent::get_by_id(&conn, analyst_id).map_err(|_| "Configured analyst not found".to_string())?;
let developer_agent = Agent::get_by_id(&conn, developer_id).map_err(|_| "Configured developer not found".to_string())?;
(project, analyst_agent, developer_agent)
}
};
```
Also update `commands/orchestrator.rs` retry cleanup project resolution:
```rust
let project = crate::models::project::Project::get_by_id(&conn, &ticket.project_id)?;
```
- [ ] **Step 5: Run service and orchestrator tests**
Run:
```bash
cd /home/leclere/Projets/IA/orchai/src-tauri
cargo test --lib services::graylog_poller -- --nocapture
cargo test --lib services::orchestrator -- --nocapture
cargo test --lib commands::orchestrator -- --nocapture
```
Expected: PASS with dedup behavior covered.
- [ ] **Step 6: Commit**
```bash
cd /home/leclere/Projets/IA/orchai
git add src-tauri/src/services/graylog_poller.rs src-tauri/src/services/orchestrator.rs src-tauri/src/commands/orchestrator.rs src-tauri/src/services/mod.rs src-tauri/src/lib.rs
git commit -m "feat(graylog): add poller bridge and source-aware orchestrator flow"
```
---
### Task 7: Frontend API/types wiring for Graylog
**Files:**
- Modify: `src/lib/types.ts`
- Modify: `src/lib/api.ts`
- [ ] **Step 1: Add TS types**
In `src/lib/types.ts`:
```ts
export interface GraylogCredentialsSafe {
id: string;
project_id: string;
base_url: string;
analyst_agent_id: string;
developer_agent_id: string;
stream_id: string | null;
query_filter: string;
polling_interval_minutes: number;
lookback_minutes: number;
score_threshold: number;
}
export interface GraylogSubject {
id: string;
project_id: string;
subject_key: string;
source: string;
normalized_message: string;
first_seen_at: string;
last_seen_at: string;
last_score: number;
active_ticket_id: string | null;
status: string;
}
export interface GraylogDetection {
id: string;
subject_id: string;
window_start: string;
window_end: string;
critical_count: number;
error_count: number;
warning_count: number;
total_count: number;
last_seen_at: string;
score: number;
triggered: boolean;
triggered_ticket_id: string | null;
created_at: string;
}
```
Also extend `ProcessedTicket`:
```ts
tracker_id: string | null;
project_id: string;
source: string;
source_ref: string | null;
```
- [ ] **Step 2: Add Graylog API wrappers**
In `src/lib/api.ts`:
```ts
export async function getGraylogCredentials(projectId: string): Promise<GraylogCredentialsSafe | null> {
return invoke("get_graylog_credentials", { projectId });
}
export async function setGraylogCredentials(
projectId: string,
baseUrl: string,
apiToken: string,
analystAgentId: string,
developerAgentId: string,
streamId: string | null,
queryFilter: string,
pollingIntervalMinutes: number,
lookbackMinutes: number,
scoreThreshold: number
): Promise<GraylogCredentialsSafe> {
return invoke("set_graylog_credentials", {
projectId,
baseUrl,
apiToken,
analystAgentId,
developerAgentId,
streamId,
queryFilter,
pollingIntervalMinutes,
lookbackMinutes,
scoreThreshold,
});
}
export async function listGraylogSubjects(projectId: string): Promise<GraylogSubject[]> {
return invoke("list_graylog_subjects", { projectId });
}
export async function listGraylogDetections(projectId: string, subjectId?: string): Promise<GraylogDetection[]> {
return invoke("list_graylog_detections", { projectId, subjectId });
}
export async function testGraylogConnection(projectId: string): Promise<string> {
return invoke("test_graylog_connection", { projectId });
}
export async function manualGraylogPoll(projectId: string): Promise<number> {
return invoke("manual_graylog_poll", { projectId });
}
```
- [ ] **Step 3: Run frontend typecheck**
Run:
```bash
cd /home/leclere/Projets/IA/orchai
npm run qa:frontend
```
Expected: PASS.
- [ ] **Step 4: Commit**
```bash
cd /home/leclere/Projets/IA/orchai
git add src/lib/types.ts src/lib/api.ts
git commit -m "feat(frontend): add graylog types and API bindings"
```
---
### Task 8: Build Graylog project UI and live activity hooks
**Files:**
- Create: `src/components/projects/ProjectGraylog.tsx`
- Modify: `src/components/projects/ProjectDashboard.tsx`
- Modify: `src/components/projects/ProjectModules.tsx`
- Modify: `src/components/tickets/TicketList.tsx`
- Modify: `src/components/tickets/TicketDetail.tsx`
- Modify: `src/App.tsx`
- [ ] **Step 1: Add route and dashboard entry**
In `src/App.tsx` add:
```tsx
import ProjectGraylog from "./components/projects/ProjectGraylog";
```
```tsx
<Route path="/projects/:projectId/graylog" element={<ProjectGraylog />} />
```
In `ProjectDashboard.tsx` add Graylog card:
```tsx
<Link
to={`/projects/${project.id}/graylog`}
className="rounded-lg border border-gray-200 bg-white p-4 hover:border-gray-300"
>
<div className="text-sm font-semibold text-gray-900">Graylog</div>
<div className="mt-1 text-xs text-gray-500">
Configure le polling Graylog et surveille les sujets scorés.
</div>
</Link>
```
- [ ] **Step 2: Add Graylog live events on dashboard**
In `ProjectDashboard.tsx` add listeners:
```tsx
listen("graylog-polling-started", (event) => {
const payload = event.payload as { project_id: string };
if (payload.project_id !== projectId) return;
appendActivity("info", "Polling Graylog lancé.");
});
listen("graylog-subject-triggered", (event) => {
const payload = event.payload as { project_id: string; score: number; subject_id: string };
if (payload.project_id !== projectId) return;
appendActivity("success", `Sujet Graylog déclenché (score ${payload.score}).`);
void loadData();
});
listen("graylog-polling-error", (event) => {
const payload = event.payload as { project_id: string; error: string };
if (payload.project_id !== projectId) return;
appendActivity("error", `Erreur Graylog: ${payload.error}`);
});
```
- [ ] **Step 3: Create `ProjectGraylog.tsx` page**
Create `src/components/projects/ProjectGraylog.tsx` with:
```tsx
import { FormEvent, useEffect, useState } from "react";
import { Link, useParams } from "react-router-dom";
import {
getGraylogCredentials,
listAgents,
listGraylogDetections,
listGraylogSubjects,
manualGraylogPoll,
setGraylogCredentials,
testGraylogConnection,
} from "../../lib/api";
import type { Agent, GraylogCredentialsSafe, GraylogDetection, GraylogSubject } from "../../lib/types";
import { getErrorMessage } from "../../lib/errors";
export default function ProjectGraylog() {
const { projectId } = useParams<{ projectId: string }>();
const [agents, setAgents] = useState<Agent[]>([]);
const [credentials, setCredentials] = useState<GraylogCredentialsSafe | null>(null);
const [subjects, setSubjects] = useState<GraylogSubject[]>([]);
const [detections, setDetections] = useState<GraylogDetection[]>([]);
const [baseUrl, setBaseUrl] = useState("");
const [apiToken, setApiToken] = useState("");
const [analystAgentId, setAnalystAgentId] = useState("");
const [developerAgentId, setDeveloperAgentId] = useState("");
const [streamId, setStreamId] = useState("");
const [queryFilter, setQueryFilter] = useState("level:(critical OR error OR warning)");
const [pollingIntervalMinutes, setPollingIntervalMinutes] = useState(10);
const [lookbackMinutes, setLookbackMinutes] = useState(30);
const [scoreThreshold, setScoreThreshold] = useState(70);
const [loading, setLoading] = useState(false);
const [saving, setSaving] = useState(false);
const [error, setError] = useState<string | null>(null);
const [success, setSuccess] = useState<string | null>(null);
async function refresh() {
if (!projectId) return;
setLoading(true);
setError(null);
try {
const [agentList, creds, subjectList, detectionList] = await Promise.all([
listAgents(),
getGraylogCredentials(projectId),
listGraylogSubjects(projectId),
listGraylogDetections(projectId),
]);
setAgents(agentList);
setCredentials(creds);
setSubjects(subjectList);
setDetections(detectionList);
if (creds) {
setBaseUrl(creds.base_url);
setAnalystAgentId(creds.analyst_agent_id);
setDeveloperAgentId(creds.developer_agent_id);
setStreamId(creds.stream_id ?? "");
setQueryFilter(creds.query_filter);
setPollingIntervalMinutes(creds.polling_interval_minutes);
setLookbackMinutes(creds.lookback_minutes);
setScoreThreshold(creds.score_threshold);
}
} catch (err: unknown) {
setError(getErrorMessage(err));
} finally {
setLoading(false);
}
}
useEffect(() => {
void refresh();
}, [projectId]);
async function handleSave(event: FormEvent) {
event.preventDefault();
if (!projectId) return;
setSaving(true);
setError(null);
setSuccess(null);
try {
const saved = await setGraylogCredentials(
projectId,
baseUrl,
apiToken,
analystAgentId,
developerAgentId,
streamId.trim() || null,
queryFilter,
pollingIntervalMinutes,
lookbackMinutes,
scoreThreshold
);
setCredentials(saved);
setApiToken("");
setSuccess("Configuration Graylog sauvegardée.");
await refresh();
} catch (err: unknown) {
setError(getErrorMessage(err));
} finally {
setSaving(false);
}
}
async function handleTest() {
if (!projectId) return;
setError(null);
setSuccess(null);
try {
const message = await testGraylogConnection(projectId);
setSuccess(message);
} catch (err: unknown) {
setError(getErrorMessage(err));
}
}
async function handleManualPoll() {
if (!projectId) return;
setError(null);
setSuccess(null);
try {
const triggered = await manualGraylogPoll(projectId);
setSuccess(`Polling Graylog manuel terminé: ${triggered} sujet(s) déclenché(s).`);
await refresh();
} catch (err: unknown) {
setError(getErrorMessage(err));
}
}
const analysts = agents.filter((a) => a.role === "analyst");
const developers = agents.filter((a) => a.role === "developer");
return (
<div className="space-y-6 p-8">
<div className="flex items-center justify-between">
<h2 className="text-xl font-bold">Graylog</h2>
{projectId && <Link to={`/projects/${projectId}`} className="rounded bg-gray-200 px-4 py-2 text-sm text-gray-700 hover:bg-gray-300">Retour</Link>}
</div>
{error && <div className="rounded border border-red-200 bg-red-50 p-3 text-sm text-red-700">{error}</div>}
{success && <div className="rounded border border-green-200 bg-green-50 p-3 text-sm text-green-700">{success}</div>}
<form onSubmit={handleSave} className="rounded-lg border border-gray-200 bg-white p-4 space-y-3">
<h3 className="text-sm font-semibold text-gray-900">Configuration</h3>
<input className="w-full rounded border border-gray-300 px-3 py-2 text-sm" value={baseUrl} onChange={(e) => setBaseUrl(e.target.value)} placeholder="https://graylog.example.com" required />
<input className="w-full rounded border border-gray-300 px-3 py-2 text-sm" value={apiToken} onChange={(e) => setApiToken(e.target.value)} placeholder={credentials ? "Laisser vide pour conserver le token actuel" : "Token API Graylog"} required={!credentials} />
<div className="grid gap-3 md:grid-cols-2">
<select className="rounded border border-gray-300 px-3 py-2 text-sm" value={analystAgentId} onChange={(e) => setAnalystAgentId(e.target.value)} required>
<option value="">Analyst agent</option>
{analysts.map((agent) => <option key={agent.id} value={agent.id}>{agent.name}</option>)}
</select>
<select className="rounded border border-gray-300 px-3 py-2 text-sm" value={developerAgentId} onChange={(e) => setDeveloperAgentId(e.target.value)} required>
<option value="">Developer agent</option>
{developers.map((agent) => <option key={agent.id} value={agent.id}>{agent.name}</option>)}
</select>
</div>
<div className="grid gap-3 md:grid-cols-4">
<input className="rounded border border-gray-300 px-3 py-2 text-sm" value={streamId} onChange={(e) => setStreamId(e.target.value)} placeholder="stream_id (optionnel)" />
<input className="rounded border border-gray-300 px-3 py-2 text-sm md:col-span-3" value={queryFilter} onChange={(e) => setQueryFilter(e.target.value)} />
</div>
<div className="grid gap-3 md:grid-cols-3">
<input type="number" className="rounded border border-gray-300 px-3 py-2 text-sm" value={pollingIntervalMinutes} onChange={(e) => setPollingIntervalMinutes(Number(e.target.value))} min={1} />
<input type="number" className="rounded border border-gray-300 px-3 py-2 text-sm" value={lookbackMinutes} onChange={(e) => setLookbackMinutes(Number(e.target.value))} min={1} />
<input type="number" className="rounded border border-gray-300 px-3 py-2 text-sm" value={scoreThreshold} onChange={(e) => setScoreThreshold(Number(e.target.value))} min={1} max={100} />
</div>
<div className="flex gap-2">
<button type="submit" disabled={saving} className="rounded bg-blue-600 px-4 py-2 text-sm text-white hover:bg-blue-700 disabled:opacity-50">{saving ? "Sauvegarde..." : "Sauvegarder"}</button>
<button type="button" onClick={() => void handleTest()} className="rounded bg-gray-200 px-4 py-2 text-sm text-gray-800 hover:bg-gray-300">Tester la connexion</button>
<button type="button" onClick={() => void handleManualPoll()} className="rounded bg-gray-900 px-4 py-2 text-sm text-white hover:bg-black">Poll manuel</button>
</div>
</form>
<div className="rounded-lg border border-gray-200 bg-white p-4">
<h3 className="mb-3 text-sm font-semibold text-gray-900">Sujets détectés</h3>
{loading ? <div className="text-sm text-gray-500">Chargement...</div> : (
<div className="space-y-2">
{subjects.map((subject) => (
<div key={subject.id} className="rounded border border-gray-100 p-3">
<div className="text-sm font-medium text-gray-900">{subject.source}</div>
<div className="text-xs text-gray-600">{subject.normalized_message}</div>
<div className="mt-1 text-xs text-gray-500">Score: {subject.last_score} | Statut: {subject.status} | Last seen: {new Date(subject.last_seen_at).toLocaleString()}</div>
</div>
))}
{subjects.length === 0 && <div className="text-sm text-gray-400">Aucun sujet détecté.</div>}
</div>
)}
</div>
<div className="rounded-lg border border-gray-200 bg-white p-4">
<h3 className="mb-3 text-sm font-semibold text-gray-900">Dernières détections</h3>
<div className="space-y-2">
{detections.slice(0, 20).map((detection) => (
<div key={detection.id} className="rounded border border-gray-100 p-3 text-xs text-gray-700">
score={detection.score} total={detection.total_count} triggered={String(detection.triggered)} at={new Date(detection.created_at).toLocaleString()}
</div>
))}
{detections.length === 0 && <div className="text-sm text-gray-400">Aucune détection enregistrée.</div>}
</div>
</div>
</div>
);
}
```
- [ ] **Step 4: Add Graylog shortcut in module cards**
In `src/components/projects/ProjectModules.tsx`, inside each module card:
```tsx
{projectId && mod.module_key === "graylog_polling_auto_resolve" && (
<Link
to={`/projects/${projectId}/graylog`}
className="mt-2 inline-flex rounded bg-gray-100 px-2 py-1 text-xs text-gray-700 hover:bg-gray-200"
>
Configurer
</Link>
)}
```
- [ ] **Step 5: Add source awareness in ticket views**
In `TicketList.tsx` ticket row header:
```tsx
<span className="rounded bg-gray-100 px-2 py-0.5 text-[10px] uppercase tracking-wide text-gray-600">
{ticket.source}
</span>
```
In `TicketDetail.tsx` info tab:
```tsx
<div>
<span className="text-sm text-gray-500">Source:</span>
<span className="ml-2 text-sm uppercase">{ticket.source}</span>
</div>
{ticket.source_ref && (
<div>
<span className="text-sm text-gray-500">Source ref:</span>
<span className="ml-2 font-mono text-sm">{ticket.source_ref}</span>
</div>
)}
```
- [ ] **Step 6: Run frontend QA and production build**
Run:
```bash
cd /home/leclere/Projets/IA/orchai
npm run qa:frontend
npm run build
```
Expected: PASS (typecheck + production bundle build).
- [ ] **Step 7: Commit**
```bash
cd /home/leclere/Projets/IA/orchai
git add src/components/projects/ProjectGraylog.tsx src/components/projects/ProjectDashboard.tsx src/components/projects/ProjectModules.tsx src/components/tickets/TicketList.tsx src/components/tickets/TicketDetail.tsx src/App.tsx
git commit -m "feat(ui): add graylog project page and live subject visibility"
```
---
### Task 9: End-to-end verification and cleanup
**Files:**
- Modify: touched files containing QA fixes discovered in this task
- [ ] **Step 1: Run full backend QA**
Run:
```bash
cd /home/leclere/Projets/IA/orchai
npm run qa:backend:check
npm run qa:backend:test
```
Expected: PASS.
- [ ] **Step 2: Run full project QA**
Run:
```bash
cd /home/leclere/Projets/IA/orchai
npm run qa
```
Expected: PASS with no clippy warnings.
- [ ] **Step 3: Manual validation checklist**
Run app:
```bash
cd /home/leclere/Projets/IA/orchai
npm run tauri dev
```
Validate:
1. Module `Polling Graylog + auto-resolve` appears in project modules.
2. Graylog config page saves credentials and tests connection.
3. Manual Graylog poll creates subject rows and detection history.
4. High-score subject creates one queue item only while active (strict dedup).
5. Ticket detail shows `source=graylog` and still supports retry/cancel path.
- [ ] **Step 4: Final commit for follow-up fixes**
```bash
cd /home/leclere/Projets/IA/orchai
git status --short
# If non-empty, continue:
git add -A
git commit -m "fix(graylog): complete integration polish and qa fixes"
```
---
## Spec Coverage Checklist (self-review)
1. **Project-scoped credentials:** covered by Tasks 1, 3, 5, 8.
2. **Subject grouping (`source + normalized_message`):** covered by Task 4 and consumed in Task 6.
3. **Deterministic scoring (`severity + frequency + recency`):** covered by Task 4.
4. **Strict dedup (single active branch per subject):** covered by Task 6 tests and logic.
5. **Reuse existing `analyst -> developer` pipeline:** covered by Task 2 + Task 6 orchestrator source-aware updates.
6. **UI module/config/visibility:** covered by Tasks 7 and 8.
7. **Observability events (`graylog-*`):** covered by Tasks 6 and 8.
8. **Rollout safety via full QA:** covered by Task 9.