Compare commits
39 Commits
ff74cc3a04
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| 0c595a787e | |||
| 7d30e6c1a6 | |||
| a2bdecd298 | |||
| c6698db51a | |||
| 604fdc5c6c | |||
| c22642630d | |||
| ab65909e6e | |||
| 07c98f36f0 | |||
| 4e96382097 | |||
| 3f910873eb | |||
| 3f848c0d31 | |||
| 967d2f56ad | |||
| 45f4175929 | |||
| e040c9a234 | |||
| b0d06a1d8c | |||
| e49a04c576 | |||
| 346cd681f9 | |||
| 17b1d5974a | |||
| 12e4fbf14e | |||
| d6e8d264c5 | |||
| f6488b6bbe | |||
| 2eacda9127 | |||
| 688443e732 | |||
| 0d6781973b | |||
| cbf0025a81 | |||
| 81b2990dca | |||
| 9d7b76339b | |||
| 0e53640b94 | |||
| 51d56a4082 | |||
| 3be22a5742 | |||
| d2983d5f38 | |||
| afd95f06df | |||
| 90fd87be61 | |||
| 47781e04f9 | |||
| 1de5b8a89e | |||
| d05e037bbc | |||
| 9cdc054c4b | |||
| 027ea498c3 | |||
| 4baf209134 |
813
docs/superpowers/plans/2026-04-15-drop-verify-and-jobs-page.md
Normal file
813
docs/superpowers/plans/2026-04-15-drop-verify-and-jobs-page.md
Normal file
@@ -0,0 +1,813 @@
|
||||
# Drop verify/checkmarks, merge jobs view into item details — Implementation Plan
|
||||
|
||||
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
|
||||
|
||||
**Goal:** Rip out the post-job verification path entirely (DB column, SSE event, handoff function), delete the standalone `/execute` page, and surface per-item job info (status, command, log, run/cancel) on the item details page. Batch queue controls move into the Pipeline column headers.
|
||||
|
||||
**Architecture:** Rescan becomes the single source of truth for "is this file still done?" — the verified flag and the Jellyfin refresh handoff are no longer needed. The Jobs page disappears; its per-item info is enriched onto `GET /api/review/:id` and rendered inline on the details page. Batch controls (`Run all`, `Clear queue`, `Clear`) sit in the existing `ColumnShell` `actions` slot.
|
||||
|
||||
**Tech Stack:** Bun + Hono (server), React 19 + TanStack Router (client), bun:sqlite.
|
||||
|
||||
---
|
||||
|
||||
## File Structure
|
||||
|
||||
**Backend:**
|
||||
- `server/db/index.ts` — add `DROP COLUMN verified` migration
|
||||
- `server/db/schema.ts` — remove `verified` from `review_plans` DDL
|
||||
- `server/services/rescan.ts` — remove `verified` from INSERT/UPDATE logic
|
||||
- `server/api/review.ts` — drop `rp.verified` from pipeline SELECT, drop `verified = 0` from unapprove, enrich `loadItemDetail` with latest job
|
||||
- `server/api/execute.ts` — delete `handOffToJellyfin`, `emitPlanUpdate`, `POST /verify-unverified`, `GET /` (list endpoint) and the plan_update emissions from job lifecycle
|
||||
- `server/types.ts` — drop `verified` from `ReviewPlan`, add job shape on detail response
|
||||
- `server/services/__tests__/webhook.test.ts` — delete the `webhook_verified flag` describe block
|
||||
|
||||
**Frontend:**
|
||||
- `src/routes/execute.tsx` — delete (file)
|
||||
- `src/features/execute/ExecutePage.tsx` — delete (file)
|
||||
- `src/shared/lib/types.ts` — drop `verified` from `PipelineJobItem` and `ReviewPlan`; add `job` field to `DetailData`
|
||||
- `src/routes/__root.tsx` — remove `Jobs` nav link
|
||||
- `src/features/pipeline/PipelinePage.tsx` — remove `plan_update` SSE listener, remove the `Start queue` header button
|
||||
- `src/features/pipeline/DoneColumn.tsx` — remove verify button, `unverifiedCount`, `verified`/`✓✓` glyph
|
||||
- `src/features/pipeline/QueueColumn.tsx` — add `Run all` + `Clear queue` actions
|
||||
- `src/features/review/AudioDetailPage.tsx` — add JobSection
|
||||
|
||||
**Plan ordering rationale:** Backend DB migration first (Task 1) so the schema drift from the `verified` column doesn't break tests. Then server logic deletions (Task 2). Then server additions (Task 3). Frontend follows in dependency order: types → route deletion → column updates → details enrichment.
|
||||
|
||||
---
|
||||
|
||||
## Task 1: Drop `verified` column from DB + backend references
|
||||
|
||||
**Files:**
|
||||
- Modify: `server/db/index.ts` (migration block)
|
||||
- Modify: `server/db/schema.ts:77`
|
||||
- Modify: `server/services/rescan.ts:233-270`
|
||||
- Modify: `server/api/review.ts:330, 773`
|
||||
- Modify: `server/types.ts` (ReviewPlan interface)
|
||||
- Modify: `server/services/__tests__/webhook.test.ts:186-240`
|
||||
|
||||
- [ ] **Step 1: Add idempotent migration in `server/db/index.ts`**
|
||||
|
||||
Locate the existing block of `alter(...)` calls (around line 76 where `webhook_verified` was added and renamed). Append a new call at the end so it runs on the next startup for existing databases:
|
||||
|
||||
```ts
|
||||
alter("ALTER TABLE review_plans DROP COLUMN verified");
|
||||
```
|
||||
|
||||
The `alter()` helper wraps each statement in try/catch, so on a fresh DB (where `verified` never existed because we'll remove it from schema.ts) the DROP is a no-op, and on an existing DB it removes the column once.
|
||||
|
||||
- [ ] **Step 2: Remove `verified` from schema.ts**
|
||||
|
||||
Open `server/db/schema.ts`. Find the `review_plans` CREATE TABLE block (around line 77) and delete the line:
|
||||
|
||||
```ts
|
||||
verified INTEGER NOT NULL DEFAULT 0,
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Remove `verified` from rescan.ts INSERT/UPDATE**
|
||||
|
||||
Open `server/services/rescan.ts` around lines 223–272.
|
||||
|
||||
First, trim the block comment immediately above the `db.prepare(...)` call. Delete the paragraph that starts `` `verified` tracks whether we have independent confirmation... `` (lines 232–238 in the current file). Keep the "Status transition rules" paragraph above it.
|
||||
|
||||
Then replace the INSERT/ON CONFLICT statement and its `.run(...)` args with the variant that has no `verified` column:
|
||||
|
||||
```ts
|
||||
db
|
||||
.prepare(`
|
||||
INSERT INTO review_plans (item_id, status, is_noop, confidence, apple_compat, job_type, notes)
|
||||
VALUES (?, 'pending', ?, ?, ?, ?, ?)
|
||||
ON CONFLICT(item_id) DO UPDATE SET
|
||||
status = CASE
|
||||
WHEN excluded.is_noop = 1 THEN 'done'
|
||||
WHEN review_plans.status = 'done' AND ? = 'webhook' THEN 'pending'
|
||||
WHEN review_plans.status = 'done' THEN 'done'
|
||||
WHEN review_plans.status = 'error' THEN 'pending'
|
||||
ELSE review_plans.status
|
||||
END,
|
||||
is_noop = excluded.is_noop,
|
||||
confidence = excluded.confidence,
|
||||
apple_compat = excluded.apple_compat,
|
||||
job_type = excluded.job_type,
|
||||
notes = excluded.notes
|
||||
`)
|
||||
.run(
|
||||
itemId,
|
||||
analysis.is_noop ? 1 : 0,
|
||||
confidence,
|
||||
analysis.apple_compat,
|
||||
analysis.job_type,
|
||||
analysis.notes.length > 0 ? analysis.notes.join("\n") : null,
|
||||
source, // for the CASE WHEN ? = 'webhook' branch
|
||||
);
|
||||
```
|
||||
|
||||
Note: the parameter list drops the two `verified`-related bindings (was `analysis.is_noop ? 1 : 0` passed twice for the verified CASE, and the source passed twice). Verify by counting `?` placeholders in the SQL (7) matches `.run()` argument count (7).
|
||||
|
||||
- [ ] **Step 4: Remove `rp.verified` from the pipeline SELECT**
|
||||
|
||||
Open `server/api/review.ts` around line 330. In the `done` query, change:
|
||||
|
||||
```ts
|
||||
SELECT j.*, mi.name, mi.series_name, mi.type,
|
||||
rp.job_type, rp.apple_compat, rp.verified
|
||||
```
|
||||
|
||||
to:
|
||||
|
||||
```ts
|
||||
SELECT j.*, mi.name, mi.series_name, mi.type,
|
||||
rp.job_type, rp.apple_compat
|
||||
```
|
||||
|
||||
- [ ] **Step 5: Remove `verified = 0` from unapprove UPDATE**
|
||||
|
||||
Open `server/api/review.ts` around line 773. Change:
|
||||
|
||||
```ts
|
||||
db.prepare("UPDATE review_plans SET status = 'pending', verified = 0, reviewed_at = NULL WHERE id = ?").run(plan.id);
|
||||
```
|
||||
|
||||
to:
|
||||
|
||||
```ts
|
||||
db.prepare("UPDATE review_plans SET status = 'pending', reviewed_at = NULL WHERE id = ?").run(plan.id);
|
||||
```
|
||||
|
||||
- [ ] **Step 6: Remove `verified` from the `ReviewPlan` type**
|
||||
|
||||
Open `server/types.ts`. Find the `ReviewPlan` interface and delete the `verified: number;` line (around line 68).
|
||||
|
||||
- [ ] **Step 7: Delete the webhook_verified test block**
|
||||
|
||||
Open `server/services/__tests__/webhook.test.ts`. Find the block `describe("processWebhookEvent — webhook_verified flag", …)` starting at line 186 and delete through its closing `});` at line 240.
|
||||
|
||||
- [ ] **Step 8: Run the test suite**
|
||||
|
||||
Run: `bun test`
|
||||
Expected: PASS with the full suite green (the remaining webhook tests, analyzer tests, etc.).
|
||||
|
||||
If any test fails with "no such column: verified", grep for remaining references:
|
||||
```bash
|
||||
rg "verified" server/
|
||||
```
|
||||
and remove each occurrence.
|
||||
|
||||
- [ ] **Step 9: Commit**
|
||||
|
||||
```bash
|
||||
git add server/db/index.ts server/db/schema.ts server/services/rescan.ts server/api/review.ts server/types.ts server/services/__tests__/webhook.test.ts
|
||||
git commit -m "drop review_plans.verified column and all its references"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 2: Delete verification path in `server/api/execute.ts`
|
||||
|
||||
**Files:**
|
||||
- Modify: `server/api/execute.ts`
|
||||
|
||||
- [ ] **Step 1: Delete `handOffToJellyfin` function**
|
||||
|
||||
Open `server/api/execute.ts`. Delete the entire `handOffToJellyfin` function and its JSDoc, spanning roughly lines 28–98 (from the block comment starting `/**\n * Post-job verification…` through the closing brace of the function).
|
||||
|
||||
Also delete the now-unused imports at the top that only this function used:
|
||||
|
||||
```ts
|
||||
import { getItem, refreshItem } from "../services/jellyfin";
|
||||
import { loadRadarrLibrary, radarrUsable } from "../services/radarr";
|
||||
import { loadSonarrLibrary, sonarrUsable } from "../services/sonarr";
|
||||
import { upsertJellyfinItem } from "../services/rescan";
|
||||
import type { RescanConfig } from "../services/rescan";
|
||||
import { getAllConfig } from "../db";
|
||||
```
|
||||
|
||||
(Only delete the ones not used elsewhere in the file. Run the TS check in Step 6 to catch any that are still needed.)
|
||||
|
||||
- [ ] **Step 2: Delete `emitPlanUpdate` function**
|
||||
|
||||
In the same file, find `emitPlanUpdate` (around line 183) and delete the function and its block comment (lines 176–186).
|
||||
|
||||
- [ ] **Step 3: Remove calls to `handOffToJellyfin` from the job lifecycle**
|
||||
|
||||
There are two call sites at lines 492 and 609, each wrapped in `.catch(...)`. Find both instances that look like:
|
||||
|
||||
```ts
|
||||
handOffToJellyfin(job.item_id).catch((err) =>
|
||||
logError(`handOffToJellyfin for item ${job.item_id} failed:`, err),
|
||||
);
|
||||
```
|
||||
|
||||
Delete both blocks entirely.
|
||||
|
||||
- [ ] **Step 4: Delete the `/verify-unverified` endpoint**
|
||||
|
||||
In the same file, find and delete the whole block starting with the comment `// ─── Verify all unverified done plans ───` and the `app.post("/verify-unverified", …)` handler below it (approximately lines 357–389).
|
||||
|
||||
- [ ] **Step 5: Delete the `GET /` list endpoint**
|
||||
|
||||
Find the handler mounted at `app.get("/", (c) => { ... })` that returns the filtered jobs list (the one used by the Execute page). Delete the whole block including its preceding comment.
|
||||
|
||||
To locate: it reads `filter` from `c.req.query("filter")`, runs a SELECT joining `jobs` with `media_items`, and returns `{ jobs, filter, totalCounts }`.
|
||||
|
||||
- [ ] **Step 6: Run TypeScript compile**
|
||||
|
||||
Run: `bun --bun tsc --noEmit --project tsconfig.server.json`
|
||||
Expected: PASS with no unused import warnings.
|
||||
|
||||
If the compiler complains about unused imports, remove them.
|
||||
|
||||
- [ ] **Step 7: Run lint**
|
||||
|
||||
Run: `bun run lint`
|
||||
Expected: PASS.
|
||||
|
||||
- [ ] **Step 8: Commit**
|
||||
|
||||
```bash
|
||||
git add server/api/execute.ts
|
||||
git commit -m "rip out jellyfin handoff verification path and verify-unverified endpoint"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 3: Enrich `loadItemDetail` with the latest job
|
||||
|
||||
**Files:**
|
||||
- Modify: `server/api/review.ts:111-126`
|
||||
- Modify: `server/types.ts` (add exported `DetailJob` shape or similar if helpful)
|
||||
|
||||
- [ ] **Step 1: Add latest-job query to `loadItemDetail`**
|
||||
|
||||
Open `server/api/review.ts` around line 111. Replace the body with the job enrichment:
|
||||
|
||||
```ts
|
||||
function loadItemDetail(db: ReturnType<typeof getDb>, itemId: number) {
|
||||
const item = db.prepare("SELECT * FROM media_items WHERE id = ?").get(itemId) as MediaItem | undefined;
|
||||
if (!item) return { item: null, streams: [], plan: null, decisions: [], command: null, job: null };
|
||||
|
||||
const streams = db
|
||||
.prepare("SELECT * FROM media_streams WHERE item_id = ? ORDER BY stream_index")
|
||||
.all(itemId) as MediaStream[];
|
||||
const plan = db.prepare("SELECT * FROM review_plans WHERE item_id = ?").get(itemId) as ReviewPlan | undefined | null;
|
||||
const decisions = plan
|
||||
? (db.prepare("SELECT * FROM stream_decisions WHERE plan_id = ?").all(plan.id) as StreamDecision[])
|
||||
: [];
|
||||
|
||||
const command = plan && !plan.is_noop ? buildCommand(item, streams, decisions) : null;
|
||||
|
||||
const job = db
|
||||
.prepare(
|
||||
`SELECT id, item_id, command, job_type, status, output, exit_code,
|
||||
created_at, started_at, completed_at
|
||||
FROM jobs WHERE item_id = ? ORDER BY created_at DESC LIMIT 1`,
|
||||
)
|
||||
.get(itemId) as Job | undefined;
|
||||
|
||||
return { item, streams, plan: plan ?? null, decisions, command, job: job ?? null };
|
||||
}
|
||||
```
|
||||
|
||||
Add the `Job` type import at the top of the file if not already imported:
|
||||
|
||||
```ts
|
||||
import type { Job, MediaItem, MediaStream, ReviewPlan, StreamDecision } from "../types";
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Run the test suite**
|
||||
|
||||
Run: `bun test`
|
||||
Expected: PASS.
|
||||
|
||||
- [ ] **Step 3: Smoke-test the endpoint manually**
|
||||
|
||||
Start the server: `bun run dev:server`
|
||||
In another terminal:
|
||||
```bash
|
||||
curl -s http://localhost:3000/api/review/1 | jq '.job'
|
||||
```
|
||||
Expected: either `null` (no jobs ever) or a job object with the fields above.
|
||||
|
||||
Kill the dev server with Ctrl-C after confirming.
|
||||
|
||||
- [ ] **Step 4: Commit**
|
||||
|
||||
```bash
|
||||
git add server/api/review.ts
|
||||
git commit -m "enrich GET /api/review/:id with the latest job row"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 4: Update client types (drop verified, add job on DetailData)
|
||||
|
||||
**Files:**
|
||||
- Modify: `src/shared/lib/types.ts`
|
||||
- Modify: `src/features/review/AudioDetailPage.tsx:13-19` (local `DetailData` interface)
|
||||
|
||||
- [ ] **Step 1: Remove `verified` from `ReviewPlan`**
|
||||
|
||||
In `src/shared/lib/types.ts`, find the `ReviewPlan` interface (lines 44–56). This client-side type doesn't currently include `verified` — confirm by reading lines 44–56. If it does, delete the line. If not, skip this sub-step.
|
||||
|
||||
- [ ] **Step 2: Remove `verified` from `PipelineJobItem`**
|
||||
|
||||
In the same file around line 161, delete:
|
||||
|
||||
```ts
|
||||
// 1 when an independent post-hoc check confirms the on-disk file matches
|
||||
// the plan (ffprobe after a job, or is_noop=1 on the very first scan).
|
||||
// Renders as the second checkmark in the Done column.
|
||||
verified?: number;
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Update `DetailData` in `AudioDetailPage.tsx`**
|
||||
|
||||
Open `src/features/review/AudioDetailPage.tsx` at line 13 and replace the interface with:
|
||||
|
||||
```ts
|
||||
interface DetailData {
|
||||
item: MediaItem;
|
||||
streams: MediaStream[];
|
||||
plan: ReviewPlan | null;
|
||||
decisions: StreamDecision[];
|
||||
command: string | null;
|
||||
job: Job | null;
|
||||
}
|
||||
```
|
||||
|
||||
Add `Job` to the imports at line 9:
|
||||
|
||||
```ts
|
||||
import type { Job, MediaItem, MediaStream, ReviewPlan, StreamDecision } from "~/shared/lib/types";
|
||||
```
|
||||
|
||||
- [ ] **Step 4: Run lint**
|
||||
|
||||
Run: `bun run lint`
|
||||
Expected: PASS.
|
||||
|
||||
- [ ] **Step 5: Commit**
|
||||
|
||||
```bash
|
||||
git add src/shared/lib/types.ts src/features/review/AudioDetailPage.tsx
|
||||
git commit -m "client types: drop verified, add job on DetailData"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 5: Delete the Execute page, route, and nav link
|
||||
|
||||
**Files:**
|
||||
- Delete: `src/features/execute/ExecutePage.tsx`
|
||||
- Delete: `src/routes/execute.tsx`
|
||||
- Modify: `src/routes/__root.tsx:72`
|
||||
- Delete (if empty after file removal): `src/features/execute/`
|
||||
|
||||
- [ ] **Step 1: Delete the files**
|
||||
|
||||
```bash
|
||||
rm src/features/execute/ExecutePage.tsx src/routes/execute.tsx
|
||||
rmdir src/features/execute 2>/dev/null || true
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Remove the `Jobs` nav link**
|
||||
|
||||
Open `src/routes/__root.tsx` at line 72 and delete:
|
||||
|
||||
```tsx
|
||||
<NavLink to="/execute">Jobs</NavLink>
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Regenerate the TanStack Router tree**
|
||||
|
||||
The router typegen runs in dev. Start the dev client briefly to regenerate `src/routeTree.gen.ts`:
|
||||
|
||||
Run: `bun run dev:client &`
|
||||
Wait 3 seconds for Vite to finish the initial build and regenerate the tree, then kill it:
|
||||
```bash
|
||||
sleep 3 && kill %1
|
||||
```
|
||||
|
||||
Alternatively, if TSR has a CLI: `bunx @tanstack/router-cli generate`. Either works.
|
||||
|
||||
- [ ] **Step 4: Run build to confirm no dangling imports**
|
||||
|
||||
Run: `bun run build`
|
||||
Expected: PASS with no errors about missing `/execute` route or missing `ExecutePage` import.
|
||||
|
||||
- [ ] **Step 5: Commit**
|
||||
|
||||
```bash
|
||||
git add -A src/
|
||||
git commit -m "delete /execute page, route, and Jobs nav link"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 6: Simplify DoneColumn (remove verify button + checkmark glyph)
|
||||
|
||||
**Files:**
|
||||
- Modify: `src/features/pipeline/DoneColumn.tsx`
|
||||
|
||||
- [ ] **Step 1: Rewrite DoneColumn with the glyph and verify button removed**
|
||||
|
||||
Replace the entire file contents with:
|
||||
|
||||
```tsx
|
||||
import { Link } from "@tanstack/react-router";
|
||||
import { Badge } from "~/shared/components/ui/badge";
|
||||
import { api } from "~/shared/lib/api";
|
||||
import type { PipelineJobItem } from "~/shared/lib/types";
|
||||
import { ColumnShell } from "./ColumnShell";
|
||||
|
||||
interface DoneColumnProps {
|
||||
items: PipelineJobItem[];
|
||||
onMutate: () => void;
|
||||
}
|
||||
|
||||
export function DoneColumn({ items, onMutate }: DoneColumnProps) {
|
||||
const clear = async () => {
|
||||
await api.post("/api/execute/clear-completed");
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const reopen = async (itemId: number) => {
|
||||
await api.post(`/api/review/${itemId}/reopen`);
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const actions = items.length > 0 ? [{ label: "Clear", onClick: clear }] : undefined;
|
||||
|
||||
return (
|
||||
<ColumnShell title="Done" count={items.length} actions={actions}>
|
||||
{items.map((item) => (
|
||||
<div key={item.id} className="group rounded border bg-white p-2">
|
||||
<Link
|
||||
to="/review/audio/$id"
|
||||
params={{ id: String(item.item_id) }}
|
||||
className="text-xs font-medium truncate block hover:text-blue-600 hover:underline"
|
||||
>
|
||||
{item.name}
|
||||
</Link>
|
||||
<div className="flex items-center gap-1.5 mt-0.5">
|
||||
<Badge variant={item.status === "done" ? "done" : "error"}>{item.status}</Badge>
|
||||
<div className="flex-1" />
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => reopen(item.item_id)}
|
||||
title="Send this item back to the Review column to redecide and re-queue"
|
||||
className="text-[0.68rem] px-1.5 py-0.5 rounded border border-gray-300 bg-white text-gray-700 hover:bg-gray-100 opacity-0 group-hover:opacity-100 transition-opacity shrink-0"
|
||||
>
|
||||
← Back to review
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
{items.length === 0 && <p className="text-sm text-gray-400 text-center py-8">No completed items</p>}
|
||||
</ColumnShell>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Run lint**
|
||||
|
||||
Run: `bun run lint`
|
||||
Expected: PASS.
|
||||
|
||||
- [ ] **Step 3: Commit**
|
||||
|
||||
```bash
|
||||
git add src/features/pipeline/DoneColumn.tsx
|
||||
git commit -m "done column: drop checkmark glyph and verify-unverified button"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 7: Add batch controls to QueueColumn header, remove Start queue from Pipeline header
|
||||
|
||||
**Files:**
|
||||
- Modify: `src/features/pipeline/QueueColumn.tsx`
|
||||
- Modify: `src/features/pipeline/PipelinePage.tsx`
|
||||
|
||||
- [ ] **Step 1: Update QueueColumn to expose `Run all` + `Clear queue`**
|
||||
|
||||
Replace the entire file with:
|
||||
|
||||
```tsx
|
||||
import { api } from "~/shared/lib/api";
|
||||
import type { PipelineJobItem } from "~/shared/lib/types";
|
||||
import { ColumnShell } from "./ColumnShell";
|
||||
import { PipelineCard } from "./PipelineCard";
|
||||
|
||||
interface QueueColumnProps {
|
||||
items: PipelineJobItem[];
|
||||
jellyfinUrl: string;
|
||||
onMutate: () => void;
|
||||
}
|
||||
|
||||
export function QueueColumn({ items, jellyfinUrl, onMutate }: QueueColumnProps) {
|
||||
const runAll = async () => {
|
||||
await api.post("/api/execute/start");
|
||||
onMutate();
|
||||
};
|
||||
const clear = async () => {
|
||||
if (!confirm(`Cancel all ${items.length} pending jobs?`)) return;
|
||||
await api.post("/api/execute/clear");
|
||||
onMutate();
|
||||
};
|
||||
const unapprove = async (itemId: number) => {
|
||||
await api.post(`/api/review/${itemId}/unapprove`);
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const actions =
|
||||
items.length > 0
|
||||
? [
|
||||
{ label: "Run all", onClick: runAll, primary: true },
|
||||
{ label: "Clear", onClick: clear },
|
||||
]
|
||||
: undefined;
|
||||
|
||||
return (
|
||||
<ColumnShell title="Queued" count={items.length} actions={actions}>
|
||||
<div className="space-y-2">
|
||||
{items.map((item) => (
|
||||
<PipelineCard key={item.id} item={item} jellyfinUrl={jellyfinUrl} onUnapprove={() => unapprove(item.item_id)} />
|
||||
))}
|
||||
{items.length === 0 && <p className="text-sm text-gray-400 text-center py-8">Queue empty</p>}
|
||||
</div>
|
||||
</ColumnShell>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Remove `Start queue` button from PipelinePage header**
|
||||
|
||||
Open `src/features/pipeline/PipelinePage.tsx`. In the header JSX around line 89–97, delete the `Start queue` `<Button>` and the `startQueue` callback (around lines 34–37). The header should become:
|
||||
|
||||
```tsx
|
||||
<div className="flex items-center justify-between px-6 py-3 border-b shrink-0">
|
||||
<h1 className="text-lg font-semibold">Pipeline</h1>
|
||||
<span className="text-sm text-gray-500">{data.doneCount} files in desired state</span>
|
||||
</div>
|
||||
```
|
||||
|
||||
Also remove the `Button` import at the top of the file if it's no longer used:
|
||||
|
||||
```tsx
|
||||
import { Button } from "~/shared/components/ui/button";
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Run lint**
|
||||
|
||||
Run: `bun run lint`
|
||||
Expected: PASS (no unused-import errors).
|
||||
|
||||
- [ ] **Step 4: Commit**
|
||||
|
||||
```bash
|
||||
git add src/features/pipeline/QueueColumn.tsx src/features/pipeline/PipelinePage.tsx
|
||||
git commit -m "pipeline: batch controls move to queued column header"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 8: Remove `plan_update` SSE listener
|
||||
|
||||
**Files:**
|
||||
- Modify: `src/features/pipeline/PipelinePage.tsx`
|
||||
|
||||
- [ ] **Step 1: Delete the plan_update listener**
|
||||
|
||||
In `src/features/pipeline/PipelinePage.tsx` around lines 67–72, delete:
|
||||
|
||||
```tsx
|
||||
// plan_update lands ~15s after a job finishes — the post-job jellyfin
|
||||
// verification writes verified=1 (or flips the plan back to pending).
|
||||
// Without refreshing here the Done column would never promote ✓ to ✓✓.
|
||||
es.addEventListener("plan_update", () => {
|
||||
scheduleReload();
|
||||
});
|
||||
```
|
||||
|
||||
The other listeners (`job_update`, `job_progress`, `queue_status`) stay untouched.
|
||||
|
||||
- [ ] **Step 2: Run lint**
|
||||
|
||||
Run: `bun run lint`
|
||||
Expected: PASS.
|
||||
|
||||
- [ ] **Step 3: Commit**
|
||||
|
||||
```bash
|
||||
git add src/features/pipeline/PipelinePage.tsx
|
||||
git commit -m "pipeline: remove plan_update SSE listener (feature gone)"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 9: Add JobSection to AudioDetailPage
|
||||
|
||||
**Files:**
|
||||
- Modify: `src/features/review/AudioDetailPage.tsx`
|
||||
|
||||
- [ ] **Step 1: Add a `JobSection` component in the same file**
|
||||
|
||||
Near the bottom of `src/features/review/AudioDetailPage.tsx`, after the `TitleInput` component and before the `AudioDetailPage` export, add:
|
||||
|
||||
```tsx
|
||||
interface JobSectionProps {
|
||||
itemId: number;
|
||||
job: Job;
|
||||
onMutate: () => void;
|
||||
}
|
||||
|
||||
function JobSection({ itemId, job, onMutate }: JobSectionProps) {
|
||||
const [showCmd, setShowCmd] = useState(false);
|
||||
const [showLog, setShowLog] = useState(job.status === "error");
|
||||
const [liveStatus, setLiveStatus] = useState(job.status);
|
||||
const [liveOutput, setLiveOutput] = useState(job.output ?? "");
|
||||
const [progress, setProgress] = useState<{ seconds: number; total: number } | null>(null);
|
||||
|
||||
// Keep local state in sync when parent fetches fresh data
|
||||
useEffect(() => {
|
||||
setLiveStatus(job.status);
|
||||
setLiveOutput(job.output ?? "");
|
||||
}, [job.status, job.output, job.id]);
|
||||
|
||||
// Subscribe to SSE for live updates on this specific job id
|
||||
useEffect(() => {
|
||||
const es = new EventSource("/api/execute/events");
|
||||
es.addEventListener("job_update", (e) => {
|
||||
const d = JSON.parse((e as MessageEvent).data) as { id: number; status: string; output?: string };
|
||||
if (d.id !== job.id) return;
|
||||
setLiveStatus(d.status as Job["status"]);
|
||||
if (d.output !== undefined) setLiveOutput(d.output);
|
||||
if (d.status === "done" || d.status === "error") onMutate();
|
||||
});
|
||||
es.addEventListener("job_progress", (e) => {
|
||||
const d = JSON.parse((e as MessageEvent).data) as { id: number; seconds: number; total: number };
|
||||
if (d.id !== job.id) return;
|
||||
setProgress({ seconds: d.seconds, total: d.total });
|
||||
});
|
||||
return () => es.close();
|
||||
}, [job.id, onMutate]);
|
||||
|
||||
const runJob = async () => {
|
||||
await api.post(`/api/execute/job/${job.id}/run`);
|
||||
onMutate();
|
||||
};
|
||||
const cancelJob = async () => {
|
||||
await api.post(`/api/execute/job/${job.id}/cancel`);
|
||||
onMutate();
|
||||
};
|
||||
const stopJob = async () => {
|
||||
await api.post("/api/execute/stop");
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const typeLabel = job.job_type === "transcode" ? "Audio Transcode" : "Audio Remux";
|
||||
const exitBadge = job.exit_code != null && job.exit_code !== 0 ? job.exit_code : null;
|
||||
|
||||
return (
|
||||
<div className="mt-6 pt-4 border-t border-gray-200">
|
||||
<div className="text-gray-400 text-[0.75rem] uppercase tracking-[0.05em] mb-2">Job</div>
|
||||
<div className="flex items-center gap-2 flex-wrap mb-3">
|
||||
<Badge variant={liveStatus}>{liveStatus}</Badge>
|
||||
<Badge variant={job.job_type === "transcode" ? "manual" : "noop"}>{typeLabel}</Badge>
|
||||
{exitBadge != null && <Badge variant="error">exit {exitBadge}</Badge>}
|
||||
{job.started_at && (
|
||||
<span className="text-gray-500 text-[0.72rem]">started {job.started_at}</span>
|
||||
)}
|
||||
{job.completed_at && (
|
||||
<span className="text-gray-500 text-[0.72rem]">completed {job.completed_at}</span>
|
||||
)}
|
||||
<div className="flex-1" />
|
||||
<Button size="sm" variant="secondary" onClick={() => setShowCmd((v) => !v)}>
|
||||
Cmd
|
||||
</Button>
|
||||
{liveOutput && (
|
||||
<Button size="sm" variant="secondary" onClick={() => setShowLog((v) => !v)}>
|
||||
Log
|
||||
</Button>
|
||||
)}
|
||||
{liveStatus === "pending" && (
|
||||
<>
|
||||
<Button size="sm" onClick={runJob}>
|
||||
▶ Run
|
||||
</Button>
|
||||
<Button size="sm" variant="secondary" onClick={cancelJob}>
|
||||
✕ Cancel
|
||||
</Button>
|
||||
</>
|
||||
)}
|
||||
{liveStatus === "running" && (
|
||||
<Button size="sm" variant="secondary" onClick={stopJob}>
|
||||
✕ Stop
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
{liveStatus === "running" && progress && progress.total > 0 && (
|
||||
<div className="h-1.5 bg-gray-200 rounded mb-3 overflow-hidden">
|
||||
<div
|
||||
className="h-full bg-blue-500 transition-[width] duration-500"
|
||||
style={{ width: `${Math.min(100, (progress.seconds / progress.total) * 100).toFixed(1)}%` }}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
{showCmd && (
|
||||
<div className="font-mono text-[0.74rem] bg-gray-50 text-gray-700 px-3 py-2 rounded max-h-[120px] overflow-y-auto whitespace-pre-wrap break-all mb-2">
|
||||
{job.command}
|
||||
</div>
|
||||
)}
|
||||
{showLog && liveOutput && (
|
||||
<div className="font-mono text-[0.74rem] bg-[#1a1a1a] text-[#d4d4d4] px-3 py-2 rounded max-h-[260px] overflow-y-auto whitespace-pre-wrap break-all">
|
||||
{liveOutput}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
Note: `Badge`'s `variant` prop must accept each of `"pending" | "running" | "done" | "error" | "manual" | "noop"`. Verify by opening `src/shared/components/ui/badge.tsx` — these variants already exist per the Execute page's use. If any are missing, add them there.
|
||||
|
||||
- [ ] **Step 2: Render `JobSection` inside `AudioDetailPage`**
|
||||
|
||||
In the same file, in the `AudioDetailPage` component's JSX, place the JobSection between the FFmpeg command textarea and the Approve/Skip buttons. Locate the existing block around lines 338–348 (the `{command && (...)}` section with the textarea) and add immediately below it:
|
||||
|
||||
```tsx
|
||||
{data.job && <JobSection itemId={item.id} job={data.job} onMutate={load} />}
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Run lint**
|
||||
|
||||
Run: `bun run lint`
|
||||
Expected: PASS.
|
||||
|
||||
- [ ] **Step 4: Run the dev server and verify manually**
|
||||
|
||||
Run: `bun run dev`
|
||||
Open `http://localhost:5173`:
|
||||
- Navigate to an item that has a pending job (approve one from Review, then go to its details page via the Queued card link) → confirm the Job section shows status `pending` and working `▶ Run` / `✕ Cancel` buttons.
|
||||
- Click `▶ Run` → the status badge flips to `running` and the progress bar appears.
|
||||
- When the job finishes → status flips to `done` and the Log button becomes available.
|
||||
- Navigate to a done item → confirm Job section shows status `done`, `Cmd` and `Log` toggles work.
|
||||
|
||||
Kill the dev server with Ctrl-C.
|
||||
|
||||
- [ ] **Step 5: Commit**
|
||||
|
||||
```bash
|
||||
git add src/features/review/AudioDetailPage.tsx
|
||||
git commit -m "details: surface job status, command, log, and run/cancel inline"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 10: Version bump, final build, CalVer commit
|
||||
|
||||
**Files:**
|
||||
- Modify: `package.json` (version field)
|
||||
|
||||
- [ ] **Step 1: Bump the CalVer version**
|
||||
|
||||
Today is 2026-04-15. Read the current version in `package.json`; if it's already `2026.04.15.N`, increment `N`. Otherwise, set it to `2026.04.15.1`.
|
||||
|
||||
Edit `package.json`:
|
||||
```json
|
||||
"version": "2026.04.15.1"
|
||||
```
|
||||
(Use the next free `.N` suffix if `.1` was already used today.)
|
||||
|
||||
- [ ] **Step 2: Run the full build**
|
||||
|
||||
Run: `bun run build`
|
||||
Expected: PASS — Vite produces `dist/` cleanly.
|
||||
|
||||
- [ ] **Step 3: Run tests once more**
|
||||
|
||||
Run: `bun test`
|
||||
Expected: PASS.
|
||||
|
||||
- [ ] **Step 4: Run lint**
|
||||
|
||||
Run: `bun run lint`
|
||||
Expected: PASS.
|
||||
|
||||
- [ ] **Step 5: Commit**
|
||||
|
||||
```bash
|
||||
git add package.json
|
||||
git commit -m "v2026.04.15.1 — drop verify/checkmarks, merge jobs view into item details"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Guided Gates (user-verified after deploy)
|
||||
|
||||
- **GG-1:** Done column shows cards with only a `done`/`error` badge — no ✓ or ✓✓ glyph.
|
||||
- **GG-2:** Clicking a Done item → details page shows Job section below the FFmpeg command box, with `Cmd` and `Log` toggles.
|
||||
- **GG-3:** Clicking a Queued item → details page shows a pending job with working `▶ Run` and `✕ Cancel`; running it updates the badge live.
|
||||
- **GG-4:** `/execute` returns 404 in the browser.
|
||||
- **GG-5:** `Run all` + `Clear` buttons appear in the Queued column header; `Clear` stays in the Done column header; the previous `Start queue` button in the Pipeline page header is gone.
|
||||
- **GG-6:** `PRAGMA table_info(review_plans);` in the SQLite DB no longer lists `verified`.
|
||||
857
docs/superpowers/plans/2026-04-15-review-lazy-load.md
Normal file
857
docs/superpowers/plans/2026-04-15-review-lazy-load.md
Normal file
@@ -0,0 +1,857 @@
|
||||
# Review column lazy-load + season grouping — Implementation Plan
|
||||
|
||||
> **For agentic workers:** Use superpowers:subagent-driven-development. Checkbox (`- [ ]`) syntax tracks progress.
|
||||
|
||||
**Goal:** Replace the 500-item review cap with group-paginated infinite scroll; nest season sub-groups inside series when they have pending work across >1 season; wire the existing `/season/:key/:season/approve-all` endpoint into the UI.
|
||||
|
||||
**Architecture:** Move the grouping logic from the client to the server so groups are always returned complete. New `GET /api/review/groups?offset=N&limit=25` endpoint. Client's ReviewColumn becomes a stateful list that extends itself via `IntersectionObserver` on a sentinel.
|
||||
|
||||
**Tech Stack:** Bun + Hono (server), React 19 + TanStack Router (client), bun:sqlite.
|
||||
|
||||
---
|
||||
|
||||
## Task 1: Server — build grouped data structure + new endpoint
|
||||
|
||||
**Files:**
|
||||
- Modify: `server/api/review.ts`
|
||||
|
||||
- [ ] **Step 1: Add shared types + builder**
|
||||
|
||||
At the top of `server/api/review.ts` (near the other type definitions), add exported types:
|
||||
|
||||
```ts
|
||||
export type ReviewGroup =
|
||||
| { kind: "movie"; item: PipelineReviewItem }
|
||||
| {
|
||||
kind: "series";
|
||||
seriesKey: string;
|
||||
seriesName: string;
|
||||
seriesJellyfinId: string | null;
|
||||
episodeCount: number;
|
||||
minConfidence: "high" | "low";
|
||||
originalLanguage: string | null;
|
||||
seasons: Array<{ season: number | null; episodes: PipelineReviewItem[] }>;
|
||||
};
|
||||
|
||||
export interface ReviewGroupsResponse {
|
||||
groups: ReviewGroup[];
|
||||
totalGroups: number;
|
||||
totalItems: number;
|
||||
hasMore: boolean;
|
||||
}
|
||||
```
|
||||
|
||||
Add a helper after the existing `enrichWithStreamsAndReasons` helper:
|
||||
|
||||
```ts
|
||||
function buildReviewGroups(db: ReturnType<typeof getDb>): {
|
||||
groups: ReviewGroup[];
|
||||
totalItems: number;
|
||||
} {
|
||||
// Fetch ALL pending non-noop items. Grouping + pagination happen in memory.
|
||||
const rows = db
|
||||
.prepare(`
|
||||
SELECT rp.*, mi.name, mi.series_name, mi.series_jellyfin_id,
|
||||
mi.jellyfin_id,
|
||||
mi.season_number, mi.episode_number, mi.type, mi.container,
|
||||
mi.original_language, mi.orig_lang_source, mi.file_path
|
||||
FROM review_plans rp
|
||||
JOIN media_items mi ON mi.id = rp.item_id
|
||||
WHERE rp.status = 'pending' AND rp.is_noop = 0
|
||||
ORDER BY
|
||||
CASE rp.confidence WHEN 'high' THEN 0 ELSE 1 END,
|
||||
COALESCE(mi.series_name, mi.name),
|
||||
mi.season_number, mi.episode_number
|
||||
`)
|
||||
.all() as PipelineReviewItem[];
|
||||
|
||||
const movies: PipelineReviewItem[] = [];
|
||||
const seriesMap = new Map<
|
||||
string,
|
||||
{
|
||||
seriesName: string;
|
||||
seriesJellyfinId: string | null;
|
||||
seasons: Map<number | null, PipelineReviewItem[]>;
|
||||
originalLanguage: string | null;
|
||||
minConfidence: "high" | "low";
|
||||
firstName: string;
|
||||
}
|
||||
>();
|
||||
|
||||
for (const row of rows) {
|
||||
if (row.type === "Movie") {
|
||||
movies.push(row);
|
||||
continue;
|
||||
}
|
||||
const key = row.series_jellyfin_id ?? row.series_name ?? String(row.item_id);
|
||||
let entry = seriesMap.get(key);
|
||||
if (!entry) {
|
||||
entry = {
|
||||
seriesName: row.series_name ?? "",
|
||||
seriesJellyfinId: row.series_jellyfin_id,
|
||||
seasons: new Map(),
|
||||
originalLanguage: row.original_language,
|
||||
minConfidence: row.confidence,
|
||||
firstName: row.series_name ?? "",
|
||||
};
|
||||
seriesMap.set(key, entry);
|
||||
}
|
||||
const season = row.season_number;
|
||||
let bucket = entry.seasons.get(season);
|
||||
if (!bucket) {
|
||||
bucket = [];
|
||||
entry.seasons.set(season, bucket);
|
||||
}
|
||||
bucket.push(row);
|
||||
if (row.confidence === "high" && entry.minConfidence === "low") {
|
||||
// Keep minConfidence as the "best" confidence across episodes — if any
|
||||
// episode is high, that's the group's dominant confidence for sort.
|
||||
// Actually we want the LOWEST (low wins) so user sees low-confidence
|
||||
// groups sorted after high-confidence ones. Revisit: keep low if present.
|
||||
}
|
||||
if (row.confidence === "low") entry.minConfidence = "low";
|
||||
}
|
||||
|
||||
// Sort season keys within each series (nulls last), episodes by episode_number.
|
||||
const seriesGroups: ReviewGroup[] = [];
|
||||
for (const [seriesKey, entry] of seriesMap) {
|
||||
const seasonKeys = [...entry.seasons.keys()].sort((a, b) => {
|
||||
if (a === null) return 1;
|
||||
if (b === null) return -1;
|
||||
return a - b;
|
||||
});
|
||||
const seasons = seasonKeys.map((season) => ({
|
||||
season,
|
||||
episodes: (entry.seasons.get(season) ?? []).sort(
|
||||
(a, b) => (a.episode_number ?? 0) - (b.episode_number ?? 0),
|
||||
),
|
||||
}));
|
||||
const episodeCount = seasons.reduce((sum, s) => sum + s.episodes.length, 0);
|
||||
seriesGroups.push({
|
||||
kind: "series",
|
||||
seriesKey,
|
||||
seriesName: entry.seriesName,
|
||||
seriesJellyfinId: entry.seriesJellyfinId,
|
||||
episodeCount,
|
||||
minConfidence: entry.minConfidence,
|
||||
originalLanguage: entry.originalLanguage,
|
||||
seasons,
|
||||
});
|
||||
}
|
||||
|
||||
// Interleave movies + series, sort by (minConfidence, name).
|
||||
const movieGroups: ReviewGroup[] = movies.map((m) => ({ kind: "movie" as const, item: m }));
|
||||
const allGroups = [...movieGroups, ...seriesGroups].sort((a, b) => {
|
||||
const confA = a.kind === "movie" ? a.item.confidence : a.minConfidence;
|
||||
const confB = b.kind === "movie" ? b.item.confidence : b.minConfidence;
|
||||
const rankA = confA === "high" ? 0 : 1;
|
||||
const rankB = confB === "high" ? 0 : 1;
|
||||
if (rankA !== rankB) return rankA - rankB;
|
||||
const nameA = a.kind === "movie" ? a.item.name : a.seriesName;
|
||||
const nameB = b.kind === "movie" ? b.item.name : b.seriesName;
|
||||
return nameA.localeCompare(nameB);
|
||||
});
|
||||
|
||||
const totalItems = movieGroups.length + seriesGroups.reduce((sum, g) => sum + (g as { episodeCount: number }).episodeCount, 0);
|
||||
return { groups: allGroups, totalItems };
|
||||
}
|
||||
```
|
||||
|
||||
(Delete the stray comment block inside the loop about "keep minConfidence as the best" — the actual logic below it is correct. I left a TODO-style note while drafting; clean it up when editing.)
|
||||
|
||||
- [ ] **Step 2: Add the `/groups` endpoint**
|
||||
|
||||
Add before `app.get("/pipeline", …)`:
|
||||
|
||||
```ts
|
||||
app.get("/groups", (c) => {
|
||||
const db = getDb();
|
||||
const offset = Math.max(0, Number.parseInt(c.req.query("offset") ?? "0", 10) || 0);
|
||||
const limit = Math.max(1, Math.min(200, Number.parseInt(c.req.query("limit") ?? "25", 10) || 25));
|
||||
|
||||
const { groups, totalItems } = buildReviewGroups(db);
|
||||
const page = groups.slice(offset, offset + limit);
|
||||
|
||||
// Enrich each visible episode/movie with audio streams + transcode reasons
|
||||
// (same shape the existing UI expects — reuse the helper already in this file).
|
||||
const flatItemsForEnrichment: Array<{ id: number; plan_id?: number; item_id: number; transcode_reasons?: string[]; audio_streams?: PipelineAudioStream[] }> = [];
|
||||
for (const g of page) {
|
||||
if (g.kind === "movie") flatItemsForEnrichment.push(g.item as never);
|
||||
else for (const s of g.seasons) for (const ep of s.episodes) flatItemsForEnrichment.push(ep as never);
|
||||
}
|
||||
enrichWithStreamsAndReasons(flatItemsForEnrichment);
|
||||
|
||||
return c.json<ReviewGroupsResponse>({
|
||||
groups: page,
|
||||
totalGroups: groups.length,
|
||||
totalItems,
|
||||
hasMore: offset + limit < groups.length,
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
`PipelineAudioStream` already imported; if not, add to existing import block.
|
||||
|
||||
- [ ] **Step 3: Modify `/pipeline` to drop `review`/`reviewTotal`**
|
||||
|
||||
In the existing `app.get("/pipeline", …)` handler (around line 270):
|
||||
|
||||
- Delete the `review` SELECT (lines ~278–293) and the enrichment of `review` rows.
|
||||
- Delete the `reviewTotal` count query (lines ~294–296).
|
||||
- Add in its place: `const reviewItemsTotal = (db.prepare("SELECT COUNT(*) as n FROM review_plans WHERE status = 'pending' AND is_noop = 0").get() as { n: number }).n;`
|
||||
- In the final `return c.json({...})` (line ~430), replace `review, reviewTotal` with `reviewItemsTotal`.
|
||||
|
||||
- [ ] **Step 4: Run tests + lint + tsc**
|
||||
|
||||
```
|
||||
mise exec bun -- bun test
|
||||
mise exec bun -- bun run lint
|
||||
mise exec bun -- bunx tsc --noEmit --project tsconfig.server.json
|
||||
```
|
||||
|
||||
All must pass. If tests that hit `/pipeline` fail because they expect `review[]`, update them in the same commit (they need to migrate anyway).
|
||||
|
||||
- [ ] **Step 5: Commit**
|
||||
|
||||
```bash
|
||||
git add server/api/review.ts
|
||||
git commit -m "review: add /groups endpoint with server-side grouping + pagination"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 2: Server — test `/groups` endpoint
|
||||
|
||||
**Files:**
|
||||
- Create: `server/api/__tests__/review-groups.test.ts`
|
||||
|
||||
- [ ] **Step 1: Write the test file**
|
||||
|
||||
```ts
|
||||
import { describe, expect, test } from "bun:test";
|
||||
import { Hono } from "hono";
|
||||
import reviewRoutes from "../review";
|
||||
import { setupTestDb, seedItem, seedPlan } from "./test-helpers"; // adjust to the project's test helpers; see existing webhook.test.ts for how tests wire up a DB
|
||||
|
||||
const app = new Hono();
|
||||
app.route("/api/review", reviewRoutes);
|
||||
|
||||
describe("GET /api/review/groups", () => {
|
||||
test("returns complete series even when total items exceed limit", async () => {
|
||||
const db = setupTestDb();
|
||||
// Seed 1 series with 30 episodes, all pending non-noop
|
||||
for (let i = 1; i <= 30; i++) seedItem(db, { type: "Episode", seriesName: "Breaking Bad", seasonNumber: 1, episodeNumber: i });
|
||||
for (const row of db.prepare("SELECT id FROM media_items").all() as { id: number }[]) seedPlan(db, row.id, { pending: true, isNoop: false });
|
||||
|
||||
const res = await app.request("/api/review/groups?offset=0&limit=25");
|
||||
const body = await res.json();
|
||||
|
||||
expect(body.groups).toHaveLength(1);
|
||||
expect(body.groups[0].kind).toBe("series");
|
||||
expect(body.groups[0].episodeCount).toBe(30);
|
||||
expect(body.groups[0].seasons[0].episodes).toHaveLength(30);
|
||||
expect(body.totalItems).toBe(30);
|
||||
expect(body.hasMore).toBe(false);
|
||||
});
|
||||
|
||||
test("paginates groups with hasMore=true", async () => {
|
||||
const db = setupTestDb();
|
||||
for (let i = 1; i <= 50; i++) seedItem(db, { type: "Movie", name: `Movie ${String(i).padStart(2, "0")}` });
|
||||
for (const row of db.prepare("SELECT id FROM media_items").all() as { id: number }[]) seedPlan(db, row.id, { pending: true, isNoop: false });
|
||||
|
||||
const page1 = await (await app.request("/api/review/groups?offset=0&limit=25")).json();
|
||||
const page2 = await (await app.request("/api/review/groups?offset=25&limit=25")).json();
|
||||
|
||||
expect(page1.groups).toHaveLength(25);
|
||||
expect(page1.hasMore).toBe(true);
|
||||
expect(page2.groups).toHaveLength(25);
|
||||
expect(page2.hasMore).toBe(false);
|
||||
const ids1 = page1.groups.map((g: { item: { item_id: number } }) => g.item.item_id);
|
||||
const ids2 = page2.groups.map((g: { item: { item_id: number } }) => g.item.item_id);
|
||||
expect(ids1.filter((id: number) => ids2.includes(id))).toHaveLength(0);
|
||||
});
|
||||
|
||||
test("buckets episodes by season, nulls last", async () => {
|
||||
const db = setupTestDb();
|
||||
for (let ep = 1; ep <= 3; ep++) seedItem(db, { type: "Episode", seriesName: "Lost", seasonNumber: 1, episodeNumber: ep });
|
||||
for (let ep = 1; ep <= 2; ep++) seedItem(db, { type: "Episode", seriesName: "Lost", seasonNumber: 2, episodeNumber: ep });
|
||||
seedItem(db, { type: "Episode", seriesName: "Lost", seasonNumber: null, episodeNumber: null });
|
||||
for (const row of db.prepare("SELECT id FROM media_items").all() as { id: number }[]) seedPlan(db, row.id, { pending: true, isNoop: false });
|
||||
|
||||
const body = await (await app.request("/api/review/groups?offset=0&limit=25")).json();
|
||||
const lost = body.groups[0];
|
||||
expect(lost.kind).toBe("series");
|
||||
expect(lost.seasons.map((s: { season: number | null }) => s.season)).toEqual([1, 2, null]);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
Important: this test file needs the project's actual test-helpers pattern. Before writing, look at `server/services/__tests__/webhook.test.ts` (the 60-line one that's still in the repo after the verified-flag block was removed) and **copy its setup style** — including how it creates a test DB, how it seeds media_items and review_plans, and how it invokes the Hono app. Replace the placeholder `setupTestDb`, `seedItem`, `seedPlan` calls with whatever the real helpers are.
|
||||
|
||||
- [ ] **Step 2: Run the tests**
|
||||
|
||||
```
|
||||
mise exec bun -- bun test server/api/__tests__/review-groups.test.ts
|
||||
```
|
||||
|
||||
Expected: 3 passes.
|
||||
|
||||
- [ ] **Step 3: Commit**
|
||||
|
||||
```bash
|
||||
git add server/api/__tests__/review-groups.test.ts
|
||||
git commit -m "test: /groups endpoint — series completeness, pagination, season buckets"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 3: Client types + PipelinePage
|
||||
|
||||
**Files:**
|
||||
- Modify: `src/shared/lib/types.ts`
|
||||
- Modify: `src/features/pipeline/PipelinePage.tsx`
|
||||
|
||||
- [ ] **Step 1: Update shared types**
|
||||
|
||||
In `src/shared/lib/types.ts`, replace the `PipelineData` interface's `review` and `reviewTotal` fields with `reviewItemsTotal: number`. Add types for the new groups response:
|
||||
|
||||
```ts
|
||||
export type ReviewGroup =
|
||||
| { kind: "movie"; item: PipelineReviewItem }
|
||||
| {
|
||||
kind: "series";
|
||||
seriesKey: string;
|
||||
seriesName: string;
|
||||
seriesJellyfinId: string | null;
|
||||
episodeCount: number;
|
||||
minConfidence: "high" | "low";
|
||||
originalLanguage: string | null;
|
||||
seasons: Array<{ season: number | null; episodes: PipelineReviewItem[] }>;
|
||||
};
|
||||
|
||||
export interface ReviewGroupsResponse {
|
||||
groups: ReviewGroup[];
|
||||
totalGroups: number;
|
||||
totalItems: number;
|
||||
hasMore: boolean;
|
||||
}
|
||||
```
|
||||
|
||||
The `PipelineData` interface becomes:
|
||||
```ts
|
||||
export interface PipelineData {
|
||||
reviewItemsTotal: number;
|
||||
queued: PipelineJobItem[];
|
||||
processing: PipelineJobItem[];
|
||||
done: PipelineJobItem[];
|
||||
doneCount: number;
|
||||
jellyfinUrl: string;
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Update PipelinePage**
|
||||
|
||||
Change `PipelinePage.tsx`:
|
||||
|
||||
- Add state for the initial groups page: `const [initialGroups, setInitialGroups] = useState<ReviewGroupsResponse | null>(null);`
|
||||
- In `load()`, fetch both in parallel:
|
||||
```ts
|
||||
const [pipelineRes, groupsRes] = await Promise.all([
|
||||
api.get<PipelineData>("/api/review/pipeline"),
|
||||
api.get<ReviewGroupsResponse>("/api/review/groups?offset=0&limit=25"),
|
||||
]);
|
||||
setData(pipelineRes);
|
||||
setInitialGroups(groupsRes);
|
||||
```
|
||||
- Wait for both before rendering (loading gate: `if (loading || !data || !initialGroups) return <Loading />`).
|
||||
- Pass to ReviewColumn: `<ReviewColumn initialResponse={initialGroups} totalItems={data.reviewItemsTotal} jellyfinUrl={data.jellyfinUrl} onMutate={load} />` — drop `items` and `total` props.
|
||||
|
||||
- [ ] **Step 3: Tsc + lint**
|
||||
|
||||
```
|
||||
mise exec bun -- bunx tsc --noEmit
|
||||
mise exec bun -- bun run lint
|
||||
```
|
||||
|
||||
Expected: errors in `ReviewColumn.tsx` because its props type hasn't been updated yet — that's fine, Task 4 fixes it. For this step, only verify that types.ts and PipelinePage.tsx themselves compile internally. If the build breaks because of ReviewColumn, commit these two files anyway and proceed to Task 4 immediately.
|
||||
|
||||
- [ ] **Step 4: Commit**
|
||||
|
||||
```bash
|
||||
git add src/shared/lib/types.ts src/features/pipeline/PipelinePage.tsx
|
||||
git commit -m "pipeline: fetch review groups endpoint in parallel with pipeline"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 4: Client — ReviewColumn with infinite scroll
|
||||
|
||||
**Files:**
|
||||
- Modify: `src/features/pipeline/ReviewColumn.tsx`
|
||||
|
||||
- [ ] **Step 1: Rewrite ReviewColumn**
|
||||
|
||||
Replace the file contents with:
|
||||
|
||||
```tsx
|
||||
import { useCallback, useEffect, useRef, useState } from "react";
|
||||
import { api } from "~/shared/lib/api";
|
||||
import type { ReviewGroup, ReviewGroupsResponse } from "~/shared/lib/types";
|
||||
import { ColumnShell } from "./ColumnShell";
|
||||
import { PipelineCard } from "./PipelineCard";
|
||||
import { SeriesCard } from "./SeriesCard";
|
||||
|
||||
const PAGE_SIZE = 25;
|
||||
|
||||
interface ReviewColumnProps {
|
||||
initialResponse: ReviewGroupsResponse;
|
||||
totalItems: number;
|
||||
jellyfinUrl: string;
|
||||
onMutate: () => void;
|
||||
}
|
||||
|
||||
export function ReviewColumn({ initialResponse, totalItems, jellyfinUrl, onMutate }: ReviewColumnProps) {
|
||||
const [groups, setGroups] = useState<ReviewGroup[]>(initialResponse.groups);
|
||||
const [hasMore, setHasMore] = useState(initialResponse.hasMore);
|
||||
const [loadingMore, setLoadingMore] = useState(false);
|
||||
const sentinelRef = useRef<HTMLDivElement | null>(null);
|
||||
|
||||
// Reset when parent passes a new initial page (onMutate refetch)
|
||||
useEffect(() => {
|
||||
setGroups(initialResponse.groups);
|
||||
setHasMore(initialResponse.hasMore);
|
||||
}, [initialResponse]);
|
||||
|
||||
const loadMore = useCallback(async () => {
|
||||
if (loadingMore || !hasMore) return;
|
||||
setLoadingMore(true);
|
||||
try {
|
||||
const res = await api.get<ReviewGroupsResponse>(`/api/review/groups?offset=${groups.length}&limit=${PAGE_SIZE}`);
|
||||
setGroups((prev) => [...prev, ...res.groups]);
|
||||
setHasMore(res.hasMore);
|
||||
} finally {
|
||||
setLoadingMore(false);
|
||||
}
|
||||
}, [groups.length, hasMore, loadingMore]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!hasMore || !sentinelRef.current) return;
|
||||
const observer = new IntersectionObserver(
|
||||
(entries) => {
|
||||
if (entries[0]?.isIntersecting) loadMore();
|
||||
},
|
||||
{ rootMargin: "200px" },
|
||||
);
|
||||
observer.observe(sentinelRef.current);
|
||||
return () => observer.disconnect();
|
||||
}, [hasMore, loadMore]);
|
||||
|
||||
const skipAll = async () => {
|
||||
if (!confirm(`Skip all ${totalItems} pending items? They won't be processed unless you unskip them.`)) return;
|
||||
await api.post("/api/review/skip-all");
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const autoApprove = async () => {
|
||||
const res = await api.post<{ ok: boolean; count: number }>("/api/review/auto-approve");
|
||||
onMutate();
|
||||
if (res.count === 0) alert("No high-confidence items to auto-approve.");
|
||||
};
|
||||
|
||||
const approveItem = async (itemId: number) => {
|
||||
await api.post(`/api/review/${itemId}/approve`);
|
||||
onMutate();
|
||||
};
|
||||
const skipItem = async (itemId: number) => {
|
||||
await api.post(`/api/review/${itemId}/skip`);
|
||||
onMutate();
|
||||
};
|
||||
const approveBatch = async (itemIds: number[]) => {
|
||||
if (itemIds.length === 0) return;
|
||||
await api.post<{ ok: boolean; count: number }>("/api/review/approve-batch", { itemIds });
|
||||
onMutate();
|
||||
};
|
||||
|
||||
// Compute ids per visible group for "Approve above"
|
||||
const idsByGroup: number[][] = groups.map((g) =>
|
||||
g.kind === "movie" ? [g.item.item_id] : g.seasons.flatMap((s) => s.episodes.map((ep) => ep.item_id)),
|
||||
);
|
||||
const priorIds = (index: number): number[] => idsByGroup.slice(0, index).flat();
|
||||
|
||||
const actions =
|
||||
totalItems > 0
|
||||
? [
|
||||
{ label: "Auto Review", onClick: autoApprove, primary: true },
|
||||
{ label: "Skip all", onClick: skipAll },
|
||||
]
|
||||
: undefined;
|
||||
|
||||
return (
|
||||
<ColumnShell title="Review" count={totalItems} actions={actions}>
|
||||
<div className="space-y-2">
|
||||
{groups.map((group, index) => {
|
||||
const prior = index > 0 ? priorIds(index) : null;
|
||||
const onApproveUpToHere = prior && prior.length > 0 ? () => approveBatch(prior) : undefined;
|
||||
if (group.kind === "movie") {
|
||||
return (
|
||||
<PipelineCard
|
||||
key={group.item.id}
|
||||
item={group.item}
|
||||
jellyfinUrl={jellyfinUrl}
|
||||
onToggleStream={async (streamId, action) => {
|
||||
await api.patch(`/api/review/${group.item.item_id}/stream/${streamId}`, { action });
|
||||
onMutate();
|
||||
}}
|
||||
onApprove={() => approveItem(group.item.item_id)}
|
||||
onSkip={() => skipItem(group.item.item_id)}
|
||||
onApproveUpToHere={onApproveUpToHere}
|
||||
/>
|
||||
);
|
||||
}
|
||||
return (
|
||||
<SeriesCard
|
||||
key={group.seriesKey}
|
||||
seriesKey={group.seriesKey}
|
||||
seriesName={group.seriesName}
|
||||
jellyfinUrl={jellyfinUrl}
|
||||
seriesJellyfinId={group.seriesJellyfinId}
|
||||
seasons={group.seasons}
|
||||
episodeCount={group.episodeCount}
|
||||
originalLanguage={group.originalLanguage}
|
||||
onMutate={onMutate}
|
||||
onApproveUpToHere={onApproveUpToHere}
|
||||
/>
|
||||
);
|
||||
})}
|
||||
{groups.length === 0 && <p className="text-sm text-gray-400 text-center py-8">No items to review</p>}
|
||||
{hasMore && (
|
||||
<div ref={sentinelRef} className="py-4 text-center text-xs text-gray-400">
|
||||
{loadingMore ? "Loading more…" : ""}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</ColumnShell>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Tsc + lint**
|
||||
|
||||
```
|
||||
mise exec bun -- bunx tsc --noEmit
|
||||
mise exec bun -- bun run lint
|
||||
```
|
||||
|
||||
Expected: the call site in ReviewColumn passes `seasons`, `episodeCount`, `originalLanguage` props to SeriesCard — this will fail until Task 5 updates SeriesCard. Same handling as Task 3 step 3: commit and proceed.
|
||||
|
||||
- [ ] **Step 3: Commit**
|
||||
|
||||
```bash
|
||||
git add src/features/pipeline/ReviewColumn.tsx
|
||||
git commit -m "review column: infinite scroll with IntersectionObserver sentinel"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 5: Client — SeriesCard season nesting
|
||||
|
||||
**Files:**
|
||||
- Modify: `src/features/pipeline/SeriesCard.tsx`
|
||||
|
||||
- [ ] **Step 1: Rewrite SeriesCard**
|
||||
|
||||
Replace the file contents with:
|
||||
|
||||
```tsx
|
||||
import { useState } from "react";
|
||||
import { api } from "~/shared/lib/api";
|
||||
import { LANG_NAMES } from "~/shared/lib/lang";
|
||||
import type { PipelineReviewItem } from "~/shared/lib/types";
|
||||
import { PipelineCard } from "./PipelineCard";
|
||||
|
||||
interface SeriesCardProps {
|
||||
seriesKey: string;
|
||||
seriesName: string;
|
||||
jellyfinUrl: string;
|
||||
seriesJellyfinId: string | null;
|
||||
seasons: Array<{ season: number | null; episodes: PipelineReviewItem[] }>;
|
||||
episodeCount: number;
|
||||
originalLanguage: string | null;
|
||||
onMutate: () => void;
|
||||
onApproveUpToHere?: () => void;
|
||||
}
|
||||
|
||||
export function SeriesCard({
|
||||
seriesKey,
|
||||
seriesName,
|
||||
jellyfinUrl,
|
||||
seriesJellyfinId,
|
||||
seasons,
|
||||
episodeCount,
|
||||
originalLanguage,
|
||||
onMutate,
|
||||
onApproveUpToHere,
|
||||
}: SeriesCardProps) {
|
||||
const [expanded, setExpanded] = useState(false);
|
||||
|
||||
const flatEpisodes = seasons.flatMap((s) => s.episodes);
|
||||
const highCount = flatEpisodes.filter((e) => e.confidence === "high").length;
|
||||
const lowCount = flatEpisodes.filter((e) => e.confidence === "low").length;
|
||||
const multipleSeasons = seasons.length > 1;
|
||||
|
||||
const setSeriesLanguage = async (lang: string) => {
|
||||
await api.patch(`/api/review/series/${encodeURIComponent(seriesKey)}/language`, { language: lang });
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const approveSeries = async () => {
|
||||
await api.post(`/api/review/series/${encodeURIComponent(seriesKey)}/approve-all`);
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const approveSeason = async (season: number | null) => {
|
||||
if (season == null) return;
|
||||
await api.post(`/api/review/season/${encodeURIComponent(seriesKey)}/${season}/approve-all`);
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const jellyfinLink =
|
||||
jellyfinUrl && seriesJellyfinId ? `${jellyfinUrl}/web/index.html#!/details?id=${seriesJellyfinId}` : null;
|
||||
|
||||
return (
|
||||
<div className="group/series rounded-lg border bg-white overflow-hidden">
|
||||
{/* Title row */}
|
||||
<div
|
||||
className="flex items-center gap-2 px-3 pt-3 pb-1 cursor-pointer hover:bg-gray-50 rounded-t-lg"
|
||||
onClick={() => setExpanded(!expanded)}
|
||||
>
|
||||
<span className="text-xs text-gray-400 shrink-0">{expanded ? "▼" : "▶"}</span>
|
||||
{jellyfinLink ? (
|
||||
<a
|
||||
href={jellyfinLink}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="text-sm font-medium truncate hover:text-blue-600 hover:underline"
|
||||
onClick={(e) => e.stopPropagation()}
|
||||
>
|
||||
{seriesName}
|
||||
</a>
|
||||
) : (
|
||||
<p className="text-sm font-medium truncate">{seriesName}</p>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Controls row */}
|
||||
<div className="flex items-center gap-2 px-3 pb-3 pt-1">
|
||||
<span className="text-xs text-gray-500 shrink-0">{episodeCount} eps</span>
|
||||
{multipleSeasons && <span className="text-xs text-gray-500 shrink-0">· {seasons.length} seasons</span>}
|
||||
{highCount > 0 && <span className="text-xs text-green-600 shrink-0">{highCount} ready</span>}
|
||||
{lowCount > 0 && <span className="text-xs text-amber-600 shrink-0">{lowCount} review</span>}
|
||||
<div className="flex-1" />
|
||||
<select
|
||||
className="h-6 text-xs border border-gray-300 rounded px-1 bg-white shrink-0"
|
||||
value={originalLanguage ?? ""}
|
||||
onChange={(e) => {
|
||||
e.stopPropagation();
|
||||
setSeriesLanguage(e.target.value);
|
||||
}}
|
||||
>
|
||||
<option value="">unknown</option>
|
||||
{Object.entries(LANG_NAMES).map(([code, name]) => (
|
||||
<option key={code} value={code}>
|
||||
{name}
|
||||
</option>
|
||||
))}
|
||||
</select>
|
||||
{onApproveUpToHere && (
|
||||
<button
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
onApproveUpToHere();
|
||||
}}
|
||||
title="Approve every card listed above this one"
|
||||
className="text-xs px-2 py-1 rounded border border-blue-600 text-blue-700 bg-white hover:bg-blue-50 cursor-pointer whitespace-nowrap shrink-0 opacity-0 group-hover/series:opacity-100 transition-opacity"
|
||||
>
|
||||
↑ Approve above
|
||||
</button>
|
||||
)}
|
||||
<button
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
approveSeries();
|
||||
}}
|
||||
className="text-xs px-2 py-1 rounded bg-blue-600 text-white hover:bg-blue-700 cursor-pointer whitespace-nowrap shrink-0"
|
||||
>
|
||||
Approve series
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{expanded && (
|
||||
<div className="border-t">
|
||||
{multipleSeasons
|
||||
? seasons.map((s) => (
|
||||
<SeasonGroup
|
||||
key={s.season ?? "unknown"}
|
||||
season={s.season}
|
||||
episodes={s.episodes}
|
||||
jellyfinUrl={jellyfinUrl}
|
||||
onApproveSeason={() => approveSeason(s.season)}
|
||||
onMutate={onMutate}
|
||||
/>
|
||||
))
|
||||
: flatEpisodes.map((ep) => (
|
||||
<EpisodeRow key={ep.id} ep={ep} jellyfinUrl={jellyfinUrl} onMutate={onMutate} />
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function SeasonGroup({
|
||||
season,
|
||||
episodes,
|
||||
jellyfinUrl,
|
||||
onApproveSeason,
|
||||
onMutate,
|
||||
}: {
|
||||
season: number | null;
|
||||
episodes: PipelineReviewItem[];
|
||||
jellyfinUrl: string;
|
||||
onApproveSeason: () => void;
|
||||
onMutate: () => void;
|
||||
}) {
|
||||
const [open, setOpen] = useState(false);
|
||||
const highCount = episodes.filter((e) => e.confidence === "high").length;
|
||||
const lowCount = episodes.filter((e) => e.confidence === "low").length;
|
||||
const label = season == null ? "No season" : `Season ${String(season).padStart(2, "0")}`;
|
||||
|
||||
return (
|
||||
<div className="border-t first:border-t-0">
|
||||
<div
|
||||
className="flex items-center gap-2 px-3 py-2 cursor-pointer hover:bg-gray-50"
|
||||
onClick={() => setOpen(!open)}
|
||||
>
|
||||
<span className="text-xs text-gray-400 shrink-0">{open ? "▼" : "▶"}</span>
|
||||
<span className="text-xs font-medium shrink-0">{label}</span>
|
||||
<span className="text-xs text-gray-500 shrink-0">· {episodes.length} eps</span>
|
||||
{highCount > 0 && <span className="text-xs text-green-600 shrink-0">{highCount} ready</span>}
|
||||
{lowCount > 0 && <span className="text-xs text-amber-600 shrink-0">{lowCount} review</span>}
|
||||
<div className="flex-1" />
|
||||
{season != null && (
|
||||
<button
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
onApproveSeason();
|
||||
}}
|
||||
className="text-xs px-2 py-1 rounded border border-blue-600 text-blue-700 bg-white hover:bg-blue-50 cursor-pointer whitespace-nowrap shrink-0"
|
||||
>
|
||||
Approve season
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
{open && (
|
||||
<div className="px-3 pb-3 space-y-2 pt-2">
|
||||
{episodes.map((ep) => (
|
||||
<EpisodeRow key={ep.id} ep={ep} jellyfinUrl={jellyfinUrl} onMutate={onMutate} />
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function EpisodeRow({ ep, jellyfinUrl, onMutate }: { ep: PipelineReviewItem; jellyfinUrl: string; onMutate: () => void }) {
|
||||
return (
|
||||
<div className="px-3 py-1">
|
||||
<PipelineCard
|
||||
item={ep}
|
||||
jellyfinUrl={jellyfinUrl}
|
||||
onToggleStream={async (streamId, action) => {
|
||||
await api.patch(`/api/review/${ep.item_id}/stream/${streamId}`, { action });
|
||||
onMutate();
|
||||
}}
|
||||
onApprove={async () => {
|
||||
await api.post(`/api/review/${ep.item_id}/approve`);
|
||||
onMutate();
|
||||
}}
|
||||
onSkip={async () => {
|
||||
await api.post(`/api/review/${ep.item_id}/skip`);
|
||||
onMutate();
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
(The `EpisodeRow` wrapper keeps the padding consistent whether episodes render directly under the series or under a season group.)
|
||||
|
||||
- [ ] **Step 2: Lint + tsc + test + build**
|
||||
|
||||
```
|
||||
mise exec bun -- bun run lint
|
||||
mise exec bun -- bunx tsc --noEmit
|
||||
mise exec bun -- bun test
|
||||
mise exec bun -- bun run build
|
||||
```
|
||||
|
||||
All must pass now that the whole pipeline (server → types → PipelinePage → ReviewColumn → SeriesCard) is consistent.
|
||||
|
||||
- [ ] **Step 3: Manual smoke test**
|
||||
|
||||
```
|
||||
mise exec bun -- bun run dev
|
||||
```
|
||||
|
||||
Navigate to the Pipeline page:
|
||||
- Confirm no "Showing first 500 of N" banner.
|
||||
- Scroll the Review column to the bottom; new groups auto-load.
|
||||
- Find a series with pending work in >1 season; expand it; confirm nested seasons with working `Approve season` button.
|
||||
- Find a series with pending work in a single season; expand it; confirm flat episode list (no season nesting).
|
||||
- Click `Approve series` on a series with many pending episodes; confirm the whole series vanishes from the column.
|
||||
|
||||
Kill the dev server.
|
||||
|
||||
- [ ] **Step 4: Commit**
|
||||
|
||||
```bash
|
||||
git add src/features/pipeline/SeriesCard.tsx
|
||||
git commit -m "series card: nest seasons when >1 pending, add Approve season button"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 6: Version bump + final push
|
||||
|
||||
- [ ] **Step 1: Bump CalVer**
|
||||
|
||||
In `package.json`, set version to today's next free dot-suffix (today is 2026-04-15; prior releases are `.1` and `.2`, so use `.3` unless already taken).
|
||||
|
||||
- [ ] **Step 2: Final checks**
|
||||
|
||||
```
|
||||
mise exec bun -- bun run lint
|
||||
mise exec bun -- bunx tsc --noEmit
|
||||
mise exec bun -- bunx tsc --noEmit --project tsconfig.server.json
|
||||
mise exec bun -- bun test
|
||||
mise exec bun -- bun run build
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Commit + push**
|
||||
|
||||
```bash
|
||||
git add package.json
|
||||
git commit -m "v2026.04.15.3 — review column lazy-load + season grouping"
|
||||
git push gitea main
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Guided Gates (user-verified)
|
||||
|
||||
- **GG-1:** No "Showing first 500 of N" banner.
|
||||
- **GG-2:** A series with episodes previously split across the cap now shows the correct episode count.
|
||||
- **GG-3:** A series with >1 pending season expands into nested season groups, each with a working `Approve season` button.
|
||||
- **GG-4:** A series with 1 pending season expands flat (no extra nesting).
|
||||
- **GG-5:** Scrolling to the bottom of Review auto-loads the next page; no scroll = no extra fetch.
|
||||
47
docs/superpowers/plans/2026-04-15-scan-page-rework.md
Normal file
47
docs/superpowers/plans/2026-04-15-scan-page-rework.md
Normal file
@@ -0,0 +1,47 @@
|
||||
# Scan Page Rework Implementation Plan
|
||||
|
||||
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
|
||||
|
||||
**Goal:** Rework the Scan page to prioritize progress + fresh ingest visibility, and add a scalable filterable/lazy-loaded library table.
|
||||
|
||||
**Architecture:** Keep `/api/scan` lightweight for status/progress and compact recent ingest rows. Add `/api/scan/items` for paginated/filterable DB browsing. Update `ScanPage` to render: scan card header count, compact 5-row recent ingest table, then a filterable lazy-loaded library table.
|
||||
|
||||
**Tech Stack:** Bun + Hono, React 19 + TanStack Router, bun:test, Biome.
|
||||
|
||||
---
|
||||
|
||||
### Task 1: Backend scan payload + items endpoint (TDD)
|
||||
|
||||
**Files:**
|
||||
- Modify: `server/api/__tests__/scan.test.ts`
|
||||
- Modify: `server/db/schema.ts`
|
||||
- Modify: `server/db/index.ts`
|
||||
- Modify: `server/services/rescan.ts`
|
||||
- Modify: `server/api/scan.ts`
|
||||
|
||||
- [ ] Add failing tests for scan item query parsing/normalization and SQL filter behavior helpers.
|
||||
- [ ] Run targeted tests to verify failure.
|
||||
- [ ] Add `media_items.ingest_source` schema + migration, set value on upsert (`scan`/`webhook`).
|
||||
- [ ] Extend `GET /api/scan` recent item shape with timestamp + ingest source and clamp to 5 rows.
|
||||
- [ ] Add `GET /api/scan/items` with filters (`q,status,type,source`) + pagination (`offset,limit`), returning `{ rows,total,hasMore }`.
|
||||
- [ ] Run targeted and full backend tests.
|
||||
|
||||
### Task 2: Scan page UI rework + lazy table
|
||||
|
||||
**Files:**
|
||||
- Modify: `src/features/scan/ScanPage.tsx`
|
||||
|
||||
- [ ] Refactor scan box header to show scanned count in top-right.
|
||||
- [ ] Replace large recent-items table with a compact 5-row recent ingest list directly under progress bar.
|
||||
- [ ] Add filter controls for library table (`q,status,type,source`) with default “All”.
|
||||
- [ ] Add lazy loading flow (initial fetch + load more) against `/api/scan/items`.
|
||||
- [ ] Render new table with useful file metadata columns and consistent truncation/tooltips.
|
||||
|
||||
### Task 3: Verification
|
||||
|
||||
**Files:**
|
||||
- Modify: none
|
||||
|
||||
- [ ] Run `bun test`.
|
||||
- [ ] Run `bun run lint` and format if needed.
|
||||
- [ ] Confirm no regressions in scan start/stop/progress behavior.
|
||||
@@ -0,0 +1,114 @@
|
||||
# Drop verify/checkmarks, merge jobs view into item details
|
||||
|
||||
Date: 2026-04-15
|
||||
|
||||
## Summary
|
||||
|
||||
Remove the post-job Jellyfin verification path and its associated `verified` flag entirely. Delete the standalone `/execute` jobs page. Surface per-item job info (status, command, log, run/cancel actions) on the item details page instead. Batch queue controls (Run all / Clear) move into the Pipeline column headers.
|
||||
|
||||
Rescan becomes the sole source of truth for "is this file still done?" — if a file drifts off-noop, the next scan flips its plan back to pending and the card reappears in Review.
|
||||
|
||||
## Motivation
|
||||
|
||||
The verify feature tried to promote done cards from ✓ to ✓✓ after ffprobe/Jellyfin cross-checked the on-disk file. In practice the Jellyfin refresh path is fragile (times out silently), the ✓/✓✓ distinction adds UI noise without user value, and rescan already catches drift. The separate Jobs page duplicates info that belongs on the item details page and forces users to jump between views to answer "what happened to this file?".
|
||||
|
||||
## Backend changes
|
||||
|
||||
### Remove verification path
|
||||
- Delete `handOffToJellyfin()` in `server/api/execute.ts` (≈lines 38–98) and both callers at `:492` and `:609`. Post-job handling is now just the existing `jobs.status` update.
|
||||
- Delete `emitPlanUpdate()` and the `plan_update` SSE event emission.
|
||||
- Delete `POST /api/execute/verify-unverified` (≈lines 357–389).
|
||||
|
||||
### Drop `verified` column
|
||||
- Add idempotent migration in `server/db/index.ts` following the existing try/catch `alter()` pattern:
|
||||
```ts
|
||||
alter("ALTER TABLE review_plans DROP COLUMN verified");
|
||||
```
|
||||
Supported on Bun's bundled SQLite (≥3.35).
|
||||
- Remove `verified` from `server/db/schema.ts:77` in the `review_plans` CREATE TABLE.
|
||||
- In `server/services/rescan.ts`, remove `verified` from the INSERT column list and the `verified = CASE ...` branch in the ON CONFLICT DO UPDATE clause.
|
||||
- In `server/api/review.ts`:
|
||||
- Remove `rp.verified` from the pipeline SELECT (≈line 330).
|
||||
- Remove `verified = 0` from the unapprove UPDATE (≈line 773).
|
||||
|
||||
### Remove jobs-list endpoint
|
||||
- Delete `GET /api/execute` (the filtered list used only by the Execute page).
|
||||
- Keep: `/start`, `/clear`, `/clear-completed`, `/job/:id/run`, `/job/:id/cancel`, `/stop`, `/events`, SSE events `job_update`, `job_progress`, `queue_status`.
|
||||
|
||||
### Enrich item details endpoint
|
||||
- Extend `GET /api/review/:id` to include the latest job row for this item (if any):
|
||||
```ts
|
||||
job: {
|
||||
id: number;
|
||||
status: Job["status"];
|
||||
job_type: "copy" | "transcode";
|
||||
command: string | null;
|
||||
output: string | null;
|
||||
exit_code: number | null;
|
||||
started_at: string | null;
|
||||
completed_at: string | null;
|
||||
} | null
|
||||
```
|
||||
- "Latest" = most recent by `jobs.created_at DESC LIMIT 1` for the item. A single additional prepared statement.
|
||||
|
||||
## Frontend changes
|
||||
|
||||
### Deletions
|
||||
- `src/features/execute/ExecutePage.tsx`
|
||||
- `src/routes/execute.tsx`
|
||||
- Nav link to `/execute` in `src/routes/__root.tsx`
|
||||
- `plan_update` SSE listener in `src/features/pipeline/PipelinePage.tsx:70-72`
|
||||
- Verify button, `verifyUnverified()`, `unverifiedCount`, and the `✓`/`✓✓` glyph span in `src/features/pipeline/DoneColumn.tsx`
|
||||
- The `verified` field on `PipelineJobItem` in `src/shared/lib/types.ts:161`
|
||||
|
||||
### DoneColumn simplification
|
||||
Each card keeps its title link, `← Back to review` hover button, and status `<Badge>` (`done` or `error`). The mark glyph and its `title` attribute go away. Column actions stay: `Clear` when items exist.
|
||||
|
||||
### Pipeline column headers (batch controls)
|
||||
`ColumnShell` already accepts an `actions: ColumnAction[]` array. Move existing batch controls off the `/execute` page into the headers:
|
||||
- **Queued column** — `Run all` (primary, when at least one pending) + `Clear queue` (when items exist)
|
||||
- **Done column** — `Clear` (existing)
|
||||
- **Processing column** — no batch controls
|
||||
|
||||
Wire these to the existing endpoints: `/api/execute/start`, `/api/execute/clear`, `/api/execute/clear-completed`.
|
||||
|
||||
### AudioDetailPage job section
|
||||
New section rendered only when `data.job` is non-null. Placement: between the FFmpeg command textarea and the Approve/Skip button row.
|
||||
|
||||
Contents:
|
||||
- Header row: status `<Badge>`, job-type badge (`Audio Remux`/`Audio Transcode`), started/completed timestamps, exit code badge (only when non-zero)
|
||||
- `Cmd` toggle button — reveals the job's recorded command (the `jobs.command` column)
|
||||
- `Log` toggle button — reveals `jobs.output`; auto-expanded when `status === "error"`
|
||||
- Action buttons based on `job.status`:
|
||||
- `pending` → `▶ Run` (calls `POST /api/execute/job/:id/run`), `✕ Cancel` (calls `POST /api/execute/job/:id/cancel`)
|
||||
- `running` → `✕ Stop` (calls `POST /api/execute/stop`)
|
||||
- `done` / `error` → no actions
|
||||
|
||||
### Live updates on details page
|
||||
The details page gets its own scoped `EventSource` subscription to `/api/execute/events`, filtering for events where `id === data.job?.id`:
|
||||
- `job_update` → merge into local state, re-fetch details on terminal (`done`/`error`) to pick up the refreshed `jobs` row
|
||||
- `job_progress` → update a progress bar for the active job
|
||||
- Close on unmount
|
||||
|
||||
## Data flow after the change
|
||||
|
||||
1. User approves plan in Review → plan.status = approved
|
||||
2. User clicks `Run all` in Queued column header → queued jobs start
|
||||
3. Processing column shows the running job with live progress (unchanged)
|
||||
4. Job finishes → `jobs.status = done`, `review_plans.status = done`. No Jellyfin refresh, no verified flip.
|
||||
5. Card lands in Done column with a `done` badge. No ✓/✓✓ glyph.
|
||||
6. Next scan (automatic or manual) re-analyzes the file. If still `is_noop = 1`, plan stays `done`; if not, plan returns to `pending` and the card reappears in Review.
|
||||
|
||||
## Testing
|
||||
|
||||
- Delete `server/services/__tests__/webhook.test.ts:186-240` — the "webhook_verified flag" describe block. The remaining webhook tests (status transitions, upserts) stay.
|
||||
- No new tests required: this spec removes features, does not add behavior.
|
||||
|
||||
## Guided Gates
|
||||
|
||||
- **GG-1:** After deploy, confirm the Done column shows cards with only a `done`/`error` badge — no ✓ or ✓✓ glyph.
|
||||
- **GG-2:** Click an item in Done → details page shows the job section below the FFmpeg command box, with `Cmd` and `Log` toggles.
|
||||
- **GG-3:** Click an item in Queued → details page shows a pending job with working `▶ Run` and `✕ Cancel` buttons; running the job updates the badge live.
|
||||
- **GG-4:** `/execute` in the browser returns a 404 (route is gone).
|
||||
- **GG-5:** `Run all` and `Clear queue` buttons appear in the Queued column header; `Clear` stays in the Done column header.
|
||||
- **GG-6:** `PRAGMA table_info(review_plans);` in the SQLite DB no longer lists a `verified` column.
|
||||
111
docs/superpowers/specs/2026-04-15-review-lazy-load-design.md
Normal file
111
docs/superpowers/specs/2026-04-15-review-lazy-load-design.md
Normal file
@@ -0,0 +1,111 @@
|
||||
# Review column lazy-load + season grouping
|
||||
|
||||
Date: 2026-04-15
|
||||
|
||||
## Summary
|
||||
|
||||
Replace the Review column's 500-item hard cap with server-side group-paginated lazy loading. Series are always returned complete (every pending non-noop episode, grouped by season), eliminating the "2 eps" mirage caused by groups getting split across the cap. When a series has pending work in more than one season, the UI nests seasons as collapsible sub-groups, each with its own "Approve season" button.
|
||||
|
||||
## Motivation
|
||||
|
||||
`server/api/review.ts:277` caps the pipeline's review list at 500 items. ReviewColumn groups client-side, so any series whose episodes spill beyond the cap shows a wrong episode count and partial episode list. The banner "Showing first 500 of N" is present but misleading — the *groups* don't survive the cut, not just the tail.
|
||||
|
||||
The existing "Approve all" button on a series card already calls `/series/:seriesKey/approve-all`, which operates on the DB directly and does approve every pending episode — so functionality works, only the display is wrong. Still, partial groups are confusing and the 500 cap forces users to approve in waves.
|
||||
|
||||
## Server changes
|
||||
|
||||
### New endpoint `GET /api/review/groups?offset=0&limit=25`
|
||||
|
||||
Response:
|
||||
```ts
|
||||
{
|
||||
groups: ReviewGroup[];
|
||||
totalGroups: number;
|
||||
totalItems: number;
|
||||
hasMore: boolean;
|
||||
}
|
||||
|
||||
type ReviewGroup =
|
||||
| { kind: "movie"; item: PipelineReviewItem }
|
||||
| {
|
||||
kind: "series";
|
||||
seriesKey: string;
|
||||
seriesName: string;
|
||||
seriesJellyfinId: string | null;
|
||||
episodeCount: number;
|
||||
minConfidence: "high" | "low";
|
||||
originalLanguage: string | null;
|
||||
seasons: Array<{ season: number | null; episodes: PipelineReviewItem[] }>;
|
||||
};
|
||||
```
|
||||
|
||||
Ordering:
|
||||
- Groups ordered by (min confidence across group ASC — `high` < `low`), then (series_name or movie name ASC)
|
||||
- Within a series, seasons ordered by `season_number` ASC (`null` last)
|
||||
- Within a season, episodes ordered by `episode_number` ASC
|
||||
|
||||
Implementation outline:
|
||||
1. Query all pending non-noop plans joined to media_items (existing `review` query minus the LIMIT).
|
||||
2. Walk once in sort order, producing groups: a Movie becomes a one-shot `{ kind: "movie" }`; consecutive Episodes sharing `series_jellyfin_id` (or `series_name` fallback) accumulate into a `{ kind: "series" }` with `seasons` bucketed by `season_number`.
|
||||
3. Apply `.slice(offset, offset + limit)` over the full group list, enrich per-episode audio streams + transcode reasons for episodes that survive (reuse existing `enrichWithStreamsAndReasons`).
|
||||
4. `totalGroups` = full group count before slicing. `totalItems` = sum of episode counts + movie count (unchanged from today's `reviewTotal`). `hasMore` = `offset + limit < totalGroups`.
|
||||
|
||||
### `GET /api/review/pipeline` changes
|
||||
|
||||
Drop `review` and `reviewTotal` from the response. Add `reviewItemsTotal: number` so the column header shows a count before the groups endpoint resolves. Queue / Processing / Done / doneCount stay unchanged.
|
||||
|
||||
### Kept as-is
|
||||
|
||||
- `POST /api/review/series/:seriesKey/approve-all` (`review.ts:529`)
|
||||
- `POST /api/review/season/:seriesKey/:season/approve-all` (`review.ts:549`) — already implemented, just unused by the UI until now
|
||||
|
||||
## Client changes
|
||||
|
||||
### PipelinePage
|
||||
|
||||
Fetches `/api/review/pipeline` for queue columns (existing) and separately `/api/review/groups?offset=0&limit=25` for the Review column's initial page. `onMutate` refetches both. Pass `reviewGroups`, `reviewGroupsTotalItems`, `reviewHasMore` into `ReviewColumn`.
|
||||
|
||||
### ReviewColumn
|
||||
|
||||
Replace the hard-cap rendering with infinite scroll:
|
||||
- Render the current loaded groups.
|
||||
- Append a sentinel `<div>` at the bottom when `hasMore`. An `IntersectionObserver` attached to it triggers a fetch of the next page when it enters the scroll viewport.
|
||||
- Pagination state (`offset`, `groups`, `hasMore`, `loading`) lives locally in ReviewColumn — parent passes `initialGroups` on mount and whenever the filter changes (`onMutate` → parent refetches page 0).
|
||||
- Remove the "Showing first N of M" banner and the `truncated` logic.
|
||||
|
||||
### SeriesCard
|
||||
|
||||
When `seasons.length > 1`:
|
||||
- Render seasons as collapsible sub-groups inside the expanded series body.
|
||||
- Each season header: `S{NN} — {episodeCount} eps · {high} high / {low} low` + an `Approve season` button.
|
||||
|
||||
When `seasons.length === 1`:
|
||||
- Render the current flat episode list (no extra nesting).
|
||||
|
||||
Rename the existing header button `Approve all` → `Approve series`.
|
||||
|
||||
### "Approve above"
|
||||
|
||||
Keeps its current "approve every group currently visible above this card" semantic. With lazy loading, that means "everything the user has scrolled past". Compute item ids client-side across the loaded groups as today. No endpoint change.
|
||||
|
||||
## Data flow
|
||||
|
||||
1. PipelinePage mounts → parallel fetch `/pipeline` + `/groups?offset=0&limit=25`.
|
||||
2. User scrolls; sentinel becomes visible → fetch `/groups?offset=25&limit=25`; appended to the list.
|
||||
3. User clicks `Approve series` on a card → `POST /series/:key/approve-all` → `onMutate` → parent refetches `/pipeline` + `/groups?offset=0&limit=25`. Series gone from list.
|
||||
4. User clicks `Approve season S02` on a nested season → `POST /season/:key/2/approve-all` → `onMutate` → same refetch.
|
||||
|
||||
## Testing
|
||||
|
||||
- Server unit test: `/groups` endpoint returns a series with all pending episodes even when the total item count exceeds `limit * offset_pages`.
|
||||
- Server unit test: offset/limit/hasMore correctness across the group boundary.
|
||||
- Server unit test: seasons array is populated, sorted, with `null` season_number ordered last.
|
||||
- Manual: scroll through the Review column on a library with >1000 pending items and confirm episode counts match `SELECT COUNT(*) ... WHERE pending AND is_noop=0` scoped per series.
|
||||
|
||||
## Guided Gates
|
||||
|
||||
- **GG-1:** No "Showing first 500 of N" banner ever appears.
|
||||
- **GG-2:** A series whose episodes previously split across the cap now shows the correct episode count immediately on first page load (if the series is in the first page) or after scroll (if not).
|
||||
- **GG-3:** A series with pending episodes in 2+ seasons expands into nested season sub-groups, each with an `Approve season` button that approves only that season.
|
||||
- **GG-4:** A series with pending episodes in exactly one season expands into the flat episode list as before.
|
||||
- **GG-5:** Scrolling to the bottom of the Review column auto-fetches the next page without a click; scrolling stops fetching when `hasMore` is false.
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "netfelix-audio-fix",
|
||||
"version": "2026.04.14.13",
|
||||
"version": "2026.04.15.10",
|
||||
"scripts": {
|
||||
"dev:server": "NODE_ENV=development bun --hot server/index.tsx",
|
||||
"dev:client": "vite",
|
||||
|
||||
85
server/api/__tests__/execute.test.ts
Normal file
85
server/api/__tests__/execute.test.ts
Normal file
@@ -0,0 +1,85 @@
|
||||
import { describe, expect, test } from "bun:test";
|
||||
import { enqueueUnseenJobs, extractErrorSummary, shouldSendLiveUpdate, yieldAfterChunk } from "../execute";
|
||||
|
||||
describe("extractErrorSummary", () => {
|
||||
test("pulls the real error line out of ffmpeg's banner", () => {
|
||||
const lines = [
|
||||
"[stderr] ffmpeg version 7.1.3 ...",
|
||||
"[stderr] built with gcc 14",
|
||||
"[stderr] Stream #0:2(eng): Subtitle: dvd_subtitle (dvdsub), 1280x720",
|
||||
"[stderr] Stream mapping:",
|
||||
"[stderr] Stream #0:2 -> #0:0 (copy)",
|
||||
"[stderr] [srt @ 0x55] Unsupported subtitles codec: dvd_subtitle",
|
||||
"[stderr] [out#0/srt @ 0x55] Could not write header (incorrect codec parameters ?): Invalid argument",
|
||||
"[stderr] Conversion failed!",
|
||||
];
|
||||
const summary = extractErrorSummary(lines, new Error("FFmpeg exited with code 234"));
|
||||
expect(summary).toContain("Unsupported subtitles codec: dvd_subtitle");
|
||||
expect(summary).toContain("Invalid argument");
|
||||
expect(summary).toContain("Conversion failed!");
|
||||
// Should NOT include the banner lines.
|
||||
expect(summary).not.toContain("ffmpeg version");
|
||||
expect(summary).not.toContain("Stream #0:2");
|
||||
});
|
||||
|
||||
test("dedupes identical fatal lines (e.g. repeated warnings)", () => {
|
||||
const lines = ["[stderr] Conversion failed!", "[stderr] Conversion failed!", "[stderr] Conversion failed!"];
|
||||
const summary = extractErrorSummary(lines);
|
||||
expect(summary?.split("\n").length).toBe(1);
|
||||
});
|
||||
|
||||
test("falls back to the thrown error when no fatal line is found", () => {
|
||||
const lines = ["[stderr] ffmpeg version 7", "[stderr] Duration: 00:10:00"];
|
||||
const summary = extractErrorSummary(lines, new Error("FFmpeg exited with code 1"));
|
||||
expect(summary).toBe("Error: FFmpeg exited with code 1");
|
||||
});
|
||||
|
||||
test("returns null when neither a fatal line nor a thrown error is available", () => {
|
||||
expect(extractErrorSummary([])).toBe(null);
|
||||
expect(extractErrorSummary(["[stderr] ffmpeg version 7"])).toBe(null);
|
||||
});
|
||||
|
||||
test("only scans the tail — a banner from a prior run doesn't leak through", () => {
|
||||
// 70 filler lines, real error at the very end; scan window is 60.
|
||||
const filler = Array.from({ length: 70 }, (_, i) => `[stderr] banner line ${i}`);
|
||||
const lines = [...filler, "[stderr] Error: no space left on device"];
|
||||
const summary = extractErrorSummary(lines);
|
||||
expect(summary).toBe("Error: no space left on device");
|
||||
});
|
||||
});
|
||||
|
||||
describe("shouldSendLiveUpdate", () => {
|
||||
test("throttles updates until interval passes", () => {
|
||||
expect(shouldSendLiveUpdate(1_000, 800, 500)).toBe(false);
|
||||
expect(shouldSendLiveUpdate(1_301, 800, 500)).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe("yieldAfterChunk", () => {
|
||||
test("yields once threshold is reached, resets chunk counter", async () => {
|
||||
let yieldCalls = 0;
|
||||
const sleep = async (_ms: number) => {
|
||||
yieldCalls += 1;
|
||||
};
|
||||
let chunks = 0;
|
||||
chunks = await yieldAfterChunk(chunks, 3, sleep);
|
||||
expect(chunks).toBe(1);
|
||||
chunks = await yieldAfterChunk(chunks, 3, sleep);
|
||||
expect(chunks).toBe(2);
|
||||
chunks = await yieldAfterChunk(chunks, 3, sleep);
|
||||
expect(chunks).toBe(0);
|
||||
expect(yieldCalls).toBe(1);
|
||||
});
|
||||
});
|
||||
|
||||
describe("enqueueUnseenJobs", () => {
|
||||
test("appends only unseen job ids to the active queue", () => {
|
||||
const queue = [{ id: 1 }, { id: 2 }] as { id: number }[];
|
||||
const seen = new Set([1, 2]);
|
||||
const added = enqueueUnseenJobs(queue, seen, [{ id: 2 }, { id: 3 }, { id: 4 }] as { id: number }[]);
|
||||
expect(added).toBe(2);
|
||||
expect(queue.map((j) => j.id)).toEqual([1, 2, 3, 4]);
|
||||
expect(seen.has(3)).toBeTrue();
|
||||
expect(seen.has(4)).toBeTrue();
|
||||
});
|
||||
});
|
||||
161
server/api/__tests__/review-groups.test.ts
Normal file
161
server/api/__tests__/review-groups.test.ts
Normal file
@@ -0,0 +1,161 @@
|
||||
import { Database } from "bun:sqlite";
|
||||
import { describe, expect, test } from "bun:test";
|
||||
import { SCHEMA } from "../../db/schema";
|
||||
import { buildReviewGroups } from "../review";
|
||||
|
||||
function makeDb(): Database {
|
||||
const db = new Database(":memory:");
|
||||
for (const stmt of SCHEMA.split(";")) {
|
||||
const trimmed = stmt.trim();
|
||||
if (trimmed) db.run(trimmed);
|
||||
}
|
||||
return db;
|
||||
}
|
||||
|
||||
interface SeedOpts {
|
||||
id: number;
|
||||
type: "Movie" | "Episode";
|
||||
name?: string;
|
||||
seriesName?: string | null;
|
||||
seriesJellyfinId?: string | null;
|
||||
seasonNumber?: number | null;
|
||||
episodeNumber?: number | null;
|
||||
confidence?: "high" | "low";
|
||||
}
|
||||
|
||||
function seed(db: Database, opts: SeedOpts) {
|
||||
const {
|
||||
id,
|
||||
type,
|
||||
name = `Item ${id}`,
|
||||
seriesName = null,
|
||||
seriesJellyfinId = null,
|
||||
seasonNumber = null,
|
||||
episodeNumber = null,
|
||||
confidence = "high",
|
||||
} = opts;
|
||||
db
|
||||
.prepare(
|
||||
"INSERT INTO media_items (id, jellyfin_id, type, name, series_name, series_jellyfin_id, season_number, episode_number, file_path) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)",
|
||||
)
|
||||
.run(id, `jf-${id}`, type, name, seriesName, seriesJellyfinId, seasonNumber, episodeNumber, `/x/${id}.mkv`);
|
||||
db
|
||||
.prepare(
|
||||
"INSERT INTO review_plans (item_id, status, is_noop, confidence, apple_compat, job_type, notes) VALUES (?, 'pending', 0, ?, 'direct_play', 'copy', NULL)",
|
||||
)
|
||||
.run(id, confidence);
|
||||
}
|
||||
|
||||
describe("buildReviewGroups", () => {
|
||||
test("returns a complete series with every pending episode", () => {
|
||||
const db = makeDb();
|
||||
for (let i = 1; i <= 30; i++) {
|
||||
seed(db, {
|
||||
id: i,
|
||||
type: "Episode",
|
||||
seriesName: "Breaking Bad",
|
||||
seriesJellyfinId: "bb",
|
||||
seasonNumber: 1,
|
||||
episodeNumber: i,
|
||||
});
|
||||
}
|
||||
|
||||
const { groups, totalItems } = buildReviewGroups(db);
|
||||
|
||||
expect(groups).toHaveLength(1);
|
||||
const series = groups[0];
|
||||
expect(series.kind).toBe("series");
|
||||
if (series.kind !== "series") throw new Error("expected series");
|
||||
expect(series.episodeCount).toBe(30);
|
||||
expect(series.seasons).toHaveLength(1);
|
||||
expect(series.seasons[0].episodes).toHaveLength(30);
|
||||
expect(totalItems).toBe(30);
|
||||
});
|
||||
|
||||
test("buckets episodes by season with null ordered last", () => {
|
||||
const db = makeDb();
|
||||
for (let ep = 1; ep <= 3; ep++) {
|
||||
seed(db, {
|
||||
id: ep,
|
||||
type: "Episode",
|
||||
seriesName: "Lost",
|
||||
seriesJellyfinId: "lost",
|
||||
seasonNumber: 1,
|
||||
episodeNumber: ep,
|
||||
});
|
||||
}
|
||||
for (let ep = 1; ep <= 2; ep++) {
|
||||
seed(db, {
|
||||
id: 10 + ep,
|
||||
type: "Episode",
|
||||
seriesName: "Lost",
|
||||
seriesJellyfinId: "lost",
|
||||
seasonNumber: 2,
|
||||
episodeNumber: ep,
|
||||
});
|
||||
}
|
||||
seed(db, { id: 99, type: "Episode", seriesName: "Lost", seriesJellyfinId: "lost", seasonNumber: null });
|
||||
|
||||
const { groups } = buildReviewGroups(db);
|
||||
expect(groups).toHaveLength(1);
|
||||
const lost = groups[0];
|
||||
if (lost.kind !== "series") throw new Error("expected series");
|
||||
expect(lost.seasons.map((s) => s.season)).toEqual([1, 2, null]);
|
||||
expect(lost.seasons[0].episodes).toHaveLength(3);
|
||||
expect(lost.seasons[1].episodes).toHaveLength(2);
|
||||
expect(lost.seasons[2].episodes).toHaveLength(1);
|
||||
});
|
||||
|
||||
test("sorts groups: high-confidence first, then by name", () => {
|
||||
const db = makeDb();
|
||||
seed(db, { id: 1, type: "Movie", name: "Zodiac", confidence: "high" });
|
||||
seed(db, { id: 2, type: "Movie", name: "Arrival", confidence: "low" });
|
||||
seed(db, { id: 3, type: "Movie", name: "Blade Runner", confidence: "high" });
|
||||
|
||||
const { groups } = buildReviewGroups(db);
|
||||
const names = groups.map((g) => (g.kind === "movie" ? g.item.name : g.seriesName));
|
||||
expect(names).toEqual(["Blade Runner", "Zodiac", "Arrival"]);
|
||||
});
|
||||
|
||||
test("minConfidence is low when any episode in the series is low", () => {
|
||||
const db = makeDb();
|
||||
seed(db, {
|
||||
id: 1,
|
||||
type: "Episode",
|
||||
seriesName: "Show",
|
||||
seriesJellyfinId: "s",
|
||||
seasonNumber: 1,
|
||||
episodeNumber: 1,
|
||||
confidence: "high",
|
||||
});
|
||||
seed(db, {
|
||||
id: 2,
|
||||
type: "Episode",
|
||||
seriesName: "Show",
|
||||
seriesJellyfinId: "s",
|
||||
seasonNumber: 1,
|
||||
episodeNumber: 2,
|
||||
confidence: "low",
|
||||
});
|
||||
|
||||
const { groups } = buildReviewGroups(db);
|
||||
expect(groups).toHaveLength(1);
|
||||
if (groups[0].kind !== "series") throw new Error("expected series");
|
||||
expect(groups[0].minConfidence).toBe("low");
|
||||
});
|
||||
|
||||
test("excludes plans that are not pending or are is_noop=1", () => {
|
||||
const db = makeDb();
|
||||
seed(db, { id: 1, type: "Movie", name: "Pending" });
|
||||
seed(db, { id: 2, type: "Movie", name: "Approved" });
|
||||
db.prepare("UPDATE review_plans SET status = 'approved' WHERE item_id = ?").run(2);
|
||||
seed(db, { id: 3, type: "Movie", name: "Noop" });
|
||||
db.prepare("UPDATE review_plans SET is_noop = 1 WHERE item_id = ?").run(3);
|
||||
|
||||
const { groups, totalItems } = buildReviewGroups(db);
|
||||
expect(groups).toHaveLength(1);
|
||||
expect(totalItems).toBe(1);
|
||||
if (groups[0].kind !== "movie") throw new Error("expected movie");
|
||||
expect(groups[0].item.name).toBe("Pending");
|
||||
});
|
||||
});
|
||||
101
server/api/__tests__/scan.test.ts
Normal file
101
server/api/__tests__/scan.test.ts
Normal file
@@ -0,0 +1,101 @@
|
||||
import { describe, expect, test } from "bun:test";
|
||||
import { buildScanItemsWhere, parseScanItemsQuery, parseScanLimit } from "../scan";
|
||||
|
||||
describe("parseScanLimit", () => {
|
||||
test("accepts positive integers and nullish/empty as no-limit", () => {
|
||||
expect(parseScanLimit(5)).toEqual({ ok: true, value: 5 });
|
||||
expect(parseScanLimit(1)).toEqual({ ok: true, value: 1 });
|
||||
expect(parseScanLimit(10_000)).toEqual({ ok: true, value: 10_000 });
|
||||
expect(parseScanLimit(null)).toEqual({ ok: true, value: null });
|
||||
expect(parseScanLimit(undefined)).toEqual({ ok: true, value: null });
|
||||
expect(parseScanLimit("")).toEqual({ ok: true, value: null });
|
||||
});
|
||||
|
||||
test("coerces numeric strings (env var path) but rejects garbage", () => {
|
||||
expect(parseScanLimit("7")).toEqual({ ok: true, value: 7 });
|
||||
expect(parseScanLimit("abc")).toEqual({ ok: false });
|
||||
expect(parseScanLimit("12abc")).toEqual({ ok: false });
|
||||
});
|
||||
|
||||
test("rejects the footguns that would silently disable the cap", () => {
|
||||
// NaN: processed >= NaN never trips → cap never fires.
|
||||
expect(parseScanLimit(Number.NaN)).toEqual({ ok: false });
|
||||
// Negative: off-by-one bugs in Math.min(limit, total).
|
||||
expect(parseScanLimit(-1)).toEqual({ ok: false });
|
||||
expect(parseScanLimit(0)).toEqual({ ok: false });
|
||||
// Float: Math.min is fine but percentage math breaks on non-integers.
|
||||
expect(parseScanLimit(1.5)).toEqual({ ok: false });
|
||||
// Infinity is technically a number but has no business as a cap.
|
||||
expect(parseScanLimit(Number.POSITIVE_INFINITY)).toEqual({ ok: false });
|
||||
});
|
||||
});
|
||||
|
||||
describe("parseScanItemsQuery", () => {
|
||||
test("normalizes default filters and pagination", () => {
|
||||
const q = parseScanItemsQuery({});
|
||||
expect(q).toEqual({
|
||||
offset: 0,
|
||||
limit: 50,
|
||||
search: "",
|
||||
status: "all",
|
||||
type: "all",
|
||||
source: "all",
|
||||
});
|
||||
});
|
||||
|
||||
test("clamps limit and offset, trims and lowercases values", () => {
|
||||
const q = parseScanItemsQuery({
|
||||
offset: "-12",
|
||||
limit: "5000",
|
||||
q: " The Wire ",
|
||||
status: "SCANNED",
|
||||
type: "EPISODE",
|
||||
source: "WEBHOOK",
|
||||
});
|
||||
expect(q).toEqual({
|
||||
offset: 0,
|
||||
limit: 200,
|
||||
search: "The Wire",
|
||||
status: "scanned",
|
||||
type: "episode",
|
||||
source: "webhook",
|
||||
});
|
||||
});
|
||||
|
||||
test("falls back to all for unknown enum values", () => {
|
||||
const q = parseScanItemsQuery({ status: "zzz", type: "cartoon", source: "mqtt" });
|
||||
expect(q.status).toBe("all");
|
||||
expect(q.type).toBe("all");
|
||||
expect(q.source).toBe("all");
|
||||
});
|
||||
});
|
||||
|
||||
describe("buildScanItemsWhere", () => {
|
||||
test("builds combined where clause + args in stable order", () => {
|
||||
const where = buildScanItemsWhere({
|
||||
offset: 0,
|
||||
limit: 50,
|
||||
search: "blade",
|
||||
status: "scanned",
|
||||
type: "movie",
|
||||
source: "webhook",
|
||||
});
|
||||
expect(where.sql).toBe(
|
||||
"WHERE scan_status = ? AND lower(type) = ? AND ingest_source = ? AND (lower(name) LIKE ? OR lower(file_path) LIKE ?)",
|
||||
);
|
||||
expect(where.args).toEqual(["scanned", "movie", "webhook", "%blade%", "%blade%"]);
|
||||
});
|
||||
|
||||
test("returns empty where when all filters are broad", () => {
|
||||
const where = buildScanItemsWhere({
|
||||
offset: 0,
|
||||
limit: 50,
|
||||
search: "",
|
||||
status: "all",
|
||||
type: "all",
|
||||
source: "all",
|
||||
});
|
||||
expect(where.sql).toBe("");
|
||||
expect(where.args).toEqual([]);
|
||||
});
|
||||
});
|
||||
@@ -1,10 +1,10 @@
|
||||
import { accessSync, constants } from "node:fs";
|
||||
import { Hono } from "hono";
|
||||
import { stream } from "hono/streaming";
|
||||
import { getAllConfig, getDb } from "../db/index";
|
||||
import { getDb } from "../db/index";
|
||||
import { log, error as logError, warn } from "../lib/log";
|
||||
import { parseId } from "../lib/validate";
|
||||
import { predictExtractedFiles } from "../services/ffmpeg";
|
||||
import { refreshItem } from "../services/jellyfin";
|
||||
import {
|
||||
getScheduleConfig,
|
||||
isInProcessWindow,
|
||||
@@ -16,29 +16,6 @@ import {
|
||||
import { verifyDesiredState } from "../services/verify";
|
||||
import type { Job, MediaItem, MediaStream } from "../types";
|
||||
|
||||
/**
|
||||
* Fire-and-forget hand-off to Jellyfin after a successful job: ask Jellyfin
|
||||
* to re-scan the file and return immediately. The MQTT webhook subscriber
|
||||
* closes the loop once Jellyfin finishes its rescan and publishes an event.
|
||||
*/
|
||||
async function handOffToJellyfin(itemId: number): Promise<void> {
|
||||
const db = getDb();
|
||||
const row = db.prepare("SELECT jellyfin_id FROM media_items WHERE id = ?").get(itemId) as
|
||||
| { jellyfin_id: string }
|
||||
| undefined;
|
||||
if (!row) return;
|
||||
|
||||
const cfg = getAllConfig();
|
||||
const jellyfinCfg = { url: cfg.jellyfin_url, apiKey: cfg.jellyfin_api_key, userId: cfg.jellyfin_user_id };
|
||||
if (!jellyfinCfg.url || !jellyfinCfg.apiKey) return;
|
||||
|
||||
try {
|
||||
await refreshItem(jellyfinCfg, row.jellyfin_id);
|
||||
} catch (err) {
|
||||
warn(`Jellyfin refresh for item ${itemId} failed: ${String(err)}`);
|
||||
}
|
||||
}
|
||||
|
||||
const app = new Hono();
|
||||
|
||||
// ─── Sequential local queue ──────────────────────────────────────────────────
|
||||
@@ -46,6 +23,36 @@ const app = new Hono();
|
||||
let queueRunning = false;
|
||||
let runningProc: ReturnType<typeof Bun.spawn> | null = null;
|
||||
let runningJobId: number | null = null;
|
||||
let activeQueue: Job[] | null = null;
|
||||
let activeSeen: Set<number> | null = null;
|
||||
const LIVE_UPDATE_INTERVAL_MS = 500;
|
||||
const STREAM_CHUNKS_BEFORE_YIELD = 24;
|
||||
|
||||
export function shouldSendLiveUpdate(now: number, lastSentAt: number, intervalMs = LIVE_UPDATE_INTERVAL_MS): boolean {
|
||||
return now - lastSentAt > intervalMs;
|
||||
}
|
||||
|
||||
export async function yieldAfterChunk(
|
||||
chunksSinceYield: number,
|
||||
chunksBeforeYield = STREAM_CHUNKS_BEFORE_YIELD,
|
||||
sleep: (ms: number) => Promise<unknown> = (ms) => Bun.sleep(ms),
|
||||
): Promise<number> {
|
||||
const next = chunksSinceYield + 1;
|
||||
if (next < chunksBeforeYield) return next;
|
||||
await sleep(0);
|
||||
return 0;
|
||||
}
|
||||
|
||||
export function enqueueUnseenJobs<T extends { id: number }>(queue: T[], seen: Set<number>, jobs: T[]): number {
|
||||
let added = 0;
|
||||
for (const job of jobs) {
|
||||
if (seen.has(job.id)) continue;
|
||||
queue.push(job);
|
||||
seen.add(job.id);
|
||||
added += 1;
|
||||
}
|
||||
return added;
|
||||
}
|
||||
|
||||
function emitQueueStatus(
|
||||
status: "running" | "paused" | "sleeping" | "idle",
|
||||
@@ -55,12 +62,19 @@ function emitQueueStatus(
|
||||
for (const l of jobListeners) l(line);
|
||||
}
|
||||
|
||||
async function runSequential(jobs: Job[]): Promise<void> {
|
||||
async function runSequential(initial: Job[]): Promise<void> {
|
||||
if (queueRunning) return;
|
||||
queueRunning = true;
|
||||
try {
|
||||
let first = true;
|
||||
for (const job of jobs) {
|
||||
const queue: Job[] = [...initial];
|
||||
const seen = new Set<number>(queue.map((j) => j.id));
|
||||
activeQueue = queue;
|
||||
activeSeen = seen;
|
||||
|
||||
while (queue.length > 0) {
|
||||
const job = queue.shift() as Job;
|
||||
|
||||
// Pause outside the processing window
|
||||
if (!isInProcessWindow()) {
|
||||
emitQueueStatus("paused", {
|
||||
@@ -94,8 +108,18 @@ async function runSequential(jobs: Job[]): Promise<void> {
|
||||
} catch (err) {
|
||||
logError(`Job ${job.id} failed:`, err);
|
||||
}
|
||||
|
||||
// When the local queue drains, re-check the DB for jobs that were
|
||||
// approved mid-run. Without this they'd sit pending until the user
|
||||
// manually clicks "Run all" again.
|
||||
if (queue.length === 0) {
|
||||
const more = db.prepare("SELECT * FROM jobs WHERE status = 'pending' ORDER BY created_at").all() as Job[];
|
||||
enqueueUnseenJobs(queue, seen, more);
|
||||
}
|
||||
}
|
||||
} finally {
|
||||
activeQueue = null;
|
||||
activeSeen = null;
|
||||
queueRunning = false;
|
||||
emitQueueStatus("idle");
|
||||
}
|
||||
@@ -161,77 +185,17 @@ function loadJobRow(jobId: number) {
|
||||
return { job: row as unknown as Job, item };
|
||||
}
|
||||
|
||||
// ─── List ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
app.get("/", (c) => {
|
||||
const db = getDb();
|
||||
const filter = (c.req.query("filter") ?? "pending") as "all" | "pending" | "running" | "done" | "error";
|
||||
|
||||
const validFilters = ["all", "pending", "running", "done", "error"];
|
||||
const whereClause = validFilters.includes(filter) && filter !== "all" ? `WHERE j.status = ?` : "";
|
||||
const params = whereClause ? [filter] : [];
|
||||
|
||||
const jobRows = db
|
||||
.prepare(`
|
||||
SELECT j.*, mi.name, mi.type, mi.series_name, mi.season_number, mi.episode_number, mi.file_path
|
||||
FROM jobs j
|
||||
LEFT JOIN media_items mi ON mi.id = j.item_id
|
||||
${whereClause}
|
||||
ORDER BY j.created_at DESC
|
||||
LIMIT 200
|
||||
`)
|
||||
.all(...params) as (Job & {
|
||||
name: string;
|
||||
type: string;
|
||||
series_name: string | null;
|
||||
season_number: number | null;
|
||||
episode_number: number | null;
|
||||
file_path: string;
|
||||
})[];
|
||||
|
||||
const jobs = jobRows.map((r) => ({
|
||||
job: r as unknown as Job,
|
||||
item: r.name
|
||||
? ({
|
||||
id: r.item_id,
|
||||
name: r.name,
|
||||
type: r.type,
|
||||
series_name: r.series_name,
|
||||
season_number: r.season_number,
|
||||
episode_number: r.episode_number,
|
||||
file_path: r.file_path,
|
||||
} as unknown as MediaItem)
|
||||
: null,
|
||||
}));
|
||||
|
||||
const countRows = db.prepare("SELECT status, COUNT(*) as cnt FROM jobs GROUP BY status").all() as {
|
||||
status: string;
|
||||
cnt: number;
|
||||
}[];
|
||||
const totalCounts: Record<string, number> = { all: 0, pending: 0, running: 0, done: 0, error: 0 };
|
||||
for (const row of countRows) {
|
||||
totalCounts[row.status] = row.cnt;
|
||||
totalCounts.all += row.cnt;
|
||||
}
|
||||
|
||||
return c.json({ jobs, filter, totalCounts });
|
||||
});
|
||||
|
||||
// ─── Param helpers ────────────────────────────────────────────────────────────
|
||||
|
||||
function parseId(raw: string | undefined): number | null {
|
||||
if (!raw) return null;
|
||||
const n = Number.parseInt(raw, 10);
|
||||
return Number.isFinite(n) && n > 0 ? n : null;
|
||||
}
|
||||
|
||||
// ─── Start all pending ────────────────────────────────────────────────────────
|
||||
|
||||
app.post("/start", (c) => {
|
||||
const db = getDb();
|
||||
const pending = db.prepare("SELECT * FROM jobs WHERE status = 'pending' ORDER BY created_at").all() as Job[];
|
||||
if (queueRunning && activeQueue && activeSeen) {
|
||||
const queued = enqueueUnseenJobs(activeQueue, activeSeen, pending);
|
||||
return c.json({ ok: true, started: 0, queued });
|
||||
}
|
||||
runSequential(pending).catch((err) => logError("Queue failed:", err));
|
||||
return c.json({ ok: true, started: pending.length });
|
||||
return c.json({ ok: true, started: pending.length, queued: pending.length });
|
||||
});
|
||||
|
||||
// ─── Run single ───────────────────────────────────────────────────────────────
|
||||
@@ -398,14 +362,16 @@ async function runJob(job: Job): Promise<void> {
|
||||
const updateOutput = db.prepare("UPDATE jobs SET output = ? WHERE id = ?");
|
||||
|
||||
const flush = (final = false) => {
|
||||
const text = outputLines.join("\n");
|
||||
const now = Date.now();
|
||||
if (final || now - lastFlushAt > 500) {
|
||||
if (!final && !shouldSendLiveUpdate(now, lastFlushAt)) {
|
||||
pendingFlush = true;
|
||||
return;
|
||||
}
|
||||
const text = outputLines.join("\n");
|
||||
if (final || shouldSendLiveUpdate(now, lastFlushAt)) {
|
||||
updateOutput.run(text, job.id);
|
||||
lastFlushAt = now;
|
||||
pendingFlush = false;
|
||||
} else {
|
||||
pendingFlush = true;
|
||||
}
|
||||
emitJobUpdate(job.id, "running", text);
|
||||
};
|
||||
@@ -418,7 +384,7 @@ async function runJob(job: Job): Promise<void> {
|
||||
const progressed = parseFFmpegProgress(line);
|
||||
if (progressed != null && totalSeconds > 0) {
|
||||
const now = Date.now();
|
||||
if (now - lastProgressEmit > 500) {
|
||||
if (shouldSendLiveUpdate(now, lastProgressEmit)) {
|
||||
emitJobProgress(job.id, progressed, totalSeconds);
|
||||
lastProgressEmit = now;
|
||||
}
|
||||
@@ -433,6 +399,7 @@ async function runJob(job: Job): Promise<void> {
|
||||
const reader = readable.getReader();
|
||||
const decoder = new TextDecoder();
|
||||
let buffer = "";
|
||||
let chunksSinceYield = 0;
|
||||
try {
|
||||
while (true) {
|
||||
const { done, value } = await reader.read();
|
||||
@@ -446,6 +413,8 @@ async function runJob(job: Job): Promise<void> {
|
||||
consumeProgress(line);
|
||||
}
|
||||
flush();
|
||||
// Let pending HTTP requests run even when ffmpeg floods stdout/stderr.
|
||||
chunksSinceYield = await yieldAfterChunk(chunksSinceYield);
|
||||
}
|
||||
if (buffer.trim()) {
|
||||
outputLines.push(prefix + buffer);
|
||||
@@ -487,29 +456,58 @@ async function runJob(job: Job): Promise<void> {
|
||||
|
||||
log(`Job ${job.id} completed successfully`);
|
||||
emitJobUpdate(job.id, "done", fullOutput);
|
||||
|
||||
// Fire-and-forget: tell Jellyfin to rescan the file. The MQTT subscriber
|
||||
// will pick up Jellyfin's resulting Library event and re-analyze the
|
||||
// item — flipping the plan back to 'pending' if the on-disk streams
|
||||
// don't actually match the plan. We don't await that; the job queue
|
||||
// moves on.
|
||||
handOffToJellyfin(job.item_id).catch((err) =>
|
||||
warn(`Jellyfin hand-off for item ${job.item_id} failed: ${String(err)}`),
|
||||
);
|
||||
} catch (err) {
|
||||
logError(`Job ${job.id} failed:`, err);
|
||||
const fullOutput = `${outputLines.join("\n")}\n${String(err)}`;
|
||||
const summary = extractErrorSummary(outputLines, err);
|
||||
// Prepend the scraped summary so the job log starts with what broke.
|
||||
// ffmpeg's 200-line stream+config banner buries the real error; this
|
||||
// gives the UI a crisp hook for the failure cause.
|
||||
const annotatedOutput = summary ? `${summary}\n\n---\n\n${fullOutput}` : fullOutput;
|
||||
db
|
||||
.prepare("UPDATE jobs SET status = 'error', exit_code = 1, output = ?, completed_at = datetime('now') WHERE id = ?")
|
||||
.run(fullOutput, job.id);
|
||||
emitJobUpdate(job.id, "error", fullOutput);
|
||||
db.prepare("UPDATE review_plans SET status = 'error' WHERE item_id = ?").run(job.item_id);
|
||||
.run(annotatedOutput, job.id);
|
||||
emitJobUpdate(job.id, "error", annotatedOutput);
|
||||
db
|
||||
.prepare("UPDATE review_plans SET status = 'error', notes = ? WHERE item_id = ?")
|
||||
.run(summary ?? String(err), job.item_id);
|
||||
} finally {
|
||||
runningProc = null;
|
||||
runningJobId = null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract a short, human-readable reason from a failed job's stderr.
|
||||
*
|
||||
* ffmpeg prints a ~200-line banner (version, config, every stream in the
|
||||
* input file) before the real error shows up. We scan the tail of the
|
||||
* output for the last line matching fatal keywords, plus anything ffmpeg
|
||||
* explicitly labels "Error:" or "Conversion failed!". Returns up to three
|
||||
* lines so the UI can show a crisp summary without users scrolling the
|
||||
* full log.
|
||||
*/
|
||||
export function extractErrorSummary(outputLines: string[], thrown?: unknown): string | null {
|
||||
const FATAL =
|
||||
/(Error:|Conversion failed!|Unsupported\b|Invalid argument|Permission denied|No such file|Cannot allocate|No space left|Killed|Segmentation fault)/;
|
||||
// Only scan the last 60 lines — anything earlier is the banner or stream
|
||||
// mapping. The real cause sits near the end.
|
||||
const tail = outputLines.slice(-60).filter((l) => l.trim());
|
||||
const hits: string[] = [];
|
||||
for (const line of tail) {
|
||||
if (FATAL.test(line)) hits.push(line.replace(/^\[stderr]\s*/, ""));
|
||||
}
|
||||
const unique = [...new Set(hits)].slice(-3);
|
||||
if (unique.length === 0) {
|
||||
// Fell off the end with no recognisable fatal line — fall back to the
|
||||
// thrown error (usually "FFmpeg exited with code N"). Better than
|
||||
// showing nothing, since the exit code at least tells someone *where*
|
||||
// to look.
|
||||
return thrown ? String(thrown) : null;
|
||||
}
|
||||
return unique.join("\n");
|
||||
}
|
||||
|
||||
// Scheduler endpoints live on /api/settings/schedule now — see server/api/settings.ts.
|
||||
|
||||
// ─── FFmpeg progress parsing ───────────────────────────────────────────────────
|
||||
|
||||
@@ -4,7 +4,7 @@ import { isOneOf, parseId } from "../lib/validate";
|
||||
import { analyzeItem, assignTargetOrder } from "../services/analyzer";
|
||||
import { buildCommand } from "../services/ffmpeg";
|
||||
import { getItem, mapStream, normalizeLanguage, refreshItem } from "../services/jellyfin";
|
||||
import type { MediaItem, MediaStream, ReviewPlan, StreamDecision } from "../types";
|
||||
import type { Job, MediaItem, MediaStream, ReviewPlan, StreamDecision } from "../types";
|
||||
|
||||
const app = new Hono();
|
||||
|
||||
@@ -110,7 +110,7 @@ function rowToPlan(r: RawRow): ReviewPlan | null {
|
||||
|
||||
function loadItemDetail(db: ReturnType<typeof getDb>, itemId: number) {
|
||||
const item = db.prepare("SELECT * FROM media_items WHERE id = ?").get(itemId) as MediaItem | undefined;
|
||||
if (!item) return { item: null, streams: [], plan: null, decisions: [], command: null };
|
||||
if (!item) return { item: null, streams: [], plan: null, decisions: [], command: null, job: null };
|
||||
|
||||
const streams = db
|
||||
.prepare("SELECT * FROM media_streams WHERE item_id = ? ORDER BY stream_index")
|
||||
@@ -122,7 +122,15 @@ function loadItemDetail(db: ReturnType<typeof getDb>, itemId: number) {
|
||||
|
||||
const command = plan && !plan.is_noop ? buildCommand(item, streams, decisions) : null;
|
||||
|
||||
return { item, streams, plan: plan ?? null, decisions, command };
|
||||
const job = db
|
||||
.prepare(
|
||||
`SELECT id, item_id, command, job_type, status, output, exit_code,
|
||||
created_at, started_at, completed_at
|
||||
FROM jobs WHERE item_id = ? ORDER BY created_at DESC LIMIT 1`,
|
||||
)
|
||||
.get(itemId) as Job | undefined;
|
||||
|
||||
return { item, streams, plan: plan ?? null, decisions, command, job: job ?? null };
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -130,11 +138,16 @@ function loadItemDetail(db: ReturnType<typeof getDb>, itemId: number) {
|
||||
* composite of (type, language, stream_index, title) so user overrides
|
||||
* survive stream-id changes when Jellyfin re-probes metadata.
|
||||
*/
|
||||
function titleKey(s: { type: string; language: string | null; stream_index: number; title: string | null }): string {
|
||||
export function titleKey(s: {
|
||||
type: string;
|
||||
language: string | null;
|
||||
stream_index: number;
|
||||
title: string | null;
|
||||
}): string {
|
||||
return `${s.type}|${s.language ?? ""}|${s.stream_index}|${s.title ?? ""}`;
|
||||
}
|
||||
|
||||
function reanalyze(db: ReturnType<typeof getDb>, itemId: number, preservedTitles?: Map<string, string>): void {
|
||||
export function reanalyze(db: ReturnType<typeof getDb>, itemId: number, preservedTitles?: Map<string, string>): void {
|
||||
const item = db.prepare("SELECT * FROM media_items WHERE id = ?").get(itemId) as MediaItem;
|
||||
if (!item) return;
|
||||
|
||||
@@ -262,36 +275,246 @@ interface PipelineAudioStream {
|
||||
action: "keep" | "remove";
|
||||
}
|
||||
|
||||
type EnrichableRow = { id?: number; plan_id?: number; item_id: number } & {
|
||||
transcode_reasons?: string[];
|
||||
audio_streams?: PipelineAudioStream[];
|
||||
};
|
||||
|
||||
/**
|
||||
* Enrich review/queued rows with transcode-reason badges and pre-checked audio
|
||||
* streams. Works for both the Review column (where `id` is the plan id) and
|
||||
* the Queued column (where `plan_id` is explicit and `id` is the job id).
|
||||
*/
|
||||
function enrichWithStreamsAndReasons(db: ReturnType<typeof getDb>, rows: EnrichableRow[]): void {
|
||||
if (rows.length === 0) return;
|
||||
const planIdFor = (r: EnrichableRow): number => (r.plan_id ?? r.id) as number;
|
||||
const planIds = rows.map(planIdFor);
|
||||
const itemIds = rows.map((r) => r.item_id);
|
||||
|
||||
const reasonPh = planIds.map(() => "?").join(",");
|
||||
const allReasons = db
|
||||
.prepare(`
|
||||
SELECT DISTINCT sd.plan_id, ms.codec, sd.transcode_codec
|
||||
FROM stream_decisions sd
|
||||
JOIN media_streams ms ON ms.id = sd.stream_id
|
||||
WHERE sd.plan_id IN (${reasonPh}) AND sd.transcode_codec IS NOT NULL
|
||||
`)
|
||||
.all(...planIds) as { plan_id: number; codec: string | null; transcode_codec: string }[];
|
||||
const reasonsByPlan = new Map<number, string[]>();
|
||||
for (const r of allReasons) {
|
||||
if (!reasonsByPlan.has(r.plan_id)) reasonsByPlan.set(r.plan_id, []);
|
||||
reasonsByPlan.get(r.plan_id)!.push(`${(r.codec ?? "").toUpperCase()} → ${r.transcode_codec.toUpperCase()}`);
|
||||
}
|
||||
|
||||
const streamPh = itemIds.map(() => "?").join(",");
|
||||
const streamRows = db
|
||||
.prepare(`
|
||||
SELECT ms.id, ms.item_id, ms.language, ms.codec, ms.channels, ms.title,
|
||||
ms.is_default, sd.action
|
||||
FROM media_streams ms
|
||||
JOIN review_plans rp ON rp.item_id = ms.item_id
|
||||
LEFT JOIN stream_decisions sd ON sd.plan_id = rp.id AND sd.stream_id = ms.id
|
||||
WHERE ms.item_id IN (${streamPh}) AND ms.type = 'Audio'
|
||||
ORDER BY ms.item_id, ms.stream_index
|
||||
`)
|
||||
.all(...itemIds) as {
|
||||
id: number;
|
||||
item_id: number;
|
||||
language: string | null;
|
||||
codec: string | null;
|
||||
channels: number | null;
|
||||
title: string | null;
|
||||
is_default: number;
|
||||
action: "keep" | "remove" | null;
|
||||
}[];
|
||||
const streamsByItem = new Map<number, PipelineAudioStream[]>();
|
||||
for (const r of streamRows) {
|
||||
if (!streamsByItem.has(r.item_id)) streamsByItem.set(r.item_id, []);
|
||||
streamsByItem.get(r.item_id)!.push({
|
||||
id: r.id,
|
||||
language: r.language,
|
||||
codec: r.codec,
|
||||
channels: r.channels,
|
||||
title: r.title,
|
||||
is_default: r.is_default,
|
||||
action: r.action ?? "keep",
|
||||
});
|
||||
}
|
||||
|
||||
for (const r of rows) {
|
||||
r.transcode_reasons = reasonsByPlan.get(planIdFor(r)) ?? [];
|
||||
r.audio_streams = streamsByItem.get(r.item_id) ?? [];
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Review groups (paginated, always returns complete series) ──────────────
|
||||
|
||||
interface ReviewItemRow {
|
||||
id: number;
|
||||
item_id: number;
|
||||
status: string;
|
||||
is_noop: number;
|
||||
confidence: "high" | "low";
|
||||
apple_compat: ReviewPlan["apple_compat"];
|
||||
job_type: "copy" | "transcode";
|
||||
name: string;
|
||||
series_name: string | null;
|
||||
series_jellyfin_id: string | null;
|
||||
jellyfin_id: string;
|
||||
season_number: number | null;
|
||||
episode_number: number | null;
|
||||
type: "Movie" | "Episode";
|
||||
container: string | null;
|
||||
original_language: string | null;
|
||||
orig_lang_source: string | null;
|
||||
file_path: string;
|
||||
transcode_reasons?: string[];
|
||||
audio_streams?: PipelineAudioStream[];
|
||||
}
|
||||
|
||||
type ReviewGroup =
|
||||
| { kind: "movie"; item: ReviewItemRow }
|
||||
| {
|
||||
kind: "series";
|
||||
seriesKey: string;
|
||||
seriesName: string;
|
||||
seriesJellyfinId: string | null;
|
||||
episodeCount: number;
|
||||
minConfidence: "high" | "low";
|
||||
originalLanguage: string | null;
|
||||
seasons: Array<{ season: number | null; episodes: ReviewItemRow[] }>;
|
||||
};
|
||||
|
||||
export function buildReviewGroups(db: ReturnType<typeof getDb>): { groups: ReviewGroup[]; totalItems: number } {
|
||||
const rows = db
|
||||
.prepare(`
|
||||
SELECT rp.*, mi.name, mi.series_name, mi.series_jellyfin_id,
|
||||
mi.jellyfin_id,
|
||||
mi.season_number, mi.episode_number, mi.type, mi.container,
|
||||
mi.original_language, mi.orig_lang_source, mi.file_path
|
||||
FROM review_plans rp
|
||||
JOIN media_items mi ON mi.id = rp.item_id
|
||||
WHERE rp.status = 'pending' AND rp.is_noop = 0
|
||||
ORDER BY
|
||||
CASE rp.confidence WHEN 'high' THEN 0 ELSE 1 END,
|
||||
COALESCE(mi.series_name, mi.name),
|
||||
mi.season_number, mi.episode_number
|
||||
`)
|
||||
.all() as ReviewItemRow[];
|
||||
|
||||
const movieGroups: ReviewGroup[] = [];
|
||||
interface SeriesAccum {
|
||||
seriesName: string;
|
||||
seriesJellyfinId: string | null;
|
||||
seasons: Map<number | null, ReviewItemRow[]>;
|
||||
originalLanguage: string | null;
|
||||
hasLow: boolean;
|
||||
}
|
||||
const seriesMap = new Map<string, SeriesAccum>();
|
||||
|
||||
for (const row of rows) {
|
||||
if (row.type === "Movie") {
|
||||
movieGroups.push({ kind: "movie", item: row });
|
||||
continue;
|
||||
}
|
||||
const key = row.series_jellyfin_id ?? row.series_name ?? String(row.item_id);
|
||||
let entry = seriesMap.get(key);
|
||||
if (!entry) {
|
||||
entry = {
|
||||
seriesName: row.series_name ?? "",
|
||||
seriesJellyfinId: row.series_jellyfin_id,
|
||||
seasons: new Map(),
|
||||
originalLanguage: row.original_language,
|
||||
hasLow: false,
|
||||
};
|
||||
seriesMap.set(key, entry);
|
||||
}
|
||||
let bucket = entry.seasons.get(row.season_number);
|
||||
if (!bucket) {
|
||||
bucket = [];
|
||||
entry.seasons.set(row.season_number, bucket);
|
||||
}
|
||||
bucket.push(row);
|
||||
if (row.confidence === "low") entry.hasLow = true;
|
||||
}
|
||||
|
||||
const seriesGroups: ReviewGroup[] = [];
|
||||
for (const [seriesKey, entry] of seriesMap) {
|
||||
const seasonKeys = [...entry.seasons.keys()].sort((a, b) => {
|
||||
if (a === null) return 1;
|
||||
if (b === null) return -1;
|
||||
return a - b;
|
||||
});
|
||||
const seasons = seasonKeys.map((season) => ({
|
||||
season,
|
||||
episodes: (entry.seasons.get(season) ?? []).sort((a, b) => (a.episode_number ?? 0) - (b.episode_number ?? 0)),
|
||||
}));
|
||||
const episodeCount = seasons.reduce((sum, s) => sum + s.episodes.length, 0);
|
||||
seriesGroups.push({
|
||||
kind: "series",
|
||||
seriesKey,
|
||||
seriesName: entry.seriesName,
|
||||
seriesJellyfinId: entry.seriesJellyfinId,
|
||||
episodeCount,
|
||||
minConfidence: entry.hasLow ? "low" : "high",
|
||||
originalLanguage: entry.originalLanguage,
|
||||
seasons,
|
||||
});
|
||||
}
|
||||
|
||||
const allGroups = [...movieGroups, ...seriesGroups].sort((a, b) => {
|
||||
const confA = a.kind === "movie" ? a.item.confidence : a.minConfidence;
|
||||
const confB = b.kind === "movie" ? b.item.confidence : b.minConfidence;
|
||||
const rankA = confA === "high" ? 0 : 1;
|
||||
const rankB = confB === "high" ? 0 : 1;
|
||||
if (rankA !== rankB) return rankA - rankB;
|
||||
const nameA = a.kind === "movie" ? a.item.name : a.seriesName;
|
||||
const nameB = b.kind === "movie" ? b.item.name : b.seriesName;
|
||||
return nameA.localeCompare(nameB);
|
||||
});
|
||||
|
||||
const totalItems =
|
||||
movieGroups.length + seriesGroups.reduce((sum, g) => sum + (g.kind === "series" ? g.episodeCount : 0), 0);
|
||||
return { groups: allGroups, totalItems };
|
||||
}
|
||||
|
||||
app.get("/groups", (c) => {
|
||||
const db = getDb();
|
||||
const offset = Math.max(0, Number.parseInt(c.req.query("offset") ?? "0", 10) || 0);
|
||||
const limit = Math.max(1, Math.min(200, Number.parseInt(c.req.query("limit") ?? "25", 10) || 25));
|
||||
|
||||
const { groups, totalItems } = buildReviewGroups(db);
|
||||
const page = groups.slice(offset, offset + limit);
|
||||
|
||||
// Enrich each visible episode/movie with audio streams + transcode reasons.
|
||||
const flat: EnrichableRow[] = [];
|
||||
for (const g of page) {
|
||||
if (g.kind === "movie") flat.push(g.item as EnrichableRow);
|
||||
else for (const s of g.seasons) for (const ep of s.episodes) flat.push(ep as EnrichableRow);
|
||||
}
|
||||
enrichWithStreamsAndReasons(db, flat);
|
||||
|
||||
return c.json({
|
||||
groups: page,
|
||||
totalGroups: groups.length,
|
||||
totalItems,
|
||||
hasMore: offset + limit < groups.length,
|
||||
});
|
||||
});
|
||||
|
||||
app.get("/pipeline", (c) => {
|
||||
const db = getDb();
|
||||
const jellyfinUrl = getConfig("jellyfin_url") ?? "";
|
||||
|
||||
// Cap the review column to keep the page snappy at scale; pipelines
|
||||
// with thousands of pending items would otherwise ship 10k+ rows on
|
||||
// every refresh and re-render every card.
|
||||
const REVIEW_LIMIT = 500;
|
||||
const review = db
|
||||
.prepare(`
|
||||
SELECT rp.*, mi.name, mi.series_name, mi.series_jellyfin_id,
|
||||
mi.jellyfin_id,
|
||||
mi.season_number, mi.episode_number, mi.type, mi.container,
|
||||
mi.original_language, mi.orig_lang_source, mi.file_path
|
||||
FROM review_plans rp
|
||||
JOIN media_items mi ON mi.id = rp.item_id
|
||||
WHERE rp.status = 'pending' AND rp.is_noop = 0
|
||||
ORDER BY
|
||||
CASE rp.confidence WHEN 'high' THEN 0 ELSE 1 END,
|
||||
COALESCE(mi.series_name, mi.name),
|
||||
mi.season_number, mi.episode_number
|
||||
LIMIT ${REVIEW_LIMIT}
|
||||
`)
|
||||
.all();
|
||||
const reviewTotal = (
|
||||
// Review items ship via GET /groups (paginated, always returns complete
|
||||
// series). The pipeline payload only carries the total count so the column
|
||||
// header can render immediately.
|
||||
const reviewItemsTotal = (
|
||||
db.prepare("SELECT COUNT(*) as n FROM review_plans WHERE status = 'pending' AND is_noop = 0").get() as { n: number }
|
||||
).n;
|
||||
|
||||
// Queued gets the same enrichment as review so the card can render
|
||||
// streams + transcode reasons read-only (with a "Back to review" button).
|
||||
// Queued carries stream + transcode-reason enrichment so the card renders
|
||||
// read-only with a "Back to review" button.
|
||||
const queued = db
|
||||
.prepare(`
|
||||
SELECT j.id, j.item_id, j.status, j.started_at, j.completed_at,
|
||||
@@ -342,79 +565,9 @@ app.get("/pipeline", (c) => {
|
||||
};
|
||||
const doneCount = noopRow.n + doneRow.n;
|
||||
|
||||
// Enrich rows that have (plan_id, item_id) with the transcode-reason
|
||||
// badges and pre-checked audio streams. Used for both review and queued
|
||||
// columns so the queued card can render read-only with the same info.
|
||||
type EnrichableRow = { id?: number; plan_id?: number; item_id: number } & {
|
||||
transcode_reasons?: string[];
|
||||
audio_streams?: PipelineAudioStream[];
|
||||
};
|
||||
const enrichWithStreamsAndReasons = (rows: EnrichableRow[]) => {
|
||||
if (rows.length === 0) return;
|
||||
const planIdFor = (r: EnrichableRow): number => (r.plan_id ?? r.id) as number;
|
||||
const planIds = rows.map(planIdFor);
|
||||
const itemIds = rows.map((r) => r.item_id);
|
||||
enrichWithStreamsAndReasons(db, queued as EnrichableRow[]);
|
||||
|
||||
const reasonPh = planIds.map(() => "?").join(",");
|
||||
const allReasons = db
|
||||
.prepare(`
|
||||
SELECT DISTINCT sd.plan_id, ms.codec, sd.transcode_codec
|
||||
FROM stream_decisions sd
|
||||
JOIN media_streams ms ON ms.id = sd.stream_id
|
||||
WHERE sd.plan_id IN (${reasonPh}) AND sd.transcode_codec IS NOT NULL
|
||||
`)
|
||||
.all(...planIds) as { plan_id: number; codec: string | null; transcode_codec: string }[];
|
||||
const reasonsByPlan = new Map<number, string[]>();
|
||||
for (const r of allReasons) {
|
||||
if (!reasonsByPlan.has(r.plan_id)) reasonsByPlan.set(r.plan_id, []);
|
||||
reasonsByPlan.get(r.plan_id)!.push(`${(r.codec ?? "").toUpperCase()} → ${r.transcode_codec.toUpperCase()}`);
|
||||
}
|
||||
|
||||
const streamPh = itemIds.map(() => "?").join(",");
|
||||
const streamRows = db
|
||||
.prepare(`
|
||||
SELECT ms.id, ms.item_id, ms.language, ms.codec, ms.channels, ms.title,
|
||||
ms.is_default, sd.action
|
||||
FROM media_streams ms
|
||||
JOIN review_plans rp ON rp.item_id = ms.item_id
|
||||
LEFT JOIN stream_decisions sd ON sd.plan_id = rp.id AND sd.stream_id = ms.id
|
||||
WHERE ms.item_id IN (${streamPh}) AND ms.type = 'Audio'
|
||||
ORDER BY ms.item_id, ms.stream_index
|
||||
`)
|
||||
.all(...itemIds) as {
|
||||
id: number;
|
||||
item_id: number;
|
||||
language: string | null;
|
||||
codec: string | null;
|
||||
channels: number | null;
|
||||
title: string | null;
|
||||
is_default: number;
|
||||
action: "keep" | "remove" | null;
|
||||
}[];
|
||||
const streamsByItem = new Map<number, PipelineAudioStream[]>();
|
||||
for (const r of streamRows) {
|
||||
if (!streamsByItem.has(r.item_id)) streamsByItem.set(r.item_id, []);
|
||||
streamsByItem.get(r.item_id)!.push({
|
||||
id: r.id,
|
||||
language: r.language,
|
||||
codec: r.codec,
|
||||
channels: r.channels,
|
||||
title: r.title,
|
||||
is_default: r.is_default,
|
||||
action: r.action ?? "keep",
|
||||
});
|
||||
}
|
||||
|
||||
for (const r of rows) {
|
||||
r.transcode_reasons = reasonsByPlan.get(planIdFor(r)) ?? [];
|
||||
r.audio_streams = streamsByItem.get(r.item_id) ?? [];
|
||||
}
|
||||
};
|
||||
|
||||
enrichWithStreamsAndReasons(review as EnrichableRow[]);
|
||||
enrichWithStreamsAndReasons(queued as EnrichableRow[]);
|
||||
|
||||
return c.json({ review, reviewTotal, queued, processing, done, doneCount, jellyfinUrl });
|
||||
return c.json({ reviewItemsTotal, queued, processing, done, doneCount, jellyfinUrl });
|
||||
});
|
||||
|
||||
// ─── List ─────────────────────────────────────────────────────────────────────
|
||||
@@ -570,6 +723,44 @@ app.post("/approve-all", (c) => {
|
||||
return c.json({ ok: true, count: pending.length });
|
||||
});
|
||||
|
||||
// ─── Batch approve (by item id list) ─────────────────────────────────────────
|
||||
// Used by the "approve up to here" affordance in the review column. The
|
||||
// client knows the visible order (movies + series sort-key) and passes in
|
||||
// the prefix of item ids it wants approved in one round-trip. Items that
|
||||
// aren't pending (already approved / skipped / done) are silently ignored
|
||||
// so the endpoint is idempotent against stale client state.
|
||||
app.post("/approve-batch", async (c) => {
|
||||
const db = getDb();
|
||||
const body = await c.req.json<{ itemIds?: unknown }>().catch(() => ({ itemIds: undefined }));
|
||||
if (
|
||||
!Array.isArray(body.itemIds) ||
|
||||
!body.itemIds.every((v) => typeof v === "number" && Number.isInteger(v) && v > 0)
|
||||
) {
|
||||
return c.json({ ok: false, error: "itemIds must be an array of positive integers" }, 400);
|
||||
}
|
||||
const ids = body.itemIds as number[];
|
||||
if (ids.length === 0) return c.json({ ok: true, count: 0 });
|
||||
|
||||
const placeholders = ids.map(() => "?").join(",");
|
||||
const pending = db
|
||||
.prepare(
|
||||
`SELECT rp.*, mi.id as item_id FROM review_plans rp JOIN media_items mi ON mi.id = rp.item_id
|
||||
WHERE rp.status = 'pending' AND rp.is_noop = 0 AND mi.id IN (${placeholders})`,
|
||||
)
|
||||
.all(...ids) as (ReviewPlan & { item_id: number })[];
|
||||
|
||||
let count = 0;
|
||||
for (const plan of pending) {
|
||||
db.prepare("UPDATE review_plans SET status = 'approved', reviewed_at = datetime('now') WHERE id = ?").run(plan.id);
|
||||
const { item, streams, decisions } = loadItemDetail(db, plan.item_id);
|
||||
if (item) {
|
||||
enqueueAudioJob(db, plan.item_id, buildCommand(item, streams, decisions));
|
||||
count++;
|
||||
}
|
||||
}
|
||||
return c.json({ ok: true, count });
|
||||
});
|
||||
|
||||
// ─── Auto-approve high-confidence ────────────────────────────────────────────
|
||||
// Approves every pending plan whose original language came from an authoritative
|
||||
// source (radarr/sonarr). Anything with low confidence keeps needing a human.
|
||||
@@ -709,6 +900,30 @@ app.post("/:id/retry", (c) => {
|
||||
return c.json({ ok: true });
|
||||
});
|
||||
|
||||
// Reopen a completed or errored plan: flip it back to pending so the user
|
||||
// can adjust decisions and re-approve. Used by the Done column's hover
|
||||
// "Back to review" affordance. Unlike /unapprove (which rolls back an
|
||||
// approved-but-not-yet-running plan), this handles the post-job states
|
||||
// and drops the lingering job row so the pipeline doesn't show leftover
|
||||
// history for an item that's about to be re-queued.
|
||||
app.post("/:id/reopen", (c) => {
|
||||
const db = getDb();
|
||||
const id = parseId(c.req.param("id"));
|
||||
if (id == null) return c.json({ error: "invalid id" }, 400);
|
||||
const plan = db.prepare("SELECT * FROM review_plans WHERE item_id = ?").get(id) as ReviewPlan | undefined;
|
||||
if (!plan) return c.notFound();
|
||||
if (plan.status !== "done" && plan.status !== "error") {
|
||||
return c.json({ ok: false, error: "Can only reopen plans with status done or error" }, 409);
|
||||
}
|
||||
db.transaction(() => {
|
||||
// Leave plan.notes alone so the user keeps any ffmpeg error summary
|
||||
// from the prior run — useful context when redeciding decisions.
|
||||
db.prepare("UPDATE review_plans SET status = 'pending', reviewed_at = NULL WHERE id = ?").run(plan.id);
|
||||
db.prepare("DELETE FROM jobs WHERE item_id = ? AND status IN ('done', 'error')").run(id);
|
||||
})();
|
||||
return c.json({ ok: true });
|
||||
});
|
||||
|
||||
app.post("/:id/unapprove", (c) => {
|
||||
const db = getDb();
|
||||
const id = parseId(c.req.param("id"));
|
||||
|
||||
@@ -10,6 +10,91 @@ import { loadLibrary as loadSonarrLibrary, isUsable as sonarrUsable } from "../s
|
||||
|
||||
const app = new Hono();
|
||||
|
||||
/**
|
||||
* Validate a scan `limit` input. Must be a positive integer or absent —
|
||||
* NaN/negatives/non-numerics would disable the progress cap
|
||||
* (`processed >= NaN` never trips) or produce bogus totals via
|
||||
* `Math.min(NaN, …)`. Exported for unit tests.
|
||||
*/
|
||||
export function parseScanLimit(raw: unknown): { ok: true; value: number | null } | { ok: false } {
|
||||
if (raw == null || raw === "") return { ok: true, value: null };
|
||||
const n = typeof raw === "number" ? raw : Number(raw);
|
||||
if (!Number.isInteger(n) || n <= 0) return { ok: false };
|
||||
return { ok: true, value: n };
|
||||
}
|
||||
|
||||
type ScanStatusFilter = "all" | "pending" | "scanned" | "error";
|
||||
type ScanTypeFilter = "all" | "movie" | "episode";
|
||||
type ScanSourceFilter = "all" | "scan" | "webhook";
|
||||
|
||||
export interface ScanItemsQuery {
|
||||
offset: number;
|
||||
limit: number;
|
||||
search: string;
|
||||
status: ScanStatusFilter;
|
||||
type: ScanTypeFilter;
|
||||
source: ScanSourceFilter;
|
||||
}
|
||||
|
||||
function parsePositiveInt(raw: unknown, fallback: number): number {
|
||||
const n = typeof raw === "number" ? raw : Number(raw);
|
||||
if (!Number.isFinite(n)) return fallback;
|
||||
if (!Number.isInteger(n)) return fallback;
|
||||
return n;
|
||||
}
|
||||
|
||||
function clamp(n: number, min: number, max: number): number {
|
||||
if (n < min) return min;
|
||||
if (n > max) return max;
|
||||
return n;
|
||||
}
|
||||
|
||||
function parseOneOf<T extends readonly string[]>(raw: unknown, allowed: T, fallback: T[number]): T[number] {
|
||||
if (typeof raw !== "string") return fallback;
|
||||
const lowered = raw.toLowerCase();
|
||||
return (allowed as readonly string[]).includes(lowered) ? (lowered as T[number]) : fallback;
|
||||
}
|
||||
|
||||
export function parseScanItemsQuery(raw: Record<string, unknown>): ScanItemsQuery {
|
||||
const limit = clamp(parsePositiveInt(raw.limit, 50), 1, 200);
|
||||
const offset = Math.max(0, parsePositiveInt(raw.offset, 0));
|
||||
const search = typeof raw.q === "string" ? raw.q.trim() : "";
|
||||
return {
|
||||
offset,
|
||||
limit,
|
||||
search,
|
||||
status: parseOneOf(raw.status, ["all", "pending", "scanned", "error"] as const, "all"),
|
||||
type: parseOneOf(raw.type, ["all", "movie", "episode"] as const, "all"),
|
||||
source: parseOneOf(raw.source, ["all", "scan", "webhook"] as const, "all"),
|
||||
};
|
||||
}
|
||||
|
||||
export function buildScanItemsWhere(query: ScanItemsQuery): { sql: string; args: string[] } {
|
||||
const clauses: string[] = [];
|
||||
const args: string[] = [];
|
||||
if (query.status !== "all") {
|
||||
clauses.push("scan_status = ?");
|
||||
args.push(query.status);
|
||||
}
|
||||
if (query.type !== "all") {
|
||||
clauses.push("lower(type) = ?");
|
||||
args.push(query.type);
|
||||
}
|
||||
if (query.source !== "all") {
|
||||
clauses.push("ingest_source = ?");
|
||||
args.push(query.source);
|
||||
}
|
||||
if (query.search.length > 0) {
|
||||
clauses.push("(lower(name) LIKE ? OR lower(file_path) LIKE ?)");
|
||||
const needle = `%${query.search.toLowerCase()}%`;
|
||||
args.push(needle, needle);
|
||||
}
|
||||
return {
|
||||
sql: clauses.length > 0 ? `WHERE ${clauses.join(" AND ")}` : "",
|
||||
args,
|
||||
};
|
||||
}
|
||||
|
||||
// ─── State ────────────────────────────────────────────────────────────────────
|
||||
|
||||
let scanAbort: AbortController | null = null;
|
||||
@@ -47,12 +132,84 @@ app.get("/", (c) => {
|
||||
const errors = (db.prepare("SELECT COUNT(*) as n FROM media_items WHERE scan_status = 'error'").get() as { n: number })
|
||||
.n;
|
||||
const recentItems = db
|
||||
.prepare("SELECT name, type, scan_status, file_path FROM media_items ORDER BY last_scanned_at DESC LIMIT 50")
|
||||
.all() as { name: string; type: string; scan_status: string; file_path: string }[];
|
||||
.prepare(
|
||||
"SELECT name, type, scan_status, file_path, last_scanned_at, ingest_source FROM media_items ORDER BY COALESCE(last_scanned_at, created_at) DESC, id DESC LIMIT 5",
|
||||
)
|
||||
.all() as {
|
||||
name: string;
|
||||
type: string;
|
||||
scan_status: string;
|
||||
file_path: string;
|
||||
last_scanned_at: string | null;
|
||||
ingest_source: string | null;
|
||||
}[];
|
||||
|
||||
return c.json({ running, progress: { scanned, total, errors }, recentItems, scanLimit: currentScanLimit() });
|
||||
});
|
||||
|
||||
app.get("/items", (c) => {
|
||||
const db = getDb();
|
||||
const query = parseScanItemsQuery({
|
||||
offset: c.req.query("offset"),
|
||||
limit: c.req.query("limit"),
|
||||
q: c.req.query("q"),
|
||||
status: c.req.query("status"),
|
||||
type: c.req.query("type"),
|
||||
source: c.req.query("source"),
|
||||
});
|
||||
const where = buildScanItemsWhere(query);
|
||||
const rows = db
|
||||
.prepare(
|
||||
`
|
||||
SELECT id, jellyfin_id, name, type, series_name, season_number, episode_number,
|
||||
scan_status, original_language, orig_lang_source, container, file_size, file_path,
|
||||
last_scanned_at, ingest_source
|
||||
FROM media_items
|
||||
${where.sql}
|
||||
ORDER BY COALESCE(last_scanned_at, created_at) DESC, id DESC
|
||||
LIMIT ? OFFSET ?
|
||||
`,
|
||||
)
|
||||
.all(...where.args, query.limit, query.offset) as Array<{
|
||||
id: number;
|
||||
jellyfin_id: string;
|
||||
name: string;
|
||||
type: string;
|
||||
series_name: string | null;
|
||||
season_number: number | null;
|
||||
episode_number: number | null;
|
||||
scan_status: string;
|
||||
original_language: string | null;
|
||||
orig_lang_source: string | null;
|
||||
container: string | null;
|
||||
file_size: number | null;
|
||||
file_path: string;
|
||||
last_scanned_at: string | null;
|
||||
ingest_source: string | null;
|
||||
audio_codecs: string | null;
|
||||
}>;
|
||||
|
||||
// Audio codecs per item, batched into one query for the current page.
|
||||
// A per-row scalar subquery over media_streams was O(page × streams)
|
||||
// and could block the event loop for minutes on large libraries.
|
||||
if (rows.length > 0) {
|
||||
const placeholders = rows.map(() => "?").join(",");
|
||||
const codecRows = db
|
||||
.prepare(
|
||||
`SELECT item_id, GROUP_CONCAT(DISTINCT LOWER(codec)) AS codecs
|
||||
FROM media_streams
|
||||
WHERE item_id IN (${placeholders}) AND type = 'Audio' AND codec IS NOT NULL
|
||||
GROUP BY item_id`,
|
||||
)
|
||||
.all(...rows.map((r) => r.id)) as { item_id: number; codecs: string | null }[];
|
||||
const byItem = new Map(codecRows.map((r) => [r.item_id, r.codecs]));
|
||||
for (const row of rows) row.audio_codecs = byItem.get(row.id) ?? null;
|
||||
}
|
||||
|
||||
const total = (db.prepare(`SELECT COUNT(*) as n FROM media_items ${where.sql}`).get(...where.args) as { n: number }).n;
|
||||
return c.json({ rows, total, hasMore: query.offset + rows.length < total, query });
|
||||
});
|
||||
|
||||
// ─── Start ────────────────────────────────────────────────────────────────────
|
||||
|
||||
app.post("/start", async (c) => {
|
||||
@@ -63,10 +220,14 @@ app.post("/start", async (c) => {
|
||||
return c.json({ ok: false, error: "Scan already running" }, 409);
|
||||
}
|
||||
|
||||
const body = await c.req.json<{ limit?: number }>().catch(() => ({ limit: undefined }));
|
||||
const formLimit = body.limit ?? null;
|
||||
const envLimit = process.env.SCAN_LIMIT ? Number(process.env.SCAN_LIMIT) : null;
|
||||
const limit = formLimit ?? envLimit ?? null;
|
||||
const body = await c.req.json<{ limit?: unknown }>().catch(() => ({ limit: undefined }));
|
||||
const formLimit = parseScanLimit(body.limit);
|
||||
const envLimit = parseScanLimit(process.env.SCAN_LIMIT);
|
||||
if (!formLimit.ok || !envLimit.ok) {
|
||||
db.prepare("UPDATE config SET value = '0' WHERE key = 'scan_running'").run();
|
||||
return c.json({ ok: false, error: "limit must be a positive integer" }, 400);
|
||||
}
|
||||
const limit: number | null = formLimit.value ?? envLimit.value ?? null;
|
||||
setConfig("scan_limit", limit != null ? String(limit) : "");
|
||||
|
||||
runScan(limit).catch((err) => {
|
||||
@@ -239,8 +400,11 @@ async function runScan(limit: number | null = null): Promise<void> {
|
||||
db
|
||||
.prepare("UPDATE media_items SET scan_status = 'error', scan_error = ? WHERE jellyfin_id = ?")
|
||||
.run(String(err), jellyfinItem.Id);
|
||||
} catch {
|
||||
/* ignore */
|
||||
} catch (dbErr) {
|
||||
// Failed to persist the error status — log it so the incident
|
||||
// doesn't disappear silently. We can't do much more; the outer
|
||||
// loop continues so the scan still finishes.
|
||||
logError(`Failed to record scan error for ${jellyfinItem.Id}:`, dbErr);
|
||||
}
|
||||
emitSse("log", { name: jellyfinItem.Name, type: jellyfinItem.Type, status: "error", file: jellyfinItem.Path });
|
||||
}
|
||||
|
||||
@@ -8,16 +8,37 @@ import { testConnection as testSonarr } from "../services/sonarr";
|
||||
|
||||
const app = new Hono();
|
||||
|
||||
// Config keys that hold credentials. `GET /` returns these as "***" when set,
|
||||
// "" when unset. Real values only reach the client via the explicit
|
||||
// GET /reveal?key=<key> endpoint (eye-icon toggle in the settings UI).
|
||||
const SECRET_KEYS = new Set(["jellyfin_api_key", "radarr_api_key", "sonarr_api_key", "mqtt_password"]);
|
||||
|
||||
app.get("/", (c) => {
|
||||
const config = getAllConfig();
|
||||
for (const key of SECRET_KEYS) {
|
||||
if (config[key]) config[key] = "***";
|
||||
}
|
||||
const envLocked = Array.from(getEnvLockedKeys());
|
||||
return c.json({ config, envLocked });
|
||||
});
|
||||
|
||||
app.get("/reveal", (c) => {
|
||||
const key = c.req.query("key") ?? "";
|
||||
if (!SECRET_KEYS.has(key)) return c.json({ error: "not a secret key" }, 400);
|
||||
return c.json({ value: getConfig(key) ?? "" });
|
||||
});
|
||||
|
||||
// The UI sends "***" as a sentinel meaning "user didn't touch this field,
|
||||
// keep the stored value". Save endpoints call this before writing a secret.
|
||||
function resolveSecret(incoming: string | undefined, storedKey: string): string {
|
||||
if (incoming === "***") return getConfig(storedKey) ?? "";
|
||||
return incoming ?? "";
|
||||
}
|
||||
|
||||
app.post("/jellyfin", async (c) => {
|
||||
const body = await c.req.json<{ url: string; api_key: string }>();
|
||||
const url = body.url?.replace(/\/$/, "");
|
||||
const apiKey = body.api_key;
|
||||
const apiKey = resolveSecret(body.api_key, "jellyfin_api_key");
|
||||
|
||||
if (!url || !apiKey) return c.json({ ok: false, error: "URL and API key are required" }, 400);
|
||||
|
||||
@@ -26,12 +47,15 @@ app.post("/jellyfin", async (c) => {
|
||||
// { ok, saved, testError } shape to decide what message to show.
|
||||
setConfig("jellyfin_url", url);
|
||||
setConfig("jellyfin_api_key", apiKey);
|
||||
setConfig("setup_complete", "1");
|
||||
|
||||
const result = await testJellyfin({ url, apiKey });
|
||||
|
||||
// Best-effort admin discovery only when the connection works; ignore failures.
|
||||
// Only mark setup complete when the connection actually works. Setting
|
||||
// setup_complete=1 on a failing test would let the user click past the
|
||||
// wizard into an app that then dies on the first Jellyfin call.
|
||||
if (result.ok) {
|
||||
setConfig("setup_complete", "1");
|
||||
// Best-effort admin discovery only when the connection works; ignore failures.
|
||||
try {
|
||||
const users = await getUsers({ url, apiKey });
|
||||
const admin = users.find((u) => u.Name === "admin") ?? users[0];
|
||||
@@ -51,7 +75,7 @@ app.post("/jellyfin", async (c) => {
|
||||
app.post("/radarr", async (c) => {
|
||||
const body = await c.req.json<{ url?: string; api_key?: string }>();
|
||||
const url = body.url?.replace(/\/$/, "");
|
||||
const apiKey = body.api_key;
|
||||
const apiKey = resolveSecret(body.api_key, "radarr_api_key");
|
||||
|
||||
if (!url || !apiKey) {
|
||||
setConfig("radarr_enabled", "0");
|
||||
@@ -69,7 +93,7 @@ app.post("/radarr", async (c) => {
|
||||
app.post("/sonarr", async (c) => {
|
||||
const body = await c.req.json<{ url?: string; api_key?: string }>();
|
||||
const url = body.url?.replace(/\/$/, "");
|
||||
const apiKey = body.api_key;
|
||||
const apiKey = resolveSecret(body.api_key, "sonarr_api_key");
|
||||
|
||||
if (!url || !apiKey) {
|
||||
setConfig("sonarr_enabled", "0");
|
||||
@@ -124,9 +148,10 @@ app.post("/mqtt", async (c) => {
|
||||
setConfig("mqtt_url", url);
|
||||
setConfig("mqtt_topic", topic || "jellyfin/events");
|
||||
setConfig("mqtt_username", username);
|
||||
// Only overwrite password when a non-empty value is sent, so the UI can
|
||||
// leave the field blank to indicate "keep the existing one".
|
||||
if (password) setConfig("mqtt_password", password);
|
||||
// Only overwrite password when a real value is sent. The UI leaves the
|
||||
// field blank or sends "***" (masked placeholder) when the user didn't
|
||||
// touch it — both mean "keep the existing one".
|
||||
if (password && password !== "***") setConfig("mqtt_password", password);
|
||||
|
||||
// Reconnect with the new config. Best-effort; failures surface in status.
|
||||
startMqttClient().catch(() => {});
|
||||
|
||||
@@ -6,6 +6,7 @@ import { error as logError } from "../lib/log";
|
||||
import { parseId } from "../lib/validate";
|
||||
import { getItem, mapStream, normalizeLanguage, refreshItem } from "../services/jellyfin";
|
||||
import type { MediaItem, MediaStream, ReviewPlan, StreamDecision, SubtitleFile } from "../types";
|
||||
import { reanalyze, titleKey } from "./review";
|
||||
|
||||
const app = new Hono();
|
||||
|
||||
@@ -405,6 +406,30 @@ app.post("/:id/rescan", async (c) => {
|
||||
|
||||
await refreshItem(jfCfg, item.jellyfin_id);
|
||||
|
||||
// Snapshot custom_titles before the DELETE cascades stream_decisions away,
|
||||
// so reanalyze() can re-attach them to the corresponding new stream rows.
|
||||
// Without this rescanning subtitles also wipes per-audio-stream title
|
||||
// overrides the user made in the review UI.
|
||||
const preservedTitles = new Map<string, string>();
|
||||
const oldTitleRows = db
|
||||
.prepare(`
|
||||
SELECT ms.type, ms.language, ms.stream_index, ms.title, sd.custom_title
|
||||
FROM stream_decisions sd
|
||||
JOIN media_streams ms ON ms.id = sd.stream_id
|
||||
JOIN review_plans rp ON rp.id = sd.plan_id
|
||||
WHERE rp.item_id = ? AND sd.custom_title IS NOT NULL
|
||||
`)
|
||||
.all(id) as {
|
||||
type: string;
|
||||
language: string | null;
|
||||
stream_index: number;
|
||||
title: string | null;
|
||||
custom_title: string;
|
||||
}[];
|
||||
for (const r of oldTitleRows) {
|
||||
preservedTitles.set(titleKey(r), r.custom_title);
|
||||
}
|
||||
|
||||
const fresh = await getItem(jfCfg, item.jellyfin_id);
|
||||
if (fresh) {
|
||||
const insertStream = db.prepare(`
|
||||
@@ -412,6 +437,10 @@ app.post("/:id/rescan", async (c) => {
|
||||
title, is_default, is_forced, is_hearing_impaired, channels, channel_layout, bit_rate, sample_rate)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
// DELETE cascades to stream_decisions via FK. reanalyze() below
|
||||
// rebuilds them from the fresh streams; without it the plan would
|
||||
// keep status='done'/'approved' but reference zero decisions, and
|
||||
// ffmpeg would emit a no-op command.
|
||||
db.prepare("DELETE FROM media_streams WHERE item_id = ?").run(id);
|
||||
for (const jStream of fresh.MediaStreams ?? []) {
|
||||
if (jStream.IsExternal) continue;
|
||||
@@ -435,6 +464,8 @@ app.post("/:id/rescan", async (c) => {
|
||||
}
|
||||
}
|
||||
|
||||
reanalyze(db, id, preservedTitles);
|
||||
|
||||
const detail = loadDetail(db, id);
|
||||
if (!detail) return c.notFound();
|
||||
return c.json(detail);
|
||||
|
||||
@@ -54,10 +54,34 @@ export function getDb(): Database {
|
||||
if (_db) return _db;
|
||||
_db = new Database(dbPath, { create: true });
|
||||
_db.exec(SCHEMA);
|
||||
migrate(_db);
|
||||
seedDefaults(_db);
|
||||
return _db;
|
||||
}
|
||||
|
||||
/**
|
||||
* Idempotent ALTER TABLE migrations for columns added after the initial
|
||||
* CREATE TABLE ships. Each block swallows "duplicate column" errors so the
|
||||
* same code path is safe on fresh and existing databases. Do not remove old
|
||||
* migrations — databases in the wild may be several versions behind.
|
||||
*/
|
||||
function migrate(db: Database): void {
|
||||
const alter = (sql: string) => {
|
||||
try {
|
||||
db.exec(sql);
|
||||
} catch (_err) {
|
||||
// column already present — ignore
|
||||
}
|
||||
};
|
||||
alter("ALTER TABLE review_plans ADD COLUMN webhook_verified INTEGER NOT NULL DEFAULT 0");
|
||||
// 2026-04-14: renamed webhook_verified → verified once we realized the
|
||||
// signal would come from our own ffprobe, not from a Jellyfin webhook.
|
||||
// RENAME COLUMN preserves values; both alters are no-ops on fresh DBs.
|
||||
alter("ALTER TABLE review_plans RENAME COLUMN webhook_verified TO verified");
|
||||
alter("ALTER TABLE review_plans DROP COLUMN verified");
|
||||
alter("ALTER TABLE media_items ADD COLUMN ingest_source TEXT NOT NULL DEFAULT 'scan'");
|
||||
}
|
||||
|
||||
function seedDefaults(db: Database): void {
|
||||
const insert = db.prepare("INSERT OR IGNORE INTO config (key, value) VALUES (?, ?)");
|
||||
for (const [key, value] of Object.entries(DEFAULT_CONFIG)) {
|
||||
|
||||
@@ -31,12 +31,13 @@ CREATE TABLE IF NOT EXISTS media_items (
|
||||
tvdb_id TEXT,
|
||||
jellyfin_raw TEXT,
|
||||
external_raw TEXT,
|
||||
scan_status TEXT NOT NULL DEFAULT 'pending',
|
||||
scan_error TEXT,
|
||||
last_scanned_at TEXT,
|
||||
last_executed_at TEXT,
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
scan_status TEXT NOT NULL DEFAULT 'pending',
|
||||
scan_error TEXT,
|
||||
last_scanned_at TEXT,
|
||||
ingest_source TEXT NOT NULL DEFAULT 'scan',
|
||||
last_executed_at TEXT,
|
||||
created_at TEXT NOT NULL DEFAULT (datetime('now'))
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS media_streams (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
|
||||
@@ -15,8 +15,12 @@ describe("parseId", () => {
|
||||
expect(parseId(undefined)).toBe(null);
|
||||
});
|
||||
|
||||
test("parses leading integer from mixed strings (parseInt semantics)", () => {
|
||||
expect(parseId("42abc")).toBe(42);
|
||||
test("rejects mixed alphanumeric strings (strict — route params must be wholly numeric)", () => {
|
||||
expect(parseId("42abc")).toBe(null);
|
||||
expect(parseId("abc42")).toBe(null);
|
||||
expect(parseId("42 ")).toBe(null);
|
||||
expect(parseId("+42")).toBe(null);
|
||||
expect(parseId("42.0")).toBe(null);
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
@@ -1,8 +1,12 @@
|
||||
import type { Context } from "hono";
|
||||
|
||||
/** Parse a route param as a positive integer id. Returns null if invalid. */
|
||||
/**
|
||||
* Parse a route param as a positive integer id. Returns null if invalid.
|
||||
* Strict: rejects mixed strings like "42abc" that Number.parseInt would
|
||||
* accept — route params must be wholly numeric or the request is bad.
|
||||
*/
|
||||
export function parseId(raw: string | undefined): number | null {
|
||||
if (!raw) return null;
|
||||
if (!raw || !/^\d+$/.test(raw)) return null;
|
||||
const n = Number.parseInt(raw, 10);
|
||||
return Number.isFinite(n) && n > 0 ? n : null;
|
||||
}
|
||||
|
||||
@@ -182,6 +182,50 @@ describe("buildCommand", () => {
|
||||
expect(cmd).toContain("-metadata:s:a:2 language=und");
|
||||
});
|
||||
|
||||
test("writes canonical 'ENG - CODEC · CHANNELS' title on every kept audio stream", () => {
|
||||
const streams = [
|
||||
stream({ id: 1, type: "Video", stream_index: 0 }),
|
||||
stream({
|
||||
id: 2,
|
||||
type: "Audio",
|
||||
stream_index: 1,
|
||||
codec: "ac3",
|
||||
channels: 6,
|
||||
language: "eng",
|
||||
title: "Audio Description",
|
||||
}),
|
||||
stream({ id: 3, type: "Audio", stream_index: 2, codec: "dts", channels: 1, language: "deu" }),
|
||||
stream({ id: 4, type: "Audio", stream_index: 3, codec: "aac", channels: 2, language: null }),
|
||||
];
|
||||
const decisions = [
|
||||
decision({ stream_id: 1, action: "keep", target_index: 0 }),
|
||||
decision({ stream_id: 2, action: "keep", target_index: 0 }),
|
||||
decision({ stream_id: 3, action: "keep", target_index: 1 }),
|
||||
decision({ stream_id: 4, action: "keep", target_index: 2 }),
|
||||
];
|
||||
const cmd = buildCommand(ITEM, streams, decisions);
|
||||
// Original "Audio Description" title is replaced with the harmonized form.
|
||||
expect(cmd).toContain("-metadata:s:a:0 title='ENG - AC3 · 5.1'");
|
||||
// Mono renders as 1.0 (not the legacy "mono" string).
|
||||
expect(cmd).toContain("-metadata:s:a:1 title='DEU - DTS · 1.0'");
|
||||
// Stereo renders as 2.0.
|
||||
expect(cmd).toContain("-metadata:s:a:2 title='AAC · 2.0'");
|
||||
});
|
||||
|
||||
test("custom_title still overrides the auto-generated audio title", () => {
|
||||
const streams = [
|
||||
stream({ id: 1, type: "Video", stream_index: 0 }),
|
||||
stream({ id: 2, type: "Audio", stream_index: 1, codec: "ac3", channels: 6, language: "eng" }),
|
||||
];
|
||||
const decisions = [
|
||||
decision({ stream_id: 1, action: "keep", target_index: 0 }),
|
||||
decision({ stream_id: 2, action: "keep", target_index: 0, custom_title: "Director's Cut" }),
|
||||
];
|
||||
const cmd = buildCommand(ITEM, streams, decisions);
|
||||
expect(cmd).toContain("-metadata:s:a:0 title='Director'\\''s Cut'");
|
||||
expect(cmd).not.toContain("ENG - AC3");
|
||||
});
|
||||
|
||||
test("sets first kept audio as default, clears others", () => {
|
||||
const streams = [
|
||||
stream({ id: 1, type: "Video", stream_index: 0 }),
|
||||
|
||||
@@ -43,41 +43,43 @@ describe("processWebhookEvent — acceptance", () => {
|
||||
beforeEach(() => _resetDedupe());
|
||||
afterEach(() => _resetDedupe());
|
||||
|
||||
test("rejects unknown events", async () => {
|
||||
test("rejects playback-related NotificationTypes (the plugin publishes many, we only want ItemAdded)", async () => {
|
||||
const db = makeDb();
|
||||
const res = await processWebhookEvent(
|
||||
{ event: "PlaybackStart", itemId: "jf-1", itemType: "Movie" },
|
||||
{ db, jellyfin: JF, rescanCfg: RESCAN_CFG, getItemFn: async () => fakeItem() },
|
||||
);
|
||||
expect(res.accepted).toBe(false);
|
||||
expect(res.reason).toContain("event");
|
||||
for (const nt of ["PlaybackStart", "PlaybackProgress", "UserDataSaved", "ItemUpdated"]) {
|
||||
const res = await processWebhookEvent(
|
||||
{ NotificationType: nt, ItemId: "jf-1", ItemType: "Movie" },
|
||||
{ db, jellyfin: JF, rescanCfg: RESCAN_CFG, getItemFn: async () => fakeItem() },
|
||||
);
|
||||
expect(res.accepted).toBe(false);
|
||||
expect(res.reason).toContain("NotificationType");
|
||||
}
|
||||
});
|
||||
|
||||
test("rejects non-Movie/Episode types", async () => {
|
||||
const db = makeDb();
|
||||
const res = await processWebhookEvent(
|
||||
{ event: "ItemUpdated", itemId: "jf-1", itemType: "Trailer" },
|
||||
{ NotificationType: "ItemAdded", ItemId: "jf-1", ItemType: "Trailer" },
|
||||
{ db, jellyfin: JF, rescanCfg: RESCAN_CFG, getItemFn: async () => fakeItem({ Type: "Trailer" }) },
|
||||
);
|
||||
expect(res.accepted).toBe(false);
|
||||
expect(res.reason).toContain("itemType");
|
||||
expect(res.reason).toContain("ItemType");
|
||||
});
|
||||
|
||||
test("rejects missing itemId", async () => {
|
||||
test("rejects missing ItemId", async () => {
|
||||
const db = makeDb();
|
||||
const res = await processWebhookEvent(
|
||||
{ event: "ItemUpdated", itemType: "Movie" },
|
||||
{ NotificationType: "ItemAdded", ItemType: "Movie" },
|
||||
{ db, jellyfin: JF, rescanCfg: RESCAN_CFG, getItemFn: async () => fakeItem() },
|
||||
);
|
||||
expect(res.accepted).toBe(false);
|
||||
expect(res.reason).toContain("itemId");
|
||||
expect(res.reason).toContain("ItemId");
|
||||
});
|
||||
|
||||
test("dedupes bursts within 5s and accepts again after", async () => {
|
||||
const db = makeDb();
|
||||
let fakeNow = 1_000_000;
|
||||
const getItemFn = async () => fakeItem();
|
||||
const payload = { event: "ItemUpdated", itemId: "jf-1", itemType: "Movie" };
|
||||
const payload = { NotificationType: "ItemAdded", ItemId: "jf-1", ItemType: "Movie" };
|
||||
|
||||
const first = await processWebhookEvent(payload, {
|
||||
db,
|
||||
@@ -113,7 +115,7 @@ describe("processWebhookEvent — acceptance", () => {
|
||||
test("drops when Jellyfin returns no item", async () => {
|
||||
const db = makeDb();
|
||||
const res = await processWebhookEvent(
|
||||
{ event: "ItemUpdated", itemId: "jf-missing", itemType: "Movie" },
|
||||
{ NotificationType: "ItemAdded", ItemId: "jf-missing", ItemType: "Movie" },
|
||||
{ db, jellyfin: JF, rescanCfg: RESCAN_CFG, getItemFn: async () => null },
|
||||
);
|
||||
expect(res.accepted).toBe(false);
|
||||
@@ -126,7 +128,7 @@ describe("processWebhookEvent — done-status override", () => {
|
||||
|
||||
async function runWebhook(db: Database, item: JellyfinItem, cfg: RescanConfig = RESCAN_CFG) {
|
||||
return processWebhookEvent(
|
||||
{ event: "ItemUpdated", itemId: item.Id, itemType: item.Type as "Movie" | "Episode" },
|
||||
{ NotificationType: "ItemAdded", ItemId: item.Id, ItemType: item.Type as "Movie" | "Episode" },
|
||||
{ db, jellyfin: JF, rescanCfg: cfg, getItemFn: async () => item },
|
||||
);
|
||||
}
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import type { MediaItem, MediaStream, PlanResult } from "../types";
|
||||
import { computeAppleCompat, isAppleCompatible, transcodeTarget } from "./apple-compat";
|
||||
import { isExtractableSubtitle } from "./ffmpeg";
|
||||
import { normalizeLanguage } from "./jellyfin";
|
||||
|
||||
export interface AnalyzerConfig {
|
||||
@@ -92,6 +93,22 @@ export function analyzeItem(
|
||||
notes.push("Original language unknown — audio tracks not filtered; manual review required");
|
||||
}
|
||||
|
||||
// Surface image-based subtitles that can't be written to a sane
|
||||
// single-file sidecar. They'll still be stripped from the container,
|
||||
// but won't land on disk anywhere — the user sees this in the plan
|
||||
// notes so nothing vanishes silently.
|
||||
const nonExtractable = streams.filter((s) => s.type === "Subtitle" && !isExtractableSubtitle(s.codec));
|
||||
if (nonExtractable.length > 0) {
|
||||
const grouped = new Map<string, string[]>();
|
||||
for (const s of nonExtractable) {
|
||||
const codec = (s.codec ?? "unknown").toLowerCase();
|
||||
if (!grouped.has(codec)) grouped.set(codec, []);
|
||||
grouped.get(codec)!.push(s.language ?? "und");
|
||||
}
|
||||
const summary = [...grouped.entries()].map(([codec, langs]) => `${codec} (${langs.join(", ")})`).join("; ");
|
||||
notes.push(`${nonExtractable.length} subtitle(s) dropped: ${summary} — not extractable to sidecar`);
|
||||
}
|
||||
|
||||
return { is_noop, has_subs: hasSubs, confidence: "low", apple_compat, job_type, decisions, notes };
|
||||
}
|
||||
|
||||
|
||||
@@ -47,39 +47,44 @@ const ISO639_1: Record<string, string> = {
|
||||
est: "et",
|
||||
};
|
||||
|
||||
/** Subtitle codec → external file extension. */
|
||||
const SUBTITLE_EXT: Record<string, string> = {
|
||||
subrip: "srt",
|
||||
srt: "srt",
|
||||
ass: "ass",
|
||||
ssa: "ssa",
|
||||
webvtt: "vtt",
|
||||
vtt: "vtt",
|
||||
hdmv_pgs_subtitle: "sup",
|
||||
pgssub: "sup",
|
||||
dvd_subtitle: "sub",
|
||||
dvbsub: "sub",
|
||||
mov_text: "srt",
|
||||
text: "srt",
|
||||
/**
|
||||
* Subtitle codecs we can reliably extract to a single-file sidecar. Mapped
|
||||
* to {ext, codecArg} for the ffmpeg output. Anything NOT in this map is
|
||||
* deliberately skipped — ffmpeg's srt/text muxers reject image-based
|
||||
* codecs like dvd_subtitle/dvb_subtitle with "Unsupported subtitles
|
||||
* codec", crashing the whole job. VobSub extraction would produce a
|
||||
* .sub + .idx pair and complicate the predicted-files contract, so for
|
||||
* now those are stripped from the container but not written out. A plan
|
||||
* note records what was dropped (see analyzer.ts).
|
||||
*
|
||||
* Jellyfin returns short codec names (dvdsub, pgssub) while ffmpeg's own
|
||||
* output uses the long form (dvd_subtitle, hdmv_pgs_subtitle). Both are
|
||||
* accepted here to keep alias drift harmless.
|
||||
*/
|
||||
const EXTRACTABLE: Record<string, { ext: string; codecArg: string }> = {
|
||||
subrip: { ext: "srt", codecArg: "copy" },
|
||||
srt: { ext: "srt", codecArg: "copy" },
|
||||
ass: { ext: "ass", codecArg: "copy" },
|
||||
ssa: { ext: "ssa", codecArg: "copy" },
|
||||
webvtt: { ext: "vtt", codecArg: "copy" },
|
||||
vtt: { ext: "vtt", codecArg: "copy" },
|
||||
mov_text: { ext: "srt", codecArg: "subrip" },
|
||||
text: { ext: "srt", codecArg: "copy" },
|
||||
hdmv_pgs_subtitle: { ext: "sup", codecArg: "copy" },
|
||||
pgssub: { ext: "sup", codecArg: "copy" },
|
||||
};
|
||||
|
||||
export function isExtractableSubtitle(codec: string | null): boolean {
|
||||
if (!codec) return false;
|
||||
return codec.toLowerCase() in EXTRACTABLE;
|
||||
}
|
||||
|
||||
function subtitleLang2(lang: string | null): string {
|
||||
if (!lang) return "und";
|
||||
const n = normalizeLanguage(lang);
|
||||
return ISO639_1[n] ?? n;
|
||||
}
|
||||
|
||||
/** Returns the ffmpeg codec name to use when extracting this subtitle stream. */
|
||||
function subtitleCodecArg(codec: string | null): string {
|
||||
if (!codec) return "copy";
|
||||
return codec.toLowerCase() === "mov_text" ? "subrip" : "copy";
|
||||
}
|
||||
|
||||
function subtitleExtForCodec(codec: string | null): string {
|
||||
if (!codec) return "srt";
|
||||
return SUBTITLE_EXT[codec.toLowerCase()] ?? "srt";
|
||||
}
|
||||
|
||||
/**
|
||||
* Build ffmpeg output args for extracting ALL subtitle streams
|
||||
* to external sidecar files next to the video.
|
||||
@@ -106,7 +111,12 @@ function computeExtractionEntries(allStreams: MediaStream[], basePath: string):
|
||||
if (s.type === "Subtitle") subTypeIdx.set(s.id, subCount++);
|
||||
}
|
||||
|
||||
const allSubs = allStreams.filter((s) => s.type === "Subtitle").sort((a, b) => a.stream_index - b.stream_index);
|
||||
// Only extract codecs we can route to a sane single-file sidecar. Image
|
||||
// formats like dvd_subtitle crash the job if we try — see EXTRACTABLE.
|
||||
const allSubs = allStreams
|
||||
.filter((s) => s.type === "Subtitle")
|
||||
.filter((s) => isExtractableSubtitle(s.codec))
|
||||
.sort((a, b) => a.stream_index - b.stream_index);
|
||||
|
||||
if (allSubs.length === 0) return [];
|
||||
|
||||
@@ -116,8 +126,9 @@ function computeExtractionEntries(allStreams: MediaStream[], basePath: string):
|
||||
for (const s of allSubs) {
|
||||
const typeIdx = subTypeIdx.get(s.id) ?? 0;
|
||||
const langCode = subtitleLang2(s.language);
|
||||
const ext = subtitleExtForCodec(s.codec);
|
||||
const codecArg = subtitleCodecArg(s.codec);
|
||||
const spec = EXTRACTABLE[(s.codec ?? "").toLowerCase()];
|
||||
const ext = spec.ext;
|
||||
const codecArg = spec.codecArg;
|
||||
|
||||
const nameParts = [langCode];
|
||||
if (s.is_forced) nameParts.push("forced");
|
||||
@@ -207,6 +218,20 @@ const LANG_NAMES: Record<string, string> = {
|
||||
est: "Estonian",
|
||||
};
|
||||
|
||||
/**
|
||||
* Channel count → "N.M" layout string (5.1, 7.1, 2.0, 1.0).
|
||||
* Falls back to "Nch" for anything outside the common consumer layouts.
|
||||
*/
|
||||
function formatChannels(n: number | null): string | null {
|
||||
if (n == null) return null;
|
||||
if (n === 1) return "1.0";
|
||||
if (n === 2) return "2.0";
|
||||
if (n === 6) return "5.1";
|
||||
if (n === 7) return "6.1";
|
||||
if (n === 8) return "7.1";
|
||||
return `${n}ch`;
|
||||
}
|
||||
|
||||
function trackTitle(stream: MediaStream): string | null {
|
||||
if (stream.type === "Subtitle") {
|
||||
// Subtitles always get a clean language-based title so Jellyfin displays
|
||||
@@ -220,12 +245,21 @@ function trackTitle(stream: MediaStream): string | null {
|
||||
if (stream.is_hearing_impaired) return `${base} (CC)`;
|
||||
return base;
|
||||
}
|
||||
// For audio and other stream types: preserve any existing title
|
||||
// (e.g. "Director's Commentary") and fall back to language name.
|
||||
if (stream.title) return stream.title;
|
||||
if (!stream.language) return null;
|
||||
const lang = normalizeLanguage(stream.language);
|
||||
return LANG_NAMES[lang] ?? lang.toUpperCase();
|
||||
// Audio: harmonize to "ENG - AC3 · 5.1". Overrides whatever the file had
|
||||
// (e.g. "Audio Description", "Director's Commentary") — the user uses
|
||||
// the review UI to drop unwanted tracks before we get here, so by this
|
||||
// point every kept audio track is a primary track that deserves a clean
|
||||
// canonical label. If a user wants a different title, custom_title on
|
||||
// the decision still wins (see buildStreamFlags).
|
||||
const lang = stream.language ? normalizeLanguage(stream.language) : null;
|
||||
const langPart = lang ? lang.toUpperCase() : null;
|
||||
const codecPart = stream.codec ? stream.codec.toUpperCase() : null;
|
||||
const channelsPart = formatChannels(stream.channels);
|
||||
const tail = [codecPart, channelsPart].filter((v): v is string => !!v).join(" · ");
|
||||
if (langPart && tail) return `${langPart} - ${tail}`;
|
||||
if (langPart) return langPart;
|
||||
if (tail) return tail;
|
||||
return null;
|
||||
}
|
||||
|
||||
const TYPE_SPEC: Record<string, string> = { Video: "v", Audio: "a", Subtitle: "s" };
|
||||
|
||||
@@ -139,7 +139,18 @@ export async function getItem(cfg: JellyfinConfig, jellyfinId: string): Promise<
|
||||
* Trigger a Jellyfin metadata refresh for a single item and wait until it completes.
|
||||
* Polls DateLastRefreshed until it changes (or timeout is reached).
|
||||
*/
|
||||
export async function refreshItem(cfg: JellyfinConfig, jellyfinId: string, timeoutMs = 15000): Promise<void> {
|
||||
/**
|
||||
* Trigger a Jellyfin metadata refresh and poll until the item's
|
||||
* `DateLastRefreshed` advances. Returns true when the probe actually ran;
|
||||
* false on timeout (caller decides whether to trust the item's current
|
||||
* metadata or treat it as unverified — verification paths should NOT
|
||||
* proceed on false, since a stale snapshot would give a bogus verdict).
|
||||
*/
|
||||
export async function refreshItem(
|
||||
cfg: JellyfinConfig,
|
||||
jellyfinId: string,
|
||||
timeoutMs = 15000,
|
||||
): Promise<{ refreshed: boolean }> {
|
||||
const itemUrl = `${cfg.url}/Items/${jellyfinId}`;
|
||||
|
||||
// 1. Snapshot current DateLastRefreshed
|
||||
@@ -164,9 +175,11 @@ export async function refreshItem(cfg: JellyfinConfig, jellyfinId: string, timeo
|
||||
const checkRes = await fetch(itemUrl, { headers: headers(cfg.apiKey) });
|
||||
if (!checkRes.ok) continue;
|
||||
const check = (await checkRes.json()) as { DateLastRefreshed?: string };
|
||||
if (check.DateLastRefreshed && check.DateLastRefreshed !== beforeDate) return;
|
||||
if (check.DateLastRefreshed && check.DateLastRefreshed !== beforeDate) {
|
||||
return { refreshed: true };
|
||||
}
|
||||
}
|
||||
// Timeout reached — proceed anyway (refresh may still complete in background)
|
||||
return { refreshed: false };
|
||||
}
|
||||
|
||||
/** Case-insensitive hints that a track is a dub / commentary, not the original. */
|
||||
|
||||
@@ -133,11 +133,11 @@ export async function upsertJellyfinItem(
|
||||
season_number, episode_number, year, file_path, file_size, container,
|
||||
runtime_ticks, date_last_refreshed,
|
||||
original_language, orig_lang_source, needs_review,
|
||||
imdb_id, tmdb_id, tvdb_id,
|
||||
jellyfin_raw, external_raw,
|
||||
scan_status, last_scanned_at${opts.executed ? ", last_executed_at" : ""}
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, 'scanned', datetime('now')${opts.executed ? ", datetime('now')" : ""})
|
||||
ON CONFLICT(jellyfin_id) DO UPDATE SET
|
||||
imdb_id, tmdb_id, tvdb_id,
|
||||
jellyfin_raw, external_raw,
|
||||
scan_status, last_scanned_at, ingest_source${opts.executed ? ", last_executed_at" : ""}
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, 'scanned', datetime('now'), ?${opts.executed ? ", datetime('now')" : ""})
|
||||
ON CONFLICT(jellyfin_id) DO UPDATE SET
|
||||
type = excluded.type, name = excluded.name, original_title = excluded.original_title,
|
||||
series_name = excluded.series_name, series_jellyfin_id = excluded.series_jellyfin_id,
|
||||
season_number = excluded.season_number, episode_number = excluded.episode_number,
|
||||
@@ -145,12 +145,13 @@ export async function upsertJellyfinItem(
|
||||
file_size = excluded.file_size, container = excluded.container,
|
||||
runtime_ticks = excluded.runtime_ticks, date_last_refreshed = excluded.date_last_refreshed,
|
||||
original_language = excluded.original_language, orig_lang_source = excluded.orig_lang_source,
|
||||
needs_review = excluded.needs_review, imdb_id = excluded.imdb_id,
|
||||
tmdb_id = excluded.tmdb_id, tvdb_id = excluded.tvdb_id,
|
||||
jellyfin_raw = excluded.jellyfin_raw, external_raw = excluded.external_raw,
|
||||
scan_status = 'scanned', last_scanned_at = datetime('now')
|
||||
${opts.executed ? ", last_executed_at = datetime('now')" : ""}
|
||||
`);
|
||||
needs_review = excluded.needs_review, imdb_id = excluded.imdb_id,
|
||||
tmdb_id = excluded.tmdb_id, tvdb_id = excluded.tvdb_id,
|
||||
jellyfin_raw = excluded.jellyfin_raw, external_raw = excluded.external_raw,
|
||||
scan_status = 'scanned', last_scanned_at = datetime('now'),
|
||||
ingest_source = excluded.ingest_source
|
||||
${opts.executed ? ", last_executed_at = datetime('now')" : ""}
|
||||
`);
|
||||
upsertItem.run(
|
||||
jellyfinItem.Id,
|
||||
jellyfinItem.Type === "Episode" ? "Episode" : "Movie",
|
||||
@@ -174,6 +175,7 @@ export async function upsertJellyfinItem(
|
||||
tvdbId,
|
||||
jellyfinRaw,
|
||||
externalRawJson,
|
||||
source,
|
||||
);
|
||||
|
||||
const itemRow = db.prepare("SELECT id FROM media_items WHERE jellyfin_id = ?").get(jellyfinItem.Id) as {
|
||||
@@ -254,7 +256,7 @@ export async function upsertJellyfinItem(
|
||||
analysis.apple_compat,
|
||||
analysis.job_type,
|
||||
analysis.notes.length > 0 ? analysis.notes.join("\n") : null,
|
||||
source,
|
||||
source, // for the CASE WHEN ? = 'webhook' branch
|
||||
);
|
||||
|
||||
const planRow = db.prepare("SELECT id FROM review_plans WHERE item_id = ?").get(itemId) as { id: number };
|
||||
|
||||
@@ -7,14 +7,19 @@ import { type RescanConfig, type RescanResult, upsertJellyfinItem } from "./resc
|
||||
import { loadLibrary as loadSonarrLibrary, isUsable as sonarrUsable } from "./sonarr";
|
||||
|
||||
/**
|
||||
* Events we care about. Jellyfin's Webhook plugin emits many event types;
|
||||
* Library.ItemAdded and Library.ItemUpdated are the only ones that signal
|
||||
* an on-disk file mutation. We ignore user-data changes, playback, etc.
|
||||
* Events we care about. Jellyfin's Webhook plugin (jellyfin-plugin-webhook)
|
||||
* only exposes ItemAdded as a library-side notification — there is no
|
||||
* ItemUpdated or Library.ItemUpdated. File-rewrites on existing items
|
||||
* produce zero MQTT traffic, so we can't observe them here; the UI's
|
||||
* post-job verification runs off our own ffprobe instead.
|
||||
*
|
||||
* Payload fields are PascalCase (NotificationType, ItemId, ItemType) — the
|
||||
* earlier camelCase in this handler matched nothing the plugin ever sends.
|
||||
*/
|
||||
const ACCEPTED_EVENTS = new Set(["ItemAdded", "ItemUpdated", "Library.ItemAdded", "Library.ItemUpdated"]);
|
||||
const ACCEPTED_EVENTS = new Set(["ItemAdded"]);
|
||||
const ACCEPTED_TYPES = new Set(["Movie", "Episode"]);
|
||||
|
||||
/** 5-second dedupe window: Jellyfin fires ItemUpdated multiple times per rescan. */
|
||||
/** 5-second dedupe window: Jellyfin can fire the same ItemAdded twice when multiple libraries share a path. */
|
||||
const DEDUPE_WINDOW_MS = 5000;
|
||||
const dedupe = new Map<string, number>();
|
||||
|
||||
@@ -29,9 +34,9 @@ function parseLanguageList(raw: string | null | undefined, fallback: string[]):
|
||||
}
|
||||
|
||||
export interface WebhookPayload {
|
||||
event?: string;
|
||||
itemId?: string;
|
||||
itemType?: string;
|
||||
NotificationType?: string;
|
||||
ItemId?: string;
|
||||
ItemType?: string;
|
||||
}
|
||||
|
||||
export interface WebhookHandlerDeps {
|
||||
@@ -59,14 +64,14 @@ export interface WebhookResult {
|
||||
export async function processWebhookEvent(payload: WebhookPayload, deps: WebhookHandlerDeps): Promise<WebhookResult> {
|
||||
const { db, jellyfin, rescanCfg, getItemFn = getItem, now = Date.now } = deps;
|
||||
|
||||
if (!payload.event || !ACCEPTED_EVENTS.has(payload.event)) {
|
||||
return { accepted: false, reason: `event '${payload.event}' not accepted` };
|
||||
if (!payload.NotificationType || !ACCEPTED_EVENTS.has(payload.NotificationType)) {
|
||||
return { accepted: false, reason: `NotificationType '${payload.NotificationType}' not accepted` };
|
||||
}
|
||||
if (!payload.itemType || !ACCEPTED_TYPES.has(payload.itemType)) {
|
||||
return { accepted: false, reason: `itemType '${payload.itemType}' not accepted` };
|
||||
if (!payload.ItemType || !ACCEPTED_TYPES.has(payload.ItemType)) {
|
||||
return { accepted: false, reason: `ItemType '${payload.ItemType}' not accepted` };
|
||||
}
|
||||
if (!payload.itemId) {
|
||||
return { accepted: false, reason: "missing itemId" };
|
||||
if (!payload.ItemId) {
|
||||
return { accepted: false, reason: "missing ItemId" };
|
||||
}
|
||||
|
||||
// Debounce: drop bursts within the window, always evict stale entries.
|
||||
@@ -74,20 +79,20 @@ export async function processWebhookEvent(payload: WebhookPayload, deps: Webhook
|
||||
for (const [id, seen] of dedupe) {
|
||||
if (ts - seen > DEDUPE_WINDOW_MS) dedupe.delete(id);
|
||||
}
|
||||
const last = dedupe.get(payload.itemId);
|
||||
const last = dedupe.get(payload.ItemId);
|
||||
if (last != null && ts - last <= DEDUPE_WINDOW_MS) {
|
||||
return { accepted: false, reason: "deduped" };
|
||||
}
|
||||
dedupe.set(payload.itemId, ts);
|
||||
dedupe.set(payload.ItemId, ts);
|
||||
|
||||
const fresh = await getItemFn(jellyfin, payload.itemId);
|
||||
const fresh = await getItemFn(jellyfin, payload.ItemId);
|
||||
if (!fresh) {
|
||||
warn(`Webhook: Jellyfin returned no item for ${payload.itemId}`);
|
||||
warn(`Webhook: Jellyfin returned no item for ${payload.ItemId}`);
|
||||
return { accepted: false, reason: "jellyfin returned no item" };
|
||||
}
|
||||
|
||||
const result = await upsertJellyfinItem(db, fresh, rescanCfg, { source: "webhook" });
|
||||
log(`Webhook: reanalyzed ${payload.itemType} ${payload.itemId} is_noop=${result.isNoop}`);
|
||||
log(`Webhook: ingested ${payload.ItemType} ${payload.ItemId} is_noop=${result.isNoop}`);
|
||||
return { accepted: true, result };
|
||||
}
|
||||
|
||||
|
||||
@@ -1,334 +0,0 @@
|
||||
import { Link, useNavigate, useSearch } from "@tanstack/react-router";
|
||||
import { Fragment, useCallback, useEffect, useRef, useState } from "react";
|
||||
import { Badge } from "~/shared/components/ui/badge";
|
||||
import { Button } from "~/shared/components/ui/button";
|
||||
import { FilterTabs } from "~/shared/components/ui/filter-tabs";
|
||||
import { api } from "~/shared/lib/api";
|
||||
import type { Job, MediaItem } from "~/shared/lib/types";
|
||||
|
||||
interface JobEntry {
|
||||
job: Job;
|
||||
item: MediaItem | null;
|
||||
}
|
||||
interface ExecuteData {
|
||||
jobs: JobEntry[];
|
||||
filter: string;
|
||||
totalCounts: Record<string, number>;
|
||||
}
|
||||
|
||||
const FILTER_TABS = [
|
||||
{ key: "all", label: "All" },
|
||||
{ key: "pending", label: "Pending" },
|
||||
{ key: "running", label: "Running" },
|
||||
{ key: "done", label: "Done" },
|
||||
{ key: "error", label: "Error" },
|
||||
];
|
||||
|
||||
function itemName(job: Job, item: MediaItem | null): string {
|
||||
if (!item) return `Item #${job.item_id}`;
|
||||
if (item.type === "Episode" && item.series_name) {
|
||||
return `${item.series_name} S${String(item.season_number ?? 0).padStart(2, "0")}E${String(item.episode_number ?? 0).padStart(2, "0")}`;
|
||||
}
|
||||
return item.name;
|
||||
}
|
||||
|
||||
function jobTypeLabel(job: Job): string {
|
||||
return job.job_type === "transcode" ? "Audio Transcode" : "Audio Remux";
|
||||
}
|
||||
|
||||
// Module-level cache for instant tab switching
|
||||
const cache = new Map<string, ExecuteData>();
|
||||
|
||||
export function ExecutePage() {
|
||||
const { filter } = useSearch({ from: "/execute" });
|
||||
const navigate = useNavigate();
|
||||
const [data, setData] = useState<ExecuteData | null>(cache.get(filter) ?? null);
|
||||
const [loading, setLoading] = useState(!cache.has(filter));
|
||||
const [logs, setLogs] = useState<Map<number, string>>(new Map());
|
||||
const [logVisible, setLogVisible] = useState<Set<number>>(new Set());
|
||||
const [cmdVisible, setCmdVisible] = useState<Set<number>>(new Set());
|
||||
const esRef = useRef<EventSource | null>(null);
|
||||
const reloadTimerRef = useRef<ReturnType<typeof setTimeout> | null>(null);
|
||||
|
||||
const load = useCallback(
|
||||
(f?: string) => {
|
||||
const key = f ?? filter;
|
||||
const cached = cache.get(key);
|
||||
if (cached && key === filter) {
|
||||
setData(cached);
|
||||
setLoading(false);
|
||||
} else if (key === filter) {
|
||||
setLoading(true);
|
||||
}
|
||||
api
|
||||
.get<ExecuteData>(`/api/execute?filter=${key}`)
|
||||
.then((d) => {
|
||||
cache.set(key, d);
|
||||
if (key === filter) {
|
||||
setData(d);
|
||||
setLoading(false);
|
||||
}
|
||||
})
|
||||
.catch(() => {
|
||||
if (key === filter) setLoading(false);
|
||||
});
|
||||
},
|
||||
[filter],
|
||||
);
|
||||
|
||||
useEffect(() => {
|
||||
load();
|
||||
}, [load]);
|
||||
|
||||
// SSE for live job updates
|
||||
useEffect(() => {
|
||||
const es = new EventSource("/api/execute/events");
|
||||
esRef.current = es;
|
||||
es.addEventListener("job_update", (e) => {
|
||||
const d = JSON.parse(e.data) as { id: number; status: string; output?: string };
|
||||
|
||||
// Update job in current list if present
|
||||
setData((prev) => {
|
||||
if (!prev) return prev;
|
||||
const jobIdx = prev.jobs.findIndex((j) => j.job.id === d.id);
|
||||
if (jobIdx === -1) return prev;
|
||||
|
||||
const oldStatus = prev.jobs[jobIdx].job.status;
|
||||
const newStatus = d.status as Job["status"];
|
||||
|
||||
// Live-update totalCounts
|
||||
const newCounts = { ...prev.totalCounts };
|
||||
if (oldStatus !== newStatus) {
|
||||
if (newCounts[oldStatus] != null) newCounts[oldStatus]--;
|
||||
if (newCounts[newStatus] != null) newCounts[newStatus]++;
|
||||
else newCounts[newStatus] = 1;
|
||||
}
|
||||
|
||||
return {
|
||||
...prev,
|
||||
totalCounts: newCounts,
|
||||
jobs: prev.jobs.map((j) => (j.job.id === d.id ? { ...j, job: { ...j.job, status: newStatus } } : j)),
|
||||
};
|
||||
});
|
||||
|
||||
if (d.output !== undefined) {
|
||||
setLogs((prev) => {
|
||||
const m = new Map(prev);
|
||||
m.set(d.id, d.output!);
|
||||
return m;
|
||||
});
|
||||
}
|
||||
|
||||
// Debounced reload on terminal state for accurate list
|
||||
if (d.status === "done" || d.status === "error") {
|
||||
if (reloadTimerRef.current) clearTimeout(reloadTimerRef.current);
|
||||
reloadTimerRef.current = setTimeout(() => {
|
||||
// Invalidate cache and reload current filter
|
||||
cache.clear();
|
||||
load();
|
||||
}, 1000);
|
||||
}
|
||||
});
|
||||
return () => {
|
||||
es.close();
|
||||
if (reloadTimerRef.current) clearTimeout(reloadTimerRef.current);
|
||||
};
|
||||
}, [load]);
|
||||
|
||||
const startAll = async () => {
|
||||
await api.post("/api/execute/start");
|
||||
cache.clear();
|
||||
load();
|
||||
};
|
||||
const clearQueue = async () => {
|
||||
await api.post("/api/execute/clear");
|
||||
cache.clear();
|
||||
load();
|
||||
};
|
||||
const clearCompleted = async () => {
|
||||
await api.post("/api/execute/clear-completed");
|
||||
cache.clear();
|
||||
load();
|
||||
};
|
||||
const runJob = async (id: number) => {
|
||||
await api.post(`/api/execute/job/${id}/run`);
|
||||
cache.clear();
|
||||
load();
|
||||
};
|
||||
const cancelJob = async (id: number) => {
|
||||
await api.post(`/api/execute/job/${id}/cancel`);
|
||||
cache.clear();
|
||||
load();
|
||||
};
|
||||
|
||||
const toggleLog = (id: number) =>
|
||||
setLogVisible((prev) => {
|
||||
const s = new Set(prev);
|
||||
s.has(id) ? s.delete(id) : s.add(id);
|
||||
return s;
|
||||
});
|
||||
const toggleCmd = (id: number) =>
|
||||
setCmdVisible((prev) => {
|
||||
const s = new Set(prev);
|
||||
s.has(id) ? s.delete(id) : s.add(id);
|
||||
return s;
|
||||
});
|
||||
|
||||
const totalCounts = data?.totalCounts ?? { all: 0, pending: 0, running: 0, done: 0, error: 0 };
|
||||
const pending = totalCounts.pending ?? 0;
|
||||
const done = totalCounts.done ?? 0;
|
||||
const errors = totalCounts.error ?? 0;
|
||||
const jobs = data?.jobs ?? [];
|
||||
|
||||
const running = totalCounts.running ?? 0;
|
||||
const allDone = totalCounts.all > 0 && pending === 0 && running === 0;
|
||||
|
||||
return (
|
||||
<div>
|
||||
<h1 className="text-xl font-bold mb-4">Execute Jobs</h1>
|
||||
|
||||
<div className="border border-gray-200 rounded-lg px-4 py-3 mb-6 flex items-center gap-3 flex-wrap">
|
||||
{totalCounts.all === 0 && !loading && <span className="text-sm text-gray-500">No jobs yet.</span>}
|
||||
{totalCounts.all === 0 && loading && <span className="text-sm text-gray-400">Loading...</span>}
|
||||
{allDone && <span className="text-sm font-medium">All jobs completed</span>}
|
||||
{running > 0 && (
|
||||
<span className="text-sm font-medium">
|
||||
{running} job{running !== 1 ? "s" : ""} running
|
||||
</span>
|
||||
)}
|
||||
{pending > 0 && (
|
||||
<>
|
||||
<span className="text-sm font-medium">
|
||||
{pending} job{pending !== 1 ? "s" : ""} pending
|
||||
</span>
|
||||
<Button size="sm" onClick={startAll}>
|
||||
Run all pending
|
||||
</Button>
|
||||
<Button size="sm" variant="secondary" onClick={clearQueue}>
|
||||
Clear queue
|
||||
</Button>
|
||||
</>
|
||||
)}
|
||||
{(done > 0 || errors > 0) && (
|
||||
<Button size="sm" variant="secondary" onClick={clearCompleted}>
|
||||
Clear done/errors
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<FilterTabs
|
||||
tabs={FILTER_TABS}
|
||||
filter={filter}
|
||||
totalCounts={totalCounts}
|
||||
onFilterChange={(key) => navigate({ to: "/execute", search: { filter: key } as never })}
|
||||
/>
|
||||
|
||||
{loading && !data && <div className="text-gray-400 py-8 text-center">Loading…</div>}
|
||||
|
||||
{jobs.length > 0 && (
|
||||
<div className="overflow-x-auto -mx-3 px-3 sm:mx-0 sm:px-0">
|
||||
<table className="w-full border-collapse text-[0.82rem]">
|
||||
<thead>
|
||||
<tr>
|
||||
{["#", "Item", "Type", "Status", "Actions"].map((h) => (
|
||||
<th
|
||||
key={h}
|
||||
className="text-left text-[0.68rem] font-bold uppercase tracking-[0.06em] text-gray-500 py-1 px-2 border-b-2 border-gray-200 whitespace-nowrap"
|
||||
>
|
||||
{h}
|
||||
</th>
|
||||
))}
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{jobs.map(({ job, item }: JobEntry) => {
|
||||
const name = itemName(job, item);
|
||||
const jobLog = logs.get(job.id) ?? job.output ?? "";
|
||||
const showLog = logVisible.has(job.id) || job.status === "running" || job.status === "error";
|
||||
const showCmd = cmdVisible.has(job.id);
|
||||
|
||||
return (
|
||||
<Fragment key={job.id}>
|
||||
<tr key={job.id} className="hover:bg-gray-50">
|
||||
<td className="py-1.5 px-2 border-b border-gray-100 font-mono text-xs">{job.id}</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">
|
||||
<div className="truncate max-w-[300px]" title={name}>
|
||||
{item ? (
|
||||
<Link
|
||||
to="/review/audio/$id"
|
||||
params={{ id: String(item.id) }}
|
||||
className="text-inherit no-underline hover:underline"
|
||||
>
|
||||
{name}
|
||||
</Link>
|
||||
) : (
|
||||
name
|
||||
)}
|
||||
</div>
|
||||
</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100 whitespace-nowrap">
|
||||
<Badge variant={job.job_type === "transcode" ? "manual" : "noop"}>{jobTypeLabel(job)}</Badge>
|
||||
</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">
|
||||
<Badge variant={job.status}>{job.status}</Badge>
|
||||
{job.exit_code != null && job.exit_code !== 0 && (
|
||||
<Badge variant="error" className="ml-1">
|
||||
exit {job.exit_code}
|
||||
</Badge>
|
||||
)}
|
||||
</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100 whitespace-nowrap">
|
||||
<div className="flex gap-1 items-center">
|
||||
{job.status === "pending" && (
|
||||
<>
|
||||
<Button size="sm" onClick={() => runJob(job.id)}>
|
||||
▶ Run
|
||||
</Button>
|
||||
<Button size="sm" variant="secondary" onClick={() => cancelJob(job.id)}>
|
||||
✕
|
||||
</Button>
|
||||
</>
|
||||
)}
|
||||
<Button size="sm" variant="secondary" onClick={() => toggleCmd(job.id)}>
|
||||
Cmd
|
||||
</Button>
|
||||
{(job.status === "done" || job.status === "error") && jobLog && (
|
||||
<Button size="sm" variant="secondary" onClick={() => toggleLog(job.id)}>
|
||||
Log
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
</td>
|
||||
</tr>
|
||||
{showCmd && (
|
||||
<tr key={`cmd-${job.id}`}>
|
||||
<td colSpan={5} className="p-0 border-b border-gray-100">
|
||||
<div className="font-mono text-[0.74rem] bg-gray-50 text-gray-700 px-3.5 py-2.5 rounded max-h-[120px] overflow-y-auto whitespace-pre-wrap break-all">
|
||||
{job.command}
|
||||
</div>
|
||||
</td>
|
||||
</tr>
|
||||
)}
|
||||
{jobLog && showLog && (
|
||||
<tr key={`log-${job.id}`}>
|
||||
<td colSpan={5} className="p-0 border-b border-gray-100">
|
||||
<div className="font-mono text-[0.74rem] bg-[#1a1a1a] text-[#d4d4d4] px-3.5 py-2.5 rounded max-h-[260px] overflow-y-auto whitespace-pre-wrap break-all">
|
||||
{jobLog}
|
||||
</div>
|
||||
</td>
|
||||
</tr>
|
||||
)}
|
||||
</Fragment>
|
||||
);
|
||||
})}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{!loading && jobs.length === 0 && totalCounts.all > 0 && (
|
||||
<p className="text-gray-500 text-center py-4">No jobs match this filter.</p>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,3 +1,4 @@
|
||||
import { Link } from "@tanstack/react-router";
|
||||
import { Badge } from "~/shared/components/ui/badge";
|
||||
import { api } from "~/shared/lib/api";
|
||||
import type { PipelineJobItem } from "~/shared/lib/types";
|
||||
@@ -14,16 +15,36 @@ export function DoneColumn({ items, onMutate }: DoneColumnProps) {
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const reopen = async (itemId: number) => {
|
||||
await api.post(`/api/review/${itemId}/reopen`);
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const actions = items.length > 0 ? [{ label: "Clear", onClick: clear }] : undefined;
|
||||
|
||||
return (
|
||||
<ColumnShell
|
||||
title="Done"
|
||||
count={items.length}
|
||||
actions={items.length > 0 ? [{ label: "Clear", onClick: clear }] : undefined}
|
||||
>
|
||||
<ColumnShell title="Done" count={items.length} actions={actions}>
|
||||
{items.map((item) => (
|
||||
<div key={item.id} className="rounded border bg-white p-2">
|
||||
<p className="text-xs font-medium truncate">{item.name}</p>
|
||||
<Badge variant={item.status === "done" ? "done" : "error"}>{item.status}</Badge>
|
||||
<div key={item.id} className="group rounded border bg-white p-2">
|
||||
<Link
|
||||
to="/review/audio/$id"
|
||||
params={{ id: String(item.item_id) }}
|
||||
className="text-xs font-medium truncate block hover:text-blue-600 hover:underline"
|
||||
>
|
||||
{item.name}
|
||||
</Link>
|
||||
<div className="flex items-center gap-1.5 mt-0.5">
|
||||
<Badge variant={item.status === "done" ? "done" : "error"}>{item.status}</Badge>
|
||||
<div className="flex-1" />
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => reopen(item.item_id)}
|
||||
title="Send this item back to the Review column to redecide and re-queue"
|
||||
className="text-[0.68rem] px-1.5 py-0.5 rounded border border-gray-300 bg-white text-gray-700 hover:bg-gray-100 opacity-0 group-hover:opacity-100 transition-opacity shrink-0"
|
||||
>
|
||||
← Back to review
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
{items.length === 0 && <p className="text-sm text-gray-400 text-center py-8">No completed items</p>}
|
||||
|
||||
@@ -32,22 +32,40 @@ interface PipelineCardProps {
|
||||
// (no onToggleStream) and the primary button un-approves the plan,
|
||||
// sending the item back to the Review column.
|
||||
onUnapprove?: () => void;
|
||||
// Review-column affordance: approve this card AND every card visually
|
||||
// above it in one round-trip. Only set for the top-level review list;
|
||||
// expanded series episodes don't get this (the series' "Approve all"
|
||||
// covers the prior-episodes-in-series case).
|
||||
onApproveUpToHere?: () => void;
|
||||
}
|
||||
|
||||
function formatChannels(n: number | null | undefined): string | null {
|
||||
if (n == null) return null;
|
||||
if (n === 1) return "1.0";
|
||||
if (n === 2) return "2.0";
|
||||
if (n === 6) return "5.1";
|
||||
if (n === 7) return "6.1";
|
||||
if (n === 8) return "7.1";
|
||||
return `${n}ch`;
|
||||
}
|
||||
|
||||
function describeStream(s: PipelineAudioStream): string {
|
||||
const parts: string[] = [];
|
||||
if (s.codec) parts.push(s.codec.toUpperCase());
|
||||
if (s.channels != null) {
|
||||
if (s.channels === 6) parts.push("5.1");
|
||||
else if (s.channels === 8) parts.push("7.1");
|
||||
else if (s.channels === 2) parts.push("stereo");
|
||||
else if (s.channels === 1) parts.push("mono");
|
||||
else parts.push(`${s.channels}ch`);
|
||||
}
|
||||
const ch = formatChannels(s.channels);
|
||||
if (ch) parts.push(ch);
|
||||
return parts.join(" · ");
|
||||
}
|
||||
|
||||
export function PipelineCard({ item, jellyfinUrl, onToggleStream, onApprove, onSkip, onUnapprove }: PipelineCardProps) {
|
||||
export function PipelineCard({
|
||||
item,
|
||||
jellyfinUrl,
|
||||
onToggleStream,
|
||||
onApprove,
|
||||
onSkip,
|
||||
onUnapprove,
|
||||
onApproveUpToHere,
|
||||
}: PipelineCardProps) {
|
||||
const title =
|
||||
item.type === "Episode"
|
||||
? `S${String(item.season_number).padStart(2, "0")}E${String(item.episode_number).padStart(2, "0")} — ${item.name}`
|
||||
@@ -63,22 +81,30 @@ export function PipelineCard({ item, jellyfinUrl, onToggleStream, onApprove, onS
|
||||
const mediaItemId: number = item.item_id ?? (item as { id: number }).id;
|
||||
|
||||
return (
|
||||
<div className={`rounded-lg border p-3 ${confidenceColor}`}>
|
||||
<div className={`group rounded-lg border p-3 ${confidenceColor}`}>
|
||||
<div className="flex items-start justify-between gap-2">
|
||||
<div className="min-w-0">
|
||||
{jellyfinLink ? (
|
||||
<a
|
||||
href={jellyfinLink}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
<div className="flex items-center gap-1 min-w-0">
|
||||
<Link
|
||||
to="/review/audio/$id"
|
||||
params={{ id: String(mediaItemId) }}
|
||||
className="text-sm font-medium truncate block hover:text-blue-600 hover:underline"
|
||||
onClick={(e) => e.stopPropagation()}
|
||||
>
|
||||
{title}
|
||||
</a>
|
||||
) : (
|
||||
<p className="text-sm font-medium truncate">{title}</p>
|
||||
)}
|
||||
</Link>
|
||||
{jellyfinLink && (
|
||||
<a
|
||||
href={jellyfinLink}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
title="Open in Jellyfin"
|
||||
onClick={(e) => e.stopPropagation()}
|
||||
className="text-xs text-gray-400 hover:text-blue-600 shrink-0"
|
||||
>
|
||||
↗
|
||||
</a>
|
||||
)}
|
||||
</div>
|
||||
<div className="flex items-center gap-1.5 mt-1 flex-wrap">
|
||||
{item.transcode_reasons && item.transcode_reasons.length > 0
|
||||
? item.transcode_reasons.map((r) => (
|
||||
@@ -94,30 +120,38 @@ export function PipelineCard({ item, jellyfinUrl, onToggleStream, onApprove, onS
|
||||
matches the item's OG (set from radarr/sonarr/jellyfin) is
|
||||
marked "(Original Language)". */}
|
||||
{item.audio_streams && item.audio_streams.length > 0 && (
|
||||
<ul className="mt-2 space-y-0.5">
|
||||
<ul className="mt-2 space-y-1.5">
|
||||
{item.audio_streams.map((s) => {
|
||||
const ogLang = item.original_language ? normalizeLanguageClient(item.original_language) : null;
|
||||
const sLang = s.language ? normalizeLanguageClient(s.language) : null;
|
||||
const isOriginal = !!(ogLang && sLang && ogLang === sLang);
|
||||
const description = describeStream(s);
|
||||
return (
|
||||
<li key={s.id} className="flex items-center gap-1.5 text-xs">
|
||||
<li key={s.id} className="flex items-start gap-1.5 text-xs">
|
||||
<input
|
||||
type="checkbox"
|
||||
className="h-3.5 w-3.5"
|
||||
className="h-3.5 w-3.5 mt-0.5 shrink-0"
|
||||
checked={s.action === "keep"}
|
||||
onChange={(e) => onToggleStream?.(s.id, e.target.checked ? "keep" : "remove")}
|
||||
disabled={!onToggleStream}
|
||||
/>
|
||||
<span className="font-medium">{langName(s.language) || "unknown"}</span>
|
||||
{description && <span className="text-gray-500">{description}</span>}
|
||||
{s.is_default === 1 && <span className="text-[10px] text-gray-400 uppercase">default</span>}
|
||||
{s.title && !isOriginal && (
|
||||
<span className="text-gray-400 truncate" title={s.title}>
|
||||
“{s.title}”
|
||||
</span>
|
||||
)}
|
||||
{isOriginal && <span className="text-green-700 text-[11px]">(Original Language)</span>}
|
||||
<div className="min-w-0 flex-1">
|
||||
<div className="flex items-center gap-1.5 flex-wrap">
|
||||
<span className="font-medium">{langName(s.language) || "unknown"}</span>
|
||||
{isOriginal && <span className="text-green-700 text-[11px]">(Original Language)</span>}
|
||||
</div>
|
||||
{(description || s.title || s.is_default === 1) && (
|
||||
<div className="flex items-baseline gap-1.5 flex-wrap text-gray-500">
|
||||
{description && <span>{description}</span>}
|
||||
{s.is_default === 1 && <span className="text-[10px] text-gray-400 uppercase">default</span>}
|
||||
{s.title && (
|
||||
<span className="text-gray-400 min-w-0 [overflow-wrap:anywhere]" title={s.title}>
|
||||
— “{s.title}”
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</li>
|
||||
);
|
||||
})}
|
||||
@@ -127,13 +161,6 @@ export function PipelineCard({ item, jellyfinUrl, onToggleStream, onApprove, onS
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-1 mt-2">
|
||||
<Link
|
||||
to="/review/audio/$id"
|
||||
params={{ id: String(mediaItemId) }}
|
||||
className="text-xs px-2 py-1 rounded border border-gray-300 text-gray-700 hover:bg-gray-100 no-underline"
|
||||
>
|
||||
Details
|
||||
</Link>
|
||||
{onSkip && (
|
||||
<button
|
||||
type="button"
|
||||
@@ -144,6 +171,16 @@ export function PipelineCard({ item, jellyfinUrl, onToggleStream, onApprove, onS
|
||||
</button>
|
||||
)}
|
||||
<div className="flex-1" />
|
||||
{onApproveUpToHere && (
|
||||
<button
|
||||
type="button"
|
||||
onClick={onApproveUpToHere}
|
||||
title="Approve every card listed above this one"
|
||||
className="text-xs px-2 py-1 rounded border border-blue-600 text-blue-700 bg-white hover:bg-blue-50 opacity-0 group-hover:opacity-100 transition-opacity"
|
||||
>
|
||||
↑ Approve above
|
||||
</button>
|
||||
)}
|
||||
{onApprove && (
|
||||
<button
|
||||
type="button"
|
||||
@@ -157,7 +194,7 @@ export function PipelineCard({ item, jellyfinUrl, onToggleStream, onApprove, onS
|
||||
<button
|
||||
type="button"
|
||||
onClick={onUnapprove}
|
||||
className="text-xs px-3 py-1 rounded border border-gray-300 bg-white text-gray-700 hover:bg-gray-100"
|
||||
className="text-xs px-3 py-1 rounded border border-gray-300 bg-white text-gray-700 hover:bg-gray-100 opacity-0 group-hover:opacity-100 transition-opacity"
|
||||
>
|
||||
← Back to review
|
||||
</button>
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import { useCallback, useEffect, useRef, useState } from "react";
|
||||
import { Button } from "~/shared/components/ui/button";
|
||||
import { api } from "~/shared/lib/api";
|
||||
import type { PipelineData } from "~/shared/lib/types";
|
||||
import type { PipelineData, ReviewGroupsResponse } from "~/shared/lib/types";
|
||||
import { DoneColumn } from "./DoneColumn";
|
||||
import { ProcessingColumn } from "./ProcessingColumn";
|
||||
import { QueueColumn } from "./QueueColumn";
|
||||
@@ -21,48 +20,55 @@ interface QueueStatus {
|
||||
|
||||
export function PipelinePage() {
|
||||
const [data, setData] = useState<PipelineData | null>(null);
|
||||
const [initialGroups, setInitialGroups] = useState<ReviewGroupsResponse | null>(null);
|
||||
const [progress, setProgress] = useState<Progress | null>(null);
|
||||
const [queueStatus, setQueueStatus] = useState<QueueStatus | null>(null);
|
||||
const [loading, setLoading] = useState(true);
|
||||
|
||||
const load = useCallback(async () => {
|
||||
const loadPipeline = useCallback(async () => {
|
||||
const pipelineRes = await api.get<PipelineData>("/api/review/pipeline");
|
||||
setData(pipelineRes);
|
||||
setLoading(false);
|
||||
}, []);
|
||||
|
||||
const startQueue = useCallback(async () => {
|
||||
await api.post("/api/execute/start");
|
||||
load();
|
||||
}, [load]);
|
||||
const loadReviewGroups = useCallback(async () => {
|
||||
const groupsRes = await api.get<ReviewGroupsResponse>("/api/review/groups?offset=0&limit=25");
|
||||
setInitialGroups(groupsRes);
|
||||
}, []);
|
||||
|
||||
// Full refresh: used on first mount and after user-driven mutations
|
||||
// (approve/skip). SSE-driven refreshes during a running job call
|
||||
// loadPipeline only, so the Review column's scroll-loaded pages don't get
|
||||
// wiped every second by job_update events.
|
||||
const loadAll = useCallback(async () => {
|
||||
await Promise.all([loadPipeline(), loadReviewGroups()]);
|
||||
setLoading(false);
|
||||
}, [loadPipeline, loadReviewGroups]);
|
||||
|
||||
useEffect(() => {
|
||||
load();
|
||||
}, [load]);
|
||||
loadAll();
|
||||
}, [loadAll]);
|
||||
|
||||
// SSE for live updates. job_update fires on every status change and per-line
|
||||
// stdout flush of the running job — without coalescing, the pipeline endpoint
|
||||
// (a 500-row review query + counts) would re-run several times per second.
|
||||
// stdout flush — coalesce via 1s debounce so the pipeline endpoint doesn't
|
||||
// re-run several times per second.
|
||||
const reloadTimer = useRef<ReturnType<typeof setTimeout> | null>(null);
|
||||
useEffect(() => {
|
||||
const scheduleReload = () => {
|
||||
const schedulePipelineReload = () => {
|
||||
if (reloadTimer.current) return;
|
||||
reloadTimer.current = setTimeout(() => {
|
||||
reloadTimer.current = null;
|
||||
load();
|
||||
loadPipeline();
|
||||
}, 1000);
|
||||
};
|
||||
const es = new EventSource("/api/execute/events");
|
||||
es.addEventListener("job_update", (e) => {
|
||||
// When a job leaves 'running' (done / error / cancelled), drop any
|
||||
// stale progress so the bar doesn't linger on the next job's card.
|
||||
try {
|
||||
const upd = JSON.parse((e as MessageEvent).data) as { id: number; status: string };
|
||||
if (upd.status !== "running") setProgress(null);
|
||||
} catch {
|
||||
/* ignore malformed events */
|
||||
}
|
||||
scheduleReload();
|
||||
schedulePipelineReload();
|
||||
});
|
||||
es.addEventListener("job_progress", (e) => {
|
||||
setProgress(JSON.parse((e as MessageEvent).data));
|
||||
@@ -74,26 +80,26 @@ export function PipelinePage() {
|
||||
es.close();
|
||||
if (reloadTimer.current) clearTimeout(reloadTimer.current);
|
||||
};
|
||||
}, [load]);
|
||||
}, [loadPipeline]);
|
||||
|
||||
if (loading || !data) return <div className="p-6 text-gray-500">Loading pipeline...</div>;
|
||||
if (loading || !data || !initialGroups) return <div className="p-6 text-gray-500">Loading pipeline...</div>;
|
||||
|
||||
return (
|
||||
<div className="flex flex-col -mx-3 sm:-mx-5 -mt-4 -mb-12 h-[calc(100vh-3rem)] overflow-hidden">
|
||||
<div className="flex items-center justify-between px-6 py-3 border-b shrink-0">
|
||||
<h1 className="text-lg font-semibold">Pipeline</h1>
|
||||
<div className="flex items-center gap-4">
|
||||
<span className="text-sm text-gray-500">{data.doneCount} files in desired state</span>
|
||||
<Button variant="primary" size="sm" onClick={startQueue}>
|
||||
Start queue
|
||||
</Button>
|
||||
</div>
|
||||
<span className="text-sm text-gray-500">{data.doneCount} files in desired state</span>
|
||||
</div>
|
||||
<div className="flex flex-1 gap-4 p-4 overflow-x-auto overflow-y-hidden min-h-0">
|
||||
<ReviewColumn items={data.review} total={data.reviewTotal} jellyfinUrl={data.jellyfinUrl} onMutate={load} />
|
||||
<QueueColumn items={data.queued} jellyfinUrl={data.jellyfinUrl} onMutate={load} />
|
||||
<ProcessingColumn items={data.processing} progress={progress} queueStatus={queueStatus} onMutate={load} />
|
||||
<DoneColumn items={data.done} onMutate={load} />
|
||||
<ReviewColumn
|
||||
initialResponse={initialGroups}
|
||||
totalItems={data.reviewItemsTotal}
|
||||
jellyfinUrl={data.jellyfinUrl}
|
||||
onMutate={loadAll}
|
||||
/>
|
||||
<QueueColumn items={data.queued} jellyfinUrl={data.jellyfinUrl} onMutate={loadAll} />
|
||||
<ProcessingColumn items={data.processing} progress={progress} queueStatus={queueStatus} onMutate={loadAll} />
|
||||
<DoneColumn items={data.done} onMutate={loadAll} />
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import { Link } from "@tanstack/react-router";
|
||||
import { useEffect, useState } from "react";
|
||||
import { Badge } from "~/shared/components/ui/badge";
|
||||
import { api } from "~/shared/lib/api";
|
||||
@@ -22,6 +23,25 @@ export function ProcessingColumn({ items, progress, queueStatus, onMutate }: Pro
|
||||
return () => clearInterval(t);
|
||||
}, [job]);
|
||||
|
||||
// Local sleep countdown. Server emits the sleep duration once when the
|
||||
// pause begins; the client anchors "deadline = receivedAt + seconds*1000"
|
||||
// and ticks a 1s timer so the UI shows a live countdown, not a static number.
|
||||
const [sleepDeadline, setSleepDeadline] = useState<number | null>(null);
|
||||
const [sleepNow, setSleepNow] = useState(() => Date.now());
|
||||
useEffect(() => {
|
||||
if (queueStatus?.status === "sleeping" && typeof queueStatus.seconds === "number") {
|
||||
setSleepDeadline(Date.now() + queueStatus.seconds * 1000);
|
||||
} else {
|
||||
setSleepDeadline(null);
|
||||
}
|
||||
}, [queueStatus?.status, queueStatus?.seconds]);
|
||||
useEffect(() => {
|
||||
if (sleepDeadline == null) return;
|
||||
const t = setInterval(() => setSleepNow(Date.now()), 1000);
|
||||
return () => clearInterval(t);
|
||||
}, [sleepDeadline]);
|
||||
const sleepRemaining = sleepDeadline != null ? Math.max(0, Math.ceil((sleepDeadline - sleepNow) / 1000)) : null;
|
||||
|
||||
// Only trust progress if it belongs to the current job — stale events from
|
||||
// a previous job would otherwise show wrong numbers until the new job emits.
|
||||
const liveProgress = job && progress && progress.id === job.id ? progress : null;
|
||||
@@ -54,9 +74,9 @@ export function ProcessingColumn({ items, progress, queueStatus, onMutate }: Pro
|
||||
actions={job ? [{ label: "Stop", onClick: stop, danger: true }] : undefined}
|
||||
>
|
||||
{queueStatus && queueStatus.status !== "running" && (
|
||||
<div className="mb-2 text-xs text-gray-500 bg-white rounded border p-2">
|
||||
<div className="mb-2 text-xs text-gray-500 bg-white rounded border p-2 tabular-nums">
|
||||
{queueStatus.status === "paused" && <>Paused until {queueStatus.until}</>}
|
||||
{queueStatus.status === "sleeping" && <>Sleeping {queueStatus.seconds}s between jobs</>}
|
||||
{queueStatus.status === "sleeping" && <>Next job in {sleepRemaining ?? queueStatus.seconds ?? 0}s</>}
|
||||
{queueStatus.status === "idle" && <>Idle</>}
|
||||
</div>
|
||||
)}
|
||||
@@ -64,7 +84,13 @@ export function ProcessingColumn({ items, progress, queueStatus, onMutate }: Pro
|
||||
{job ? (
|
||||
<div className="rounded border bg-white p-3">
|
||||
<div className="flex items-start justify-between gap-2">
|
||||
<p className="text-sm font-medium truncate flex-1">{job.name}</p>
|
||||
<Link
|
||||
to="/review/audio/$id"
|
||||
params={{ id: String(job.item_id) }}
|
||||
className="text-sm font-medium truncate flex-1 hover:text-blue-600 hover:underline"
|
||||
>
|
||||
{job.name}
|
||||
</Link>
|
||||
<button
|
||||
type="button"
|
||||
onClick={stop}
|
||||
|
||||
@@ -10,23 +10,30 @@ interface QueueColumnProps {
|
||||
}
|
||||
|
||||
export function QueueColumn({ items, jellyfinUrl, onMutate }: QueueColumnProps) {
|
||||
const runAll = async () => {
|
||||
await api.post("/api/execute/start");
|
||||
onMutate();
|
||||
};
|
||||
const clear = async () => {
|
||||
if (!confirm(`Cancel all ${items.length} pending jobs?`)) return;
|
||||
await api.post("/api/execute/clear");
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const unapprove = async (itemId: number) => {
|
||||
await api.post(`/api/review/${itemId}/unapprove`);
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const actions =
|
||||
items.length > 0
|
||||
? [
|
||||
{ label: "Run all", onClick: runAll, primary: true },
|
||||
{ label: "Clear", onClick: clear },
|
||||
]
|
||||
: undefined;
|
||||
|
||||
return (
|
||||
<ColumnShell
|
||||
title="Queued"
|
||||
count={items.length}
|
||||
actions={items.length > 0 ? [{ label: "Clear", onClick: clear }] : undefined}
|
||||
>
|
||||
<ColumnShell title="Queued" count={items.length} actions={actions}>
|
||||
<div className="space-y-2">
|
||||
{items.map((item) => (
|
||||
<PipelineCard key={item.id} item={item} jellyfinUrl={jellyfinUrl} onUnapprove={() => unapprove(item.item_id)} />
|
||||
|
||||
@@ -1,28 +1,57 @@
|
||||
import { useCallback, useEffect, useRef, useState } from "react";
|
||||
import { api } from "~/shared/lib/api";
|
||||
import type { PipelineReviewItem } from "~/shared/lib/types";
|
||||
import type { ReviewGroup, ReviewGroupsResponse } from "~/shared/lib/types";
|
||||
import { ColumnShell } from "./ColumnShell";
|
||||
import { PipelineCard } from "./PipelineCard";
|
||||
import { SeriesCard } from "./SeriesCard";
|
||||
|
||||
const PAGE_SIZE = 25;
|
||||
|
||||
interface ReviewColumnProps {
|
||||
items: PipelineReviewItem[];
|
||||
total: number;
|
||||
initialResponse: ReviewGroupsResponse;
|
||||
totalItems: number;
|
||||
jellyfinUrl: string;
|
||||
onMutate: () => void;
|
||||
}
|
||||
|
||||
interface SeriesGroup {
|
||||
name: string;
|
||||
key: string;
|
||||
jellyfinId: string | null;
|
||||
episodes: PipelineReviewItem[];
|
||||
}
|
||||
export function ReviewColumn({ initialResponse, totalItems, jellyfinUrl, onMutate }: ReviewColumnProps) {
|
||||
const [groups, setGroups] = useState<ReviewGroup[]>(initialResponse.groups);
|
||||
const [hasMore, setHasMore] = useState(initialResponse.hasMore);
|
||||
const [loadingMore, setLoadingMore] = useState(false);
|
||||
const sentinelRef = useRef<HTMLDivElement | null>(null);
|
||||
|
||||
export function ReviewColumn({ items, total, jellyfinUrl, onMutate }: ReviewColumnProps) {
|
||||
const truncated = total > items.length;
|
||||
// Reset when the parent refetches page 0 (after approve/skip actions).
|
||||
useEffect(() => {
|
||||
setGroups(initialResponse.groups);
|
||||
setHasMore(initialResponse.hasMore);
|
||||
}, [initialResponse]);
|
||||
|
||||
const loadMore = useCallback(async () => {
|
||||
if (loadingMore || !hasMore) return;
|
||||
setLoadingMore(true);
|
||||
try {
|
||||
const res = await api.get<ReviewGroupsResponse>(`/api/review/groups?offset=${groups.length}&limit=${PAGE_SIZE}`);
|
||||
setGroups((prev) => [...prev, ...res.groups]);
|
||||
setHasMore(res.hasMore);
|
||||
} finally {
|
||||
setLoadingMore(false);
|
||||
}
|
||||
}, [groups.length, hasMore, loadingMore]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!hasMore || !sentinelRef.current) return;
|
||||
const observer = new IntersectionObserver(
|
||||
(entries) => {
|
||||
if (entries[0]?.isIntersecting) loadMore();
|
||||
},
|
||||
{ rootMargin: "200px" },
|
||||
);
|
||||
observer.observe(sentinelRef.current);
|
||||
return () => observer.disconnect();
|
||||
}, [hasMore, loadMore]);
|
||||
|
||||
const skipAll = async () => {
|
||||
if (!confirm(`Skip all ${total} pending items? They won't be processed unless you unskip them.`)) return;
|
||||
if (!confirm(`Skip all ${totalItems} pending items? They won't be processed unless you unskip them.`)) return;
|
||||
await api.post("/api/review/skip-all");
|
||||
onMutate();
|
||||
};
|
||||
@@ -41,76 +70,68 @@ export function ReviewColumn({ items, total, jellyfinUrl, onMutate }: ReviewColu
|
||||
await api.post(`/api/review/${itemId}/skip`);
|
||||
onMutate();
|
||||
};
|
||||
const approveBatch = async (itemIds: number[]) => {
|
||||
if (itemIds.length === 0) return;
|
||||
await api.post<{ ok: boolean; count: number }>("/api/review/approve-batch", { itemIds });
|
||||
onMutate();
|
||||
};
|
||||
|
||||
// Group by series (movies are standalone)
|
||||
const movies = items.filter((i) => i.type === "Movie");
|
||||
const seriesMap = new Map<string, SeriesGroup>();
|
||||
// Compute ids per visible group for "Approve above"
|
||||
const idsByGroup: number[][] = groups.map((g) =>
|
||||
g.kind === "movie" ? [g.item.item_id] : g.seasons.flatMap((s) => s.episodes.map((ep) => ep.item_id)),
|
||||
);
|
||||
const priorIds = (index: number): number[] => idsByGroup.slice(0, index).flat();
|
||||
|
||||
for (const item of items.filter((i) => i.type === "Episode")) {
|
||||
const key = item.series_jellyfin_id ?? item.series_name ?? String(item.item_id);
|
||||
if (!seriesMap.has(key)) {
|
||||
seriesMap.set(key, { name: item.series_name ?? "", key, jellyfinId: item.series_jellyfin_id, episodes: [] });
|
||||
}
|
||||
seriesMap.get(key)!.episodes.push(item);
|
||||
}
|
||||
|
||||
// Interleave movies and series, sorted by confidence (high first)
|
||||
const allItems = [
|
||||
...movies.map((m) => ({ type: "movie" as const, item: m, sortKey: m.confidence === "high" ? 0 : 1 })),
|
||||
...[...seriesMap.values()].map((s) => ({
|
||||
type: "series" as const,
|
||||
item: s,
|
||||
sortKey: s.episodes.every((e) => e.confidence === "high") ? 0 : 1,
|
||||
})),
|
||||
].sort((a, b) => a.sortKey - b.sortKey);
|
||||
const actions =
|
||||
totalItems > 0
|
||||
? [
|
||||
{ label: "Auto Review", onClick: autoApprove, primary: true },
|
||||
{ label: "Skip all", onClick: skipAll },
|
||||
]
|
||||
: undefined;
|
||||
|
||||
return (
|
||||
<ColumnShell
|
||||
title="Review"
|
||||
count={truncated ? `${items.length} of ${total}` : total}
|
||||
actions={
|
||||
total > 0
|
||||
? [
|
||||
{ label: "Auto Review", onClick: autoApprove, primary: true },
|
||||
{ label: "Skip all", onClick: skipAll },
|
||||
]
|
||||
: undefined
|
||||
}
|
||||
>
|
||||
<ColumnShell title="Review" count={totalItems} actions={actions}>
|
||||
<div className="space-y-2">
|
||||
{allItems.map((entry) => {
|
||||
if (entry.type === "movie") {
|
||||
{groups.map((group, index) => {
|
||||
const prior = index > 0 ? priorIds(index) : null;
|
||||
const onApproveUpToHere = prior && prior.length > 0 ? () => approveBatch(prior) : undefined;
|
||||
if (group.kind === "movie") {
|
||||
return (
|
||||
<PipelineCard
|
||||
key={entry.item.id}
|
||||
item={entry.item}
|
||||
key={group.item.id}
|
||||
item={group.item}
|
||||
jellyfinUrl={jellyfinUrl}
|
||||
onToggleStream={async (streamId, action) => {
|
||||
await api.patch(`/api/review/${entry.item.item_id}/stream/${streamId}`, { action });
|
||||
await api.patch(`/api/review/${group.item.item_id}/stream/${streamId}`, { action });
|
||||
onMutate();
|
||||
}}
|
||||
onApprove={() => approveItem(entry.item.item_id)}
|
||||
onSkip={() => skipItem(entry.item.item_id)}
|
||||
onApprove={() => approveItem(group.item.item_id)}
|
||||
onSkip={() => skipItem(group.item.item_id)}
|
||||
onApproveUpToHere={onApproveUpToHere}
|
||||
/>
|
||||
);
|
||||
}
|
||||
return (
|
||||
<SeriesCard
|
||||
key={entry.item.key}
|
||||
seriesKey={entry.item.key}
|
||||
seriesName={entry.item.name}
|
||||
key={group.seriesKey}
|
||||
seriesKey={group.seriesKey}
|
||||
seriesName={group.seriesName}
|
||||
jellyfinUrl={jellyfinUrl}
|
||||
seriesJellyfinId={entry.item.jellyfinId}
|
||||
episodes={entry.item.episodes}
|
||||
seriesJellyfinId={group.seriesJellyfinId}
|
||||
seasons={group.seasons}
|
||||
episodeCount={group.episodeCount}
|
||||
originalLanguage={group.originalLanguage}
|
||||
onMutate={onMutate}
|
||||
onApproveUpToHere={onApproveUpToHere}
|
||||
/>
|
||||
);
|
||||
})}
|
||||
{allItems.length === 0 && <p className="text-sm text-gray-400 text-center py-8">No items to review</p>}
|
||||
{truncated && (
|
||||
<p className="text-xs text-gray-400 text-center py-3 border-t mt-2">
|
||||
Showing first {items.length} of {total}. Approve some to see the rest.
|
||||
</p>
|
||||
{groups.length === 0 && <p className="text-sm text-gray-400 text-center py-8">No items to review</p>}
|
||||
{hasMore && (
|
||||
<div ref={sentinelRef} className="py-4 text-center text-xs text-gray-400">
|
||||
{loadingMore ? "Loading more…" : ""}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</ColumnShell>
|
||||
|
||||
@@ -9,8 +9,13 @@ interface SeriesCardProps {
|
||||
seriesName: string;
|
||||
jellyfinUrl: string;
|
||||
seriesJellyfinId: string | null;
|
||||
episodes: PipelineReviewItem[];
|
||||
seasons: Array<{ season: number | null; episodes: PipelineReviewItem[] }>;
|
||||
episodeCount: number;
|
||||
originalLanguage: string | null;
|
||||
onMutate: () => void;
|
||||
// Review-column affordance: approve every card visually above this
|
||||
// series in one round-trip. See ReviewColumn for the id computation.
|
||||
onApproveUpToHere?: () => void;
|
||||
}
|
||||
|
||||
export function SeriesCard({
|
||||
@@ -18,12 +23,18 @@ export function SeriesCard({
|
||||
seriesName,
|
||||
jellyfinUrl,
|
||||
seriesJellyfinId,
|
||||
episodes,
|
||||
seasons,
|
||||
episodeCount,
|
||||
originalLanguage,
|
||||
onMutate,
|
||||
onApproveUpToHere,
|
||||
}: SeriesCardProps) {
|
||||
const [expanded, setExpanded] = useState(false);
|
||||
|
||||
const seriesLang = episodes[0]?.original_language ?? "";
|
||||
const flatEpisodes = seasons.flatMap((s) => s.episodes);
|
||||
const highCount = flatEpisodes.filter((e) => e.confidence === "high").length;
|
||||
const lowCount = flatEpisodes.filter((e) => e.confidence === "low").length;
|
||||
const multipleSeasons = seasons.length > 1;
|
||||
|
||||
const setSeriesLanguage = async (lang: string) => {
|
||||
await api.patch(`/api/review/series/${encodeURIComponent(seriesKey)}/language`, { language: lang });
|
||||
@@ -35,14 +46,17 @@ export function SeriesCard({
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const highCount = episodes.filter((e) => e.confidence === "high").length;
|
||||
const lowCount = episodes.filter((e) => e.confidence === "low").length;
|
||||
const approveSeason = async (season: number | null) => {
|
||||
if (season == null) return;
|
||||
await api.post(`/api/review/season/${encodeURIComponent(seriesKey)}/${season}/approve-all`);
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const jellyfinLink =
|
||||
jellyfinUrl && seriesJellyfinId ? `${jellyfinUrl}/web/index.html#!/details?id=${seriesJellyfinId}` : null;
|
||||
|
||||
return (
|
||||
<div className="group rounded-lg border bg-white overflow-hidden">
|
||||
<div className="group/series rounded-lg border bg-white overflow-hidden">
|
||||
{/* Title row */}
|
||||
<div
|
||||
className="flex items-center gap-2 px-3 pt-3 pb-1 cursor-pointer hover:bg-gray-50 rounded-t-lg"
|
||||
@@ -66,13 +80,14 @@ export function SeriesCard({
|
||||
|
||||
{/* Controls row */}
|
||||
<div className="flex items-center gap-2 px-3 pb-3 pt-1">
|
||||
<span className="text-xs text-gray-500 shrink-0">{episodes.length} eps</span>
|
||||
<span className="text-xs text-gray-500 shrink-0">{episodeCount} eps</span>
|
||||
{multipleSeasons && <span className="text-xs text-gray-500 shrink-0">· {seasons.length} seasons</span>}
|
||||
{highCount > 0 && <span className="text-xs text-green-600 shrink-0">{highCount} ready</span>}
|
||||
{lowCount > 0 && <span className="text-xs text-amber-600 shrink-0">{lowCount} review</span>}
|
||||
<div className="flex-1" />
|
||||
<select
|
||||
className="h-6 text-xs border border-gray-300 rounded px-1 bg-white shrink-0"
|
||||
value={seriesLang}
|
||||
value={originalLanguage ?? ""}
|
||||
onChange={(e) => {
|
||||
e.stopPropagation();
|
||||
setSeriesLanguage(e.target.value);
|
||||
@@ -85,40 +100,129 @@ export function SeriesCard({
|
||||
</option>
|
||||
))}
|
||||
</select>
|
||||
{onApproveUpToHere && (
|
||||
<button
|
||||
type="button"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
onApproveUpToHere();
|
||||
}}
|
||||
title="Approve every card listed above this one"
|
||||
className="text-xs px-2 py-1 rounded border border-blue-600 text-blue-700 bg-white hover:bg-blue-50 cursor-pointer whitespace-nowrap shrink-0 opacity-0 group-hover/series:opacity-100 transition-opacity"
|
||||
>
|
||||
↑ Approve above
|
||||
</button>
|
||||
)}
|
||||
<button
|
||||
type="button"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
approveSeries();
|
||||
}}
|
||||
className="text-xs px-2 py-1 rounded bg-blue-600 text-white hover:bg-blue-700 cursor-pointer whitespace-nowrap shrink-0"
|
||||
>
|
||||
Approve all
|
||||
Approve series
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{expanded && (
|
||||
<div className="border-t px-3 pb-3 space-y-2 pt-2">
|
||||
<div className="border-t">
|
||||
{multipleSeasons
|
||||
? seasons.map((s) => (
|
||||
<SeasonGroup
|
||||
key={s.season ?? "unknown"}
|
||||
season={s.season}
|
||||
episodes={s.episodes}
|
||||
jellyfinUrl={jellyfinUrl}
|
||||
onApproveSeason={() => approveSeason(s.season)}
|
||||
onMutate={onMutate}
|
||||
/>
|
||||
))
|
||||
: flatEpisodes.map((ep) => <EpisodeRow key={ep.id} ep={ep} jellyfinUrl={jellyfinUrl} onMutate={onMutate} />)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function SeasonGroup({
|
||||
season,
|
||||
episodes,
|
||||
jellyfinUrl,
|
||||
onApproveSeason,
|
||||
onMutate,
|
||||
}: {
|
||||
season: number | null;
|
||||
episodes: PipelineReviewItem[];
|
||||
jellyfinUrl: string;
|
||||
onApproveSeason: () => void;
|
||||
onMutate: () => void;
|
||||
}) {
|
||||
const [open, setOpen] = useState(false);
|
||||
const highCount = episodes.filter((e) => e.confidence === "high").length;
|
||||
const lowCount = episodes.filter((e) => e.confidence === "low").length;
|
||||
const label = season == null ? "No season" : `Season ${String(season).padStart(2, "0")}`;
|
||||
|
||||
return (
|
||||
<div className="border-t first:border-t-0">
|
||||
<div className="flex items-center gap-2 px-3 py-2 cursor-pointer hover:bg-gray-50" onClick={() => setOpen(!open)}>
|
||||
<span className="text-xs text-gray-400 shrink-0">{open ? "▼" : "▶"}</span>
|
||||
<span className="text-xs font-medium shrink-0">{label}</span>
|
||||
<span className="text-xs text-gray-500 shrink-0">· {episodes.length} eps</span>
|
||||
{highCount > 0 && <span className="text-xs text-green-600 shrink-0">{highCount} ready</span>}
|
||||
{lowCount > 0 && <span className="text-xs text-amber-600 shrink-0">{lowCount} review</span>}
|
||||
<div className="flex-1" />
|
||||
{season != null && (
|
||||
<button
|
||||
type="button"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
onApproveSeason();
|
||||
}}
|
||||
className="text-xs px-2 py-1 rounded border border-blue-600 text-blue-700 bg-white hover:bg-blue-50 cursor-pointer whitespace-nowrap shrink-0"
|
||||
>
|
||||
Approve season
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
{open && (
|
||||
<div className="px-3 pb-3 space-y-2 pt-2">
|
||||
{episodes.map((ep) => (
|
||||
<PipelineCard
|
||||
key={ep.id}
|
||||
item={ep}
|
||||
jellyfinUrl={jellyfinUrl}
|
||||
onToggleStream={async (streamId, action) => {
|
||||
await api.patch(`/api/review/${ep.item_id}/stream/${streamId}`, { action });
|
||||
onMutate();
|
||||
}}
|
||||
onApprove={async () => {
|
||||
await api.post(`/api/review/${ep.item_id}/approve`);
|
||||
onMutate();
|
||||
}}
|
||||
onSkip={async () => {
|
||||
await api.post(`/api/review/${ep.item_id}/skip`);
|
||||
onMutate();
|
||||
}}
|
||||
/>
|
||||
<EpisodeRow key={ep.id} ep={ep} jellyfinUrl={jellyfinUrl} onMutate={onMutate} />
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function EpisodeRow({
|
||||
ep,
|
||||
jellyfinUrl,
|
||||
onMutate,
|
||||
}: {
|
||||
ep: PipelineReviewItem;
|
||||
jellyfinUrl: string;
|
||||
onMutate: () => void;
|
||||
}) {
|
||||
return (
|
||||
<div className="px-3 py-1">
|
||||
<PipelineCard
|
||||
item={ep}
|
||||
jellyfinUrl={jellyfinUrl}
|
||||
onToggleStream={async (streamId, action) => {
|
||||
await api.patch(`/api/review/${ep.item_id}/stream/${streamId}`, { action });
|
||||
onMutate();
|
||||
}}
|
||||
onApprove={async () => {
|
||||
await api.post(`/api/review/${ep.item_id}/approve`);
|
||||
onMutate();
|
||||
}}
|
||||
onSkip={async () => {
|
||||
await api.post(`/api/review/${ep.item_id}/skip`);
|
||||
onMutate();
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -6,7 +6,7 @@ import { Button } from "~/shared/components/ui/button";
|
||||
import { Select } from "~/shared/components/ui/select";
|
||||
import { api } from "~/shared/lib/api";
|
||||
import { LANG_NAMES, langName } from "~/shared/lib/lang";
|
||||
import type { MediaItem, MediaStream, ReviewPlan, StreamDecision } from "~/shared/lib/types";
|
||||
import type { Job, MediaItem, MediaStream, ReviewPlan, StreamDecision } from "~/shared/lib/types";
|
||||
|
||||
// ─── Types ────────────────────────────────────────────────────────────────────
|
||||
|
||||
@@ -16,6 +16,7 @@ interface DetailData {
|
||||
plan: ReviewPlan | null;
|
||||
decisions: StreamDecision[];
|
||||
command: string | null;
|
||||
job: Job | null;
|
||||
}
|
||||
|
||||
// ─── Utilities ────────────────────────────────────────────────────────────────
|
||||
@@ -205,6 +206,114 @@ function TitleInput({ value, onCommit }: { value: string; onCommit: (v: string)
|
||||
);
|
||||
}
|
||||
|
||||
// ─── Job section ─────────────────────────────────────────────────────────────
|
||||
|
||||
interface JobSectionProps {
|
||||
job: Job;
|
||||
onMutate: () => void;
|
||||
}
|
||||
|
||||
function JobSection({ job, onMutate }: JobSectionProps) {
|
||||
const [showCmd, setShowCmd] = useState(false);
|
||||
const [showLog, setShowLog] = useState(job.status === "error");
|
||||
const [liveStatus, setLiveStatus] = useState(job.status);
|
||||
const [liveOutput, setLiveOutput] = useState(job.output ?? "");
|
||||
const [progress, setProgress] = useState<{ seconds: number; total: number } | null>(null);
|
||||
|
||||
useEffect(() => {
|
||||
setLiveStatus(job.status);
|
||||
setLiveOutput(job.output ?? "");
|
||||
}, [job.status, job.output, job.id]);
|
||||
|
||||
useEffect(() => {
|
||||
const es = new EventSource("/api/execute/events");
|
||||
es.addEventListener("job_update", (e) => {
|
||||
const d = JSON.parse((e as MessageEvent).data) as { id: number; status: string; output?: string };
|
||||
if (d.id !== job.id) return;
|
||||
setLiveStatus(d.status as Job["status"]);
|
||||
if (d.output !== undefined) setLiveOutput(d.output);
|
||||
if (d.status === "done" || d.status === "error") onMutate();
|
||||
});
|
||||
es.addEventListener("job_progress", (e) => {
|
||||
const d = JSON.parse((e as MessageEvent).data) as { id: number; seconds: number; total: number };
|
||||
if (d.id !== job.id) return;
|
||||
setProgress({ seconds: d.seconds, total: d.total });
|
||||
});
|
||||
return () => es.close();
|
||||
}, [job.id, onMutate]);
|
||||
|
||||
const runJob = async () => {
|
||||
await api.post(`/api/execute/job/${job.id}/run`);
|
||||
onMutate();
|
||||
};
|
||||
const cancelJob = async () => {
|
||||
await api.post(`/api/execute/job/${job.id}/cancel`);
|
||||
onMutate();
|
||||
};
|
||||
const stopJob = async () => {
|
||||
await api.post("/api/execute/stop");
|
||||
onMutate();
|
||||
};
|
||||
|
||||
const typeLabel = job.job_type === "transcode" ? "Audio Transcode" : "Audio Remux";
|
||||
const exitBadge = job.exit_code != null && job.exit_code !== 0 ? job.exit_code : null;
|
||||
|
||||
return (
|
||||
<div className="mt-6 pt-4 border-t border-gray-200">
|
||||
<div className="text-gray-400 text-[0.75rem] uppercase tracking-[0.05em] mb-2">Job</div>
|
||||
<div className="flex items-center gap-2 flex-wrap mb-3">
|
||||
<Badge variant={liveStatus}>{liveStatus}</Badge>
|
||||
<Badge variant={job.job_type === "transcode" ? "manual" : "noop"}>{typeLabel}</Badge>
|
||||
{exitBadge != null && <Badge variant="error">exit {exitBadge}</Badge>}
|
||||
{job.started_at && <span className="text-gray-500 text-[0.72rem]">started {job.started_at}</span>}
|
||||
{job.completed_at && <span className="text-gray-500 text-[0.72rem]">completed {job.completed_at}</span>}
|
||||
<div className="flex-1" />
|
||||
<Button size="sm" variant="secondary" onClick={() => setShowCmd((v) => !v)}>
|
||||
Cmd
|
||||
</Button>
|
||||
{liveOutput && (
|
||||
<Button size="sm" variant="secondary" onClick={() => setShowLog((v) => !v)}>
|
||||
Log
|
||||
</Button>
|
||||
)}
|
||||
{liveStatus === "pending" && (
|
||||
<>
|
||||
<Button size="sm" onClick={runJob}>
|
||||
▶ Run
|
||||
</Button>
|
||||
<Button size="sm" variant="secondary" onClick={cancelJob}>
|
||||
✕ Cancel
|
||||
</Button>
|
||||
</>
|
||||
)}
|
||||
{liveStatus === "running" && (
|
||||
<Button size="sm" variant="secondary" onClick={stopJob}>
|
||||
✕ Stop
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
{liveStatus === "running" && progress && progress.total > 0 && (
|
||||
<div className="h-1.5 bg-gray-200 rounded mb-3 overflow-hidden">
|
||||
<div
|
||||
className="h-full bg-blue-500 transition-[width] duration-500"
|
||||
style={{ width: `${Math.min(100, (progress.seconds / progress.total) * 100).toFixed(1)}%` }}
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
{showCmd && (
|
||||
<div className="font-mono text-[0.74rem] bg-gray-50 text-gray-700 px-3 py-2 rounded max-h-[120px] overflow-y-auto whitespace-pre-wrap break-all mb-2">
|
||||
{job.command}
|
||||
</div>
|
||||
)}
|
||||
{showLog && liveOutput && (
|
||||
<div className="font-mono text-[0.74rem] bg-[#1a1a1a] text-[#d4d4d4] px-3 py-2 rounded max-h-[260px] overflow-y-auto whitespace-pre-wrap break-all">
|
||||
{liveOutput}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
// ─── Detail page ──────────────────────────────────────────────────────────────
|
||||
|
||||
export function AudioDetailPage() {
|
||||
@@ -347,6 +456,9 @@ export function AudioDetailPage() {
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Job */}
|
||||
{data.job && <JobSection job={data.job} onMutate={load} />}
|
||||
|
||||
{/* Actions */}
|
||||
{plan?.status === "pending" && !plan.is_noop && (
|
||||
<div className="flex gap-2 mt-6">
|
||||
|
||||
@@ -10,10 +10,42 @@ import { formatThousands } from "~/shared/lib/utils";
|
||||
interface ScanStatus {
|
||||
running: boolean;
|
||||
progress: { scanned: number; total: number; errors: number };
|
||||
recentItems: { name: string; type: string; scan_status: string; file_path: string }[];
|
||||
recentItems: {
|
||||
name: string;
|
||||
type: string;
|
||||
scan_status: string;
|
||||
file_path: string;
|
||||
last_scanned_at: string | null;
|
||||
ingest_source: "scan" | "webhook" | null;
|
||||
}[];
|
||||
scanLimit: number | null;
|
||||
}
|
||||
|
||||
interface ScanItemsRow {
|
||||
id: number;
|
||||
jellyfin_id: string;
|
||||
name: string;
|
||||
type: "Movie" | "Episode";
|
||||
series_name: string | null;
|
||||
season_number: number | null;
|
||||
episode_number: number | null;
|
||||
scan_status: string;
|
||||
original_language: string | null;
|
||||
orig_lang_source: string | null;
|
||||
container: string | null;
|
||||
file_size: number | null;
|
||||
file_path: string;
|
||||
last_scanned_at: string | null;
|
||||
ingest_source: "scan" | "webhook" | null;
|
||||
audio_codecs: string | null;
|
||||
}
|
||||
|
||||
interface ScanItemsResponse {
|
||||
rows: ScanItemsRow[];
|
||||
total: number;
|
||||
hasMore: boolean;
|
||||
}
|
||||
|
||||
interface DashboardStats {
|
||||
totalItems: number;
|
||||
scanned: number;
|
||||
@@ -47,6 +79,22 @@ interface LogEntry {
|
||||
file?: string;
|
||||
}
|
||||
|
||||
interface RecentIngestRow {
|
||||
name: string;
|
||||
type: string;
|
||||
status: string;
|
||||
file: string;
|
||||
scannedAt: string | null;
|
||||
source: "scan" | "webhook" | null;
|
||||
}
|
||||
|
||||
interface ItemFilters {
|
||||
q: string;
|
||||
status: "all" | "pending" | "scanned" | "error";
|
||||
type: "all" | "movie" | "episode";
|
||||
source: "all" | "scan" | "webhook";
|
||||
}
|
||||
|
||||
// Mutable buffer for SSE data — flushed to React state on an interval
|
||||
interface SseBuf {
|
||||
scanned: number;
|
||||
@@ -65,19 +113,54 @@ function freshBuf(): SseBuf {
|
||||
|
||||
const FLUSH_MS = 200;
|
||||
|
||||
function statusBadgeVariant(status: string): "pending" | "done" | "error" | "default" {
|
||||
if (status === "pending") return "pending";
|
||||
if (status === "done" || status === "scanned") return "done";
|
||||
if (status === "error") return "error";
|
||||
return "default";
|
||||
}
|
||||
|
||||
function formatScannedAt(ts: string | null): string {
|
||||
if (!ts) return "—";
|
||||
const d = new Date(ts.includes("T") ? ts : `${ts}Z`);
|
||||
if (Number.isNaN(d.getTime())) return ts;
|
||||
return d.toLocaleString([], { year: "numeric", month: "2-digit", day: "2-digit", hour: "2-digit", minute: "2-digit" });
|
||||
}
|
||||
|
||||
function formatFileSize(bytes: number | null): string {
|
||||
if (!bytes || bytes <= 0) return "—";
|
||||
if (bytes < 1000) return `${bytes} B`;
|
||||
if (bytes < 1000 ** 2) return `${(bytes / 1000).toFixed(1)} kB`;
|
||||
if (bytes < 1000 ** 3) return `${(bytes / 1000 ** 2).toFixed(1)} MB`;
|
||||
return `${(bytes / 1000 ** 3).toFixed(1)} GB`;
|
||||
}
|
||||
|
||||
function episodeLabel(row: ScanItemsRow): string {
|
||||
if (row.type !== "Episode") return "—";
|
||||
const season = row.season_number ?? 0;
|
||||
const episode = row.episode_number ?? 0;
|
||||
return `S${String(season).padStart(2, "0")}E${String(episode).padStart(2, "0")}`;
|
||||
}
|
||||
|
||||
export function ScanPage() {
|
||||
const navigate = useNavigate();
|
||||
const [status, setStatus] = useState<ScanStatus | null>(null);
|
||||
const [stats, setStats] = useState<DashboardStats | null>(null);
|
||||
const [configChecked, setConfigChecked] = useState(false);
|
||||
const [limit, setLimit] = useState("");
|
||||
const [log, setLog] = useState<LogEntry[]>([]);
|
||||
const [recentIngest, setRecentIngest] = useState<RecentIngestRow[]>([]);
|
||||
const [statusLabel, setStatusLabel] = useState("");
|
||||
const [scanComplete, setScanComplete] = useState(false);
|
||||
const [currentItem, setCurrentItem] = useState("");
|
||||
const [progressScanned, setProgressScanned] = useState(0);
|
||||
const [progressTotal, setProgressTotal] = useState(0);
|
||||
const [errors, setErrors] = useState(0);
|
||||
const [filters, setFilters] = useState<ItemFilters>({ q: "", status: "all", type: "all", source: "all" });
|
||||
const [itemsRows, setItemsRows] = useState<ScanItemsRow[]>([]);
|
||||
const [itemsOffset, setItemsOffset] = useState(0);
|
||||
const [itemsHasMore, setItemsHasMore] = useState(false);
|
||||
const [itemsTotal, setItemsTotal] = useState(0);
|
||||
const [itemsLoading, setItemsLoading] = useState(false);
|
||||
const esRef = useRef<EventSource | null>(null);
|
||||
const bufRef = useRef<SseBuf>(freshBuf());
|
||||
const timerRef = useRef<ReturnType<typeof setInterval> | null>(null);
|
||||
@@ -122,7 +205,19 @@ export function ScanPage() {
|
||||
setCurrentItem(b.currentItem);
|
||||
if (b.newLogs.length > 0) {
|
||||
const batch = b.newLogs.splice(0);
|
||||
setLog((prev) => [...batch.reverse(), ...prev].slice(0, 100));
|
||||
setRecentIngest((prev) =>
|
||||
[
|
||||
...batch.map((item) => ({
|
||||
name: item.name,
|
||||
type: item.type,
|
||||
status: item.status,
|
||||
file: item.file ?? item.name,
|
||||
scannedAt: new Date().toISOString(),
|
||||
source: "scan" as const,
|
||||
})),
|
||||
...prev,
|
||||
].slice(0, 5),
|
||||
);
|
||||
}
|
||||
b.dirty = false;
|
||||
}
|
||||
@@ -172,13 +267,55 @@ export function ScanPage() {
|
||||
setErrors(s.progress.errors);
|
||||
setStatusLabel(s.running ? "Scan in progress…" : "Scan idle");
|
||||
if (s.scanLimit != null) setLimit(String(s.scanLimit));
|
||||
setLog(s.recentItems.map((i) => ({ name: i.name, type: i.type, status: i.scan_status, file: i.file_path })));
|
||||
setRecentIngest(
|
||||
s.recentItems.map((i) => ({
|
||||
name: i.name,
|
||||
type: i.type,
|
||||
status: i.scan_status,
|
||||
file: i.file_path,
|
||||
scannedAt: i.last_scanned_at,
|
||||
source: i.ingest_source,
|
||||
})),
|
||||
);
|
||||
}, []);
|
||||
|
||||
useEffect(() => {
|
||||
load();
|
||||
}, [load]);
|
||||
|
||||
const fetchItems = useCallback(
|
||||
async (offset: number, append: boolean) => {
|
||||
setItemsLoading(true);
|
||||
try {
|
||||
const qs = new URLSearchParams({
|
||||
offset: String(offset),
|
||||
limit: "50",
|
||||
q: filters.q,
|
||||
status: filters.status,
|
||||
type: filters.type,
|
||||
source: filters.source,
|
||||
});
|
||||
const res = await api.get<ScanItemsResponse>(`/api/scan/items?${qs.toString()}`);
|
||||
setItemsRows((prev) => (append ? [...prev, ...res.rows] : res.rows));
|
||||
setItemsOffset(offset + res.rows.length);
|
||||
setItemsHasMore(res.hasMore);
|
||||
setItemsTotal(res.total);
|
||||
} finally {
|
||||
setItemsLoading(false);
|
||||
}
|
||||
},
|
||||
[filters],
|
||||
);
|
||||
|
||||
useEffect(() => {
|
||||
fetchItems(0, false);
|
||||
}, [fetchItems]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!scanComplete) return;
|
||||
fetchItems(0, false);
|
||||
}, [scanComplete, fetchItems]);
|
||||
|
||||
const connectSse = useCallback(() => {
|
||||
esRef.current?.close();
|
||||
const buf = bufRef.current;
|
||||
@@ -229,7 +366,7 @@ export function ScanPage() {
|
||||
}, [status?.running, connectSse, stopFlushing]);
|
||||
|
||||
const startScan = async () => {
|
||||
setLog([]);
|
||||
setRecentIngest([]);
|
||||
setProgressScanned(0);
|
||||
setProgressTotal(0);
|
||||
setErrors(0);
|
||||
@@ -261,7 +398,7 @@ export function ScanPage() {
|
||||
return (
|
||||
<div>
|
||||
<div className="flex items-center justify-between mb-4">
|
||||
<h1 className="text-xl font-bold m-0">Scan</h1>
|
||||
<h1 className="text-xl font-bold m-0">Library</h1>
|
||||
<MqttBadge />
|
||||
</div>
|
||||
|
||||
@@ -284,37 +421,48 @@ export function ScanPage() {
|
||||
)}
|
||||
|
||||
<div className="border border-gray-200 rounded-lg px-4 py-3 mb-6">
|
||||
<div className="flex items-center flex-wrap gap-2 mb-3">
|
||||
<span className="text-sm font-medium">{statusLabel || (running ? "Scan in progress…" : "Scan idle")}</span>
|
||||
{scanComplete && (
|
||||
<Link to="/pipeline" className="text-blue-600 hover:underline text-sm">
|
||||
Review in Pipeline →
|
||||
</Link>
|
||||
)}
|
||||
{running ? (
|
||||
<Button variant="secondary" size="sm" onClick={stopScan}>
|
||||
Stop
|
||||
</Button>
|
||||
) : (
|
||||
<div className="flex items-center gap-2">
|
||||
<label className="flex items-center gap-1.5 text-xs m-0">
|
||||
Limit
|
||||
<input
|
||||
type="number"
|
||||
value={limit}
|
||||
onChange={(e) => setLimit(e.target.value)}
|
||||
placeholder="all"
|
||||
min="1"
|
||||
className="border border-gray-300 rounded px-1.5 py-0.5 text-xs w-16"
|
||||
/>
|
||||
items
|
||||
</label>
|
||||
<Button size="sm" onClick={startScan}>
|
||||
Start Scan
|
||||
</Button>
|
||||
<div className="flex items-start justify-between gap-3 mb-3">
|
||||
<div className="space-y-2 min-w-0">
|
||||
<div className="flex items-center flex-wrap gap-2">
|
||||
<span className="text-sm font-medium">{statusLabel || (running ? "Scan in progress…" : "Scan idle")}</span>
|
||||
{scanComplete && (
|
||||
<Link to="/pipeline" className="text-blue-600 hover:underline text-sm">
|
||||
Review in Pipeline →
|
||||
</Link>
|
||||
)}
|
||||
{errors > 0 && <Badge variant="error">{errors} error(s)</Badge>}
|
||||
</div>
|
||||
)}
|
||||
{errors > 0 && <Badge variant="error">{errors} error(s)</Badge>}
|
||||
{running ? (
|
||||
<Button variant="secondary" size="sm" onClick={stopScan}>
|
||||
Stop
|
||||
</Button>
|
||||
) : (
|
||||
<div className="flex items-center gap-2">
|
||||
<label className="flex items-center gap-1.5 text-xs m-0">
|
||||
Limit
|
||||
<input
|
||||
type="number"
|
||||
value={limit}
|
||||
onChange={(e) => setLimit(e.target.value)}
|
||||
placeholder="all"
|
||||
min="1"
|
||||
className="border border-gray-300 rounded px-1.5 py-0.5 text-xs w-16"
|
||||
/>
|
||||
items
|
||||
</label>
|
||||
<Button size="sm" onClick={startScan}>
|
||||
Start Scan
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
<div className="text-right shrink-0">
|
||||
<div className="text-sm font-semibold text-gray-700">
|
||||
{formatThousands(progressScanned)}
|
||||
{progressTotal > 0 ? ` / ${formatThousands(progressTotal)}` : ""}
|
||||
</div>
|
||||
<div className="text-[0.7rem] text-gray-500">scanned</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{(running || progressScanned > 0) && (
|
||||
@@ -325,25 +473,131 @@ export function ScanPage() {
|
||||
</div>
|
||||
)}
|
||||
<div className="flex items-center gap-2 text-gray-500 text-xs">
|
||||
<span>
|
||||
{progressScanned}
|
||||
{progressTotal > 0 ? ` / ${progressTotal}` : ""} scanned
|
||||
</span>
|
||||
{currentItem && <span className="truncate max-w-xs text-gray-400">{currentItem}</span>}
|
||||
{currentItem && <span className="truncate max-w-2xl text-gray-400">{currentItem}</span>}
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
|
||||
<div className="mt-3">
|
||||
<h3 className="font-semibold text-sm mb-2">Recent ingest (5)</h3>
|
||||
<table className="w-full border-collapse text-[0.78rem]">
|
||||
<thead>
|
||||
<tr>
|
||||
{["Time", "Source", "Type", "File", "Status"].map((h) => (
|
||||
<th
|
||||
key={h}
|
||||
className="text-left text-[0.66rem] font-bold uppercase tracking-[0.05em] text-gray-500 py-1 px-2 border-b border-gray-200 whitespace-nowrap"
|
||||
>
|
||||
{h}
|
||||
</th>
|
||||
))}
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{recentIngest.length === 0 && (
|
||||
<tr>
|
||||
<td colSpan={5} className="py-2 px-2 text-gray-400">
|
||||
No ingested items yet.
|
||||
</td>
|
||||
</tr>
|
||||
)}
|
||||
{recentIngest.map((item, i) => {
|
||||
const fileName = item.file.split("/").pop() ?? item.name;
|
||||
return (
|
||||
<tr key={`${item.file}-${i}`} className="hover:bg-gray-50">
|
||||
<td className="py-1.5 px-2 border-b border-gray-100 whitespace-nowrap">{formatScannedAt(item.scannedAt)}</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">
|
||||
<Badge variant="default">{item.source ?? "scan"}</Badge>
|
||||
</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">{item.type}</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100 truncate max-w-96" title={item.file}>
|
||||
{fileName}
|
||||
</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">
|
||||
<Badge variant={statusBadgeVariant(item.status)}>{item.status}</Badge>
|
||||
</td>
|
||||
</tr>
|
||||
);
|
||||
})}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Log */}
|
||||
<h3 className="font-semibold text-sm mb-2">Recent items</h3>
|
||||
<table className="w-full border-collapse text-[0.82rem]">
|
||||
<div className="mb-2 flex items-end justify-between gap-3">
|
||||
<h3 className="font-semibold text-sm">Library items</h3>
|
||||
<span className="text-xs text-gray-500">{formatThousands(itemsTotal)} total</span>
|
||||
</div>
|
||||
|
||||
<div className="border border-gray-200 rounded-lg p-3 mb-3 flex flex-wrap items-end gap-2">
|
||||
<label className="text-xs text-gray-600 flex flex-col gap-1">
|
||||
Search
|
||||
<input
|
||||
type="text"
|
||||
value={filters.q}
|
||||
onChange={(e) => setFilters((prev) => ({ ...prev, q: e.target.value }))}
|
||||
placeholder="Name or path"
|
||||
className="border border-gray-300 rounded px-2 py-1 text-xs w-56"
|
||||
/>
|
||||
</label>
|
||||
<label className="text-xs text-gray-600 flex flex-col gap-1">
|
||||
Status
|
||||
<select
|
||||
value={filters.status}
|
||||
onChange={(e) => setFilters((prev) => ({ ...prev, status: e.target.value as ItemFilters["status"] }))}
|
||||
className="border border-gray-300 rounded px-2 py-1 text-xs"
|
||||
>
|
||||
<option value="all">All</option>
|
||||
<option value="scanned">Scanned</option>
|
||||
<option value="pending">Pending</option>
|
||||
<option value="error">Error</option>
|
||||
</select>
|
||||
</label>
|
||||
<label className="text-xs text-gray-600 flex flex-col gap-1">
|
||||
Type
|
||||
<select
|
||||
value={filters.type}
|
||||
onChange={(e) => setFilters((prev) => ({ ...prev, type: e.target.value as ItemFilters["type"] }))}
|
||||
className="border border-gray-300 rounded px-2 py-1 text-xs"
|
||||
>
|
||||
<option value="all">All</option>
|
||||
<option value="movie">Movie</option>
|
||||
<option value="episode">Episode</option>
|
||||
</select>
|
||||
</label>
|
||||
<label className="text-xs text-gray-600 flex flex-col gap-1">
|
||||
Source
|
||||
<select
|
||||
value={filters.source}
|
||||
onChange={(e) => setFilters((prev) => ({ ...prev, source: e.target.value as ItemFilters["source"] }))}
|
||||
className="border border-gray-300 rounded px-2 py-1 text-xs"
|
||||
>
|
||||
<option value="all">All</option>
|
||||
<option value="scan">Scan</option>
|
||||
<option value="webhook">Webhook</option>
|
||||
</select>
|
||||
</label>
|
||||
</div>
|
||||
|
||||
<table className="w-full border-collapse text-[0.8rem]">
|
||||
<thead>
|
||||
<tr>
|
||||
{["Type", "File", "Status"].map((h) => (
|
||||
{[
|
||||
"Scanned",
|
||||
"Name",
|
||||
"Type",
|
||||
"Series / Ep",
|
||||
"Language",
|
||||
"Audio",
|
||||
"Container",
|
||||
"Size",
|
||||
"Source",
|
||||
"Status",
|
||||
"Path",
|
||||
].map((h) => (
|
||||
<th
|
||||
key={h}
|
||||
className="text-left text-[0.68rem] font-bold uppercase tracking-[0.06em] text-gray-500 py-1 px-2 border-b-2 border-gray-200 whitespace-nowrap"
|
||||
className="text-left text-[0.66rem] font-bold uppercase tracking-[0.05em] text-gray-500 py-1 px-2 border-b border-gray-200 whitespace-nowrap"
|
||||
>
|
||||
{h}
|
||||
</th>
|
||||
@@ -351,22 +605,55 @@ export function ScanPage() {
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{log.map((item, i) => {
|
||||
const fileName = item.file ? (item.file.split("/").pop() ?? item.name) : item.name;
|
||||
return (
|
||||
<tr key={i} className="hover:bg-gray-50">
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">{item.type}</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100" title={item.file ?? item.name}>
|
||||
{fileName}
|
||||
</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">
|
||||
<Badge variant={item.status as "error" | "done" | "pending"}>{item.status}</Badge>
|
||||
</td>
|
||||
</tr>
|
||||
);
|
||||
})}
|
||||
{itemsRows.length === 0 && !itemsLoading && (
|
||||
<tr>
|
||||
<td colSpan={11} className="py-3 px-2 text-gray-400">
|
||||
No items match the current filters.
|
||||
</td>
|
||||
</tr>
|
||||
)}
|
||||
{itemsRows.map((row) => (
|
||||
<tr key={row.id} className="hover:bg-gray-50">
|
||||
<td className="py-1.5 px-2 border-b border-gray-100 whitespace-nowrap">
|
||||
{formatScannedAt(row.last_scanned_at)}
|
||||
</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">{row.name}</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">{row.type}</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">
|
||||
<div>{row.series_name ?? "—"}</div>
|
||||
<div className="text-[0.68rem] text-gray-500">{episodeLabel(row)}</div>
|
||||
</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">
|
||||
<div>{row.original_language ?? "—"}</div>
|
||||
<div className="text-[0.68rem] text-gray-500">{row.orig_lang_source ?? "—"}</div>
|
||||
</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100 font-mono text-[0.72rem]">
|
||||
{row.audio_codecs ? row.audio_codecs.split(",").join(" · ") : "—"}
|
||||
</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">{row.container ?? "—"}</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100 whitespace-nowrap">{formatFileSize(row.file_size)}</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">
|
||||
<Badge variant="default">{row.ingest_source ?? "scan"}</Badge>
|
||||
</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100">
|
||||
<Badge variant={statusBadgeVariant(row.scan_status)}>{row.scan_status}</Badge>
|
||||
</td>
|
||||
<td className="py-1.5 px-2 border-b border-gray-100 truncate max-w-xs" title={row.file_path}>
|
||||
{row.file_path}
|
||||
</td>
|
||||
</tr>
|
||||
))}
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<div className="mt-3 flex items-center gap-2">
|
||||
{itemsHasMore && (
|
||||
<Button size="sm" variant="secondary" onClick={() => fetchItems(itemsOffset, true)} disabled={itemsLoading}>
|
||||
{itemsLoading ? "Loading…" : "Load more"}
|
||||
</Button>
|
||||
)}
|
||||
{itemsLoading && !itemsHasMore && <span className="text-xs text-gray-500">Loading…</span>}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -58,6 +58,124 @@ function LockedInput({ locked, ...props }: { locked: boolean } & React.InputHTML
|
||||
// (LockedInput) already signals when a value is env-controlled, the badge
|
||||
// was duplicate noise.
|
||||
|
||||
// ─── Secret input (password-masked with eye-icon reveal) ──────────────────────
|
||||
|
||||
function EyeIcon({ open }: { open: boolean }) {
|
||||
// GNOME-style eye / crossed-eye glyphs as inline SVG so they inherit
|
||||
// currentColor instead of fighting emoji rendering across OSes.
|
||||
if (open) {
|
||||
return (
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="16"
|
||||
height="16"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
strokeWidth="2"
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
aria-hidden="true"
|
||||
>
|
||||
<path d="M17.94 17.94A10.07 10.07 0 0 1 12 20c-7 0-11-8-11-8a18.45 18.45 0 0 1 5.06-5.94" />
|
||||
<path d="M9.9 4.24A9.12 9.12 0 0 1 12 4c7 0 11 8 11 8a18.5 18.5 0 0 1-2.16 3.19" />
|
||||
<path d="M14.12 14.12a3 3 0 1 1-4.24-4.24" />
|
||||
<line x1="1" y1="1" x2="23" y2="23" />
|
||||
</svg>
|
||||
);
|
||||
}
|
||||
return (
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="16"
|
||||
height="16"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
strokeWidth="2"
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
aria-hidden="true"
|
||||
>
|
||||
<path d="M1 12s4-8 11-8 11 8 11 8-4 8-11 8-11-8-11-8z" />
|
||||
<circle cx="12" cy="12" r="3" />
|
||||
</svg>
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Input for API keys / passwords. Shows "***" masked when the server returns
|
||||
* a secret value (the raw key never reaches this component by default). Eye
|
||||
* icon fetches the real value via /api/settings/reveal and shows it. Users
|
||||
* can also type a new value directly — any edit clears the masked state.
|
||||
*/
|
||||
function SecretInput({
|
||||
configKey,
|
||||
locked,
|
||||
value,
|
||||
onChange,
|
||||
placeholder,
|
||||
className,
|
||||
}: {
|
||||
configKey: string;
|
||||
locked: boolean;
|
||||
value: string;
|
||||
onChange: (next: string) => void;
|
||||
placeholder?: string;
|
||||
className?: string;
|
||||
}) {
|
||||
const [revealed, setRevealed] = useState(false);
|
||||
const isMasked = value === "***";
|
||||
|
||||
const toggle = async () => {
|
||||
if (revealed) {
|
||||
setRevealed(false);
|
||||
return;
|
||||
}
|
||||
if (isMasked) {
|
||||
try {
|
||||
const res = await api.get<{ value: string }>(`/api/settings/reveal?key=${encodeURIComponent(configKey)}`);
|
||||
onChange(res.value);
|
||||
} catch {
|
||||
/* ignore — keep masked */
|
||||
}
|
||||
}
|
||||
setRevealed(true);
|
||||
};
|
||||
|
||||
return (
|
||||
<div className={`relative ${className ?? ""}`}>
|
||||
<Input
|
||||
type={revealed ? "text" : "password"}
|
||||
value={value}
|
||||
disabled={locked}
|
||||
onChange={(e) => onChange(e.target.value)}
|
||||
placeholder={placeholder}
|
||||
className="pr-9"
|
||||
/>
|
||||
{locked ? (
|
||||
<span
|
||||
className="absolute inset-y-0 right-0 flex items-center pr-2.5 text-[0.9rem] opacity-40 pointer-events-none select-none"
|
||||
title="Set via environment variable — edit your .env file to change this value"
|
||||
>
|
||||
🔒
|
||||
</span>
|
||||
) : (
|
||||
<button
|
||||
type="button"
|
||||
onClick={toggle}
|
||||
tabIndex={-1}
|
||||
className="absolute inset-y-0 right-0 flex items-center px-2.5 text-gray-400 hover:text-gray-700 focus:outline-none focus-visible:text-gray-700"
|
||||
title={revealed ? "Hide" : "Reveal"}
|
||||
aria-label={revealed ? "Hide secret" : "Reveal secret"}
|
||||
>
|
||||
<EyeIcon open={revealed} />
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
// ─── Section card ──────────────────────────────────────────────────────────────
|
||||
|
||||
function SectionCard({
|
||||
@@ -227,17 +345,18 @@ function ConnSection({
|
||||
value={url}
|
||||
onChange={(e) => setUrl(e.target.value)}
|
||||
placeholder={urlPlaceholder}
|
||||
className="mt-0.5 max-w-sm"
|
||||
className="mt-0.5"
|
||||
/>
|
||||
</label>
|
||||
<label className="block text-sm text-gray-700 mb-1 mt-3">
|
||||
API Key
|
||||
<LockedInput
|
||||
<SecretInput
|
||||
configKey={apiKeyProp}
|
||||
locked={locked.has(apiKeyProp)}
|
||||
value={key}
|
||||
onChange={(e) => setKey(e.target.value)}
|
||||
onChange={setKey}
|
||||
placeholder="your-api-key"
|
||||
className="mt-0.5 max-w-xs"
|
||||
className="mt-0.5"
|
||||
/>
|
||||
</label>
|
||||
<div className="flex items-center gap-2 mt-3">
|
||||
|
||||
@@ -65,11 +65,10 @@ function RootLayout() {
|
||||
<VersionBadge />
|
||||
<div className="flex flex-wrap items-center gap-0.5">
|
||||
<NavLink to="/" exact>
|
||||
Scan
|
||||
Library
|
||||
</NavLink>
|
||||
<NavLink to="/pipeline">Pipeline</NavLink>
|
||||
<NavLink to="/review/subtitles">Subtitles</NavLink>
|
||||
<NavLink to="/execute">Jobs</NavLink>
|
||||
</div>
|
||||
<div className="flex-1" />
|
||||
<div className="flex items-center gap-0.5">
|
||||
|
||||
@@ -1,10 +0,0 @@
|
||||
import { createFileRoute } from "@tanstack/react-router";
|
||||
import { z } from "zod";
|
||||
import { ExecutePage } from "~/features/execute/ExecutePage";
|
||||
|
||||
export const Route = createFileRoute("/execute")({
|
||||
validateSearch: z.object({
|
||||
filter: z.enum(["all", "pending", "running", "done", "error"]).default("pending"),
|
||||
}),
|
||||
component: ExecutePage,
|
||||
});
|
||||
@@ -160,11 +160,32 @@ export interface PipelineJobItem {
|
||||
}
|
||||
|
||||
export interface PipelineData {
|
||||
review: PipelineReviewItem[];
|
||||
reviewTotal: number;
|
||||
reviewItemsTotal: number;
|
||||
queued: PipelineJobItem[];
|
||||
processing: PipelineJobItem[];
|
||||
done: PipelineJobItem[];
|
||||
doneCount: number;
|
||||
jellyfinUrl: string;
|
||||
}
|
||||
|
||||
// ─── Review groups (GET /api/review/groups) ──────────────────────────────────
|
||||
|
||||
export type ReviewGroup =
|
||||
| { kind: "movie"; item: PipelineReviewItem }
|
||||
| {
|
||||
kind: "series";
|
||||
seriesKey: string;
|
||||
seriesName: string;
|
||||
seriesJellyfinId: string | null;
|
||||
episodeCount: number;
|
||||
minConfidence: "high" | "low";
|
||||
originalLanguage: string | null;
|
||||
seasons: Array<{ season: number | null; episodes: PipelineReviewItem[] }>;
|
||||
};
|
||||
|
||||
export interface ReviewGroupsResponse {
|
||||
groups: ReviewGroup[];
|
||||
totalGroups: number;
|
||||
totalItems: number;
|
||||
hasMore: boolean;
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user