Commit Graph

74 Commits

Author SHA1 Message Date
23dca8bf0b split scheduling into scan + process windows, move controls to settings page
Some checks failed
Build and Push Docker Image / build (push) Failing after 8s
the old one-window scheduler gated only the job queue. now the scan loop and
the processing queue have independent windows — useful when the container
runs as an always-on service and we only want to hammer jellyfin + ffmpeg
at night.

config keys renamed from schedule_* to scan_schedule_* / process_schedule_*,
plus the existing job_sleep_seconds. scheduler.ts exposes parallel helpers
(isInScanWindow / isInProcessWindow, waitForScanWindow / waitForProcessWindow)
so each caller picks its window without cross-contamination.

scan.ts checks the scan window between items and emits paused/resumed sse.
execute.ts keeps its per-job pause + sleep-between-jobs but now on the
process window. /api/execute/scheduler moved to /api/settings/schedule.

frontend: ScheduleControls popup deleted from the pipeline header, replaced
with a plain Start queue button. settings page grows a Schedule section with
both windows and the job sleep input.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-13 14:50:25 +02:00
6fcaeca82c write canonical iso3 language metadata, tighten is_noop, store full jellyfin data
Some checks failed
Build and Push Docker Image / build (push) Failing after 16s
ffmpeg now writes -metadata:s:a:i language=<iso3> on every kept audio track so
files end up with canonical 3-letter tags (en → eng, ger → deu, null → und).
analyzer passes stream.profile (not title) to transcodeTarget so lossless
dts-hd ma in mkv correctly targets flac. is_noop also checks og-is-default and
canonical-language so pipeline-would-change-it cases stop showing as done.

normalizeLanguage gains 2→3 mapping, and mapStream no longer normalizes at
ingest so the raw jellyfin tag survives for the canonical check.

per-item scan work runs in a single db.transaction for large sqlite speedups,
extracted into server/services/rescan.ts so execute.ts can reuse it.

on successful job, execute calls jellyfin /Items/{id}/Refresh, waits for
DateLastRefreshed to change, refetches the item, and upserts it through the
same pipeline; plan flips to done iff the fresh streams satisfy is_noop.

schema wiped + rewritten to carry jellyfin_raw, external_raw, profile,
bit_depth, date_last_refreshed, runtime_ticks, original_title, last_executed_at
— so future scans aren't required to stay correct. user must drop data/*.db.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-13 13:56:19 +02:00
cc418e5874 fix: jellyfin save now matches the new { ok, saved, testError } response shape
All checks were successful
Build and Push Docker Image / build (push) Successful in 31s
When I switched the settings UI to read result.saved to decide whether the
'✓ Saved & connected' / '⚠ Saved, but connection test failed' / '✗ error'
states should appear, I only updated the Radarr and Sonarr endpoints to
return that shape. Jellyfin still returned bare { ok: true } so the UI
saw saved=undefined and showed '✗ Save failed' even on a perfectly
successful save — making it look like Jellyfin had stopped working.

Bring Jellyfin in line:
- Save the URL+API key (and setup_complete) BEFORE running testConnection
  so the input survives a failed probe (same fix as Radarr/Sonarr).
- Only do the admin-user discovery on test success.
- Return { ok, saved, testError }.
2026-04-13 12:33:26 +02:00
94a460be9d rename setup → settings throughout; persist arr creds even on test failure
All checks were successful
Build and Push Docker Image / build (push) Successful in 36s
Two cleanups:

1. Rename the page from 'Setup' to 'Settings' all the way down. The H1
   already said Settings; the file/component/api path were lying.
   - src/features/setup/ → src/features/settings/
   - SetupPage.tsx → SettingsPage.tsx, SetupPage → SettingsPage,
     SetupData → SettingsData, setupCache → settingsCache
   - server/api/setup.ts → server/api/settings.ts
   - /api/setup → /api/settings (only consumer is our frontend)
   - server/index.tsx import + route mount renamed
   - ScanPage's local setupChecked → configChecked

2. Sonarr (and Radarr) save flow: persist the values BEFORE running the
   connection test. The previous code returned early if the test failed,
   silently dropping what the user typed — explained the user's report
   that Sonarr 'forgets' the input. Now setConfig fires unconditionally
   on a valid (non-empty) URL+key; the test result is returned as
   { ok, saved, testError } so the UI can show 'Saved & connected' on
   success or '⚠ Saved, but connection test failed: …' on failure
   instead of erasing the input.

Note: setup_complete config key kept as-is — it represents 'has the user
configured Jellyfin' which is conceptually setup and not user-visible.
2026-04-13 12:26:30 +02:00
e8f33c6224 consolidate dashboard into scan page; / now renders Scan
All checks were successful
Build and Push Docker Image / build (push) Successful in 45s
Single landing page: stats grid up top, then scan controls + progress,
then recent items log. Drops the 'click → bounce to Scan' indirection.

- ScanPage pulls /api/dashboard for the stats card grid; refetches when
  a scan completes so totals reflect the new state
- Scan page also owns the setup-complete redirect to /settings (was on
  Dashboard) and the empty-library 'click Start Scan' nudge
- / route now renders ScanPage; /scan route deleted
- DashboardPage and its feature dir gone
- Nav: drop 'Dashboard', repoint 'Scan' to /
2026-04-13 12:21:03 +02:00
962b5efc6f settings: drop the section-header env-var lock badges
All checks were successful
Build and Push Docker Image / build (push) Successful in 34s
The per-input LockedInput already shows a 🔒 inside any field that's
controlled by an env var, with a tooltip pointing at the .env file. The
extra '🔒 JELLYFIN_URL' badge in the section title was duplicate signal —
remove it. Drop EnvBadge entirely; section titles go back to plain text
('Jellyfin', 'Radarr (optional)', etc.).
2026-04-13 12:14:00 +02:00
b8525be015 scan: validate arr URLs upfront, cache library once per scan
All checks were successful
Build and Push Docker Image / build (push) Successful in 30s
Two regressions from the radarr/sonarr fix:

1. ERR_INVALID_URL spam — when radarr_enabled='1' but radarr_url is empty
   or not http(s), every per-item fetch threw TypeError. We caught it but
   still ate the cost (and the log noise) on every movie. New isUsable()
   check on each service: enabled-in-config but URL doesn't parse →
   warn ONCE and skip arr lookups for the whole scan.

2. Per-item HTTP storm — for movies not in Radarr's library we used to
   hit /api/v3/movie (the WHOLE library) again per item, then two
   metadata-lookup calls. With 2000 items that's thousands of extra
   round-trips and the scan crawled. Now: pre-load the Radarr/Sonarr
   library once into Map<tmdbId,..>+Map<imdbId,..>+Map<tvdbId,..>,
   per-item lookups are O(1) memory hits, and only the genuinely-missing
   items make a single lookup-endpoint HTTP call.

The startup line now reports the library size:
  External language sources: radarr=enabled (https://..., 1287 movies in library), sonarr=...
so you can immediately see whether the cache loaded.
2026-04-13 12:06:17 +02:00
1aafcb4972 apply codex code review: fix useEffect refetch loops, dead routes, subtitle job_type leftovers
All checks were successful
Build and Push Docker Image / build (push) Successful in 36s
All ack'd as real bugs:

frontend
- AudioDetailPage / SubtitleDetailPage / PathsPage / ScanPage /
  SubtitleListPage / ExecutePage: load() was a fresh function reference
  every render, so 'useEffect(() => load(), [load])' refetched on every
  render. Wrap each in useCallback with the right deps ([id], [filter],
  or []).
- SetupPage: langsLoaded was useState; setting it inside load() retriggered
  the same effect → infinite loop. Switch to useRef. Also wrap saveJellyfin/
  Radarr/Sonarr in async fns so they return Promise<void> (matches the
  consumer signatures, fixes the latent TS error).
- DashboardPage: redirect target /setup doesn't exist; the route is
  /settings.
- ExecutePage: <>...</> fragment with two <tr> children had keys on the
  rows but not on the fragment → React reconciliation warning. Use
  <Fragment key>. jobTypeLabel + badge variant still branched on the
  removed 'subtitle' job_type — relabel to 'Audio Transcode' / 'Audio
  Remux' and use 'manual'/'noop' variants.

server
- review.ts + scan.ts: parseLanguageList helper catches JSON errors and
  enforces array-of-strings shape with a fallback. A corrupted config
  row would otherwise throw mid-scan.
2026-04-13 12:01:57 +02:00
cafb3852a1 radarr/sonarr: stop silent failures, add metadata lookup fallback, diagnostic logs
All checks were successful
Build and Push Docker Image / build (push) Successful in 25s
The real reason 8 Mile landed as Turkish: Radarr WAS being called, but the
call path had three silent failure modes that all looked identical from
outside.

1. try { … } catch { return null } swallowed every error. No log when
   Radarr was unreachable, when the API key was wrong, when HTTP returned
   404/500, or when JSON parsing failed. A miss and a crash looked the
   same: null, fall back to Jellyfin's dub guess.

2. /api/v3/movie?tmdbId=X only queries Radarr's LIBRARY. If the movie is
   on disk + in Jellyfin but not actively managed in Radarr, returns [].
   We then gave up and used the Jellyfin guess.

3. iso6391To6392 fell back to normalizeLanguage(name.slice(0, 3)) for any
   unknown language name — pretending 'Mandarin' → 'man' and 'Flemish' →
   'fle' are valid ISO 639-2 codes.

Fixes:
- Both services: fetchJson helper logs HTTP errors with context and the
  url (api key redacted), plus catches+logs thrown errors.
- Added a metadata-lookup fallback: /api/v3/movie/lookup/tmdb and
  /lookup/imdb for Radarr, /api/v3/series/lookup?term=tvdb:X for Sonarr.
  These hit TMDB/TVDB via the arr service for titles not in its library.
- Expanded NAME_TO_639_2: Mandarin/Cantonese → zho, Flemish → nld,
  Farsi → fas, plus common European langs that were missing.
- Unknown name → return null (log a warning) instead of a made-up 3-char
  code. scan.ts then marks needs_review.
- scan.ts: per-item warn when Radarr/Sonarr miss; per-scan summary line
  showing hits/misses/no-provider-id tallies.

Run a scan — the logs will now tell you whether Radarr was called, what
it answered, and why it fell back if it did.
2026-04-13 11:46:26 +02:00
50d3e50280 fix '8 Mile is Turkish': jellyfin guesses never earn high confidence
All checks were successful
Build and Push Docker Image / build (push) Successful in 28s
Two bugs compounded:

1. extractOriginalLanguage() in jellyfin.ts picked the FIRST audio stream's
   language and called it 'original'. Files sourced from non-English regions
   often have a local dub as track 0, so 8 Mile with a Turkish dub first
   got labelled Turkish.

2. scan.ts promoted any single-source answer to confidence='high' — even
   the pure Jellyfin guess, as long as no second source (Radarr/Sonarr)
   contradicted it. Jellyfin's dub-magnet guess should never be green.

Fixes:
- extractOriginalLanguage now prefers the IsDefault audio track and skips
  tracks whose title shouts 'dub' / 'commentary' / 'director'. Still a
  heuristic, but much less wrong. Fallback to the first track when every
  candidate looks like a dub so we have *something* to flag.
- scan.ts: high confidence requires an authoritative source (Radarr/Sonarr)
  with no conflict. A Jellyfin-only answer is always low confidence AND
  gets needs_review=1 so it surfaces in the pipeline for manual override.
- Data migration (idempotent): downgrade existing plans backed only by the
  Jellyfin heuristic to low confidence and mark needs_review=1, so users
  don't have to rescan to benefit.
- New server/services/__tests__/jellyfin.test.ts covers the default-track
  preference and dub-skip behavior.
2026-04-13 11:39:59 +02:00
e3b241bef3 drop audio list tab, move per-item actions onto pipeline cards
All checks were successful
Build and Push Docker Image / build (push) Successful in 39s
The pipeline tab fully replaces the audio list: same items, better
workflow. What the old list contributed (per-item details + skip/approve)
now lives inline on each pipeline card.

- delete src/routes/review/audio/index.tsx + src/features/review/AudioListPage.tsx
- /review/ now redirects to /pipeline (was /review/audio, which no longer exists)
- AudioDetailPage back link goes to /pipeline
- nav: drop the Audio link
- PipelineCard: three buttons on every card — Details (TanStack Link to
  /review/audio/$id — the detail route stays, it's how you drill in),
  Skip (POST /api/review/:id/skip), Approve (POST /api/review/:id/approve).
  Remove the old 'Approve up to here' button (it was computing against
  frontend ordering we don't want to maintain, and it was broken).
- SeriesCard: drop onApproveUpTo, pass new approve/skip handlers through
  to each expanded episode card
- server: remove now-unused POST /api/review/approve-batch (no callers)
2026-04-13 11:20:57 +02:00
d12dd80209 fix: buildCommand now extracts subtitles to sidecars before stripping them
All checks were successful
Build and Push Docker Image / build (push) Successful in 1m12s
Bug: every approve path (buildCommand used by review approve/approve-all/
series approve-all/season approve-all/retry/detail preview) was building
an ffmpeg command that -map'd only the 'keep' streams and dropped all
subtitles. For a file like Wuthering Heights with 37 embedded subs, the
run would delete every sub into the void — user expected extraction to
sidecar files per the pipeline contract.

buildPipelineCommand already did the right thing (extract every subtitle
with -map 0:s:N -c:s copy 'basename.lang.srt', then remux kept streams)
but it was only reached by tests. buildCommand now delegates to it — one
call site, subtitle extraction always runs, predictExtractedFiles records
the sidecar paths after job success (same logic, same basePath).

Added a regression test: buildCommand on a 2-subtitle file contains
-map 0:s:0, -map 0:s:1 and the expected 'basename.en.srt'/'.de.srt' paths.
2026-04-13 11:12:10 +02:00
5fa39aee7c processing card: meaningful progress display
All checks were successful
Build and Push Docker Image / build (push) Successful in 1m42s
Two issues with the old bar:
1. progress state was never cleared between jobs — when a job finished,
   its 100% bar lingered on the next job's card until that job emitted
   its first progress event. Clear progress on any job_update where
   status != 'running', and on the column side ignore progress unless
   progress.id matches the current job.id.
2. labels were misleading: the left/right times were ffmpeg's *input*
   timestamp position (how far into the source it had read), not wall-
   clock elapsed/remaining. For -c copy jobs ripping a 90-min file in
   5 wall-seconds, the user saw '0:45 / 90:00' jump straight to
   '90:00 / 90:00' which looks broken.

New display: 'elapsed M:SS  N%  ~M:SS left'. Elapsed is wall-clock
since the job started (re-renders every second), percent comes from
ffmpeg input progress as before, ETA is derived from elapsed × (100-p)/p
once we have at least 1% to avoid wild guesses.
2026-04-13 10:29:49 +02:00
37e30e9ade processing column: per-card stop button alongside the column-header one
All checks were successful
Build and Push Docker Image / build (push) Successful in 37s
2026-04-13 10:26:20 +02:00
2ada728e50 fix approve-up-to: client sends explicit visible plan id list
All checks were successful
Build and Push Docker Image / build (push) Successful in 1m12s
The server's old /approve-up-to/:id re-ran its own SQL ORDER BY against
ALL pending plans (no LIMIT) to decide which rows fell 'before' the target.
The pipeline UI uses a different ordering — interleaving movies with
series cards, sorting by confidence tier without a name tiebreaker, and
collapsing every episode of a series into one card. Visible position
therefore did not map to the server's iteration position, and clicking
'Approve up to here' could approve far more (or different) items than
the user expected.

- replace POST /approve-up-to/:id with POST /approve-batch { planIds: [...] }
  — server only approves the plans the client lists, idempotent: skips
  ids that are no longer pending, were already approved, or are noop
- ReviewColumn now builds visiblePlanIds in actual render order
  (each movie's id, then every episode id of each series in series order)
  and 'approve up to here' on any card sends slice(0, idx+1) of that list
- works the same for both PipelineCard (movie) and SeriesCard (whole series
  through its last episode)
2026-04-13 10:16:58 +02:00
4a378eb833 pipeline: equal-width columns + per-column clear/stop button
All checks were successful
Build and Push Docker Image / build (push) Successful in 39s
Extract a ColumnShell component so all four columns share the same flex-1
basis-0 width (no more 24/16/18/16 rem mix) and the same header layout
(title + count + optional action button on the right).

Per-column actions:
- Review:     'Skip all' → POST /api/review/skip-all (new endpoint, sets all
              pending non-noop plans to skipped in one update)
- Queued:     'Clear'    → POST /api/execute/clear (existing; cancels pending jobs)
- Processing: 'Stop'     → POST /api/execute/stop (new; SIGTERMs the running
              ffmpeg via a tracked Bun.spawn handle, runJob's catch path
              marks the job error and cleans up)
- Done:       'Clear'    → POST /api/execute/clear-completed (existing)

All destructive actions confirm before firing.
2026-04-13 10:08:42 +02:00
ec28e43484 make pipeline responsive at scale: cap review query, debounce sse reload, indexable done count
All checks were successful
Build and Push Docker Image / build (push) Successful in 37s
The pipeline endpoint returned every pending plan (no LIMIT) while the audio
list capped at 500 — that alone was the main lag. SSE compounded it: every
job_update (which fires per-line of running ffmpeg output) re-ran the entire
endpoint and re-rendered every card.

- review query: LIMIT 500 + a separate COUNT for reviewTotal; column header
  shows 'X of Y' and a footer 'Showing first X of Y. Approve some to see
  the rest' when truncated
- doneCount: split the OR-form into two indexable counts (is_noop + done&!noop),
  added together — uses idx_review_plans_is_noop and idx_review_plans_status
  instead of full scan
- pipeline page: 1s debounce on SSE-triggered reload so a burst of
  job_update events collapses into one refetch
2026-04-13 10:00:08 +02:00
9ee0dd445f remove standalone subtitle extract, unify done semantics, fix nav active matching
All checks were successful
Build and Push Docker Image / build (push) Successful in 49s
Subtitle extraction lives only in the pipeline now; a file is 'done' when it
matches the desired end state — no embedded subs AND audio matches the
language config. The separate Extract page was redundant.

- delete src/routes/review/subtitles/extract.tsx + SubtitleExtractPage
- delete /api/subtitles/extract-all + /:id/extract endpoints
- delete buildExtractOnlyCommand + unused buildExtractionOutputs from ffmpeg.ts
- detail page: drop Extract button + extractCommand textarea, replace with
  'will be extracted via pipeline' note when embedded subs present
- pipeline endpoint: doneCount = is_noop OR status='done' (a file in the
  desired state, however it got there); UI label 'N files in desired state'
- nav: drop the now-defunct 'Extract subs' link, default activeOptions.exact
  to false so detail subpages (e.g. /review/audio/123) highlight their
  parent ('Audio') in the menu — was the cause of the broken-feeling menu
2026-04-13 09:41:46 +02:00
cc19d99292 surface all app routes in nav
All checks were successful
Build and Push Docker Image / build (push) Successful in 29s
Nav only exposed a subset; Dashboard, Audio review, Subtitle extract, Jobs,
and Paths were reachable only via URL. Add links for every top-level route:

- left: Dashboard, Scan, Pipeline, Audio, Extract subs, Subtitle mgr, Jobs
- right: Paths, Settings

Split the two subtitle pages explicitly (Extract subs = per-item extraction
queue, Subtitle mgr = language summary + title harmonization) so their
distinct purpose is visible from the nav instead of hidden under one label.
2026-04-13 08:25:09 +02:00
bb4016d05b gitignore scheduled_tasks.lock (scheduled wakeup state)
All checks were successful
Build and Push Docker Image / build (push) Successful in 1m13s
2026-04-13 08:18:08 +02:00
e4c771d39e fix scan page tdz crash: break flush/stopFlushing useCallback cycle
Some checks failed
Build and Push Docker Image / build (push) Has been cancelled
Prod minified bundle crashed with 'can't access lexical declaration 'o'
before initialization' because flush was memoized with stopFlushing in its
deps, and stopFlushing was memoized with flush in its deps — circular.
In dev this still worked (refs paper over TDZ), but Vite's minifier emitted
the declarations in an order that tripped the temporal dead zone.

Extract the interval-clearing into a plain inline helper (clearFlushTimer)
that both flush and stopFlushing call. flush no longer depends on
stopFlushing; the cycle is gone.
2026-04-13 08:17:57 +02:00
3c1c8dd8f0 drop docker/* gitea actions to avoid 2+ min git-clone overhead per run
All checks were successful
Build and Push Docker Image / build (push) Successful in 3m6s
Gitea's act runner re-clones every referenced GitHub action on every build
because it has no action cache. docker/setup-buildx-action alone was taking
~2 minutes to clone before the build even started.

buildx is already bundled in gitea/runner-images:ubuntu-latest, so call
'docker buildx build --push' directly with --cache-from/--cache-to pointing
at a registry buildcache tag. Keeps the layer caching benefit, skips the
action-clone tax entirely.
2026-04-13 08:05:50 +02:00
b04c8acc39 speed up docker build: bun everywhere, buildx layer cache, tighter dockerignore
Some checks failed
Build and Push Docker Image / build (push) Has been cancelled
Root cause of 6+ min builds: Dockerfile stage 1 ran 'npm install' with no
package-lock.json, so every build re-resolved + re-fetched the full npm tree
from scratch on a fresh runner.

- Dockerfile: replace node:22-slim+npm stage with oven/bun:1-slim; both
  stages now 'bun install --frozen-lockfile' against the tracked bun.lock;
  --mount=type=cache for the bun install cache
- workflow: switch to docker/build-push-action with registry buildcache
  (cache-from + cache-to) so layers persist across runs
- dockerignore: add .worktrees, docs, tests, tsbuildinfo so the build context
  ships less
2026-04-13 08:00:19 +02:00
9184c3991c gitignore tsbuildinfo (project references write this)
Some checks failed
Build and Push Docker Image / build (push) Has been cancelled
2026-04-13 07:51:33 +02:00
af410cb616 fix server typecheck: use tsconfig project references, await bun file in spa fallback
Some checks failed
Build and Push Docker Image / build (push) Has been cancelled
- split tsconfig.json into project references (client + server) so bun-types and DOM types don't leak into the other side; server now resolves Bun.* without diagnostics
- client tsconfig adds vite/client types so import.meta.env typechecks
- index.tsx spa fallback: use async/await + c.html(await …) instead of returning a Promise of a Response, which Hono's Handler type rejects
- subtitles normalize-titles: narrow canonical to string|null (Map.get widened to include undefined)
2026-04-13 07:51:10 +02:00
874f04b7a5 wire scheduler into queue, add retry, dev-reset cleanup, biome 2.4 migrate
- execute: actually call isInScheduleWindow/waitForWindow/sleepBetweenJobs in runSequential (they were dead code); emit queue_status SSE events (running/paused/sleeping/idle) so the pipeline's existing QueueStatus listener lights up
- review: POST /:id/retry resets an errored plan to approved, wipes old done/error jobs, rebuilds command from current decisions, queues fresh job
- scan: dev-mode DELETE now also wipes jobs + subtitle_files (previously orphaned after every dev reset)
- biome: migrate config to 2.4 schema, autoformat 68 files (strings + indentation), relax opinionated a11y/hooks-deps/index-key rules that don't fit this codebase
- routeTree.gen.ts regenerated after /nodes removal
2026-04-13 07:41:19 +02:00
f11861658e add bun:test coverage for analyzer + ffmpeg + validate, emit ffmpeg progress sse
- analyzer.test.ts: audio keep rules (OG + configured langs, unknown OG, undetermined lang, iso alias), ordering (OG first, reorder noop), subtitle forced-remove, transcode targets
- ffmpeg.test.ts: shellQuote, sortKeptStreams canonical order, buildCommand tmp+mv, type-relative maps (0:a:N), disposition, buildPipelineCommand sub extraction + transcode bitrate, predictExtractedFiles dedup
- validate.test.ts: parseId bounds + isOneOf narrowing
- execute: parse ffmpeg Duration + time, emit job_progress SSE events throttled at 500ms so ProcessingColumn progress bar fills in (it already listened)
- package: switch test script from placeholder echo to 'bun test'
2026-04-13 07:35:24 +02:00
93ed0ac33c fix analyzer + api boundary + perf + scheduler hardening
- analyzer: rewrite checkAudioOrderChanged to compare actual output order, unify assignTargetOrder with a shared sortKeptStreams util in ffmpeg builder
- review: recompute is_noop via full audio removed/reordered/transcode/subs check on toggle, preserve custom_title across rescan by matching (type,lang,stream_index,title), batch pipeline transcode-reasons query to avoid N+1
- validate: add lib/validate.ts with parseId + isOneOf helpers; replace bare Number(c.req.param('id')) with 400 on invalid ids across review/subtitles
- scan: atomic CAS on scan_running config to prevent concurrent scans
- subtitles: path-traversal guard — only unlink sidecars within the media item's directory; log-and-orphan DB entries pointing outside
- schedule: include end minute in window (<= vs <)
- db: add indexes on review_plans(status,is_noop), stream_decisions(plan_id), media_items(series_jellyfin_id,series_name,type), media_streams(item_id,type), subtitle_files(item_id), jobs(status,item_id)
2026-04-13 07:31:48 +02:00
cdcb1ff706 drop multi-node ssh execution, unify job runner to local + fix job completion atomicity
- remove nodes table, ssh service, nodes api, NodesPage route
- execute.ts: local-only spawn, atomic CAS job claim via UPDATE status
- wrap job done + subtitle_files insert + review_plans status in db transaction
- stream ffmpeg output per line with 500ms throttled flush
- bump version to 2026.04.13
2026-04-13 07:25:19 +02:00
1762f070a9 pipeline UI polish: transcode reasons, scroll fix, series card overflow, rounded corners
All checks were successful
Build and Push Docker Image / build (push) Successful in 51s
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-28 10:35:51 +01:00
9c5a793a47 pipeline UI polish: jellyfin deep-links on titles, hover-to-show approve buttons, series approve-up-to
All checks were successful
Build and Push Docker Image / build (push) Successful in 37s
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-28 01:14:19 +01:00
7cefd9bf04 wire scan completion to pipeline page
All checks were successful
Build and Push Docker Image / build (push) Successful in 8m50s
After a scan completes, show a "Review in Pipeline →" link next to the
status label. Nav already included the Pipeline entry from a prior task.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 01:53:29 +01:00
3881f3a4c2 bump version to 2026.03.27 for unified pipeline release
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 01:53:07 +01:00
8bdfa79215 add pipeline Kanban board: route, layout, review/queue/processing/done columns, schedule controls
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 01:51:47 +01:00
fd72a6d212 add pipeline API: approve-up-to, series language, pipeline summary
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 01:49:14 +01:00
9cffdaac47 fix reanalyze: pass container to analyzeItem, store new pipeline fields 2026-03-27 01:47:40 +01:00
9a19350f7e add job scheduler: sleep between jobs, schedule window, FFmpeg progress parsing
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 01:46:41 +01:00
97e60dbfc5 add buildPipelineCommand: single FFmpeg command for sub extraction, audio cleanup, transcode
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 01:46:26 +01:00
ecb0732185 store confidence, apple_compat, job_type, transcode_codec during scan 2026-03-27 01:45:56 +01:00
b1cf0fca38 unify analyzer: 3-step pipeline with apple compat, transcode decisions, extended is_noop
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 01:44:22 +01:00
c2e5b70b02 add schema migrations for unified pipeline: confidence, apple_compat, job_type, transcode_codec
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 01:42:18 +01:00
c017ca09d4 add apple compatibility service: codec checks, transcode target mapping 2026-03-27 01:41:21 +01:00
6507924e45 add .worktrees/ to .gitignore 2026-03-27 01:39:06 +01:00
3f14b19195 remove green tint from action boxes, simplify execute empty state
All checks were successful
Build and Push Docker Image / build (push) Successful in 1m14s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-06 21:51:30 +01:00
6363a133dd unify action box across all pages: consistent border/rounded style, green tint for "all good" states
All checks were successful
Build and Push Docker Image / build (push) Successful in 35s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-06 17:16:57 +01:00
dd82318828 fix nav highlighting by using exact active matching on links
All checks were successful
Build and Push Docker Image / build (push) Successful in 31s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-06 17:07:08 +01:00
d422b0a79b split subtitles tab into ST Extract (browse/extract items) and ST Manager (language summary, title harmonization)
All checks were successful
Build and Push Docker Image / build (push) Successful in 2m1s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-06 17:01:50 +01:00
38b0faf55a add job_type column, simplify execute page: remove node/command columns, add type badge, make item title clickable
All checks were successful
Build and Push Docker Image / build (push) Successful in 37s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 15:01:11 +01:00
2f10037e93 fix subtitle summary 404 by moving /summary route before /:id catch-all, bump version
All checks were successful
Build and Push Docker Image / build (push) Successful in 1m8s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 13:46:00 +01:00
76d3b1acfb remove path mappings, add subtitle summary endpoint, cache setup page, bump version
All checks were successful
Build and Push Docker Image / build (push) Successful in 1m50s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-05 12:02:26 +01:00