Turfi Platform Documentation
Official Turfi documentation portal for users, admins, and developers.
Documentation Search
Search only within Turfi documentation pages.
Media and Match Intelligence
Media, moments, highlights, and match intelligence systems that turn events into usable outputs.
Turfi does more than store scores; it turns game events into moments, highlights, and media context that can be surfaced across the platform. This document explains the systems that convert raw match data into playable and meaningful outputs. It fits into the architecture between competition records and media distribution. As automated detection, playback, and highlight workflows expand, this file should remain the reference for media and match intelligence behavior.
Shared status legend: [docs/_shared/status-legend.md](./_shared/status-legend.md)
Games, events, and media (truth model)
This stack implements Turfi’s role as a Consensus Engine for Sports Reality (see [Platform Philosophy and Scope](./platform-philosophy-and-scope.md)): evidence and inputs converge into canonical records and derived outputs—there is no assumption of a single monopoly feed.
- Games are the primary match anchor: schedule, score fields, competition and venue linkage, and lifecycle state.
- Media is evidence, not decoration — uploaded video, clips, and highlights are primary inputs for review, AI-assisted extraction, and human verification; multiple media assets may attach to the same game.
- Events may originate from multiple sources (official imports, manual tagging, video-derived proposals, community input). Rows may exist in proposed or review states before promotion to canonical event streams used for stats.
- Canonical attributed actions (goals, cards, substitutions, etc.) feed statistics and timelines once reconciled; canonical stats must be traceable to underlying events and, where applicable, to source media or import metadata—not invented as free-floating numbers.
- Scores on
games(home_score,away_score) are the declared result used for scheduling hooks and for standings when a game is official per engine rules. A computed score derived from approved events may differ—see Consensus Scoring below. Do not assume a single automatic source of truth between declared and event-derived totals. - Lightweight workflows: creation and enrichment paths must tolerate missing non-critical structural data; optional context should not block recording a game or attaching media when integrity rules are met.
Consensus Scoring
A game has two parallel score representations:
- Declared score: stored on the game record (
home_score,away_score). It represents the official or manually entered result. - Computed score: derived from approved game events. It represents the evidence-based result.
These two values can differ.
Computed score rules
The computed score is derived from:
event_type = 'goal'review_status = 'approved'
…grouped per team within a game.
Consensus status
Each game has a consensus status:
- match: declared score matches computed score
- mismatch: declared score differs from computed score
This is calculated dynamically (via database view: game_score_consensus).
System behavior
- The system does not automatically overwrite declared scores.
- The system does not enforce alignment between declared and computed scores.
- The system surfaces discrepancies for visibility and review.
Purpose
This model enables:
- multi-source validation (manual, AI, imported data)
- progressive accuracy over time
- media-backed verification of game results
Match Intelligence Engine
Status: IMPLEMENTED
The Match Intelligence Engine turns raw match data into something useful. When a goal is scored or a substitution happens, the Competition Engine records it—but the Match Intelligence Engine structures those events into play sequences, statistics, moment candidates, and timelines that power stats panels, highlight detection, and player performance views.
Raw game events are necessary but not sufficient. Coaches, scouts, and fans want to see stats, trends, and key moments—not just a flat list of events. The Match Intelligence Engine exists to derive that higher-level structure so the rest of Turfi can surface meaningful insights instead of raw logs.
This engine owns play sequences and action chains, match state snapshots and derived context, moment candidates (potential highlight opportunities), player and team statistics from events, and timeline aggregation for playback and UI.
Match Intelligence sits between the Competition Engine (which records events) and the Media and Highlight systems (which attach clips to moments). Discovery uses its outputs for ranking and trending; Broadcast uses timelines for playback; the Player Engine consumes stats for career views. It's the analytical layer that makes match data actionable.
Future work includes richer sequence detection, automated moment candidate scoring, real-time stat updates during live games, and integration with video analysis for clip-level intelligence.
Purpose
The Match Intelligence Engine structures match-centric performance and sequence data so matches can be analyzed through events, actions, stats, and derived context.
Architecture Overview
Current implementation exposes match intelligence primarily through read models consumed by lib/api/match.ts and match UI components. The app retrieves game context, unified timeline, game stats, moments, and highlight collections in parallel. The schema also includes deeper action/sequence tables used for intelligence modeling even where full UI workflows are still maturing.
Key Database Models
- games
- game_events
- game_actions
- game_action_context
- match_event_context
- player_game_stats
- player_competition_stats
- play_sequences
- play_sequence_actions
- match_state_snapshots
- stat_action_map
- event_types
- action_types
- api_games
- api_match_timeline_unified
- api_player_game_stats
- video_moments
- highlight_collections
Key Workflows
- Match detail retrieval by game id
- Event timeline aggregation for playback context
- Game-stat projection retrieval for comparison views
- Moment and highlight retrieval for event-aware playback and recap context
- Correlation of action/event state with derived match context records
Video Event Normalization
Turfi now treats video-derived events as inputs to review, not as direct match truth. Events may originate from provider event feeds, manual tagging, assisted tagging, or future AI detection, but they must all normalize into canonical game_events.
Current event-source values should be treated as:
provider_importmanual_tagassisted_tagturfi_ai
Normalization metadata is expected to include:
event_source_typeevent_confidencereview_statusverified_byverified_atprimary_video_id
Approved events are the only events that drive:
player_game_stats- match timelines
video_moment_generation_queuevideo_moments- player highlights
This keeps provider feeds, manual tagging, and future detection models aligned with the same canonical event layer instead of producing competing sources of truth.
Interactions with Other Engines
- Competition Engine
- Media Engine
- Broadcast Engine
- Discovery Engine
- Recruitment Engine
Media Engine
Status: IMPLEMENTED
The Media Engine is Turfi's home for video, images, and highlight content. When a coach uploads match footage, when a moment is detected from a goal, or when a player's highlight reel is built, the Media Engine stores the files, links them to the right games and players, and makes them available for playback and discovery.
Sports experiences are visual. Stats and rosters matter, but video and highlights are what bring games to life. The Media Engine exists so that media is stored once, linked to the correct context (players, games, competitions), and reused across match pages, player profiles, and recruitment surfaces without duplicating files or losing that context.
This engine owns media asset records (the canonical reference for each uploaded file), links between media and players, teams, games, and events, moment and highlight structures (video moments, highlight collections, highlight items), storage metadata and processing status, and the graph that connects source media to clips and reels.
The Media Engine consumes moment candidates from Match Intelligence and attaches clips to create playable video moments. Broadcast delivers those to users; Discovery surfaces media in search and feeds; Player Engine exposes player-linked media; Recruitment uses highlights for scouting. The Import System can seed media metadata for bulk ingestion.
Planned evolution includes AI-driven highlight detection, transcoding pipelines, thumbnail generation, Cloudflare R2 migration for storage, and editorial automation that drafts articles when key media moments occur.
Purpose
The Media Engine provides Turfi's shared media foundation for ingesting source files, organizing contextual links, and producing moment/highlight outputs for match, player, and recruitment surfaces.
Turfi video is not treated as passive file storage. The Media Engine treats video as a structured input that can produce event candidates, reusable moments, highlight clips, collections, and player-facing media feeds.
Canonical match truth always remains in game_events.
Video-derived data must never overwrite canonical match data automatically. Provider imports, assisted tagging, or future AI detections can accelerate review, but only approved canonical events may drive stats, timelines, moments, clips, and player feeds.
Architecture Overview
Turfi media architecture is centralized at the data/model layer and exposed contextually in workspace experiences. Media workflows begin with a canonical media asset record, then branch into moments, candidate clips, and highlight groupings that can be reused across match, player, and admin flows without duplicating source files.
Core concept:
video source
↓
event detection
↓
canonical events
↓
moments
↓
clips
↓
highlights
↓
player feeds
Studio Layer
Studio is Turfi's production layer responsible for media ingestion, processing, clip creation, and highlight generation.
Studio capabilities appear contextually across workspaces (Play, Recruit, Compete, Admin) based on user flow, but Studio itself is not a standalone workspace.
Media ingestion and processing are centralized in shared platform architecture even when Studio tools are surfaced inside multiple workspaces.
Studio Editorial Automation (Future Capability)
The Studio content creation layer will eventually support editorial automation features.
These capabilities will allow Turfi to generate draft articles and editorial templates automatically when events occur within the sports data graph.
This automation will leverage data produced by the Competition Engine, Player Engine, and Media Engine.
The feature will initially generate draft articles which editors can finalize and publish.
Event Driven Article Drafts (Planned Feature)
Turfi will support automatic creation of article drafts based on events occurring within the platform's sports graph.
When key events occur in the Competition Engine or Player Engine, the system will generate draft editorial articles inside the Studio content creation environment.
These drafts allow editors to quickly publish match reports, competition coverage, and player stories with minimal manual setup.
The goal of this feature is to accelerate editorial workflows by automatically preparing article templates when meaningful sporting events occur.
Examples of triggering events include:
- Game scheduled
- Game completed
- Competition created
- Competition finals scheduled
- Player milestone events
When triggered, the system will create a draft article that includes pre-filled structured data such as:
- Competition name
- Participating teams
- Match score
- Key players
- Highlight references
- Game metadata
Editors will review and complete the draft article before publication.
This feature is planned for a future phase of the Studio editorial system.
Media Graph Architecture
Turfi models media as a graph rooted at media_assets:
media_assets
├ video_moments
├ moment_candidates
├ highlight_items
└ highlight_collections
media_assetsis the root media object.- Each row represents one uploaded media file.
- Moments, clips, and highlights reference the same source media.
- This avoids duplication and allows clips/highlights to be reused in multiple contexts.
Media Storage Model
The media_assets table stores the canonical media record with three groups of fields:
Content metadata
media_typetitledescriptionmedia_categoryis_featured
Context relationships
player_idteam_idgame_idcompetition_idevent_id
Storage metadata
storage_providerstorage_bucketfile_pathmime_typefile_size_bytesduration_secondsresolution_widthresolution_heightprocessing_status
These storage fields support media processing/transcoding pipelines, AI highlight detection, and future provider portability (for example Cloudflare R2, S3-compatible storage, or other object stores).
Compatibility Note
The uploaded_by column currently exists in media_assets.
A future migration will introduce uploaded_by_user_id for schema consistency.
Both columns may temporarily coexist during migration to avoid breaking existing application logic.
Key Database Models
- media_assets
- video_moments
- moment_candidates
- highlight_moments
- highlight_collections
- highlight_items
- api_player_media
- api_games
Key Workflows
- Source media ingestion into
media_assets - Media processing/transcoding status lifecycle management
- Moment candidate generation and curation
- Video moment creation tied to match timeline contexts
- Highlight item and collection assembly for recap/playback surfaces
- Cross-context media reuse without source-file duplication
Video Source Registry
One game may have multiple source videos rather than one official recording. Turfi's media model therefore extends beyond media_assets and introduces a source registry layer for match footage.
Example source set for one game:
- Veo recording
- XbotGO recording
- player uploaded video
- club broadcast feed
The planned game_videos table registers those source videos before moments and clips are derived from them.
game_videos
Purpose Canonical registry for source match videos linked to a game, provider, owner, and ingestion mode before clips and moments are derived.
Fields
| Field | Purpose |
|---|---|
id | Primary key |
game_id | Linked game |
provider_key | Provider or source system key |
source_type | Provider, upload, broadcast, archive, or future live source class |
source_label | Human-readable source label |
source_url | External or internal source URL |
storage_path | Managed storage path |
external_video_ref | Provider-side reference |
duration_seconds | Source duration |
resolution_width | Width in pixels |
resolution_height | Height in pixels |
fps | Frames per second |
ingestion_mode | Upload, provider import, archive, future RTMP ingest |
status | Availability or processing status |
created_by | Creating user |
created_at | Created timestamp |
updated_at | Updated timestamp |
Media assets, playable clips, and derived moments can all trace back to one of these source video rows. This lets Turfi manage many recordings for one game while still generating canonical clips from normalized source footage.
Media Ingestion Pipeline
The implemented Media Engine follows a governed multi-stage ingestion pipeline:
- Video intake: register uploaded or linked source videos in
game_videos - Ingestion jobs: automatically create
video_ingestion_jobsfor validation, transcode, poster extraction, event detection, and downstream preparation - Event detection: write raw provider or AI payloads into
video_event_imports - Event normalization: map raw payloads into canonical Turfi event types while preserving lineage through
game_event_sources - Player resolution: route unresolved identities into
video_player_resolution_queue - Moment generation: enqueue approved canonical events into
video_moment_generation_queue - Clip generation: enqueue generated moments into
video_clip_generation_queue - Highlight publishing: persist outputs in
highlight_clips, distribute throughhighlight_items,highlight_collections, andplayer_highlight_items
At every stage, canonical match truth remains the approved game_events layer rather than the raw provider or AI input layer.
Clip and Highlight Output Layer
The implemented media-intelligence rollout extends the earlier moment/highlight architecture with explicit queue and clip layers.
Core output flow:
approved game_events
↓
video_moment_generation_queue
↓
video_moments
↓
video_clip_generation_queue
↓
highlight_clips
↓
player_highlight_items
↓
v_player_highlights
This matters because Turfi no longer treats highlight playback as a generic attachment problem. Approved events are the trigger point, moments are the reusable timeline objects, clips are the extracted playback assets, and player-highlight items are the player-facing association layer.
Media Data Model and API Surfaces
The Media Engine now relies on a layered table model:
game_videosregisters source videos for gamesvideo_ingestion_jobstracks processing statevideo_event_importsstores raw provider or AI payloads before canonical reviewgame_event_sourcespreserves lineage from canonicalgame_eventsback to source evidencevideo_player_resolution_queuehandles unresolved player identity reviewvideo_moment_generation_queuetracks approved events waiting for moment creationvideo_clip_generation_queuetracks moments waiting for clip extractionvideo_momentsstores reusable timeline-aware momentshighlight_clipsstores extracted clip assetshighlight_itemslinks clips into reusable highlight collectionshighlight_collectionsstores grouped highlight reelsplayer_highlight_itemslinks clips to player-facing feeds
These layers feed the existing API/read-model surfaces:
api_match_highlightsapi_player_highlight_feedapi_player_highlight_momentsv_player_highlights
Those APIs remain read models. They do not become the write path for canonical media workflow decisions.
Media Admin Module
Turfi now exposes a dedicated Admin -> Media workspace module as the operational control center for the Media Engine.
Primary tabs:
OverviewVideosMomentsHighlightsPlayersProcessing
Tab responsibilities:
Overviewprovides quick actions forBroadcast Game,Upload Game Video, andUpload Highlight, plus summary counts for videos, moments, highlights, and unresolved playersVideosis the registry-style surface for all source videosMomentsis the approval/review surface for extracted or imported moment candidatesHighlightsmanages generated clips and highlight collection placementPlayersresolves unresolved player detections from jersey numbers or partial labelsProcessingcontains engineering monitoring tools such as Video Sources, Ingestion Jobs, Event Imports, Moment Generation Queue, and Clip Generation Queue
Reusable Service Architecture
Media follows a platform rule: admin must not own media business logic directly.
Reusable service responsibilities now include:
createVideoSource()startVideoIngestion()importVideoEvents()approveMoment()generateHighlightClip()resolvePlayerIdentity()publishHighlight()
These services are designed to be reused later by other workspaces such as Studio, Club, Player, and Recruiter instead of remaining coupled to the admin UI.
Admin UI Design Principle
The Media module follows the same UX language as the Data Registry:
- entity-based grids
- filters
- status badges
- row actions
- registry-style navigation
This keeps media operations consistent with the way other Turfi entities are managed, reviewed, and corrected.
High-Level Media Engine Diagram
Video Source
↓
Ingestion Jobs
↓
Event Imports
↓
Canonical Events
↓
Moments
↓
Clips
↓
Highlights
↓
Player Media Feeds
Interactions with Other Engines
- Match Intelligence Engine
- Broadcast Engine
- Discovery Engine
- Recruitment Engine
- Social Engine
Highlight Detection System
Status: IMPLEMENTED (architecture and table model); PLANNED (automated AI detection pipelines)
The Highlight Detection System turns match events into highlight content. When a goal is scored or a key save happens, the system identifies it as a moment worth capturing, attaches clips when media is available, and organizes those moments into reels. It's what makes "watch the highlights" possible instead of forcing users to scrub through full matches.
Matches generate a lot of events. Most viewers care about a subset—goals, assists, key saves, pivotal moments. The Highlight Detection System exists to filter that signal from the noise, structure it for playback, and support both human curation and (eventually) automated detection so highlight reels can be produced at scale.
This system owns moment candidate generation from match events, video clip attachment to candidates, multi-player participation (scorer, assist, buildup, etc.) per moment, highlight collections and highlight items, and the pipeline from event → candidate → clip → reel.
Highlight Detection sits between Match Intelligence (which evaluates events) and Media/Broadcast (which store and deliver clips). Competition Engine provides the event stream; Media Engine stores clips; Broadcast delivers the final reels. Player Engine benefits when moments are linked to players for profile highlights.
Planned evolution includes AI-driven moment detection from video, automated clip extraction, coach and scout tagging tools, and participation moderation so the right players are credited and the right moments surface in discovery.
Purpose
The Highlight Detection System identifies and organizes key match moments so they can be transformed into clips and highlight reels. It bridges raw match events and user-facing highlight content by converting analytical detections into media-ready highlight structures.
Architecture Overview
- Match events produce moment candidates. The Competition Engine records official events; Match Intelligence evaluates them and emits
moment_candidatesrepresenting potential highlight opportunities. - Approved or review-confirmed events can enqueue
video_moment_generation_queue, allowing workers to createvideo_momentsfrom canonical event truth. - Video clips are attached to moments. The Media Engine attaches clips to moments through
video_clip_generation_queueand persists extracted outputs inhighlight_clips. - Multiple players can be associated with the same moment through
moment_participants(scorer, assist, buildup, defender, goalkeeper, participant roles). - Player highlight feeds are assembled through
player_highlight_itemsand ranked viav_player_highlights. - Highlight collections organize clips into reels. Video moments and clips can still be added to
highlight_collectionsthroughhighlight_items, enabling player/team/competition-scoped reels.
Key Database Models
- moment_candidates
- video_moment_generation_queue
- video_moments
- video_clip_generation_queue
- highlight_clips
- player_highlight_items
- v_video_moment_generation_jobs
- v_video_clip_generation_jobs
- v_player_highlights
- moment_participants (multi-player association: scorer, assist, buildup, defender, goalkeeper, participant)
- highlight_collections
- highlight_items
- api_match_timeline_unified
Key Workflows
- Official match events recorded during games feed Match Intelligence
- Match Intelligence evaluates event/action context and generates moment candidates
- Approved
game_eventsenqueuevideo_moment_generation_queue - Moment workers read
v_video_moment_generation_jobsand createvideo_moments - Clip workers read
v_video_clip_generation_jobs, extract segments, and createhighlight_clips - Player-highlight association logic writes
player_highlight_items - Players/coaches associate additional participants with the same moment through the participation system
- Ranked player clips are exposed through
v_player_highlights - Video moments added to highlight collections; frontend renders collections as playable reels
Interactions with Other Engines
- Match Intelligence Engine: analyzes events and emits candidate highlight opportunities
- Competition Engine: provides official event stream for candidate generation
- Media Engine: stores/attaches clips and builds playable moment media records
- Player Engine: associates moments with player identities and participation claims
- Discovery Engine: uses highlight activity signals for trending players and notable moment surfacing
Match Intelligence Tables
api_match_timeline_unified
Purpose Unified match-event timeline read model.
Primary Relationships Keyed by game_id; combines event, player, and team timeline context.
Important Constraints Must preserve event ordering and timestamp/minute semantics for playback/timeline rendering.
api_player_game_stats
Purpose Match statistics read model for game stat panels.
Primary Relationships Keyed by game_id; exposes home/away values for normalized stat keys.
Important Constraints Current UI expects consistent stat-key labeling and numeric values per row.
Media Engine Tables
media_assets
Purpose Root media entity used by the Studio layer for uploaded media ingestion and shared media lifecycle management.
Primary Relationships Links uploaded media to player/team/game/competition/event context and acts as the source media reference for moments, clips, and highlight collections.
Important Constraints Must preserve stable source identity and storage metadata so downstream moment/highlight pipelines can reuse one canonical media object across multiple contexts without duplication.
video_moments
Purpose Stores/exposes moment-level clips tied to games.
Primary Relationships Keyed by game_id; consumed by match timeline/media playback.
Important Constraints Requires stable ids and timestamp/minute fields to support deterministic playback navigation.
highlight_collections
Purpose Stores/exposes grouped highlight content for a game.
Primary Relationships Keyed by game_id; consumed by match highlight and recap surfaces.
Important Constraints Requires stable ids and playable URL fields for highlight rendering.