Turfi Platform Documentation
Official Turfi documentation portal for users, admins, and developers.
Documentation Search
Search only within Turfi documentation pages.
Video Ingestion and Stats Mapping System
Cross-engine video ingestion, provider stats mapping, event normalization, player resolution, and player media workflows.
Turfi needs a governed way to turn source video, provider events, and manual review into approved match truth, generated moments, extracted clips, and reusable player-highlight feeds. This document explains the current Video Intelligence and Highlight System as a cross-engine workflow rather than a standalone subsystem disconnected from the rest of the platform.
It extends the Media Engine, Match Intelligence Engine, Broadcast Engine, Player Engine, Admin Operations, Operations, and Import Engine. As automated capture, event attribution, and AI-assisted analysis mature, this file should remain the reference for how those capabilities normalize into the same canonical Turfi objects.
Shared status legend: [docs/_shared/status-legend.md](./_shared/status-legend.md)
Video Ingestion and Stats Mapping System
Status: PARTIAL
Turfi supports two ingestion paths that ultimately converge into the same platform truth model.
Path A: External provider video with stats
- Veo
- XbotGO
- broadcast feeds
- provider exports with event metadata
Path B: Video only
- phone uploads
- club uploads
- player highlight uploads
- livestream recordings
Both paths normalize into the same canonical objects:
game_eventsplayer_game_statsvideo_momentshighlight_clipsplayer_highlight_itemsmedia_assets
Provider feeds are never the final authority by themselves. Canonical truth is always approved game_events after normalization, review, and verification. Provider payloads can seed or accelerate the workflow, but player stats, match timelines, highlights, and player profile outputs must always derive from the approved event layer rather than from unreviewed third-party data.
Purpose
The Video Intelligence and Highlight System exists so Turfi can ingest match footage and event metadata from multiple sources, normalize those inputs into the same event model, and produce reusable player/media outputs across the platform.
It solves five platform problems at once:
- registering and processing source footage in a governed media layer
- storing raw provider events before they affect canonical match truth
- resolving detected players through roster-aware attribution logic
- generating reusable moments and clips from approved events
- building player highlight feeds from the same canonical event graph used by stats and timelines
System Overview
The system spans multiple existing engines instead of introducing a separate standalone subsystem:
- Media Engine stores source videos, clips, moments, and highlight outputs
- Match Intelligence Engine normalizes events and drives stat rebuilds
- Broadcast Engine prepares source videos for playback and clip extraction
- Player Engine resolves athletes, exposes player highlights, and governs claim/review workflows
- Admin Operations surfaces jobs, review queues, and mapping failures
- Import Engine handles provider payload ingestion and structured mapping workflows
- Entity Resolution supports player, team, club, and contact cleanup when imported data reveals conflicts
Platform Media Intelligence Architecture
video ingestion
↓
game_videos
↓
video_ingestion_jobs + video_event_imports
↓
game_events + game_event_sources
↓
video_moment_generation_queue
↓
video_moments
↓
video_clip_generation_queue
↓
highlight_clips
↓
player_highlight_items
↓
v_player_highlights
Core Truth Model
Turfi video is not just media storage. It is a governed ingestion path that can generate reviewable event candidates, approved canonical events, reusable moments, clips, highlights, and player-facing feeds.
Core progression:
video source
↓
event detection
↓
canonical events
↓
moments
↓
clips
↓
highlights
↓
player feeds
The canonical truth boundary is always game_events.
Video-derived records must never overwrite canonical match data automatically. Raw provider payloads, manual tags, or AI detections remain inputs to review until they are normalized and approved.
Supported Video Sources
Turfi supports multiple video origins for the same game rather than assuming one official recording.
Supported source classes include:
- external provider recordings such as Veo and XbotGO
- club broadcast feeds
- uploaded full-match files
- player-uploaded highlight footage
- coach or admin uploaded review footage
- archived livestream recordings
One game may have many videos attached simultaneously. For example, a single match may include:
- a Veo auto-capture recording
- an XbotGO sideline recording
- a club-uploaded broadcast export
- a player-uploaded clip package for personal highlights
Video Ownership Model
Videos may be uploaded or registered by multiple entity types:
- player
- team
- club
- league
- association
- admin
Ownership is tracked through:
video_owner_typevideo_owner_id
Ownership affects moderation, edit permissions, visibility defaults, and who can claim or manage clips derived from the source video. A club-owned broadcast feed may be visible to club staff and league reviewers, while a player-owned highlight upload may remain private or recruiter-visible until approved.
Ownership does not replace canonical match governance. Even if a provider or player owns the uploaded source video, event truth still depends on approved game_events.
Video Analysis and Event Attribution
Video analysis in Turfi is a staged processing pipeline rather than a direct provider-to-stats shortcut.
Video analysis sources may include:
- Veo
- XbotGO
- AI detection engines
- manual tagging
Event source values in game_events may include:
provider_importmanual_tagassisted_tagturfi_ai
Processing flow:
- Register the source video in
game_videos. - Create
video_ingestion_jobsrows for upload validation, transcoding, stream preparation, poster extraction, and provider event import. - Normalize video metadata such as duration, resolution, fps, and source references.
- If provider stats are present, write raw imported events into
video_event_imports. - Normalize imported or manually tagged event payloads into
game_events. - Link source lineage through
game_event_sources. - Resolve players, teams, and event types.
- Route unresolved cases into
video_player_resolution_queue. - Approved events automatically enqueue
video_moment_generation_queue. - Workers generate
video_moments. - Moments enqueue
video_clip_generation_queue. - Clip workers generate
highlight_clips. - Clips are associated to players through
player_highlight_items.
Event Normalization
All incoming events must normalize into the canonical game_events model.
Incoming sources may include:
- provider event feeds
- manual tagging
- assisted tagging
- future AI detection
Normalization enriches each event with metadata such as:
event_source_typeevent_confidencereview_statusverified_byverified_atprimary_video_id
This gives Turfi one event layer that can power match timelines, player stats, highlights, and player media regardless of how the original event was detected.
Player Identification Logic
Provider feeds often reference players using jersey numbers, short labels, or provider-specific names that do not cleanly map to Turfi identities on first import.
Turfi handles that through a player-attribution cascade before falling back to video_player_resolution_queue.
Example detection:
- Team:
Lakeshore U17 - Jersey:
7
Matching cascade:
Layer 1: Game lineup roster
- Match using
game_id + jersey_number
Layer 2: Team roster
- Match using
team_id + jersey_number
Layer 3: Historical inference
- Analyze historical events for the same team and jersey number
Fallback
- If no confident match exists, insert a task into
video_player_resolution_queue
Resolution workflow:
- A provider or AI event is normalized into a provisional or pending
game_eventsrow. - Attribution logic tries lineup, roster, then historical inference.
- If confidence is insufficient, the system writes a queue row with the raw player label, team context, and candidate payload.
- Admins or coaches review the queue and assign the correct player.
- The approved player identity is written back to the event graph.
- Player stats, moments, and highlight items rebuild from the approved event layer.
This workflow keeps imported provider labels separate from canonical player truth until a reviewer confirms the mapping.
Event Verification and Review
Imported or tagged events are reviewable before they become canonical truth.
Review statuses support workflows such as:
- pending
- needs_review
- approved
- rejected
Verification surfaces allow reviewers to confirm:
- event type accuracy
- timestamp accuracy
- player identity
- team attribution
- primary source video
- source confidence
Approved events drive:
player_game_stats- match timelines
video_moment_generation_queue- player highlights
Rejected or unverified provider rows remain traceable through source tables and job logs, but they do not become canonical match truth.
Automatic Player Highlights
Turfi automatically generates player highlight feeds from the canonical event graph rather than from disconnected standalone clips.
Pipeline:
game_event
↓
video_moment
↓
highlight_clip
↓
player_highlight_items
Clips are automatically ranked and exposed through:
v_player_highlights
This powers:
- player profiles
- recruiter discovery
- highlight reels
- league content feeds
Highlight and Clip Generation
Clips and highlights are derived from normalized source videos and approved events.
Once a source video is available and an event is approved, Turfi can:
- create
video_moment_generation_queuejobs - generate
video_moments - create
video_clip_generation_queuejobs - extract
highlight_clips - attach derived clips back to player, team, competition, and game context
- preserve source lineage through
game_event_sources
This is why the system stores both raw source references and canonical event links. A clip may visually originate from a Veo file, but its meaning inside Turfi comes from the approved game_events record that anchors the timeline.
Player Media and Moments
Player media is not limited to direct uploads. It can be derived from system-generated clips, imported provider events, coach-generated reels, and club-managed broadcast content.
Player-facing outputs include:
- system generated clips
- player uploads
- coach uploads
- club uploads
- curated highlight collections
Player profile navigation should continue to expose:
- Overview
- Stats
- Matches
- Highlights
- Media
- Recruitment
Moments may also require post-import claiming. Example:
- Provider imports a goal event.
- The player is unknown or unresolved.
- A player submits a claim for that event.
- A coach or admin verifies the claim.
- The event is updated with the correct player identity.
- Player stats and player media outputs are rebuilt from the approved event set.
This keeps claim workflows aligned with the canonical event layer instead of treating player media as a disconnected upload-only feature.
Worker System
Turfi uses worker-facing views so asynchronous services can poll stable job contracts instead of reading raw queue tables directly.
Worker job views:
v_video_moment_generation_jobsv_video_clip_generation_jobs
Worker responsibilities:
moment worker
- reads
v_video_moment_generation_jobs - creates
video_momentsfrom approved events - marks queue progress and writes generation results
clip worker
- reads
v_video_clip_generation_jobs - extracts playable video segments
- creates
highlight_clips - fans successful clip outputs into player-highlight association logic
Privacy and Visibility Model
Videos, moments, and highlight outputs support visibility values:
- private
- team
- club
- league
- recruiters
- public
Visibility is influenced by:
- video ownership
- review status
- moderation status
- role-based access
Moderation and review queues should govern issues such as:
- copyright disputes
- wrong-player attribution
- offensive content
- spam uploads
This protects both raw source footage and derived player media while still allowing recruiting and public highlight distribution where appropriate.
Admin Operations
Turfi now exposes a dedicated Admin -> Media module for operational control of the media pipeline. This is the admin workspace for monitoring source videos, reviewing extracted events, resolving players, and supervising processing queues without mixing that work into unrelated data-governance screens.
Primary tabs:
OverviewVideosMomentsHighlightsPlayersProcessing
Tab responsibilities:
Overview: quick entry actions forBroadcast Game,Upload Game Video, andUpload Highlight, plus high-level media countsVideos: registry of all source videos and linked footageMoments: review and approval surface for extracted or imported video eventsHighlights: clip publishing and collection managementPlayers: unresolved jersey-number or partial-name resolutionProcessing: queue and infrastructure monitoring tools for operators and engineers
Key operational queues include:
- Video Ingestion Jobs: ingestion, transcoding, and import queue health
- Unresolved Player Queue: imported events that still need player assignment
- Provider Mapping Failures: raw provider rows that failed normalization or mapping
- Moment Generation Queue: approved events waiting for moment creation
- Clip Generation Queue: approved moments waiting for segment extraction
- Video Source Registry: all source videos linked to games, owners, providers, and ingestion status
These surfaces complement the Import Engine, Entity Resolution, Admin Data registries, and downstream playback APIs already used for broader platform governance.
Reusable Service Architecture
The admin module is intentionally a thin client over shared media services. Media business logic must live in reusable platform services so the same operations can later be consumed by Studio, Club, Player, and Recruiter workflows.
Representative service responsibilities:
createVideoSource()startVideoIngestion()importVideoEvents()approveMoment()generateHighlightClip()resolvePlayerIdentity()publishHighlight()
This keeps the operational UI replaceable while preserving one implementation path for media workflow rules.
Admin UI Design Principle
Media follows the same UX principles as Data Registry:
- entity-based grids
- filters
- status badges
- action menus
- registry-style tab navigation
The goal is consistency. Media rows should be managed with the same interaction patterns administrators already use for clubs, competitions, venues, and other platform entities.
Future Extensions
Future capabilities will extend the same normalization pipeline rather than introducing a second truth model.
Planned future features:
- AI video analysis
- jersey number detection
- automatic roster matching
- trending players
- top plays feeds
- league highlight content
Future AI signals may include:
- event suggestions from video
- player candidate suggestions
- confidence scoring for review queues
- auto-detected clip boundaries
- assisted timeline alignment between provider payloads and source video
Even in those future phases, canonical truth must still remain the approved game_events layer after review.
Database Structure Update
The current database rollout uses a layered media-intelligence model rather than a single monolithic highlights table.
Core media ingestion layer
game_videos
Status: PARTIAL
- Purpose: registers every source video associated with a game, regardless of whether the source came from a provider, upload, broadcast archive, or future live ingest path
- Key fields:
game_id,provider_key,source_type,source_label,source_url,storage_path,external_video_ref,duration_seconds,resolution_width,resolution_height,fps,ingestion_mode,status,video_owner_type,video_owner_id - Foreign key relationships: expected to anchor to
games.id; ownership links point at the owning player, team, club, league, organization, or admin actor depending onvideo_owner_type - Associated triggers / automation: source registration is expected to fan out ingestion work by creating
video_ingestion_jobs; downstream processing uses the video as the canonical source reference for imports, moments, and clips - Pipeline fit: entry point for the full video-intelligence pipeline
video_ingestion_jobs
Status: PARTIAL
- Purpose: tracks ingestion, transcoding, poster extraction, and event-import jobs for source videos
- Key fields:
game_video_id,job_type,status,progress_percent,started_at,completed_at,error_message,metadata_json - Foreign key relationships:
game_video_id -> game_videos.id - Associated triggers / automation: ingestion workers update job state; this table is operational infrastructure rather than canonical match truth
- Pipeline fit: processing control layer between raw video registration and normalized events
video_event_imports
Status: PARTIAL
- Purpose: stores raw provider events before normalization into canonical Turfi match events
- Key fields:
game_video_id,provider_event_type,provider_event_id,timestamp_seconds,provider_payload,mapped_event_type,mapping_status - Foreign key relationships:
game_video_id -> game_videos.id - Associated triggers / automation: imported rows feed normalization/review workflows; failed mappings remain traceable here instead of mutating canonical match tables directly
- Pipeline fit: raw provider-staging layer
Event intelligence layer
game_events
Status: IMPLEMENTED
- Purpose: canonical match-event table used by Turfi for official event truth
- Key fields: existing event identity fields plus video-intelligence metadata such as
event_source_type,event_confidence,review_status,verified_by,verified_at,primary_video_id - Foreign key relationships: anchored to
games, optionallyplayers, and now video lineage throughprimary_video_id -> game_videos.id - Associated triggers / automation: approved events are the handoff point into stat rebuilds and moment-generation queueing; historical snapshot triggers still protect readable event identity
- Pipeline fit: canonical approval layer from which all downstream stats, moments, clips, and player highlights derive
game_event_sources
Status: PARTIAL
- Purpose: tracks where canonical events originated and which source video or provider rows support them
- Key fields:
game_event_id,game_video_id,source_kind,provider_event_id,confidence,is_primary,created_at - Foreign key relationships:
game_event_id -> game_events.id,game_video_id -> game_videos.id - Associated triggers / automation: inserted during normalization or review approval to preserve lineage between approved events and their source evidence
- Pipeline fit: event-lineage table connecting raw sources to canonical match truth
Resolution layer
video_player_resolution_queue
Status: PARTIAL
- Purpose: queues imported or detected events that still need a player assignment before approval
- Key fields:
game_video_id,game_event_id,provider_player_label,team_id,resolution_status,resolved_player_id,candidate_payload,created_at,resolved_at - Foreign key relationships:
game_video_id -> game_videos.id,game_event_id -> game_events.id,team_id -> teams.id,resolved_player_id -> players.id - Associated triggers / automation: unresolved attributions are inserted here by normalization logic; admin or coach review resolves them and feeds the approved identity back into
game_events - Pipeline fit: human-governed fallback when automatic player attribution is not confident enough
Moment generation layer
video_moment_generation_queue
Status: PARTIAL
- Purpose: queue of approved events waiting for moment creation
- Key fields: expected queue identity, source-event reference, status/progress fields, and worker timestamps
- Foreign key relationships: expected to anchor to
game_events.idand/orgame_videos.id - Associated triggers / automation: approved
game_eventsautomatically enqueue rows here; moment workers poll the queue throughv_video_moment_generation_jobs - Pipeline fit: handoff from approved events into derived moment creation
video_moments
Status: IMPLEMENTED
- Purpose: stores playable or timeline-aware moments tied to games and events
- Key fields: current match read paths assume fields such as
id,title,minute,event_type,player_name,timestamp,thumbnail_url,video_url - Foreign key relationships: tied to
games; in the intelligence rollout, moments are also derived from approvedgame_eventsand source-video context - Associated triggers / automation: created by moment workers after queue processing; successful moments enqueue downstream clip extraction work
- Pipeline fit: reusable moment layer bridging approved events and extracted clips
Clip generation layer
video_clip_generation_queue
Status: PARTIAL
- Purpose: queue of approved moments waiting for segment extraction
- Key fields: expected queue identity, source-moment reference, status/progress fields, and worker timestamps
- Foreign key relationships: expected to anchor to
video_moments.idand the originating source video - Associated triggers / automation: new/approved
video_momentsenqueue rows here; clip workers poll the queue throughv_video_clip_generation_jobs - Pipeline fit: handoff from reusable moments into extracted clip assets
highlight_clips
Status: PARTIAL
- Purpose: stores extracted clip outputs derived from approved moments
- Key fields: expected clip identity, source moment/video references, start/end timestamps, output URLs, thumbnail or poster references, processing status
- Foreign key relationships: expected to anchor to
video_moments.idandgame_videos.id - Associated triggers / automation: created by clip workers; successful clip writes feed
player_highlight_items - Pipeline fit: reusable extracted-clip layer used by player, league, and discovery highlight surfaces
Player highlight layer
player_highlight_items
Status: PARTIAL
- Purpose: associates generated clips to player-facing highlight feeds
- Key fields: expected
player_id, clip reference, ranking/ordering fields, game/event context, visibility state - Foreign key relationships: expected to anchor to
players.idandhighlight_clips.id - Associated triggers / automation: populated automatically after clip extraction and player attribution are approved
- Pipeline fit: final player-facing highlight association layer before read-model ranking
Worker job views
v_video_moment_generation_jobs
Status: PARTIAL
- Purpose: stable worker-facing view for moment generation jobs
- Key fields: queue identity, approved event context, source video context, status and retry-ready worker fields
- Pipeline fit: read contract for moment workers
v_video_clip_generation_jobs
Status: PARTIAL
- Purpose: stable worker-facing view for clip extraction jobs
- Key fields: queue identity, moment context, source video context, extraction timing fields, status and retry-ready worker fields
- Pipeline fit: read contract for clip workers
v_player_highlights
Status: PARTIAL
- Purpose: ranked player-highlight read model built from
player_highlight_items - Key fields: player identity, clip identity, ranking/order metadata, game/event context, media URLs
- Pipeline fit: final read model powering player profiles, recruiter discovery, and feed surfaces