Material: MediaObject, MediaObjectInstance, Track, MaterialInstance

A core functionality of the Limecraft Flow platform is the management of (audiovisual) media clips, which in their most common form represent video material with its associated audio. An extensive data model was implemented to support the various aspects of managing audiovisual media and their technical and user-defined metadata.

This document contains some specifics about the part of the data model that represents material managed by our platform.

A logical Clip consists of various related data model business objects in the Limecraft Flow platform. For starters, a single logical Clip is represented by a MediaObject, which corresponds to a single piece of material that is ingested into Limecraft Flow. While the most common form of a Clip is a continuous audiovisual piece of media, a clip does not need to represent video footage only, it can also be just audio, it can be an image, it is even possible to have a clip represent any filetype to Limecraft Flow. In fact, a clip doesn’t even necessarily need to have a file attached to it, it can be a placeholder for files that will be uploaded later, that are managed externally (with only references to the digital assets stored in Limecraft Flow), or it can represent a non-digital asset if needed.

The MediaObject, regardless of the content that it represents, is what is displayed by the Limecraft Flow UI library, is what is manipulated by platform workflows, is what is played by the UI player, is what is shared to users for reviewing, etc.

The MediaObject is the parent object to a variety of other object types to manage various aspects of a clip:

  • MediaObjectInstances: While the logical clip is addressed one-on-one by a MediaObject, each such MediaObject can exist in various versions or renditions called MediaObjectInstances, e.g., one for the original native resolution and codecs, and one for a web-compatible MP4 version. MediaObjectInstances describe the technical details of each such version.

  • Annotations: While MediaObjects and their MediaObjectInstances can contain intrinsic technical metadata (e.g., the file container, resolution, image color space, recording date, etc.), other interesting metadata is associated with MediaObjects using Annotations. Such metadata includes users-defined metadata fields and values, user comments and subclips along the clip’s timeline (temporal and spatial metadata), and metadata generated by Media Intelligence services incl. automated speech-to-text, subtitling or translation.

  • Tracks: Audiovisual essence that consists of multiple isochronous 'tracks' of audio and video is modelled by individual Track objects attached to a MediaObject and some of its MediaObjectInstances.

  • MaterialInstances: Deeper down the object reference chain, digital copies of MediaObjectInstances can be tracked and managed by our platform. To realize this, each (digital) copy of such an instance is represented by a MaterialInstances which specifies where and how that copy is stored and how it can be accessed.

Click the links to learn in more detail about each of these data model business objects.