Are you a media house looking for an end-to-end workflow that puts together a format to achieve desired functionalities? Read on to get some best practice recommendations.
Workflow orchestration by definition gives an overview of the functionalities associated with any workflow, and how complex or simple it might be. Media organizations these days have challenges in the form of:
- Gathering content that comes from a variety of sources including in-house productions, content aggregators, vendors, syndicated feeds, freelancers etc.
- Modifying and consolidating the content (Editorial teams/ Graphics Team/ Audio team/ Legal team/ S&A Team all possibly working on different platforms but consolidating the whole package as one)
- Delivering the content for platforms such as Satellite, OTT platforms, social media, IPTV, etc.
Whether content is created in-facility or aggregated from various sources, there is a choice of workflows that can be considered. The below questions will help decide the path to choose.
- Was this content produced in-house or sourced from a vendor/syndication?
- Does the content meet the channel’s requirement in terms of quality – both visually and thematically?
- Is the content meant for immediate use or for future consumption?
- Is there value for the content beyond single use, and how much of the content needs to be preserved for posterity? Would only the edit be used, or even the rushes? Would it be used in native format or transcoded?
Overview of a media house workflow
One of the first processes for any media workflow is content creation or content gathering in the case of an aggregator. If this is taken care of systematically (ensuring correct format, relevant metadata, etc.), most of the complexities caused by human intervention can be minimized. This could also minimize the requirement for a transcoder in the subsequent stages.
The second stage is storage of the content. A high-speed SAN storage can be considered here. Though this is the fastest medium, it is not necessarily the most cost-effective. So, a second-tier cost-effective Nearline storage can be added to the setup for parking content that will not be used immediately.
Once the content starts coming in, it must be standardized into an in-house format. This is where a transcoder can be used effectively. A lower resolution of the content can be created at this stage for review and approval, when dealing with out-sourced content.
Editors either edit, re-edit, modify the content to suit their demography and guidelines, etc. or use the already edited content to produce promos and shorts. For outsourced programming, content usually reaches the editor only after all the internal approval teams have approved the content.
Once the edit is completed and approved, the content is delivered to its destination such as an OTT platform or a syndication client. Accelerated file transfer platforms are available to monitor and streamline the distribution pipeline.
Post the delivery, content that needs archiving is sent to a storage format that costs much less than the above-mentioned SAN or Nearline storage. Archival sometimes includes a secondary copy at a different location as part of DR workflow.
Role of an orchestrator in automating a media house workflow
Each of the modules listed above may work either independently or be integrated to some extent. This is where an orchestration tool steps in to function as the heart and brain of the entire workflow.
The moment content is ready and available, an ingest tool from the Orchestration module provides the user with a custom interface (designed unique for different teams) to push content either locally or remotely either through an HTTP interface or through an accelerated file transfer tool to a temporary storage.
Once the content is ingested, the Orchestrator runs some checks and subsequently liaises with a Transcode module when required. Content may be transcoded to a unified House format, and a low-resolution file can also be created especially for Higher than HD workflow. Source content is automatically deleted at the end of the conversion, unless the source content is of a much higher quality and needs to be preserved for future repurposing.
Often, the content needs to go through a quality check before getting into production storage and in this case, the same Orchestrator through an API request can talk to a QC tool. Once the QC finishes its analysis, it sends the file for Ingest. Sometimes, certain formats involve a change of colour space, for which the QC can be done only once the file is converted to the desired format. Upon conversion, if the QC tool rejects the file, the orchestrator notifies the stakeholders.
Depending on the timeline of the project that the content is to be used in, the orchestrator provides an option to choose the destination storage, be it Production or Nearline. The Orchestrator can also move or copy content from Nearline to Production storage when it is time to be edited. Also deferred projects can be moved from Production to Nearline storage using the Orchestrator and without any manual intervention.
Once the content is in the Production storage, the editors will directly browse the necessary content from their editing application using a plugin from the Orchestrator and start the edit.
On completion, if an edit needs to be approved, a review/approval workflow can be created. Again, relevant stakeholders are notified when content becomes available for review, and the review decisions are e-mailed, and corresponding actions are triggered automatically based on responses.
Once the content is finalized and is ready to be delivered for transmission/ CDN Delivery/ social media/ in-house Archival etc., the subsequent workflow will be handled by the Orchestrator through various integrations, either built-in or customized as per the requirement.
Workflow Orchestrator – An integrator of modules and teams
When it comes to team complexities, there might be different teams such as the IT team looking after the whole infrastructure requirements, QC team to evaluate Content fitness, Ingest team looking after the content that is being ingested, Editors to view and edit the content, Producers to monitor the process from Ingest to Delivery, Library to manage the media catalogue, S&P teams to monitor the workflow aspects. While all these teams may work independently, they are still part of the same stakeholder and should not function as silos within the same organization. A Workflow Orchestrator for this scenario creates different hierarchies with each set of users, and permissions may be given according to the functionality. For example, while IT would need Admin access for either creating workflows, viewing usability matrix, creating and removing users etc. Editors only need access for viewing and editing the content. Moreover, sub-admins can also be created and given limited permission to add team members, all of which is possible with the orchestrator.
In conclusion, a workflow orchestrator is like the conductor of an opera, it efficiently puts each instrument to use. However, at the end of every technology piece there is a human who needs to use it efficiently, so it must be flexible and easier to use no matter what complexity lies deep inside.
It is also important to remember that this article is an overview of multiple scenarios and recommended best practices. Every facility will have some unique challenges of its own and it is a wise idea for the various teams in a facility to have a collective meeting to note down specific challenges, reach out to Workflow Architects and System experts in the industry, benefit from their knowledge and experience and accordingly plan a workflow that best suits the organization.