TODO: add examples of the GUI:
Details in the GUI
Next and Current line as gif / webm
hover-scrub as animated gif / webm
media-details, such as end-of-file and freeze-frames
Warnings from blueprints
TODO: make new overview
All interaction with Sofie is currently done via a web. This makes the system very flexible as you can quickly spin up another session in case a machine goes down, you can have multiple operators working on the same show at the same time or you can give a single operator multiple interfaces (f.g. a traditional keyboard and screen plus a touchscreen tablet for a ShotBox). Utilizing modern web technologies allows all these sessions to stay in sync in real time.
The producer gets the overview of the entire show. This view also lets them control the entire show as well as see all relevant timing information and warnings about media or playback. The producers view consist of a vertical list of Segments, representing large chunks of the show (f.g. a story), each of which contains a horizontal list of Parts, each being a small part of the Segment - a single clip, a piece to camera or an interview. Every part contains items, such as cameras, media items, graphics objects, audio ques etc. They are intended to be stacked similar to how layers of an NLE would work, the background at the bottom and foreground items on top.
We have a few tools that help the producer check if everything will play out correctly. Sofie can currently check for freeze frames and black frames, the correct format of the media (resolution, frame-rate and interlacing) and lastly we give the producer the possibility of previewing the media straight from the UI using something we call hover-scrub. Sofie also allows you to start playing from anywhere, so you don’t need to sit through the whole clip to see the last graphics item.
Sofie also has pages for status monitoring and configuration. The status page gives information about the state of all the connected devices and is especially useful for support crew. The configuration pages are intended for installation and allow extensive configuration of the studio installation, layers and devices as well upgrading the database after a system update and uploading blueprints. For the presenter we offer a special screen containing information about the current and next Part, for example which camera is currently live or how long the current media item will be playing for. We also offer a teleprompter view that displays the script and can automatically scroll to the right Segments and Parts while the show progresses. This auto scrolling is especially useful during rehearsal when you might not play back through the show in a linear manner.
While producing the show, we give the producer what we call the “next” screen on the multiviewer. This essentially always shows the producer what source they are going to switch to next. But it is not limited to just the mixer: for example, it can also show the next item for playback. And in theory, this could be extended to any part of the show, for example scripts or camera robotics. Another tool the producer has available are Ad-Libs, from the latin ad libitum. Global Ad-Libs are available throughout the entire show, this could be cameras and remote inputs. Other Ad-Libs are context aware and may only become available when the producer is in a Part where they can be used. During the whole show Sofie can send metadata to external systems, such as indexing and timing information.
We currently (August 2019) support the Blackmagic Atem range, CasparCG and Quantel for playback, HyperDeck for recording and can control Pharos lighting systems and Panasonic PTZ cameras. Through the Sisyfos controller application we have support for Behringer, Midas and MIDI-compatible sound mixers. We also have support for the Ember+, OSC, HTTP and TCP protocols that allows us to control many more devices: audio mixers, lightning systems, tally lights, etc.
The spine of a Sofie installation is the server-core: it serves the React-based UI to the user, connects to the Database and processes incoming and outgoing data using Blueprints. The server-core is built using Meteor.
In addition to the server-core, we use gateways to connect to services and devices. Most notably we have the MOS Gateway for connecting to ENPS and the Playout Gateway for resolving the Timeline and controlling playout devices.
The Timeline is an abstraction level for playout, it allows us to tie objects (a camera source, or a vt) with timing information. The commands are generated from this timeline on the fly by comparing the next and the current state at a time and calculating what commands are needed to reach the next state.
The objects on the Timeline are put there by server-core and based on the rundown. But before these objects can be put on the timeline they need to be created. This is one of the tasks of the Blueprints. The Blueprints take data from a change in the input data (f.g. ENPS) and process it into objects that can be used to drive the UI as well as be put on the timeline.