01. Assimilate Product Suite 9.9 BETA

Expand / Collapse
 

01. Assimilate Product Suite 9.9 BETA


General

This is a beta release of the Assimilate Product Suite version 9.9. This version requires an updated license. If you want to give this beta a try and have a valid v9.8 license – either a subscription or a permanent license with a support contract that is at least valid for another month - then contact licensing@assimilateinc.com to update your license. Note that this version is not fully backward compatible. Downgrading from this version might not render the same results and might require additional manual steps to get things back in the original state. Always create a backup of your projects before upgrading.

Download the latest Assimilate Product Suite build from here:

Release History

21 'Oct 25, build 1192

  • Start open beta v9.9 - see release notes below.

Release Notes

Multi Display Output

This version allows the usage of any number of (GPU) displays rather than a single Dual Head output next to the main UI. This opens up many new possibilities for controlling complex LED volumes with minimal latency, outputting multi-camera live compositions and more.

The selection of which displays to use is done through the existing VideoIO module for selecting SDI- and NDI-outputs (and inputs). The former VideoIO panel has been split into two separate panels: the Device List and the Channel Selection and Settings panel. In the Device List panel, you can quickly enable or disable a device with a single click, and you can customize the device name by clicking the Edit button or clicking the name directly. This helps to maintain a potential long list of outputs with generic or device coded names.
Using multiple display outputs from a single or multiple GPUs might require explicit (frame) synchronization, which currently can only be done with an Nvidia RTX Pro Sync solution (formerly known as Quadro Sync). 

Read more about using multiple (GPU) displays, synchronization of displays, color management and HDR output, as well as how to manage systems with multiple GPUs in this article

Note that with this version, the VideoIO API interface changed, which might cause that not all custom settings have been read correct. Please check the VideoIO settings in the Player - Settings - Monitor menu.

Also note that the traditional Dual Head (in the System Settings) and the new VideoIO GPU outputs are mutually exclusive. As soon as you activate one, the other is de-activated to prevent confusion. Going forward, the Dual Head function will be deprecated.

Multi-Node System Setup

The new Multi-Node Setup links multiple Live FX systems together for synchronized playback, grading and compositing to volumes that cannot be run from a single system – either because performance of a single system does not allow for it or because the setup needs to be operated by multiple people. 

The Multi-Node Setup offers 3 levels of synchronization.
Sync-Player. This only syncs the play-position and play mode (play/pause) of the connected systems – independent of the media that is loaded or grading that is applied. (This is an enhanced version of the existing Sync-Player function).

  • Sync-Project. All connected systems have access to the same project database(s) – either by using shared storage or by copying the project database(s) to each of the local drives. All systems sync to the master project, timeline, shot selection and playback mode. This should be used when the output is fully predefined. Media / grading / composition changes on the master are not automatically replicated.
  • Sync-Session. Client systems need access to the same media as the master system but automatically sync to the master shot selection. Playback as well as grading and composition changes are automatically replicated to all clients. This mode also offers the option to (temporarily) assign any of the connected clients as moderator system, which then controls playback mode and can grade the image – which is still replicated to all connected systems. This offers e.g. the option to connect a grading system / laptop to the core projection setup and offer each user(role) its own system.

The Multi-Node Setup is available from the startup screen and from the Tools drop-down menu in the Player. In the multi-node panel, you set the role (master/client) and the sync level (Player/Project/Session). The multi-node can be set to auto start when the software starts. 

If the system includes an Nvidia RTX Pro Sync card, then the Frame-Sync is available. This will ensure that playback of the connected systems is frame-sync. 

Note that the multi-node setup still does require some initial manual setup. Not every aspect on a client system can be controlled from the master system. You install and setup a client system just as any Live FX system. But once setup and with the Auto-Start option enabled, the client system is automatically operational as part of the multi-node environment.

For all the details on the multi-node setup, go here.

Multi-Node System revisited

Next to the core multi-node operational synchronisation functions, this version also adds a new Shared Tray (next to the Project and Gallery trays) to efficiently share grades and compositions between systems. This allows for a workflow where setups are prepared on a system independent from the core projection system but can be shared and used with a few clicks. 

The Shared tray points by default to a local system folder (System Settings panel) but it is meant to point to a shared network drive so multiple systems can access the same tray. Originally galleries could only contain grades but in this version, there is the option to store the full composition (e.g. projection node tree) in the gallery. When recreating a composition from the tray, you potentially only have to update the media references. Composition gallery items are recognizable (from grade-items) with a special icon in the top right corner and interior of the proxy.

Do note that the new Shared tray replaces a prior – by default hidden – System tray. The location of that could be changed and was tied to where render-presets were stored. If the path was changed, you might have to relocate your render-presets.

Stage Manager

The Stage Manager and mostly the mapping options have been significantly improved. The new Stage Manager panel has 4 tabs:

  • Stage - create or select a stage. With new functions to export/import a stage setup to easily copy a setup to another system. The stage-export can contain the wall mesh data inline, so you only have a single file to copy. From this tab you also control how new wall-meshes are managed (copy/referenced) and have the option to consolidate all meshes that are used.
  • Walls – create / manage LED walls in the setup. Each wall now has an Active property, which makes it easy to include/exclude a wall when creating projection compositions.
  • Mappings - this is where the biggest updates are located. From here you can now map any wall configuration to the available display outputs which – together with the multi-GPU output – offers all flexibility and less dependence on e.g. Nvidia Mosaic to map complex wall configurations. You select any (part of a) wall and map that to any (part of a) display. You can split up walls based on pixels or tiles. You can create virtual displays to prepare a stage setup on any system, copy the setup over and quickly replace the (virtual) display of a mapping with the online / physical display.
  • Previs - this allows you to load actors (models) into your 3D stage view in the Stage Manager display and create a snapshot image: either of the 3D view or of the camera view. The images are stored in the trays and can be useful when preparing a scene for e.g. usage on a storyboard. (Note that you can also create complex previs animations, by keyframing camera movement in the Camera menu or any projection setup) 

More Stage Manager updates

  • Mappings are only displayed when using a Switcher node. All new projection setups that are created use a Switcher node as root. In the Channel Controller panel, select the Stage Mapping option with a display to output the mapping configuration from the Stage Manager.
  • Mappings positions now use the top left as origin and mappings can include a rotation, so all pixels of a display output can be used.
  • Mapping selection and positioning can be done from the numeric controls, but also directly within the view, which includes snapping and a lasso tool (hold down ctrl + drag) to select a segment of the active wall. All available keyboard combinations are shown in the Quick-Key help panel, which can be started with ‘h’.
  • The model view pan/zoom settings in the stage manager are now stored so the model view of the stage manager now comes up the same as left in the last session. Use the (R)eset button in the lower right corner to switch to the default view.
  • Added a label display with the walls (and actors) properties that shows the model size (width + height + depth).
  • Note that on first use of this version, all existing mappings are updated to the new format. If you go back a version, the old settings will be active again.

For more details on the new Stage Manager go here.

Unreal Workflow

Various extensions and improvements to the Unreal workflow (tools) - the Unreal Live Link and Live FX Live Link Plug-in for Unreal.

Manage Unreal Projects

A new function to directly start Unreal projects from within the Live Links panel in Live FX. Select an Unreal project from disk, select the corresponding Unreal version, select the Scene/Level in the project to run and hit Start. Unreal can be started in one of three modes: Editor, Game (no interface) or in nDisplay mode (if an nDisplay configuration is available – see below). Each of these modes have their own set of startup / command line parameters – where the most important ones can be set directly from the Unreal Live Link panel interface. The project, version and level references are stored so they are directly available on every restart. 

The intent is that you can manage your Unreal project fully from within Live FX. Initially you would start the project in Editor mode to create a new Live FX camera + live link to output a texture share (see new functionality of the Live FX Unreal plug-in below). After the Editor / setup session, the Unreal project can be started in Game mode, where you just specify the output resolution but do not need any UE interface. Optionally you can link the project to an Unreal multi-user session.

When the Auto-Close option is enabled in the Live Link panel, then any Unreal project started from there, is also automatically closed when opening another project or when exiting the Live FX project.

Note that the project start options are currently only available on Windows.

Live FX Live Link Unreal Plug-in

A new version of the Live FX Live Link plug-in for Unreal is available for Unreal versions 5.3, 5.4, 5.5 and 5.6. The plug-in update has a new setup panel that is opened from the (new) Live FX toolbar button in the UE editor, that is available after adding the plug-in to a project. Through this new panel you can:

  • do a full setup: create a new Live Link instance + a Live FX Cinema camera in the current level of the project + create a Live Link preset with the project so that on each next startup, the Live Link is automatically instantiated.
  • just create a new Cinema camera. If a Live Link preset is available, it is tied to this camera automatically.
  • convert a Live FX Stage (.lfxs) file into an nDisplay configuration file, which is used to start the Unreal project in nDisplay mode. The Live FX Stage file can be exported from the Stage Manager panel.

Note that the new plug-in is backward compatible and can work with v9.8 of Live FX, although the nDisplay workflow is only available with v9.9. 

For the Texture Share functionality to work with the nDisplay start mode of Unreal, The Unreal Texture Share node was extended with an explicit option for nDisplay. 

Navigation mode and Bookmarks

The Unreal Live Link, which passes on camera tracking data to the Live FX Unreal plug-in, already contained an option to set an offset from the (physical) camera position to adjust the actual position in the Unreal scene. For setting that offset more intuitively, a Navigation mode was added. When toggling the Navigation mode in the Unreal Live Link on, a series of quick-keys and mouse gestures become available to navigate in the Unreal scene, which is visible in the View Port either through a Texture Share or VideoIO capture node. The available quick keys, for moving forward/back, up/down, pan/tilt/, etc., are shown at the top of the screen when in Navigation mode. 

Each position in the Unreal scene can now also be quickly stored and recalled by setting a bookmark. Each bookmark can have a custom name and is stored for subsequent sessions.

Generate High Res Images

Through the Generate option on the Projects sub-tab of the Live Link panel, you can have Unreal generate a (high resolution) image of the current scene. This can be a standard 2D image of the current camera pose or an equirectangular image. Next to the type, you also set the format and desired resolution. This function needs the new Live FX Unreal plug-in and presumes that Live FX and Unreal run on the same system. 

The image is generated in the default Snapshot media folder of Live FX and when properly generated by Unreal, automatically loaded into the current Live FX session and placed in the Snapshot Tray and attached to the pen.

For more details on the various Live FX - Unreal workflows go here.

Stage Lights

  • Added snapping functionality with the fixture sample box overlays. This function can be toggled on/off in the Config menu or can work ‘inverted’ by holding down the shift key when dragging the fixture overlay to resize or reposition it. Snapping happens based on the location of other fixtures and the edge of the image. When snapping, a guideline is visible to show the element it snaps to.
  • Added a lasso select tool (click+drag) for fixtures. When also holding down Ctrl with the lasso tool, the selection can be extended or reduced.
  • You can now resize fixture groups inside fixture groups, including snapping. This makes managing and positioning high numbers of fixtures much easier.
  • Updated the display of fixture overlays. Clearer selection as to which axes are updated. In case of multi-select, fixtures from different group levels are shown in grey as they cannot be moved or resized together.
  • Available quick-keys for snapping and resizing fixtures are available in the Quick-Key overview, that can be started by pressing hotkey ‘h’.
  • The default universe for new fixtures is now 0 to prevent conflicts with existing fixtures, when creating a new one. Universe 0 is not sent out and not included when checking for conflicts
  • Mapping update. When mapping to a group-fixture, the dimmer/color/grade values are now properly applied to the underlying child fixtures.

General / Misc

  • Support for the HAP-codec, both encoding and decoding. HAP can be used to ensure realtime playback with (very) high resolution media and can be an alternative for e.g. NotchLC. HAP comes in the following flavors in order of quality: HAP, HAP Alpha, HAP Q, HAP Q Alpha, HAP R, HAP HDR. Next to the codec flavor, the encoder has an additional quality setting that relates to the de-/encoding speed and with that also the image quality. Finally, there is an option for so-called Snappy (lossless) compression to reduce the file size. The HAP reader is part of the regular QuickTime reader. The HAP writer is available in the Render tab with the Finishing toolset.
  • Added a Pause button to the OSC source Live Link to easily suspend all external live-links and cues temporarily.
  • Added a Flair MRMC Camera Animation Import function. This function is available from the Player - Tools dropdown menu. The function can connect to a Flair system on the network and import the current active camera animation and apply this to the active shot-camera. The function requires the latest version of Flair and that the application has the gRPC protocol enabled. The imported animation keyframes can be applied from any frame position in the active shot and can be spread out over any number of frames - by default it uses one keyframe per frame. The aim of the importer is to have an easy way to sync the camera robot motion with led wall projected media, which can be played at any / different framerate. In addition to the import function, an OSC Flair play-state cue was added in the OSC Sources Live Link. This makes sure that next to the sync of the animation, playback start with Flair is also in sync. Note that this function is (for now) only available on Windows.
  • Added support for OpenTrackIO, both a Tracker (reader) and a Sender (writer) Live Link. OpenTrackIO is a new standard for passing (camera) tracker data (https://ris-pub.smpte.org/ris-osvp-metadata-camdkit/). The OpenTrackIO writer Live Link sends out the data of the active shot camera. Both the Tracker and Sender have a Unicast and Multicast option. The latter is the default and recommended transport method – where you just set a source-id to capture the intended source.
  • You can now activate a shot camera without having to create a (dummy) layer. The Live and Projection setup functions will no longer create this layer for new setups.
  • On the Cylindrical to Equirectangular and Cylindrical to Wall nodes you can now set the cylindrical coverage in degrees as well as in pixels.
  • The Nest node now has a button to link the X and Y Scale controls.
  • The layer and plug-in overlays now show normally when the top node is locked. Note though that they are not adjusted for any scaling that might happen upstream.
  • When toggling the Relative mode of a layer in the Canvas menu you could hold down Ctrl to ensure that the layer holds its current position while the camera was not in its default position. With holding Ctrl, the Canvas position and rotation were combined / updated with the current camera position and rotation, so the layer holds its actual position. This behaviour is now the default. Holding down Ctrl now does not combine position and rotation, meaning that the layer will jump if the camera is not at the origin.
  • Added a new [Direct->Wall] ‘projection’ node, which is effectively not a projection node but just behaves like the other projection nodes in that its output is tied to a wall and visible in the Stage Manager. This node just puts its input onto the wall without taking the geometry of the wall or the camera position into account. The node has offset and scaling controls to adjust the size and position of the image on the wall, as well as a background-input to fill the wall completely. The new option is available in the Projection Setup panel.
  • Changed the default loop mode for Fill/Mattes from Once to Loop.
  • The Sync-Mode and Vertical-Sync settings (in the Player Settings menu) are now system settings and no longer project settings. Please explicitly check the settings after installing this version!
  • For a shot without valid alpha rendering the primary layer would result in a transparent instead of an opaque image. Now, a default alpha of 1 is assumed and the alpha channel options in the Balance menu work more correct when viewing the alpha in the view port. This change might affect existing compositions. The easiest way to mimic the old behaviour is to then set the Alpha in the Balance menu to 0.
  • Allow to double click the proxy images in the fill/matte menu to navigate to the corresponding node of that shot.
  • When a crash occurred during a render, you could lose all render queue updates from just before the render. The system settings (including the render queue) are now always saved before starting a new render.
  • Updated the MoSys tracker Live Link to include an explicit block when the new tracker-frame for a new image frame has not arrived yet. MoSys only sends a single tracker frame per image (whereas other trackers send tracker data on a higher rater than the image framerate). To adjust for network timing, you sometimes had to add a small delay on the Mosys tracker to prevent the same tracker data being used for two consecutive frames.
  • Added an explicit option to set a matte in the Live Setup panel: Chroma key, Embedded (in the live capture), VideoIO channel (for use with e.g. Ultimatte), External (separate shot).
  • Use quick key Alt+F2 to rename a construct in the project tree (without having to jump into Edit mode)
  • Added system setting to always open projects in read-only mode. Primarily intended for clients in a multi-node setup, but this can also be useful to protect the project database when sharing it to a pure review system. When enabling the mode, a lock icon on the startup screen appears.
  • Added a system-name property in the system settings panel. By default, this is the computer name, but it can be updated to any name. The system-name is e.g. used for the multi-node setup.
  • ACES AP0 as colorspace for displays was disabled explicitly but is now available again as some Video IO plug-ins prefer to get their input in ACES.
  • Fixed an issue when applying a nest in the node view on a referenced node, the nest could be inserted on the wrong input of a switcher node.




Rate this Download:


Details
Last Modified:Tuesday, October 21, 2025
Last Modified By: Peter
Type: BETA
Article not rated yet.
Article has been viewed 280 times.
Options