ftrack Connect Pipeline Framework

Welcome to ftrack-connect-pipeline’s documentation!

Important

This is the new DCC Framework for Connect 2 written in Python. If you are migrating from the old ftrack Connectors integrations then please read the dedicated migration guide.

This document is intended to bring a deeper understanding of how the new Framework works under the hood, enabling developers to customise it to their particular needs.

For a general introduction on how to install and use ftrack Connect 2 out of the box, please head over here: https://www.ftrack.com/en/portfolio/connect

The documentation is divided into two main parts, the first part is where we describe the new Framework and the second part is where we customise it to fit our needs.

Introduction

The ftrack pipeline Framework allows you to publish, load and manage assets from a consistent set of api and interface to streamline the production process.

This section describes the new Framework, how it is structured and its different components.

Important

We recommend you read through this part before attempting to do any further customisations.

Overview

Prerequisites

To be able to get the most out of this document, we assume you have basic knowledge of:

  • What ftrack(https://ftrack.com) are and what it does.

  • Python, specifically the ftrack Python API.

  • Source code management, specifically Git.

  • DCC applications, i.e. Maya, Nuke.

  • ftrack Connect 2 Package, installation - app launch - publish and load.

Key design elements and tradeoffs

The new Framework is designed to be a starting point for developers who seek to further extend it and use it as a base for a complete studio toolset.

The key design elements are:

  • Modular design; divided into different layered core plugins with well-defined API:s.

  • Schema based definitions; enabling customizable departmentalised processes.

  • Python 3 compatibility; Aligning with latest VFX standards.

  • Host - UI abstraction layer; Enabling non-QT DCC applications or even headless operation.

  • Open source; everyone should be able to contribute to the new Framework.

The new Framework attempts to strike a balance between built-in functionality and configurability, yet being easy to understand and adopt.

One main design decision was to build a central definition repository, containing schemas and plugins. The main reason is that all DCC apps usually share a lot of functionality within a medium-large pipeline, keeping this inside each integration plugin would cause a lot of duplicate code needing to be maintained.

Architecture

_images/architecture.svg
ftrack

ftrack acts as the database from which the Framework retrieves and stores information. These are the key entities involved with the Framework:

  • Context; The task entity that the DCC work session is bound to, and comes from the task launched within ftrack or Connect.

  • AssetVersion; Created during publish, resolved by assembler during load.

  • Component; Created during publish, loaded and managed as an asset.

Python API

The ftrack Python api is the core dependency of Connect and the Framework, enabling communication with the ftrack workspace.

The API also comes with the ftrack location system, allowing storage and resource identifier(file path) aware implementations.

Connect

Connect is the ftrack desktop GUI application enabling DCC application launch and publishing. Connect locates the Framework plugins, and together with the Application launcher plugin, enables discovery and launch of DCC applications.

The Framework modules are connect plugins, which means that need to implement a launch hook containing the logic to discover and launch the DCC application.

Application launcher

The Application launcher is a Connect plugin responsible for discovery and launch of DCC applications.

The application launcher reads its JSON configuration files and performs discovery of DCC applications.

The tutorial part of this documentation shows an example on how to modify the configuration.

For more information on how the application launcher works, please refer to the ftrack Application Launcher documentation.

Connect Package

The Connect package is a build of Connect, shipped as an installer, available for each major operating system platform.

Framework

The Connect pipeline Framework is divided into two four layers:

Core Pipeline layer

The core pipeline Framework module is the backbone of the pipeline, on which all other modules rely.

It is in the core were all the interaction with the underlying host type is performed, except when it comes to bootstrap of the DCC.

The core is depending on the definition module to be present.

The module comprises four major components:

Host

To use the Framework, a host must be instantiated with an event manager.

The host:

  • Loads the supplied context(task) from the FTRACK_CONTEXTID environment variable.

  • Discovers and validates each definition against its schema.

  • Serves each Client with data, handles context change.

  • Run definitions by instantiating an Engine.

  • Manages logging by listening to client notifications.

Client

The Client is the the user facing component that communicates with host through the host connection over the ftrack event system.

Clients are categorised into the engine types, see below.

Clients reads the definition and context from the host, and then commands the host to run the augmented definition and its plugins with options collected from the user.

Engine

An Framework Engine is a module within the core that defines a function and require an associated schema. The current defined engine types are:

  • Publisher

  • Loader

  • Asset manager

  • Opener

Logs

Clients sends notifications to the host which is stored in an internal SQLite database valid during the session.

Definition layer

The definition pipeline module is were the each definition, schema and Framework plugin are stored.

As mentioned previously, the new Framework is designed to make it easy to write custom code that takes care of asset load and publishing, the new Framework achieves this by introducing “definitions” which basically are JSON schemas that configures which Framework plugins (loaders and publishers) to run for a certain ftrack asset type. This module is were you most likely do customisations in order to tailor the Framework to the studio needs.

Definitions module are divided into two parts:

resource/
    definition/
    plugins/
definition

This directory contains the definition JSON configuration files for each engine and host type.

We have highlighted some file of importance, leaving out built-in definitions that would be left out of an potential customization:

asset_manager/
   ..
loader/
   ..
   maya/
      geometry-maya-loader.json
publisher/
    ..
    maya/
        geometry-maya-publish.json
loader

Loader definitions, used by the Assembler client during load of assets.

maya/

Maya loader definitions.

loader/maya/geometry-maya-loader.json

The Framework definition for loading geometry asset versions into Maya:

{
    "type": "loader",
    "name": "Geometry Loader",
    "asset_type": "geo",
    "host_type": "maya",
    "ui_type": "qt",
    "contexts": [
        {
            "name": "main",
            "stages": [
                {
                    "name": "context",
                    "plugins": [
                        {
                            "name": "context selector",
                            "plugin": "common_passthrough_loader_context",
                            "widget": "common_default_loader_context"
                        }
                    ]
                }
            ]
        }
    ],
    "components": [
        {
            "name": "snapshot",
            "file_formats": [
                ".mb",
                ".ma"
            ],
            "stages": [
                {
                    "name": "collector",
                    "plugins": [
                        {
                            "name": "Collect components from context",
                            "plugin": "common_context_loader_collector"
                        }
                    ]
                },
                {
                    "name": "importer",
                    "plugins": [
                        {
                            "name": "Import paths to Maya",
                            "plugin": "maya_native_loader_importer",
                            "options": {
                                "load_mode": "import",
                                "load_options": {
                                    "preserve_references": true,
                                    "add_namespace": true,
                                    "namespace_option": "file_name"
                                }
                            }
                        }
                    ]
                },
                {
                    "name": "post_importer",
                    "plugins": [
                        {
                            "name": "maya",
                            "plugin": "common_passthrough_loader_post_importer"
                        }
                    ]
                }
            ]
        },
        {
            "name": "model",
            "file_formats": [
                ".mb",
                ".ma"
            ],
            "stages": [
                {
                    "name": "collector",
                    "plugins": [
                        {
                            "name": "Collect components from context",
                            "plugin": "common_context_loader_collector"
                        }
                    ]
                },
                {
                    "name": "importer",
                    "plugins": [
                        {
                            "name": "Import paths to Maya",
                            "plugin": "maya_native_loader_importer",
                            "options": {
                                "load_mode": "reference",
                                "load_options": {
                                    "preserve_references": true,
                                    "add_namespace": true,
                                    "namespace_option": "file_name"
                                }
                            }
                        }
                    ]
                },
                {
                    "name": "post_importer",
                    "plugins": [
                        {
                            "name": "maya",
                            "plugin": "common_passthrough_loader_post_importer"
                        }
                    ]
                }
            ]
        },
        {
            "name": "cache",
            "file_formats": [
                ".abc"
            ],
            "optional": true,
            "selected": false,
            "stages": [
                {
                    "name": "collector",
                    "plugins": [
                        {
                            "name": "Collect components from context",
                            "plugin": "common_context_loader_collector"
                        }
                    ]
                },
                {
                    "name": "importer",
                    "plugins": [
                        {
                            "name": "Import paths to Maya",
                            "plugin": "maya_abc_loader_importer"
                        }
                    ]
                },
                {
                    "name": "post_importer",
                    "plugins": [
                        {
                            "name": "maya",
                            "plugin": "common_passthrough_loader_post_importer"
                        }
                    ]
                }
            ]
        },
        {
            "name": "game",
            "file_formats": [
                ".fbx"
            ],
            "optional": true,
            "selected": false,
            "stages": [
                {
                    "name": "collector",
                    "plugins": [
                        {
                            "name": "Collect components from context",
                            "plugin": "common_context_loader_collector"
                        }
                    ]
                },
                {
                    "name": "importer",
                    "plugins": [
                        {
                            "name": "Import paths to Maya",
                            "plugin": "maya_native_loader_importer",
                            "options": {
                                "load_mode": "import",
                                "load_options": {
                                    "preserve_references": true,
                                    "add_namespace": true,
                                    "namespace_option": "file_name"
                                }
                            }
                        }
                    ]
                },
                {
                    "name": "post_importer",
                    "plugins": [
                        {
                            "name": "maya",
                            "plugin": "common_passthrough_loader_post_importer"
                        }
                    ]
                }
            ]
        }
    ],
    "finalizers": [
        {
            "name": "main",
            "stages": [
                {
                    "name": "pre_finalizer",
                    "visible": false,
                    "plugins": [
                        {
                            "name": "Pre finalizer",
                            "plugin": "common_passthrough_loader_pre_finalizer"
                        }
                    ]
                },
                {
                    "name": "finalizer",
                    "visible": false,
                    "plugins": [
                        {
                            "name": "Finalizer",
                            "plugin": "common_passthrough_loader_finalizer"
                        }
                    ]
                },
                {
                    "name": "post_finalizer",
                    "visible": false,
                    "plugins": [
                        {
                            "name": "Post finalizer",
                            "plugin": "common_passthrough_loader_post_finalizer"
                        }
                    ]
                }
            ]
        }
    ]
}

Attributes:

  • type; Definition type, binds to the host engine names.

  • name; The name of the definition should be kept unique within the pipeline.

  • host_type; The type of host this definition should be available to, basically the name of the DCC application.

  • context; Section that defines the plugin to use when selecting context (Task) and the asset version to load.

  • components; Section that defines each loadable component (step) - which definition plugin and options to use for collect and load into the DCC app. See plugin and their widgets directories below.

  • finalizers; Section that defines plugins that should be run after load has finished.

Publisher

Publisher definitions, used by the Publisher client during publish of assets.

The structure of a publish definition is very similar to the loader, with different plugins and options.

Asset Manager

Plugins and options are defined that are used with the Framework asset manager client and engine.

The Assembler dependency resolver options are defined here, and allows tuning of which asset types are to be resolved for a certain task type.

Schema

JSON configuration files defining the rules that apply to the syntax of definitions (asset manager, loader and publisher). Typically you will not alter these files, but you can add your own attributes to definitions here, that can be picked up by the plugins.

plugin

The plugins are were the code lives, that are referenced within the definitions. The plugins for each host type is depending on both the framework core, Qt and the corresponding DCC plugin.

Plugin structure:

..
maya/
    python/
        loader/
            importers/
                widget/
                    smaya_native_loader_importer_options.py
                maya_native_loader_importer.py
                ..
            finalizers/
                maya_merge_abc_loader_finalizer.py
        publisher/
            collectors/
                widget/
                    maya_geometry_publisher_collector_options.py
                maya_geometry_publisher_collector.py
                    ..
            validators/
                maya_geometry_publisher_validator.py
                ..
            exporters/
                maya_abc_publisher_exporter.py
                ..
            finalizers/
                publish_result_maya.py
        opener/
            ..
common/
    python/
        asset_manager/
        ..
..
maya/

Plugins for Maya hosts.

maya/python/loader/importers/

Directory that should harbour Python plugins responsible for collecting options and do the actual loading into the DCC app.

maya/python/loader/importers/widget/maya_native_loader_importer_options.py

This Qt widget plugin defines the UI elements presented to the user, so the user can set the load options. These load options are then read by the loader plugin below. The name of the plugin has to be unique within Framework but can be shared with the loader plugin:

..
class MayaNativeLoaderImporterPluginWidget(
    plugin.MayaLoaderImporterPluginWidget
):
    plugin_name = 'maya_native_loader_importer'
    idget = MayaNativeLoaderImporterOptionsWidget
..
maya/python/loader/importers/maya_native_loader_importer.py

This is the actual required DCC app plugin that reads the data from disk, as collected by the Framework, and loads it into the current open project.

maya/python/loader/finalizers/maya_merge_abc_loader_finalizer.py

This optional plugin runs after load and here the post process of the imported data can be performed as necessary.

maya/python/publisher/

Plugins for exporting data out from DCC app to disk and creating a version in ftrack with reviewable and components.

maya/python/publisher/collectors/widget/maya_geometry_publisher_collector_options.py

The Qt plugin that defines the widget associated with the geometry collector, and usually is based on the standard built in collector that adds selected objects to a list of objects.

Set auto_fetch_on_init property to True and the fetch function within the collector plugin will be called upon widget instantiation - enabling immediate population of objects based on selection or other expressions/rules.

One can also define a different function, than the default “run” function, to be executed when the plugin is run.

maya/python/publisher/collectors/maya_geometry_publisher_collector.py

The plugin that fetches objects from the loaded DCC app project to be published, in this case Maya geometry. Depending on the type of integration, Pythonic objects can be returned to the next stage or a path to object(s) is returned (Houdini, Unreal).

maya/python/publisher/validators/maya_geometry_publisher_validator.py

(Optional) Validator plugins that can be used to make sure the collected(selected) objects are eligible for publish.

maya/python/publisher/output/maya_abc_publisher_exporter.py

The plugin that is responsible for exporting the collected(selected) objects to disk, to a temporary path. The file will then be moved to its correct path dictated by the API structure plugin associated with the location (if a managed), upon finalization.

maya/python/publisher/finalizers/publish_result_maya.py

(Optional) Plugin that can be used to prepare the data for publish, after the output stage is done. A post process plugin can be implemented that runs after version have been published, allowing for example a trigger that sends out extra notifications or do uploads to additional storage.

Schema validation

This host performs validation of the definitions at boot and when a definition is supplied to be run with a engine.

The validation is important to make sure the syntax and plugin references are correct within the definition.

Search the DCC log for validation errors, for example Maya log is located here:

  • Windows; %LOCALAPPDATA%\ftrack\ftrack-connect\log\ftrack_connect_pipeline_maya.log

  • Mac OSX; ~/Library/Application Support/ftrack-connect/log/ftrack_connect_pipeline_maya.log

  • Linux; ~/.local/share/ftrack-connect/log/ftrack_connect_pipeline_maya.log

UI layer

The UI abstraction layer takes care of rendering widgets inside the DCC application, with the ftrack default style applied. The UI sits on top of the pipeline Framework core layer in the stack and is backed by the :term: Qt framework plugin.

Each Client is represented in the UI layer, which in turn is inherited by the DCC layer (Maya and so on).

Structure:

client/
plugin/
ui/
    assembler/
    asset_manager/
    factory/
    log_viewer/
    utility/

Description of main submodules:

  • client; Contains client UI implementations.

  • plugin; Contain bases for definition plugin widgets

  • assembler; The assembler/loader widget.

  • asset_manager; The asset manager widget.

  • factory; Contains widgets and logic for generating the publisher and parts of the loader, factorised from the definition.

  • log_viewer; The log viewer widget.

  • utility; Utility widgets such as the entity browser, context selectors and so on.

resource

The resource folder contains associated fonts, images and stylesheets.

DCC integration layer

The plugin for a specific DCC application (maya, nuke and so on) and is identified by the host type. Depends on the core Framework plugins above for bootstrapping and providing the three main Framework features:

  • Publish files/components.

  • Load files/components.

  • Asset management.

The integration achieves this by:

  • Bootstrapping the DCC app.

  • Launching the pipeline host.

  • Adding menu items to the “ftrack” menu within the DCC app, enabling launch of each client widget.

The DCC module sits on top of the UI layer in the pipeline Framework stack.

Structure:

asset/
client/
host/
plugin/
utils/

Description of main submodules:

  • asset; Contains asset manager logic for handling DCC objects.

  • client; DCC implementation of each :term;`client`.

  • host; DCC implementation of the :term;`host`.

  • plugin; Contain DCC implementation of bases for definition plugin widgets.

  • utils; Contains additional utils and tools related to the DCC application.

resource

The resource folder contains the bootstrap scripts, the hook is assumed to setup the DCC by either environment variables or arguments so it is able to find and load the bootstrap script(s).

How it works

Here we will outline how the new Framework works, within the DCC application.

Important

We are not going to touch upon Connect and the launcher, please see separate user documentation.

We will go through step by step what happens within the DCC application during launch, load, asset management and publish.

DCC launch

The following flowchart describes the DCC bootstrap process and how it interacts with the Framework:

_images/launch.svg
Standalone

In standalone mode, outside the DCC application, the launch is identical. Instead of targeting the DCC host when it comes to loading definitions, the “python” host type is instead used.

Asset loading

The following flowchart describes the asset load process and how it interacts with the Framework:

_images/load.svg

Asset management

Assets inside the DCC are tracked using specialised ftrack tracking object that usually is hidden from the user or by other means are hard to alter. This is object also referred to as the ftrack node.

The following flowchart describes the asset management process and how it interacts with the Framework:

_images/asset_management.svg

Asset publish

The following flowchart describes the asset publish process and how it interacts with the Framework:

_images/publish.svg

History

With the first version of Connect, the need to load and publish assets within DCC applications were addressed but it was not easy to further customise within an existing studio environment.

We realised our customers were not using the Connect integrations, more having them as a base for inspiration when writing their own integration plugins.

Initial development commenced at ftrack 2018, with the first public release by the summer of 2022.

Installing

Install the DCC Framework plugins through the plugin manager within Connect 2.

These plugins are prebuilt on Python 3.7 and certified to be used with public DCC application releases for Maya, Nuke and so on.

If the embedded Python version of your DCC application is incompatible with the prebuilt plugins, you will be required to rebuild them and distribute them separate as part of your studio configuration.

Developing

Learn how to develop with the ftrack Connect pipeline Framework: download, build and deploy integration in order to tailor it to your studio needs.

Refer to the API Reference for detailed information on the pipeline core API.

Find examples on how to develop your own customisations within the Tutorial.

Warning

The ftrack framework are subject to change with the next major version release, when config driven pipeline environments will be introduced. Appropriate guidelines on how to migrate existing customisations will be provided by ftrack.

Coding environment setup

IDE

Internally at ftrack we use PyCharm as our main development tool. Visual Studio would be our second editor of choice, enabling additional free remote debugging against DCC:s/Maya.

Source Code Management

It is possible to edit the code and configurations directly without and SCM, but that will make it very complicated to download and merge in new Framework releases as they are announced.

The recommended way of doing this is to create your own repositories and then sync in changes from ftrack by setting adding a remote pointer to our GitHub repositories. This process is described in detail within the Tutorial.

Download

The Framework is 100% open source and accessible on GitHub.

Git clone base repositories:

git clone ftrackhq/ftrack-connect-pipeline
git clone ftrackhq/ftrack-connect-pipeline-definition
git clone ftrackhq/ftrack-connect-pipeline-qt

Clone a DCC repository:

git clone ftrackhq/ftrack-connect-pipeline-maya

Customise

Here follows some general customisations guidelines on each Framework module.

ftrack-connect-pipeline

Generally you will never need to touch the core module in order to customise your pipeline, the most common addon would in case be a custom engine providing new functionality to the Framework. Another case would be providing shared integration utility code that can be used across all DCC applications and definition plugins.

ftrack-connect-pipeline-definition

This module repository is designed to be the place where main customisations will happen within the resource directory.

ftrack-connect-pipeline-qt

This module is the repository were Qt widgets, images and fonts resides, together with the base style of integrations. Modify this repository on order to augment the look and feel of the integrations, or if you seek to supply new widgets that is to be used across DCC integrations.

ftrack-connect-pipeline-host_type

The module repository were you would make changes to each individual DCC application when it comes to the ftrack menu, clients and base plugins and widgets that are referenced from within the definition.

Tutorial

We provide a comprehensive Tutorial on how to customise the Framework in practise, with fully working source code attached.

Build

Create a virtual environment

  1. Download and install/build latest Python 3.7, see below for reasoning on which Python version to use.

  2. Install virtualenv.

  3. Create a virtual env.

  4. Change folder to ftrack-connect-pipeline

Run:

pip install .

This will setup your virtual environment with the dependencies needed.

Choosing Python base version

To take into consideration here is the target set of DCC applications the Framework is supposed to work with, from that set you need to evaluate which is the lowest common denominator built in Python interpreter version. As of 2022, this is Python 3.7 but will be subject to change as DCC:s move forward according to the VFX reference platform.

Building the Framework

The process of building a Framework plugin is the same:

  1. Activate the virtual env created above.

  2. Change folder to ftrack-connect-pipeline

Run:

python setup.py build_plugin

This will produce output into the /build subfolder, repeat this step for each Framework plugin (ftrack-connect-pipeline-definition, ftrack-connect-pipeline-qt and so on)

Building the documentation

Install the doc build dependencies into your virtual env, you will find these in setup.py beneath the setup_requires section.

After that, you should be ready to build the documentation:

python setup.py build_sphinx

Run:

pip install .

This will setup your virtual environment with the dependencies needed.

Choosing Python base version

To take into consideration here is the target set of DCC applications the Framework is supposed to work with, from that set you need to evaluate which is the lowest common denominator built in Python interpreter version. As of 2022, this is Python 3.7 but will be subject to change as DCC:s move forward according to the VFX reference platform.

Building the Framework

The process of building a Framework plugin is the same:

  1. Activate the virtual env created above.

  2. Change folder to ftrack-connect-pipeline

Run:

python setup.py build_plugin

This will produce output into the /build subfolder, repeat this step for each Framework plugin (ftrack-connect-pipeline-definition, ftrack-connect-pipeline-qt and so on)

Building the documentation

Install the doc build dependencies into your virtual env, you will find these in setup.py beneat the setup_requires section.

After that, you should be ready to build the documentation:

python setup.py build_sphinx

Deploy

Deploying the Framework locally

After you have built, copy the plugin directory that is output in /build for each Framework plugin, to the ftrack Connect plugin path (or to a location pointed to by the FTRACK_CONNECT_PLUGIN_PATH):

  • Windows; %LOCALAPPDATA%\ftrack\ftrack-connect-plugins

  • Mac OSX; ~/Library/Application Support/ftrack-connect-plugins

  • Linux; ~/.local/share/ftrack-connect-plugins

Finalise by restarting Connect and DCC(s) to have the newly built integrations discovered.

Important

This is no need to restart Connect on a rebuild if the the version number is the same, in that case only a relaunch of DCC is required.

Building and deploying Connect centrally

To minimise IT administrative tasks, one could build Connect and launch it from a central location within a new or existing Python environment.

The process is documented in the ftrack Connect documentation, a short summary:

Run:

pip install .

or:

python setup.py install

A Connect executable is then compiled, which can be set to run a login time on workstations and be wrapped with a proper launcher having an icon.

Deploying Framework centrally

To have Connect pickup your custom built Framework plugins, build and deploy them to a central network location, for example:

\\filesrv\nas\pipeline\connect

Then on workstations set the environment variable to point at this location:

set|export FTRACK_CONNECT_PLUGIN_PATH=\\filesrv\nas\pipeline\connect

Finally install and launch Connect, remember to remove any locally installed Framework plugins to prevent conflict.

Tutorial

Discover how to extend and modify the ftrack Connect pipeline Framework for a tighter integration and more customisation options.

Introduction

In this section, we are getting our hands dirty and showing an example on how the new Framework can be customised to a studio specific needs.

The aim with this exercise is to inspire the reader on what can be achieved - build a fragment of a VFX studio pipeline that aids the artists to working the right way, with correct file naming convention, minimising tedious and error prone tasks.

Warning

The ftrack framework are subject to change with the next major version release, when config driven pipeline environments will be introduced. Appropriate guidelines on how to migrate existing customisations will be provided by ftrack.

Checkout documentation code

The source code present in this tutorial, as fully working examples, can be found in the resource directory.

The tools we are about to build

In this tutorial, we will first show how to apply a custom file structure that will apply to all API location based file operations - Connect and DCC integrations.

Next we target Maya and extend the Framework integration with a set of tools:

  • DCC bootstrap extension - have the latest snapshot opened, or a new one generated from template and saved.

  • Custom loader - load a previewable (Quicktime) onto a camera image plane in Maya.

  • Post a message on Slack upon publish, with an asset version identifier and thumbnail attached.

  • Add a custom tool to the ftrack menu.

We also walk you through the process of customising the launch of DCC applications - how to constrain application launch to a certain department (e.g. task type) and how to set environment variables plus add additional arguments.

At the end we go more in depth on how to build, deploy and maintain the customised Framework pipeline within the studio.

Important

In this tutorial configurations are made within the source code, at present time there is currently no GUI approach to configuring the Framework, and no way to provide separate builds and configurations per project or context in general. This is subject to be improved with the next major release of the Framework and Connect.

Setup

Prerequisites

For this exercise we require to following:

  • A workstation, in our example a PC running Windows 11.

  • A text editor for syntax highlighted editing of Python code, for example PyCharm or Visual Code.

  • Git command line tools installed, including Git bash.

  • A licensed version of Maya 2022 or later.

  • A valid ftrack trial or subscription.

  • ftrack Connect 2 Package installed, with the new framework.

  • A central storage scenario setup to point to your studio network file share.

Git repositories

The best approach is to create your own set of repositories and then pull from the ftrack Framework repositories as a remote upstream.

Important

You are not forced to create repositories, the simplest approach is to just pull the code and start working on it. Recall that it will be difficult to work on the code internally as a team without proper SCM in place.

We will extend two Framework plugins:

  • ftrack-connect-pipeline-definition

  • ftrack-connect-pipeline-maya

The rest of the plugins we will use are shipped with Connect and installable through the plugin manager.

As a first step, create the repositories within your SCM environment (GitHub, Bitbucket, locally..). We recommend you create them by the same name to minimise confusion.

Next create folders, clone the remote repositories with Git bash and merge from ftrack:

$ mkdir mypipeline
$ cd mypipeline
$ git clone <my SCM service base url>/ftrack-connect-pipeline-definition
$ git remote add upstream https://github.com/ftrackhq/ftrack-connect-pipeline-definition.git
$ git fetch upstream
$ git merge upstream/main
$ git rebase upstream/main

Repeat the steps above for ftrack-connect-pipeline-maya repository. Throughout this tutorial, the folder mypipeline will refer to this location were you checkout and store your local source code repository.

At any new release done by ftrack, you can simply pull these and then merge into your repository:

$ git pull upstream main

Branching

We are not going into full detail on how to manage your source code, a good general practice to always develop on stories, e.g. backlog/bigfeature/story with sub branches. For more guidelines on Git: https://nvie.com/posts/a-successful-git-branching-model

Testing

During the test phase you would want to test your tools locally before deploying centrally. As first step, create a virtual environment, then follow the instructions on how to build and deploy locally:

$ <activate virtual environment>
$ cd mypipeline\ftrack-connect-pipeline-definition
$ python setup.py build_plugin
$ rmdir /s "%HOMEPATH%\AppData\Local\ftrack\ftrack-connect-plugins\ftrack-connect-pipeline-definition-<version>"
$ move build\ftrack-connect-pipeline-definition-<version> "%HOMEPATH%\AppData\Local\ftrack\ftrack-connect-plugins"

The same process applies to the Maya DCC plugin and all other Connect framework plugins.

Pipeline deploy

Towards the end of this chapter, we will build the integration plugins and put them centrally on a server for everyone to use. We assume there is a space for pipeline to reside on a network share:

\\server\share\PIPELINE

This will be the physical location of our custom pipeline, and will be named “PIPELINE” hereafter.

Custom file structure

The tutorial relies on defining a custom folder structure across the studio.

With ftrack, and a storage scenario, comes the ftrack_api.structure.Standard structure plugin which publishes files with the standard ftrack structure:

project/sequence/shot/model/v001/alembic.abc

With this tutorial, we are going to provide our custom studio file structure that puts publishes in a “PUBLISH” folder:

project/sequence/shot/PUBLISH/anim/alembic.abc

We are achieving this by defining our own structure plugin that we apply to the storage scenario location. This API/Connect plugin needs to reside server side:

mypipeline/custom-location-plugin:

hook/plugin_hook.py                 #  Enable structure within Connect
location/custom_location_plugin.py  #  Initialise the location - apply structure
location/structure.py               #  Provides the custom file structure

Structure

Within the structure plugin we define an asset resolver:

mypipeline/custom-location-plugin/location/structure.py

  1# :coding: utf-8
  2# :copyright: Copyright (c) 2022 ftrack
  3
  4import sys
  5import os
  6import re
  7import unicodedata
  8import logging
  9import traceback
 10
 11from collections import OrderedDict
 12
 13import ftrack_api.symbol
 14import ftrack_api.structure.base
 15
 16STUDIO_PUBLISH_FOLDER = "PUBLISH"
 17
 18
 19class Structure(ftrack_api.structure.base.Structure):
 20    '''
 21    Custom structure publishing to "_PUBLISH" folder beneath shot.
 22    '''
 23
 24    def __init__(
 25        self, project_versions_prefix=None, illegal_character_substitute='_'
 26    ):
 27        super(Structure, self).__init__()
 28        self.logger = logging.getLogger(
 29            'com.ftrack.framework-guide.custom-location-plugin.location.Structure'
 30        )
 31        self.project_versions_prefix = project_versions_prefix
 32        self.illegal_character_substitute = illegal_character_substitute
 33        self.resolvers = OrderedDict(
 34            {
 35                'FileComponent': self._resolve_filecomponent,
 36                'SequenceComponent': self._resolve_sequencecomponent,
 37                'ContainerComponent': self._resolve_containercomponent,
 38                'AssetVersion': self._resolve_version,
 39                'Asset': self._resolve_asset,
 40                'Task': self._resolve_task,
 41                'Project': self._resolve_project,
 42                'ContextEntity': self._resolve_context_entity,
 43            }
 44        )
 45
 46    def sanitise_for_filesystem(self, value):
 47        '''Return *value* with illegal filesystem characters replaced.
 48
 49        An illegal character is one that is not typically valid for filesystem
 50        usage, such as non ascii characters, or can be awkward to use in a
 51        filesystem, such as spaces. Replace these characters with
 52        the character specified by *illegal_character_substitute* on
 53        initialisation. If no character was specified as substitute then return
 54        *value* unmodified.
 55
 56        '''
 57        if self.illegal_character_substitute is None:
 58            return value
 59
 60        value = unicodedata.normalize('NFKD', str(value)).encode(
 61            'ascii', 'ignore'
 62        )
 63        value = re.sub(
 64            '[^\w\.-]',
 65            self.illegal_character_substitute,
 66            value.decode('utf-8'),
 67        )
 68        return str(value.strip().lower())
 69
 70    def _resolve_project(self, project, context=None):
 71        '''Return resource identifier for *project*.'''
 72        # Base on project name
 73        return [self.sanitise_for_filesystem(project['name'])]
 74
 75    def _resolve_context_entity(self, entity, context=None):
 76        '''Return resource identifier parts from general *entity*.'''
 77
 78        error_message = (
 79            'Entity {0!r} must be supported (have a link), be committed and have'
 80            ' a parent context.'.format(entity)
 81        )
 82
 83        if entity is ftrack_api.symbol.NOT_SET:
 84            raise ftrack_api.exception.StructureError(error_message)
 85
 86        session = entity.session
 87
 88        if not 'link' in entity:
 89            raise NotImplementedError(
 90                'Cannot generate resource identifier for unsupported '
 91                'entity {0!r}'.format(entity)
 92            )
 93
 94        link = entity['link']
 95
 96        if not link:
 97            raise ftrack_api.exception.StructureError(error_message)
 98
 99        structure_names = [item['name'] for item in link[1:]]
100
101        if 'project' in entity:
102            project = entity['project']
103        elif 'project_id' in entity:
104            project = session.get('Project', entity['project_id'])
105        else:
106            project = session.get('Project', link[0]['id'])
107
108        parts = self._resolve_project(project)
109
110        if structure_names:
111            for part in structure_names:
112                parts.append(self.sanitise_for_filesystem(part))
113        elif self.project_versions_prefix:
114            # Add *project_versions_prefix* if configured and the version is
115            # published directly under the project.
116            parts.append(
117                self.sanitise_for_filesystem(self.project_versions_prefix)
118            )
119
120        return parts
121
122    def _resolve_task(self, task, context=None):
123        '''Build resource identifier for *task*.'''
124        # Resolve parent context
125        parts = self._resolve_context_entity(task['parent'], context=context)
126        # TODO: Customise were task work files go
127        # Base on task name, and use underscore instead of whitespaces
128        parts.append(
129            self.sanitise_for_filesystem(task['name'].replace(' ', '_'))
130        )
131        return parts
132
133    def _resolve_asset(self, asset, context=None):
134        '''Build resource identifier for *asset*.'''
135        # Resolve parent context
136        parts = self._resolve_context_entity(asset['parent'], context=context)
137        # Framework guide customisation - publish to shot/asset "publish" subfolder
138        parts.append(STUDIO_PUBLISH_FOLDER)
139        # Base on its name
140        parts.append(self.sanitise_for_filesystem(asset['name']))
141        return parts
142
143    def _format_version(self, number):
144        '''Return a formatted string representing version *number*.'''
145        return 'v{0:03d}'.format(number)
146
147    def _resolve_version(self, version, component=None, context=None):
148        '''Get resource identifier for *version*.'''
149
150        error_message = 'Version {0!r} must be committed and have a asset with parent context.'.format(
151            version
152        )
153
154        if version is ftrack_api.symbol.NOT_SET and component:
155            version = component.session.get(
156                'AssetVersion', component['version_id']
157            )
158
159        if version is ftrack_api.symbol.NOT_SET or (
160            not component is None and version in component.session.created
161        ):
162            raise ftrack_api.exception.StructureError(error_message)
163
164        # Create version resource identifier from asset and version number
165        version_number = self._format_version(version['version'])
166        parts = self._resolve_asset(version['asset'], context=context)
167        parts.append(self.sanitise_for_filesystem(version_number))
168
169        return parts
170
171    def _resolve_sequencecomponent(self, sequencecomponent, context=None):
172        '''Get resource identifier for *sequencecomponent*.'''
173        # Create sequence expression for the sequence component and add it
174        # to the parts.
175        parts = self._resolve_version(
176            sequencecomponent['version'],
177            component=sequencecomponent,
178            context=context,
179        )
180        sequence_expression = self._get_sequence_expression(sequencecomponent)
181        parts.append(
182            '{0}.{1}{2}'.format(
183                self.sanitise_for_filesystem(sequencecomponent['name']),
184                sequence_expression,
185                self.sanitise_for_filesystem(sequencecomponent['file_type']),
186            )
187        )
188        return parts
189
190    def _resolve_container(self, component, container, context=None):
191        '''Get resource identifier for *container*, based on the *component*
192        supplied.'''
193        container_path = self.get_resource_identifier(
194            container, context=context
195        )
196        if container.entity_type in ('SequenceComponent',):
197            # Strip the sequence component expression from the parent
198            # container and back the correct filename, i.e.
199            # /sequence/component/sequence_component_name.0012.exr.
200            name = '{0}.{1}{2}'.format(
201                container['name'], component['name'], component['file_type']
202            )
203            parts = [
204                os.path.dirname(container_path),
205                self.sanitise_for_filesystem(name),
206            ]
207
208        else:
209            # Container is not a sequence component so add it as a
210            # normal component inside the container.
211            name = component['name'] + component['file_type']
212            parts = [container_path, self.sanitise_for_filesystem(name)]
213        return parts
214
215    def _resolve_filecomponent(self, filecomponent, context=None):
216        '''Get resource identifier for file component.'''
217        container = filecomponent['container']
218        if container:
219            parts = self._resolve_container(
220                filecomponent, container, context=context
221            )
222        else:
223            # File component does not have a container, construct name from
224            # component name and file type.
225            parts = self._resolve_version(
226                filecomponent['version'],
227                component=filecomponent,
228                context=context,
229            )
230            name = filecomponent['name'] + filecomponent['file_type']
231            parts.append(self.sanitise_for_filesystem(name))
232        return parts
233
234    def _resolve_containercomponent(self, containercomponent, context=None):
235        '''Get resource identifier for *containercomponent*.'''
236        # Get resource identifier for container component
237        # Add the name of the container to the resource identifier parts.
238        parts = self._resolve_version(
239            containercomponent['version'],
240            component=containercomponent,
241            context=context,
242        )
243        parts.append(self.sanitise_for_filesystem(containercomponent['name']))
244        return parts
245
246    def get_resource_identifier(self, entity, context=None):
247        '''Return a resource identifier for supplied *entity*.
248
249        *context* can be a mapping that supplies additional information, but
250        is unused in this implementation.
251
252
253        Raise a :py:exc:`ftrack_api.exeption.StructureError` if *entity* is a
254        component not attached to a committed version/asset with a parent
255        context, or if entity is not a proper Context.
256
257        '''
258
259        resolver_fn = self.resolvers.get(
260            entity.entity_type, self._resolve_context_entity
261        )
262
263        parts = resolver_fn(entity, context=context)
264
265        return self.path_separator.join(parts)

Location

The structure are then registered and used with the default location, if it is an unmanaged/server location, a default location at disk is used so publishes not are lost in system temp directory:

mypipeline/custom-location-plugin/location/custom_location_plugin.py

 1# :coding: utf-8
 2# :copyright: Copyright (c) 2022 ftrack
 3
 4import os
 5import functools
 6import logging
 7
 8import ftrack_api
 9import ftrack_api.accessor.disk
10
11logger = logging.getLogger(
12    'com.ftrack.ftrack-connect-pipeline.tutorial.custom-location-plugin.location.custom_location_plugin'
13)
14
15
16def configure_location(session, event):
17    '''Apply our custom structure to default storage scenario location.'''
18    import structure
19
20    DEFAULT_USER_DISK_PREFIX = os.path.join(
21        os.path.expanduser('~'), 'Documents', 'ftrack_tutorial'
22    )
23
24    location = session.pick_location()
25    if location['name'] in ['ftrack.unmanaged', 'ftrack.server']:
26        location.accessor = ftrack_api.accessor.disk.DiskAccessor(
27            prefix=DEFAULT_USER_DISK_PREFIX
28        )
29    location.structure = structure.Structure()
30    # location.priority = 1
31
32    logger.info(
33        u'Registered custom file structure at location "{0}", path: {1}.'.format(
34            location['name'], DEFAULT_USER_DISK_PREFIX
35        )
36    )
37
38
39def register(api_object, **kw):
40    '''Register location with *session*.'''
41
42    if not isinstance(api_object, ftrack_api.Session):
43        return
44
45    api_object.event_hub.subscribe(
46        'topic=ftrack.api.session.configure-location',
47        functools.partial(configure_location, api_object),
48    )
49
50    api_object.event_hub.subscribe(
51        'topic=ftrack.api.connect.configure-location',
52        functools.partial(configure_location, api_object),
53    )
54
55    logger.info(u'Registered tutorial location plugin.')
Source code

The complete source code for the API location structure plugin can be found here:

resource/custom-location-plugin

Maya open latest scene

As our first task, we implement an automatic scene open in Maya upon launch. It will check if there is a previous snapshot published on the task, if not it tries to locate a template scene, based on the task, to load and start from.

Prerequisites

  1. A shot created in ftrack, with proper timeline and fps set.

  2. The previous custom location plugin deployed, and configured storage scenario set up (preferable).

  3. A Maya template scene to use when no previous published snapshot file exists, present in project folder @ _TEMPLATES/maya/<task type>_template.mb

Implementation

All DCC tools goes into the file custom_commands.py:

mypipeline/ftrack-connect-pipeline-maya/source/ftrack_connect_pipeline_maya/utils/custom_commands.py

 1# :coding: utf-8
 2# :copyright: Copyright (c) 2022 ftrack
 3import logging
 4import os
 5import threading
 6from functools import wraps
 7import shutil
 8
 9from Qt import QtWidgets, QtCompat
10
11import ftrack_api
12
13
14def scene_open(session, logger):
15    '''Load latest scene, or generate new from template.'''
16    context_id = os.getenv('FTRACK_CONTEXTID')
17    task = session.query('Task where id={}'.format(context_id)).one()
18    path_snapshot_open = path_snapshot_load = None
19    path_snapshot, message = get_save_path(
20        context_id, session, extension='.mb', temp=True
21    )
22    location = session.pick_location()
23    for version in session.query(
24        'AssetVersion where '
25        'task.id={} order by date descending'.format(
26            task['id'],
27        )
28    ).all():
29        # Find a snapshot
30        component = session.query(
31            'Component where '
32            'name="snapshot" and version.id={}'.format(version['id'])
33        ).first()
34        if component:
35            try:
36                path_snapshot_open = location.get_filesystem_path(component)
37            except ftrack_api.exception.ComponentNotInLocationError as e:
38                cmds.confirmDialog(message=str(e))
39
40    if path_snapshot_open is None:
41        # Copy Maya template scene
42        path_template = os.path.join(
43            location.accessor.prefix,
44            task['project']['name'],
45            '_TEMPLATES',
46            'maya',
47            '{}_template.mb'.format(task['type']['name'].lower()),
48        )
49        if os.path.exists(path_template):
50            logger.info(
51                'Copying Maya template {} to {}'.format(
52                    path_template, path_snapshot
53                )
54            )
55            shutil.copy(path_template, path_snapshot)
56            path_snapshot_load = path_snapshot
57        else:
58            logger.warning(
59                'Maya template not found @ {}!'.format(path_template)
60            )
61    else:
62        # Make a copy in temp
63        logger.info(
64            'Copying most recent snapshot {} to {}'.format(
65                path_snapshot_open, path_snapshot
66            )
67        )
68        shutil.copy(path_snapshot_open, path_snapshot)
69        path_snapshot_load = path_snapshot
70
71    if path_snapshot_load:
72        # Load the scene
73        logger.info('Loading scene: {}'.format(path_snapshot_load))

We are not going into detail what the scene_open function does, but it tries to locate a previous published snapshot and if not found - a new one is copied from a template and saved to the temp folder and opened.

Finally, to have this run during Maya startup, we add it to userSetup.py:

mypipeline/ftrack-connect-pipeline-maya/resources/scripts/userSetup.py

1def initialise():
2    ..
3
4    maya_utils.init_maya()
5
6    maya_utils.scene_open(session, logger)

Load camera image plane in Maya

Next, we implement a custom camera loader within Maya that loads a reviewable Quicktime (.mov) as an image plane, to aid animation and lighting framing.

Constrain camera loader

As a preparation, we constrain the camera loader to only be seen when on animation and lighting tasks, hiding it during modelling. We do this by modifying the loader definition json and adding the discoverable key:

mypipeline/ftrack-connect-pipeline-definitions/resource/definitions/loader/maya/camera-maya-loader.json

1{
2    "type": "loader",
3    "name": "Camera Loader",
4    "asset_type": "cam",
5    "host_type": "maya",
6    "ui_type": "qt",
7    "discoverable": ["animation","lighting"]
8
9}

Here we have added the additional discoverable key with associate task type names.

Render loader

This serves as an example on how to implement your own loader that is not part of the framework but required in production.

Definition

Reviewable Quicktimes are most likely published with render (asset type), from Nuke Studio or similar tool. This is why we implement an new render loader definition:

mypipeline/ftrack-connect-pipeline-definitions/resource/definitions/loader/maya/render-maya-loader.json

 1{
 2  "type": "loader",
 3  "name": "Render Loader",
 4  "asset_type": "render",
 5  "host_type": "maya",
 6  "ui_type": "qt",
 7  "contexts": [
 8    {
 9      "name": "main",
10      "stages": [
11        {
12          "name": "context",
13          "plugins":[
14            {
15              "name": "context selector",
16              "plugin": "common_passthrough_loader_context",
17              "widget": "common_default_loader_context"
18            }
19          ]
20        }
21      ]
22    }
23  ],
24  "components": [
25    {
26      "name": "movie",
27      "file_formats": [".mov", ".r3d", ".mxf", ".avi"],
28      "stages": [
29        {
30          "name": "collector",
31          "plugins":[
32            {
33              "name": "Collect components from context",
34              "plugin": "common_context_loader_collector"
35            }
36          ]
37        },
38        {
39          "name": "importer",
40          "plugins":[
41            {
42              "name": "Import reviewable to Maya",
43              "plugin": "maya_render_loader_importer",
44              "options": {
45                "camera_name": "persp"
46              }
47            }
48          ]
49        },
50        {
51          "name": "post_importer",
52          "plugins":[
53            {
54              "name": "maya",
55              "plugin": "common_passthrough_loader_post_importer"
56            }
57          ]
58        }
59      ]
60    }
61  ],
62  "finalizers": [
63    {
64      "name": "main",
65      "stages": [
66        {
67          "name": "pre_finalizer",
68          "visible": false,
69          "plugins":[
70            {
71              "name": "Pre finalizer",
72              "plugin": "common_passthrough_loader_pre_finalizer"
73            }
74          ]
75        },
76        {
77          "name": "finalizer",
78          "visible": false,
79          "plugins":[
80            {
81              "name": "Finalizer",
82              "plugin": "common_passthrough_loader_finalizer"
83            }
84          ]
85        },
86        {
87          "name": "post_finalizer",
88          "visible": false,
89          "plugins":[
90            {
91              "name": "Post finalizer",
92              "plugin": "common_passthrough_loader_post_finalizer"
93            }
94          ]
95        }
96      ]
97    }
98  ]
99}

Definition breakdown:

  • name; We follow the Framework naming convention here.

  • asset_type: Change here if quicktimes are published onto a different custom asset type than render.

  • component name; The name of loadable components on an asset version.

  • component file formats/types; List of file format extensions supported by the loader plugin.

  • importer plugin; Here we reference the new maya_render_loader_importer that we are about to write.

  • importer plugin options; In the options we expose a camera_name attribute, which will be an option that the user can change.

Render importer plugin

Finally we implement a new importer plugin:

mypipeline/ftrack-connect-pipeline-definitions/ftrack-connect-pipeline-definition/resource/plugins/maya/python/loader/importers/maya_render_loader_importer.py

 1# :coding: utf-8
 2# :copyright: Copyright (c) 2014-2022 ftrack
 3
 4import maya.cmds as cmds
 5
 6from ftrack_connect_pipeline_maya import plugin
 7import ftrack_api
 8
 9
10class MayaRenderLoaderImporterPlugin(plugin.MayaLoaderImporterPlugin):
11    '''Maya Quicktime importer plugin'''
12
13    plugin_name = 'maya_render_loader_importer'
14
15    def run(self, context_data=None, data=None, options=None):
16        '''Load alembic files pointed out by collected paths supplied in *data*'''
17
18        results = {}
19
20        camera_name = options.get('camera_name', 'persp')
21        paths_to_import = []
22        for collector in data:
23            paths_to_import.extend(collector['result'])
24
25        for component_path in paths_to_import:
26            self.logger.debug(
27                'Importing path "{}" as image plane to camera "{}"'.format(
28                    component_path, camera_name
29                )
30            )
31            imagePlane = cmds.imagePlane(
32                camera=camera_name, fileName=component_path
33            )
34            cmds.setAttr('{}.type'.format(imagePlane[0]), 2)
35            cmds.setAttr('{}.useFrameExtension'.format(imagePlane[0]), True)
36
37            self.logger.info(
38                'Imported "{}" to {}.'.format(component_path, imagePlane[0])
39            )
40
41            results[component_path] = imagePlane[0]
42
43        return results
44
45
46def register(api_object, **kw):
47    if not isinstance(api_object, ftrack_api.Session):
48        # Exit to avoid registering this plugin again.
49        return
50    plugin = MayaRenderLoaderImporterPlugin(api_object)
51    plugin.register()

Plugin breakdown:

  • plugin_name; The name of the plugin, have to match the name used within the definition.

  • run function; The function that will be run during load in the ftrack Assembler.

Custom publisher plugin

Writing a custom publisher is very similar to writing a loader, the big difference is that you also will have to write a publisher collector that collects which objects within the DCC to publish, and also decide on component name and file format extension.

In this tutorial, we will not provide any example publisher code. Instead we refer to the extensive set of built-in publishers for inspiration.

Post Slack message on publish

Next, we implement a custom finaliser within Maya - a small function that posts a Slack message containing the version ident, the comment and a thumbnail (can be replaced with the full Quicktime reviewable movie if desired), with each publish made.

Functions that run after load and import are called “finalizers” and they are fed all the data from the previous steps and stages.

Save copy of thumbnail image

As a preparation, we need to have the thumbnail publisher to save a copy to be used with the Slack post:

PIPELINE/ftrack-connect-pipeline-definitions/resource/plugins/maya/python/publisher/exporters/maya_thumbnail_publisher_exporter.py

 1import os
 2import shutil
 3
 4..
 5
 6def run(self, context_data=None, data=None, options=None):
 7    ..
 8    # Make a copy of the thumbnail to be used with Slack post
 9    path_slack_thumbnail = os.path.join(os.path.dirname(path), 'slack-{}'.format(os.path.basename(path)))
10    shutil.copy(path, path_slack_thumbnail)
11
12    return [path]

Finaliser

PIPELINE/ftrack-connect-pipeline-definitions/resource/plugins/common/python/publisher/finalisers/common_slack_post_publisher_finalizer.py

 1# :coding: utf-8
 2# :copyright: Copyright (c) 2014-2022 ftrack
 3import os
 4
 5from slack import WebClient
 6
 7import ftrack_api
 8
 9from ftrack_connect_pipeline import plugin
10
11
12class CommonSlackPublisherFinalizerPlugin(plugin.PublisherPostFinalizerPlugin):
13
14    plugin_name = 'common_slack_publisher_finalizer'
15
16    SLACK_CHANNEL = 'test'
17
18    def run(self, context_data=None, data=None, options=None):
19
20        # Harvest publish data
21        reviewable_path = asset_version_id = None
22        for component_data in data:
23            if component_data['name'] == 'thumbnail':
24                for output in component_data['result']:
25                    if output['name'] == 'exporter':
26                        reviewable_path = output['result'][0]['result'][0]
27            elif component_data['name'] == 'main':
28                for step in component_data['result']:
29                    if step['name'] == 'finalizer':
30                        asset_version_id = step['result'][0]['result'][
31                            'asset_version_id'
32                        ]
33                        component_names = step['result'][0]['result'][
34                            'component_names'
35                        ]
36
37        # Fetch version
38        version = self.session.query(
39            'AssetVersion where id={}'.format(asset_version_id)
40        ).one()
41
42        # Fetch path to thumbnail
43        if reviewable_path:
44            # Assume it is on the form /tmp/tmp7vlg8kv5.jpg.0000.jpg, locate our copy
45            reviewable_path = os.path.join(
46                os.path.dirname(reviewable_path),
47                'slack-{}'.format(os.path.basename(reviewable_path)),
48            )
49
50        client = WebClient("<slack-api-key>")
51
52        ident = '|'.join(
53            [cl['name'] for cl in version['asset']['parent']['link']]
54            + [version['asset']['name'], 'v%03d' % (version['version'])]
55        )
56
57        if reviewable_path:
58            self.logger.info(
59                'Posting Slack message "{}" to channel {}, attaching reviewable "{}"'.format(
60                    ident, self.SLACK_CHANNEL, reviewable_path
61                )
62            )
63            try:
64                response = client.files_upload(
65                    channels=self.SLACK_CHANNEL,
66                    file=reviewable_path,
67                    title=ident,
68                    initial_comment=version['comment'],
69                )
70            finally:
71                os.remove(reviewable_path)  # Not needed anymore
72        else:
73            # Just post a message
74            self.logger.info(
75                'Posting Slack message "{}" to channel {}, without reviewable'.format(
76                    ident, self.SLACK_CHANNEL
77                )
78            )
79            client.chat_postMessage(channel=self.SLACK_CHANNEL, text=ident)
80        if response.get('ok') is False:
81            raise Exception(
82                'Slack file upload failed! Details: {}'.format(response)
83            )
84
85        return {}
86
87
88def register(api_object, **kw):
89    if not isinstance(api_object, ftrack_api.Session):
90        # Exit to avoid registering this plugin again.
91        return
92    plugin = CommonSlackPublisherFinalizerPlugin(api_object)
93    plugin.register()

Breakdown of plugin:

  • With the data argument, the finaliser gets passed on the result from the entire publish process. From this data we harvest the temporary path to thumbnail and asset version id.

  • We transcode the path so we locate the thumbnail copy.

  • A Slack client API session is created

  • An human readable asset version identifier is compiled

  • If a thumbnail were found, it is uploaded to Slack. A standard chat message is posted otherwise.

Add Slack finaliser to publishers

Finally we augment the publishers that we wish to use.

PIPELINE/ftrack-connect-pipeline-definition/resource/definitions/publisher/maya/geometry-maya-publish.json

 1{
 2  "type": "publisher",
 3  "name": "Geometry Publisher",
 4  "contexts": [],
 5  "components": [],
 6  "finalizers": [
 7    {
 8      "name": "main",
 9      "stages": [
10        {
11          "name": "pre_finalizer",
12          "visible": false,
13          "plugins":[
14            {
15              "name": "Pre publish to ftrack server",
16              "plugin": "common_passthrough_publisher_pre_finalizer"
17            }
18          ]
19        },
20        {
21          "name": "finalizer",
22          "visible": false,
23          "plugins":[
24            {
25              "name": "Publish to ftrack server",
26              "plugin": "common_passthrough_publisher_finalizer"
27            }
28          ]
29        },
30        {
31          "name": "post_finalizer",
32          "visible": true,
33          "plugins":[
34            {
35              "name": "Post slack message",
36              "plugin": "common_slack_publisher_finalizer"
37            }
38          ]
39        }
40      ]
41    }
42  ]
43}

Repeat this for all publishers that should have the finaliser.

Add Slack library

To be able to use the Slack Python API, we need to add it to our Framework build. We do that by adding the dependency to setup.py:

ftrack-connect-pipeline-definition/setup.py

 1..
 2
 3# Configuration.
 4setup(
 5    name='ftrack-connect-pipeline-definition',
 6    description='Collection of definitions of package and packages.',
 7    long_description=open(README_PATH).read(),
 8    keywords='ftrack',
 9    url='https://bitbucket.org/ftrack/ftrack-connect-pipeline-definition',
10    author='ftrack',
11    author_email='support@ftrack.com',
12    license='Apache License (2.0)',
13    packages=find_packages(SOURCE_PATH),
14    package_dir={'': 'source'},
15    python_requires='<3.10',
16    use_scm_version={
17        'write_to': 'source/ftrack_connect_pipeline_definition/_version.py',
18        'write_to_template': version_template,
19        'version_scheme': 'post-release',
20    },
21    setup_requires=[
22        'sphinx >= 1.8.5, < 4',
23        'sphinx_rtd_theme >= 0.1.6, < 2',
24        'lowdown >= 0.1.0, < 2',
25        'setuptools>=44.0.0',
26        'setuptools_scm',
27        'slackclient'
28    ],
29    install_requires=[
30        'slackclient'
31    ],
32    tests_require=['pytest >= 2.3.5, < 3'],
33    cmdclass={'test': PyTest, 'build_plugin': BuildPlugin},
34    zip_safe=False,
35)

Important

A better approach is to add the dependency to the ftrack-connect-pipeline module where the other pipeline dependencies are defined and built.

Add a custom tool the the ftrack menu within Maya

In this section we demonstrate how to add your own studio tool to the Maya plug-in, which in this case updates the status of the task you have launched to “In progress”. We add its menu item to the ftrack menu in userSetup.py:

mypipeline/ftrack-connect-pipeline-maya/resource/scripts/userSetup.py

 1..
 2
 3def initialise():
 4
 5    ..
 6
 7    maya_utils.init_maya()
 8
 9    cmds.menuItem(
10        parent=ftrack_menu,
11        label='In Progress',
12        command=(functools.partial(maya_utils.set_task_status, 'in progress', session, logger))
13    )
14
15    maya_utils.scene_open(session, logger)

In DCC custom_commands.py, we add the corresponding set_task_status function:

mypipeline/ftrack-connect-pipeline-maya/source/ftrack_connect_pipeline_maya/utils/custom_commands.py

 1
 2def set_task_status(status_name, session, logger, unused_arg=None):
 3    '''Change the status of the launched task to *status*'''
 4    task = session.query(
 5        'Task where id={}'.format(os.environ['FTRACK_CONTEXTID'])
 6    ).one()
 7    status = session.query('Status where name="{}"'.format(status_name)).one()
 8    logger.info(
 9        'Changing status of task {} to {}'.format(task['name'], status_name)
10    )
11    task['status'] = status
12    session.commit()

Customising DCC launch

Here we are going to demonstrate how to disable Maya launcher for the compositing(task type) department, we do this by modifying the Maya launcher hook:

mypipeline/ftrack-connect-pipeline-maya/hook/discover_maya.py

 1..
 2
 3def on_discover_maya_pipeline(session, event):
 4    from ftrack_connect_pipeline_maya import __version__ as integration_version
 5
 6    data = {
 7        'integration': {
 8            "name": 'ftrack-connect-pipeline-maya',
 9            'version': integration_version
10        }
11    }
12
13    # Disable maya launch on certain task types
14    task = session.get('Context', selection[0]['entityId'])
15
16    if task['type']['name'].lower() in ['compositing']:
17        # Do not show Maya launchers compositing task launch
18        data['integration']['disable'] = True
19
20..

The implementation is pretty straightforward, as Connect emits a discover event, the maya hook checks the task type and disables the launcher accordingly.

Within the same hook, you can also augment the environment variables sent to Nuke.

Changing paths to DCC executables and arguments

In some cases, you might want to change the location of DCC executables or the arguments passed on. This is done by modifying the configs within configuration of the ftrack-application-launcher module.

In this example we are going to change the Windows location of Maya and add an argument:

ftrack-application-launcher/resource/config/maya-pipeline.json

 1{
 2    "priority":100,
 3    "context": ["Task"],
 4    "identifier": "ftrack-connect-launch-maya",
 5    "applicationIdentifier":"maya_{variant}",
 6    "integrations": {
 7        "pipeline":[
 8            "ftrack-connect-pipeline-definition",
 9            "ftrack-connect-pipeline",
10            "ftrack-connect-pipeline-qt",
11            "ftrack-connect-pipeline-maya"
12        ]
13    },
14    "label": "Maya",
15    "icon": "maya",
16    "variant": "{version}",
17    "search_path":{
18        "linux": {
19            "prefix":["/", "usr","autodesk","maya.+"],
20            "expression":["bin","maya$"],
21            "version_expression": "maya(?P<version>\\d{4})"
22        },
23        "windows": {
24            "prefix":["E:\\", "Program Files.*"],
25            "expression":["Autodesk", "Maya.+", "bin", "maya.exe"],
26            "launch_arguments": ["-pythonver","2"]
27        },
28        "darwin": {
29            "prefix":["/", "Applications"],
30            "expression": ["Autodesk", "maya.+", "Maya.app"]
31        }
32    },
33    "console": true
34 }

The changes we have done:

  • Changed the Windows Maya executable base path to E: instead of C:

  • Added the arguments “-pythonver 2” to Maya.

References

Find full documentation on how to create a launcher here: ftrack Application Launcher developer documentation

Deploy the customised pipeline within studio

Before we can start using our custom pipeline, we want to make sure Maya can be launched using our customised framework through Connect.

Create and activate an virtual environment

To be able to build the framework integrations, we need to create a Python 3.7.12 virtual environment:

  1. Download Python 3.7.12 (https://www.python.org/downloads/release/python-3712/)

  2. Open a shell/DOS box and install virtual env: pip install virtualenv

  3. Create the virtual environment: virtualenv venv_3712

  4. Activate it: venv_py3712\Scripts\activate

Build the integrations

We build each integration using this virtual env:

$ cd mypipeline\ftrack-connect-pipeline-definition
$ python setup.py build_plugin

We repeat this for the ftrack-connect-pipeline-maya repository.

The built plugin will end up in the build/ folder.

Install the integrations on another machine

Before we deploy centrally, we advise testing integrations on a separate machine, and iron out eventual bugs with rigorous testing.

Copy the integrations from each build/ folder to the Connect default plug-in search path, overwriting the existing plugins:

  1. Windows: C:Documents and Settings<User>Application DataLocal Settingsftrackftrack-connect-plugins

  2. Linux: ~/.local/share/ftrack-connect-plugins

  3. OSX: ~/Library/Application Support/ftrack-connect-plugins

Also copy the custom folder structure, custom-location-plugin folder.

Restart ftrack Connect and verify that all plugins are installed, including ftrack-application-launcher, ftrack-connect-pipeline-qt, ftrack-connect-application-launcher-widget.

Launch Maya on an animation task, verify the publish and load functions together with the task status set tool.

Deploy the integrations studio wide

Here we make sure all on-prem workstations load the same centrally deployed integrations:

  1. Pick a folder on the network (e.g. \\server\share\PIPELINE\connect)

  2. Move all integrations from local test deploy to this folder.

  3. Move the custom location plug-in to a separate folder (e.g. \\server\share\PIPELINE\api)

  4. Setup FTRACK_CONNECT_PLUGIN_PATH to point to these folders (e.g. FTRACK_CONNECT_PLUGIN_PATH=\\server\share\PIPELINE\connect;\\server\share\PIPELINE\api)

  5. Set FTRACK_EVENT_PLUGIN_PATH to point to the custom location structure this enables our custom folder structure within all API sessions (e.g. FTRACK_EVENT_PLUGIN_PATH=\\server\share\PIPELINE\api).

Important

Make sure that the default builds of your customised integration plugins are removed from Connect installed at workstations across your studio (see local folders above), otherwise they will collide.

Standalone

This section describes how to use the pipeline Framework in standalone mode, from within the DCC application or outside.

Python Example

This is an example on how to run the framework in a python console without Connect or any DCC running on the background, this way the framework is able to discover any definition where the host type is python.

mypipeline/standalone-snippets/python_standalone_publish_snippet.py

 1import os
 2from ftrack_connect_pipeline import host, constants, event
 3import ftrack_api
 4
 5# Set the minimum required Environment variables.
 6os.environ['FTRACK_EVENT_PLUGIN_PATH'] = (
 7    '<path-to-your-repo-folder>/ftrack-connect-pipeline-definition/resource/plugins:'
 8    '<path-to-your-repo-folder>/ftrack-connect-pipeline-definition/resource/definitions:'
 9)
10
11# Create a session and Event Manager
12session = ftrack_api.Session(auto_connect_event_hub=False)
13event_manager = event.EventManager(
14    session=session, mode=constants.LOCAL_EVENT_MODE
15)
16
17# Init host
18host_class = host.Host(event_manager)
19
20# Init Client
21from ftrack_connect_pipeline import client
22
23client_connection = client.Client(event_manager)
24
25# Discover hosts
26client_connection.discover_hosts()
27
28# Print discovered hosts
29# client_connection.host_connections
30
31# Setup a host
32client_connection.change_host(client_connection.host_connections[0])
33
34# Set a context id
35# You can choose to set the context id in the host or in the client,
36# booth ways will work.
37host_class.context_id = '<your-context-id>'
38# client_connection.context_id = '<your-context-id>'
39
40# Select the File Publisher definition
41definition = client_connection.host_connection.definitions[
42    'publisher'
43].get_all(name='File Publisher')[0]
44
45# Assign the definition to the client
46client_connection.change_definition(definition)
47
48# Make the desired changes:
49collector_plugins = definition.get_all(category='plugin', type='collector')
50collector_plugins[0].options.path = '<Path-to-the-file-to-publish>'
51collector_plugins[1].options.path = '<Path-to-the-file-to-publish>'
52
53# Execute the definition.
54client_connection.run_definition()
55
56# You could now make more changes and run the definition again to publish a new version.
57# collector_plugins = definition.get_all(category='plugin', type='collector')
58# collector_plugins[0].options.path='<Path-to-another-file-to-publish>'
59# collector_plugins[1].options.path='<Path-to-another-file-to-publish>'
60# client_connection.run_definition()

DCC Maya Example

This is an example on how to run the framework inside the maya console. All the definitions with host_type maya and python will be discovered.

Warning

DCC launch is subject to be improved in future releases of the framework, making it possible to share launcher with Connect to enable consistent and context aware framework setup. For now we highly recommend to launch DCC from connect to avoid having to manually setup all the environment variables. Please refer to the Discover Framework from Standalone DCC section in case you want to manually set them up.

mypipeline/standalone-snippets/maya_standalone_publish_snippet.py

 1import os
 2from ftrack_connect_pipeline import constants, event
 3from ftrack_connect_pipeline_maya import host
 4import ftrack_api
 5
 6# Set the minimum required Environment variables.
 7os.environ['FTRACK_EVENT_PLUGIN_PATH'] = (
 8    '<path-to-your-repo-folder>/ftrack-connect-pipeline-definition/resource/plugins:'
 9    '<path-to-your-repo-folder>/ftrack-connect-pipeline-definition/resource/definitions:'
10)
11
12# Create a session and Event Manager
13session = ftrack_api.Session(auto_connect_event_hub=False)
14event_manager = event.EventManager(
15    session=session, mode=constants.LOCAL_EVENT_MODE
16)
17
18# Init Maya host
19host_class = host.MayaHost(event_manager)
20
21# Init Client
22from ftrack_connect_pipeline import client
23
24client_connection = client.Client(event_manager)
25
26# Discover hosts
27client_connection.discover_hosts()
28
29# Print discovered hosts
30# client_connection.host_connections
31
32# Setup a host
33client_connection.change_host(client_connection.host_connections[0])
34
35# Set a context id
36# You can choose to set the context id in the host or in the client,
37# booth ways will work.
38host_class.context_id = '<your-context-id>'
39# client_connection.context_id = '<your-context-id>'
40
41# Select the File Publisher definition
42definition = client_connection.host_connection.definitions[
43    'publisher'
44].get_all(name='Geometry Publisher')[0]
45
46# Assign the definition to the client
47client_connection.change_definition(definition)
48
49# Make the desired changes:
50collector_plugins = definition.get_all(
51    category='plugin', type='collector', name='Geometry Collector'
52)
53collector_plugins[0].options.collected_objects = ['pCube']
54collector_plugins[1].options.collected_objects = ['pCube']
55collector_plugins[2].options.collected_objects = ['pCube']
56
57# Execute the definition.
58client_connection.run_definition()

Discover Framework from Standalone DCC

These are the necessary environment variables that has to be setup for the framework to be discovered in a DCC application without launching from connect.

Warning

This is a maya example. Please replace maya and the plugin version for your desired DCC and plugin version.

Required Environment Variables:

Name

Values

PYTHONPATH

<your-local-path-to>/ftrack-connect-plugins/ftrack-connect-pipeline-maya-1.0.2/dependencies;
<your-local-path-to>/ftrack-connect-plugins/ftrack-connect-pipeline-maya-1.0.2/resource/scripts;
<your-local-path-to>/ftrack-connect-plugins/ftrack-connect-pipeline-qt-1.0.3/dependencies;
<your-local-path-to>/ftrack-connect-plugins/ftrack-connect-pipeline-1.0.4/dependencies;
<your-local-path-to>/ftrack/ftrack-connect-plugins/ftrack-connect-pipeline-definition-1.0.3/dependencies;
<your-local-path-to>/ftrack/ftrack-connect-plugins/ftrack-application-launcher-1.0.6/dependencies;

FTRACK_EVENT_PLUGIN_PATH

<your-local-path-to>/ftrack-connect-plugins/ftrack-connect-pipeline-definition-1.0.3/resource/plugins/maya/python;
<your-local-path-to>/ftrack-connect-plugins/ftrack-connect-pipeline-definition-1.0.3/resource/plugins/qt/python;
<your-local-path-to>/ftrack-connect-plugins/ftrack-connect-pipeline-definition-1.0.3/resource/plugins/common/python;
<your-local-path-to>/ftrack-connect-plugins/ftrack-connect-pipeline-definition-1.0.3/resource/definitions;

FTRACK_DEFINITION_PLUGIN_PATH

<your-local-path-to>/ftrack-connect-plugins/ftrack-connect-pipeline-definition-1.0.3/resource/plugins

MAYA_SCRIPT_PATH

<your-local-path-to>/ftrack-connect-plugins/ftrack-connect-pipeline-maya-1.0.2/resource/scripts

API Reference

ftrack_connect_pipeline

ftrack_connect_pipeline.asset

class ftrack_connect_pipeline.asset.FtrackObjectManager(event_manager)[source]

Bases: object

FtrackObjectManager class. Mantain the syncronization between asset_info and the ftrack information of the objects in the scene.

class DccObject(name=None, from_id=None, **kwargs)

Bases: dict

Base DccObject class.

__init__(name=None, from_id=None, **kwargs)

If the from_id is provided find an object in the dcc with the given from_id as assset_info_id. If a name is provided create a new object in the dcc.

connect_objects(objects)

Link the given objects ftrack attribute to the self name object asset_link attribute in the DCC.

objects List of DCC objects

create(name)

Creates a new dcc_object with the given name.

static dictionary_from_object(object_name)

Static method to be used without initializing the current class. Returns a dictionary with the keys and values of the given object_name if exists.

object_name ftrack object type from the DCC.

from_asset_info_id(asset_info_id)

Checks the dcc to get all the ftrack objects. Compares them with the given asset_info_id and returns them if matches.

ftrack_plugin_id = None

Plugin id used on some DCC applications

get(k, default=None)

If exists, returns the value of the given k otherwise returns default.

k : Key of the current dictionary.

default : Default value of the given Key.

property name

Return name of the object

property objects_loaded

Returns the attribute objects_loaded of the current self name

setdefault(key, value=None)

Sets a default value for the given key.

update(*args, **kwargs)

Updates the current keys and values with the given ones.

property asset_info

Returns instance of FtrackAssetInfo

property dcc_object

Returns instance of DccObject

property session

Returns instance of ftrack_api.session.Session

property event_manager

Returns instance of EventManager

property is_sync

Returns if the self dcc_object is sync with the self asset_info

property objects_loaded

Returns whether the objects are loaded in the scene or not.

__init__(event_manager)[source]

Initialize FtrackObjectManager with instance of EventManager

connect_objects(objects)[source]

Link the given objects ftrack attribute to the self dcc_object.

objects List of objects

create_new_dcc_object()[source]

Creates a new dcc_object with a unique name.

ftrack_connect_pipeline.constants.asset.asset_info
ftrack_connect_pipeline.asset.asset_info.generate_asset_info_dict_from_args(context_data, data, options, session)[source]

Returns a dictionary constructed from the needed values of the given context_data, data and options

context_data : Context dictionary of the current asset. Should contain the keys asset_type_name, asset_name, asset_id, version_number, version_id, context_id.

data : Data of the current operation or plugin. Should contain the component_path from the asset that we are working on.

options : Options of the current widget or operation, should contain the load_mode that we want to/or had apply to the current asset.

session : should be instance of ftrack_api.session.Session to use for communication with the server.

class ftrack_connect_pipeline.asset.asset_info.FtrackAssetInfo(mapping=None, **kwargs)[source]

Bases: dict

Base FtrackAssetInfo class.

__init__(mapping=None, **kwargs)[source]

Initialize the FtrackAssetInfo with the given mapping.

mapping Dictionary with the asset information.

encode_options(asset_info_options)[source]

Encodes the json value from the given asset_info_opitons to base64.

asset_info_opitons : Options used to load the asset in the scene.

decode_options(asset_info_options)[source]

Decodes the json value from the given asset_info_opitons from base64.

asset_info_opitons : Options used to load the asset in the scene.

get(k, default=None)[source]

If exists, returns the value of the given k otherwise returns default.

k : Key of the current dictionary.

default : Default value of the given Key.

setdefault(k, default=None)[source]

Sets the default value for the given k.

k : Key of the current dictionary.

default : Default value of the given Key.

classmethod from_version_entity(version_entity, component_name)[source]

Returns an FtrackAssetInfo object generated from the given ftrack_version and the given component_name

ftrack_version : ftrack_api.entity.asset_version.AssetVersion

component_name : Component name

ftrack_connect_pipeline.constants.asset.dcc_object
class ftrack_connect_pipeline.asset.dcc_object.DccObject(name=None, from_id=None, **kwargs)[source]

Bases: dict

Base DccObject class.

ftrack_plugin_id = None

Plugin id used on some DCC applications

property name

Return name of the object

property objects_loaded

Returns the attribute objects_loaded of the current self name

__init__(name=None, from_id=None, **kwargs)[source]

If the from_id is provided find an object in the dcc with the given from_id as assset_info_id. If a name is provided create a new object in the dcc.

get(k, default=None)[source]

If exists, returns the value of the given k otherwise returns default.

k : Key of the current dictionary.

default : Default value of the given Key.

update(*args, **kwargs)[source]

Updates the current keys and values with the given ones.

setdefault(key, value=None)[source]

Sets a default value for the given key.

create(name)[source]

Creates a new dcc_object with the given name.

from_asset_info_id(asset_info_id)[source]

Checks the dcc to get all the ftrack objects. Compares them with the given asset_info_id and returns them if matches.

static dictionary_from_object(object_name)[source]

Static method to be used without initializing the current class. Returns a dictionary with the keys and values of the given object_name if exists.

object_name ftrack object type from the DCC.

connect_objects(objects)[source]

Link the given objects ftrack attribute to the self name object asset_link attribute in the DCC.

objects List of DCC objects

ftrack_connect_pipeline.client

class ftrack_connect_pipeline.client.HostConnection(event_manager, host_data)[source]

Bases: object

Host Connection Base class. This class is used to communicate between the client and the host.

property context_id

Returns the current context id as fetched from the host

property event_manager

Returns instance of EventManager

property session

Returns instance of ftrack_api.session.Session

property definitions

Returns the current definitions, filtered on discoverable.

property id

Returns the current host id.

property name

Returns the current host name.

property host_types

Returns the list of compatible host for the current definitions.

__init__(event_manager, host_data)[source]

Initialise HostConnection with instance of EventManager , and host_data

host_data : Dictionary containing the host information. provide_host_information()

run(data, engine, callback=None)[source]

Publish an event with the topic PIPELINE_HOST_RUN with the given data and engine.

launch_client(name, source=None)[source]

Send a widget launch event, to be picked up by DCC.

subscribe_host_context_change()[source]

Have host connection subscribe to context change events, to be able to notify client

change_host_context_id(context_id)[source]
class ftrack_connect_pipeline.client.Client(event_manager, multithreading_enabled=True)[source]

Bases: object

Base client class.

ui_types = [None]

Compatible UI for this client.

definition_filters = None

Use only definitions that matches the definition_filters

definition_extensions_filter = None

(Open) Only show definitions and components capable of accept these filename extensions.

property session

Returns instance of ftrack_api.session.Session

property event_manager

Returns instance of EventManager

property connected

Returns True if client is connected to a HOST

property context_id

Returns the current context id from host

property context

Returns the current context

property host_connections

Return the current list of host_connections

property host_connection

Return instance of HostConnection

property schema

Return the current schema.

property definition

Returns the current definition.

property definitions

Returns the definitions list of the current host connection

property engine_type

Return the current engine type

property logs

Return the log items

property multithreading_enabled

Return True if DCC supports multithreading (write operations)

__init__(event_manager, multithreading_enabled=True)[source]

Initialise Client with instance of EventManager

discover_hosts(force_rediscover=False, time_out=3)[source]

Find for available hosts during the optional time_out and Returns a list of discovered HostConnection.

Skip this and use existing singleton host connection if previously detected, unless force_rediscover is True.

filter_host(host_connection)[source]

Return True if the host_connection should be considered

host_connection: ftrack_connect_pipeline.client.HostConnection

change_host(host_connection)[source]

Client(user) has chosen the host connection to use, set it to host_connection

on_hosts_discovered(host_connections)[source]

Callback, hosts has been discovered. To be overridden by the qt client

on_host_changed(host_connection)[source]

Called when the host has been (re-)selected by the user. To be overridden by the qt client.

subscribe_host_context_change()[source]

Have host connection subscribe to context change events, to be able to notify client

on_context_changed(context_id)[source]

Called when the context has been set or changed within the host connection, either from this client or remote (other client or the host). Should be overridden by client.

unsubscribe_host_context_change()[source]

Unsubscribe to client context change events

run_definition(definition=None, engine_type=None)[source]

Calls the run() to run the entire given definition with the given engine_type.

Callback received at _run_callback()

get_schema_from_definition(definition)[source]

Return matching schema for the given definition

change_definition(definition, schema=None)[source]

Assign the given schema and the given definition as the current schema and definition

run_plugin(plugin_data, method, engine_type)[source]

Calls the run() to run one single plugin.

Callback received at _run_callback()

plugin_data : Dictionary with the plugin information.

method : method of the plugin to be run

on_ready(callback, time_out=3)[source]

calls the given callback method when host is been discovered with the optional time_out

change_engine(engine_type)[source]

Assign the given engine_type as the current engine_type

on_client_notification()[source]

Subscribe to topic PIPELINE_CLIENT_NOTIFICATION to receive client notifications from the host in _notify_client()

ftrack_connect_pipeline.client.asset_manager
class ftrack_connect_pipeline.client.asset_manager.AssetManagerClient(event_manager, multithreading_enabled=True)[source]

Bases: Client

Asset Manager Client Base Class

definition_filters = ['asset_manager']

Use only definitions that matches the definition_filters

__init__(event_manager, multithreading_enabled=True)[source]

Initialise AssetManagerClient with instance of EventManager

on_host_changed(host_connection)[source]

Asset manager host has been selected, fetch definition. Return False if no definitions.

resolve_dependencies(context_id, resolve_dependencies_callback, options=None)[source]

Calls the run() to run the method resolve_dependencies() To fetch list of version dependencies on the given context_id.

Callback received resolve_dependencies_callback

context_id : Should be the ID of an existing task.

resolve_dependencies_callback : Callback function that should take the result as argument.

options : The options to supply to the plugin.

discover_assets(plugin=None)[source]

Calls the ftrack_connect_pipeline.client.HostConnection.run() to run the method ftrack_connect_pipeline.host.engine.AssetManagerEngine.discover_assets()

Callback received at _asset_discovered_callback()

plugin : Optional plugin to be run in the method. (Not implremented yet)

load_assets(asset_info_list)[source]

Calls the run() to run the method load_assets() To load the assets of the given asset_info_list.

Callback received at _load_assets_callback()

asset_info_list : Should a list pf be instances of FtrackAssetInfo

select_assets(asset_info_list)[source]

Calls the run() to run the method select_assets() To select the assets of the given asset_info_list

asset_info_list : Should a list pf be instances of FtrackAssetInfo

update_assets(asset_info_list, plugin)[source]

Calls the run() to run the method update_assets() To update to the last version the assets of the given asset_info_list.

Callback received at _update_assets_callback()

asset_info_list : Should a list pf be instances of FtrackAssetInfo

plugin : The plugin definition of the plugin to run during the update_assets method

change_version(asset_info, new_version_id)[source]

Calls the ftrack_connect_pipeline.client.HostConnection.run() to run the method ftrack_connect_pipeline.host.engine.AssetManagerEngine.change_version() To change the version of the given asset_info.

Callback received at _change_version_callback()

asset_info : Should be instance of ftrack_connect_pipeline.asset.FtrackAssetInfo

new_version_id : Should be an AssetVersion id.

unload_assets(asset_info_list)[source]

Calls the run() to run the method unload_assets() To unload the assets of the given asset_info_list.

Callback received at _unload_assets_callback()

asset_info_list : Should a list pf be instances of FtrackAssetInfo

remove_assets(asset_info_list)[source]

Calls the run() to run the method remove_assets() To remove the assets of the given asset_info_list.

Callback received at _remove_assets_callback()

asset_info_list : Should a list pf be instances of FtrackAssetInfo

ftrack_connect_pipeline.client.loader_loader
class ftrack_connect_pipeline.client.loader.LoaderClient(event_manager, multithreading_enabled=True)[source]

Bases: Client

Loader Client Base Class

definition_filters = ['loader']

Use only definitions that matches the definition_filters

__init__(event_manager, multithreading_enabled=True)[source]

Initialise OpenerClient with instance of EventManager

ftrack_connect_pipeline.client.opener_opener
class ftrack_connect_pipeline.client.opener.OpenerClient(event_manager)[source]

Bases: Client

Opener Client Base Class

definition_filters = ['opener']

Use only definitions that matches the definition_filters

__init__(event_manager)[source]

Initialise OpenerClient with instance of EventManager

ftrack_connect_pipeline.client.publisher_publisher
class ftrack_connect_pipeline.client.publisher.PublisherClient(event_manager)[source]

Bases: Client

Publisher Client Base Class

definition_filters = ['publisher']

Use only definitions that matches the definition_filters

__init__(event_manager)[source]

Initialise PublisherClient with instance of EventManager

ftrack_connect_pipeline.client.log_viewer
class ftrack_connect_pipeline.client.log_viewer.LogViewerClient(event_manager)[source]

Bases: Client

Log Viewer Client Base Class

definition_filters = ['log_viewer']

Use only definitions that matches the definition_filters

__init__(event_manager)[source]

Initialise LogViewerClient with instance of EventManager

ftrack_connect_pipeline.constants

ftrack_connect_pipeline.constants.UI_TYPE = None

Default ui type for ftrack_connect_pipeline

ftrack_connect_pipeline.constants.HOST_TYPE = 'python'

Default host type for ftrack_connect_pipeline

ftrack_connect_pipeline.constants.STEP = 'step'

Step Category.

ftrack_connect_pipeline.constants.STAGE = 'stage'

Stage Category.

ftrack_connect_pipeline.constants.PLUGIN = 'plugin'

Plugin Category.

ftrack_connect_pipeline.constants.CONTEXTS = 'contexts'

Contexts step group.

ftrack_connect_pipeline.constants.FINALIZERS = 'finalizers'

Finalizers step group.

ftrack_connect_pipeline.constants.COMPONENTS = 'components'

Components step group.

ftrack_connect_pipeline.constants.CONTEXT = 'context'

Contexts step type.

ftrack_connect_pipeline.constants.FINALIZER = 'finalizer'

Finalizers step type.

ftrack_connect_pipeline.constants.COMPONENT = 'component'

Components step type.

ftrack_connect_pipeline.constants.COLLECTOR = 'collector'

Collector component stage name.

ftrack_connect_pipeline.constants.VALIDATOR = 'validator'

Validator component stage name.

ftrack_connect_pipeline.constants.EXPORTER = 'exporter'

Output component stage name.

ftrack_connect_pipeline.constants.IMPORTER = 'importer'

Importer component stage name.

ftrack_connect_pipeline.constants.POST_IMPORTER = 'post_importer'

Post_import component stage name.

ftrack_connect_pipeline.constants.OPENER = 'opener'

Opener client and its definition.

ftrack_connect_pipeline.constants.LOADER = 'loader'

Loader client and its definition used with assembler

ftrack_connect_pipeline.constants.PUBLISHER = 'publisher'

Publisher client and its definition.

ftrack_connect_pipeline.constants.PIPELINE_REGISTER_TOPIC = 'ftrack.pipeline.register'

Pipeline register topic event. Published by the Host and used to register the definitions module. Definitions Docs

ftrack_connect_pipeline.constants.PIPELINE_RUN_PLUGIN_TOPIC = 'ftrack.pipeline.run'

Pipeline run plugin topic event. Used to run the plugins. Published in change_version() and _run_plugin(). Subscribed to run the plugins in register()

ftrack_connect_pipeline.constants.PIPELINE_DISCOVER_PLUGIN_TOPIC = 'ftrack.pipeline.discover'

Pipeline discover plugin topic event. Used to discover the plugins. Published in _discover_plugin(), Subscribed to discover the plugins in register()

ftrack_connect_pipeline.constants.PIPELINE_HOST_RUN = 'ftrack.pipeline.host.run'

Pipeline host run plugin topic event. Used to communicate between client and host, by the host connection to make the host run the plugin. the plugins. Published in run(), and Subscribed in on_register_definition()

ftrack_connect_pipeline.constants.PIPELINE_CLIENT_NOTIFICATION = 'ftrack.pipeline.client.notification'

Pipeline client notification topic event. Used to communicate the result of the plugin execution from host to the client. Published in _notify_client(), and Subscribed in on_client_notification()

ftrack_connect_pipeline.constants.PIPELINE_CLIENT_PROGRESS_NOTIFICATION = 'ftrack.pipeline.client.progress.notification'

Pipeline client progress notification topic event. Used to communicate the result of the steps execution from host to the client. Published in _notify_progress_client(), and Subscribed in on_client_progress_notification()

ftrack_connect_pipeline.constants.PIPELINE_DISCOVER_HOST = 'ftrack.pipeline.host.discover'

Pipeline Discover host topic event. Used to discover available hosts. Published in _discover_hosts(), and Subscribed in on_register_definition()

ftrack_connect_pipeline.constants.asset
ftrack_connect_pipeline.constants.asset.DCC_OBJECT_NAME = '{}_ftrackdata_{}'

Name of the ftrack object to identify the loaded assets

ftrack_connect_pipeline.constants.asset.ASSET_ID = 'asset_id'

Asset id constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.ASSET_NAME = 'asset_name'

Asset name constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.CONTEXT_PATH = 'context_path'

context path constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.ASSET_TYPE_NAME = 'asset_type_name'

Asset type constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.VERSION_ID = 'version_id'

Version id constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.VERSION_NUMBER = 'version_number'

Version number constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.COMPONENT_PATH = 'component_path'

Component path constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.COMPONENT_NAME = 'component_name'

Component name constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.COMPONENT_ID = 'component_id'

Component id constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.LOAD_MODE = 'load_mode'

Load Mode constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.ASSET_INFO_OPTIONS = 'asset_info_options'

Asset info options constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.REFERENCE_OBJECT = 'reference_object'

Reference object constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.IS_LATEST_VERSION = 'is_latest_version'

Is Lates version constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.ASSET_INFO_ID = 'asset_info_id'

Asset info ID constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.DEPENDENCY_IDS = 'dependency_ids'

Dependency ids constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.OBJECTS_LOADED = 'objects_loaded'

Is loaded constant identifier key for ftrack assets connected or used with FtrackAssetInfo and the DCC ftrack plugin.

ftrack_connect_pipeline.constants.asset.VERSION = '1.0'

Identifier version of the asset constants and plugin.

ftrack_connect_pipeline.constants.asset.KEYS = ['asset_id', 'asset_name', 'context_path', 'asset_type_name', 'version_id', 'version_number', 'component_path', 'component_name', 'component_id', 'load_mode', 'asset_info_options', 'reference_object', 'is_latest_version', 'asset_info_id', 'dependency_ids', 'objects_loaded']

List of all the constants keys used for the FtrackAssetInfo

ftrack_connect_pipeline.constants.plugin
ftrack_connect_pipeline.constants.plugin.asset_manager
ftrack_connect_pipeline.constants.plugin.asset_manager.PLUGIN_AM_ACTION_TYPE = 'asset_manager.action'

Asset Manager plugin type for action plugins

ftrack_connect_pipeline.constants.plugin.asset_manager.PLUGIN_AM_DISCOVER_TYPE = 'asset_manager.discover'

Asset Manager plugin type for discover plugins

ftrack_connect_pipeline.constants.plugin.asset_manager.PLUGIN_AM_RESOLVE_TYPE = 'asset_manager.resolver'

Asset Manager plugin type for action plugins

ftrack_connect_pipeline.constants.plugin.open
ftrack_connect_pipeline.constants.plugin.open.PLUGIN_OPENER_FINALIZER_TYPE = 'opener.finalizer'

Opener plugin type for finalizer plugins

ftrack_connect_pipeline.constants.plugin.open.PLUGIN_OPENER_POST_FINALIZER_TYPE = 'opener.post_finalizer'

Opener plugin type for post finalizer plugins

ftrack_connect_pipeline.constants.plugin.open.PLUGIN_OPENER_PRE_FINALIZER_TYPE = 'opener.pre_finalizer'

Opener plugin type for pre finalizer plugins

ftrack_connect_pipeline.constants.plugin.open.PLUGIN_OPENER_CONTEXT_TYPE = 'opener.context'

Opener plugin type for context plugins

ftrack_connect_pipeline.constants.plugin.open.PLUGIN_OPENER_COLLECTOR_TYPE = 'opener.collector'

Opener plugin type for collector plugins

ftrack_connect_pipeline.constants.plugin.open.PLUGIN_OPENER_IMPORTER_TYPE = 'opener.importer'

Opener plugin type for importer plugins

ftrack_connect_pipeline.constants.plugin.open.PLUGIN_OPENER_POST_IMPORTER_TYPE = 'opener.post_importer'

Opener plugin type for post import plugins

ftrack_connect_pipeline.constants.plugin.load
ftrack_connect_pipeline.constants.plugin.load.PLUGIN_LOADER_FINALIZER_TYPE = 'loader.finalizer'

Loader plugin type for finalizer plugins

ftrack_connect_pipeline.constants.plugin.load.PLUGIN_LOADER_POST_FINALIZER_TYPE = 'loader.post_finalizer'

Loader plugin type for post finalizer plugins

ftrack_connect_pipeline.constants.plugin.load.PLUGIN_LOADER_PRE_FINALIZER_TYPE = 'loader.pre_finalizer'

Loader plugin type for pre finalizer plugins

ftrack_connect_pipeline.constants.plugin.load.PLUGIN_LOADER_CONTEXT_TYPE = 'loader.context'

Loader plugin type for context plugins

ftrack_connect_pipeline.constants.plugin.load.PLUGIN_LOADER_COLLECTOR_TYPE = 'loader.collector'

Loader plugin type for collector plugins

ftrack_connect_pipeline.constants.plugin.load.PLUGIN_LOADER_IMPORTER_TYPE = 'loader.importer'

Loader plugin type for importer plugins

ftrack_connect_pipeline.constants.plugin.load.PLUGIN_LOADER_POST_IMPORTER_TYPE = 'loader.post_importer'

Loader plugin type for post import plugins

ftrack_connect_pipeline.constants.plugin.publish
ftrack_connect_pipeline.constants.plugin.publish.PLUGIN_PUBLISHER_FINALIZER_TYPE = 'publisher.finalizer'

Publisher plugin type for finalizer plugins

ftrack_connect_pipeline.constants.plugin.publish.PLUGIN_PUBLISHER_POST_FINALIZER_TYPE = 'publisher.post_finalizer'

Publisher plugin type for post finalizer plugins

ftrack_connect_pipeline.constants.plugin.publish.PLUGIN_PUBLISHER_PRE_FINALIZER_TYPE = 'publisher.pre_finalizer'

Publisher plugin type for pre finalizer plugins

ftrack_connect_pipeline.constants.plugin.publish.PLUGIN_PUBLISHER_CONTEXT_TYPE = 'publisher.context'

Publisher plugin type for context plugins

ftrack_connect_pipeline.constants.plugin.publish.PLUGIN_PUBLISHER_COLLECTOR_TYPE = 'publisher.collector'

Publisher plugin type for collector plugins

ftrack_connect_pipeline.constants.plugin.publish.PLUGIN_PUBLISHER_VALIDATOR_TYPE = 'publisher.validator'

Publisher plugin type for validator plugins

ftrack_connect_pipeline.constants.plugin.publish.PLUGIN_PUBLISHER_EXPORTER_TYPE = 'publisher.exporter'

Publisher plugin type for exporters plugins

ftrack_connect_pipeline.constants.event
ftrack_connect_pipeline.constants.event.REMOTE_EVENT_MODE = 1

Run the events of the session in remote mode

ftrack_connect_pipeline.constants.event.LOCAL_EVENT_MODE = 0

Run the events of the session in local mode

ftrack_connect_pipeline.constants.status
ftrack_connect_pipeline.constants.status.UNKNOWN_STATUS = 'UNKNOWN_STATUS'

Unknown status of plugin execution.

ftrack_connect_pipeline.constants.status.SUCCESS_STATUS = 'SUCCESS_STATUS'

Succed status of plugin execution.

ftrack_connect_pipeline.constants.status.WARNING_STATUS = 'WARNING_STATUS'

Warning status of plugin execution.

ftrack_connect_pipeline.constants.status.ERROR_STATUS = 'ERROR_STATUS'

Error status of plugin execution.

ftrack_connect_pipeline.constants.status.EXCEPTION_STATUS = 'EXCEPTION_STATUS'

Exception status of plugin execution.

ftrack_connect_pipeline.constants.status.RUNNING_STATUS = 'RUNNING_STATUS'

Running status of plugin execution.

ftrack_connect_pipeline.constants.status.DEFAULT_STATUS = 'PAUSE_STATUS'

Default status of plugin execution.

ftrack_connect_pipeline.constants.status.status_bool_mapping = {'ERROR_STATUS': False, 'EXCEPTION_STATUS': False, 'PAUSE_STATUS': False, 'RUNNING_STATUS': False, 'SUCCESS_STATUS': True, 'UNKNOWN_STATUS': False, 'WARNING_STATUS': False}

Mapping of the run plugins status. Valid or non-valid result.

ftrack_connect_pipeline.definition

ftrack_connect_pipeline.definition.collect_and_validate(session, current_dir, host_types)[source]

Collects and validates the definitions and the schemas of the given host in the given current_dir.

session : instance of ftrack_api.session.Session current_dir : Directory path to look for the definitions. host : Definition host to look for.

ftrack_connect_pipeline.definition.collect
ftrack_connect_pipeline.definition.collect.resolve_schemas(data)[source]

Resolves the refs of the schemas in the given data

data : Dictionary of json definitions and schemas generated at collect_definitions()

ftrack_connect_pipeline.definition.collect.filter_definitions_by_host(data, host_types)[source]

Filter the definitions in the given data by the given host

data : Dictionary of json definitions and schemas generated at collect_definitions() host_types : List of definition host to be filtered by.

ftrack_connect_pipeline.definition.collect.collect_definitions(lookup_dir)[source]

Collect all the schemas and definitions from the given lookup_dir

lookup_dir : Directory path to look for the definitions.

ftrack_connect_pipeline.definition.validate
ftrack_connect_pipeline.definition.validate.validate_schema(data, session)[source]

Validates and aguments the definitions and the schemas from the given data

data : Dictionary of json definitions and schemas generated at collect_definitions()

ftrack_connect_pipeline.host

ftrack_connect_pipeline.host.provide_host_information(context_id, host_id, definitions, host_name, event)[source]

Returns dictionary with host id, host name, context id and definition from the given host_id, definitions and host_name.

host_id : Host id

definitions : Dictionary with a valid definitions

host_name : Host name

class ftrack_connect_pipeline.host.Host(event_manager)[source]

Bases: object

host_types = ['python']

Compatible Host types for this HOST.

engines = {'asset_manager': <class 'ftrack_connect_pipeline.host.engine.asset_manager.AssetManagerEngine'>, 'loader': <class 'ftrack_connect_pipeline.host.engine.load.LoaderEngine'>, 'opener': <class 'ftrack_connect_pipeline.host.engine.open.OpenerEngine'>, 'publisher': <class 'ftrack_connect_pipeline.host.engine.publish.PublisherEngine'>}

Available engines for this host.

property context_id

Return the the default context id set at host launch

property host_id

Returns the current host id.

property host_name

Returns the current host name

property session

Returns instance of ftrack_api.session.Session

__init__(event_manager)[source]

Initialise Host with instance of EventManager

run(event)[source]

Runs the data with the defined engine type of the givent event

Returns result of the engine run.

event : Published from the client host connection at run()

on_register_definition(event)[source]

Callback of the register() Validates the given event and subscribes to the ftrack_api.event.base.Event events with the topics PIPELINE_DISCOVER_HOST and PIPELINE_HOST_RUN

event : Should be a validated and complete definitions, schema and packages dictionary coming from ftrack_connect_pipeline_definition.resource.definitions.register.register_definitions()

validate(data)[source]

Validates the given data against the correspondant plugin validator. Returns a validated data.

data : Should be a validated and complete definitions and schemas coming from ftrack_connect_pipeline_definition.resource.definitions.register.register_definitions()

register()[source]

Publishes the ftrack_api.event.base.Event with the topic PIPELINE_REGISTER_TOPIC with the first host_type in the list host_types and type definition as the data.

Callback of the event points to on_register_definition()

reset()[source]

Empty the variables host_type, host_id and __registry

launch_client(name, source=None)[source]

Send a widget launch event, to be picked up by DCC.

ftrack_connect_pipeline.host.engine
ftrack_connect_pipeline.host.engine.getEngine(baseClass, engineType)[source]

Returns the Class or Subclass of the given baseClass that matches the name of the given engineType

class ftrack_connect_pipeline.host.engine.BaseEngine(event_manager, host_types, host_id, asset_type_name)[source]

Bases: object

Base engine class.

engine_type = 'base'

Engine type for this engine class

class FtrackObjectManager(event_manager)

Bases: object

FtrackObjectManager class. Mantain the syncronization between asset_info and the ftrack information of the objects in the scene.

class DccObject(name=None, from_id=None, **kwargs)

Bases: dict

Base DccObject class.

__init__(name=None, from_id=None, **kwargs)

If the from_id is provided find an object in the dcc with the given from_id as assset_info_id. If a name is provided create a new object in the dcc.

connect_objects(objects)

Link the given objects ftrack attribute to the self name object asset_link attribute in the DCC.

objects List of DCC objects

create(name)

Creates a new dcc_object with the given name.

static dictionary_from_object(object_name)

Static method to be used without initializing the current class. Returns a dictionary with the keys and values of the given object_name if exists.

object_name ftrack object type from the DCC.

from_asset_info_id(asset_info_id)

Checks the dcc to get all the ftrack objects. Compares them with the given asset_info_id and returns them if matches.

ftrack_plugin_id = None

Plugin id used on some DCC applications

get(k, default=None)

If exists, returns the value of the given k otherwise returns default.

k : Key of the current dictionary.

default : Default value of the given Key.

property name

Return name of the object

property objects_loaded

Returns the attribute objects_loaded of the current self name

setdefault(key, value=None)

Sets a default value for the given key.

update(*args, **kwargs)

Updates the current keys and values with the given ones.

__init__(event_manager)

Initialize FtrackObjectManager with instance of EventManager

property asset_info

Returns instance of FtrackAssetInfo

connect_objects(objects)

Link the given objects ftrack attribute to the self dcc_object.

objects List of objects

create_new_dcc_object()

Creates a new dcc_object with a unique name.

property dcc_object

Returns instance of DccObject

property event_manager

Returns instance of EventManager

property is_sync

Returns if the self dcc_object is sync with the self asset_info

property objects_loaded

Returns whether the objects are loaded in the scene or not.

property session

Returns instance of ftrack_api.session.Session

class DccObject(name=None, from_id=None, **kwargs)

Bases: dict

Base DccObject class.

__init__(name=None, from_id=None, **kwargs)

If the from_id is provided find an object in the dcc with the given from_id as assset_info_id. If a name is provided create a new object in the dcc.

connect_objects(objects)

Link the given objects ftrack attribute to the self name object asset_link attribute in the DCC.

objects List of DCC objects

create(name)

Creates a new dcc_object with the given name.

static dictionary_from_object(object_name)

Static method to be used without initializing the current class. Returns a dictionary with the keys and values of the given object_name if exists.

object_name ftrack object type from the DCC.

from_asset_info_id(asset_info_id)

Checks the dcc to get all the ftrack objects. Compares them with the given asset_info_id and returns them if matches.

ftrack_plugin_id = None

Plugin id used on some DCC applications

get(k, default=None)

If exists, returns the value of the given k otherwise returns default.

k : Key of the current dictionary.

default : Default value of the given Key.

property name

Return name of the object

property objects_loaded

Returns the attribute objects_loaded of the current self name

setdefault(key, value=None)

Sets a default value for the given key.

update(*args, **kwargs)

Updates the current keys and values with the given ones.

property ftrack_object_manager

Initializes and returns an instance of FtrackObjectManager

property dcc_object

Returns the dcc_object from the FtrackObjectManager

property asset_info

Returns the asset_info from the FtrackObjectManager

property host_id

Returns the current host id.

property host_types

Return the current host type.

__init__(event_manager, host_types, host_id, asset_type_name)[source]

Initialise HostConnection with instance of EventManager , and host, host_id and asset_type_name

host : Host type.. (ex: python, maya, nuke….) host_id : Host id. asset_type_name : If engine is initialized to publish or load, the asset type should be specified.

run_event(plugin_name, plugin_type, host_type, data, options, context_data, method)[source]

Returns an ftrack_api.event.base.Event with the topic PIPELINE_RUN_PLUGIN_TOPIC with the data of the given plugin_name, plugin_type, host_definition, data, options, context_data, method

plugin_name : Name of the plugin.

plugin_type : Type of plugin.

host_definition : Host type.

data : data to pass to the plugin.

options : options to pass to the plugin

context_data : result of the context plugin containing the context_id, aset_name… Or None

method : Method of the plugin to be executed.

run(data)[source]

Executes the _run_plugin() with the provided data. Returns the result of the mentioned method.

data : pipeline[‘data’] provided from the client host connection at run()

run_stage(stage_name, plugins, stage_context, stage_options, stage_data, plugins_order=None, step_type=None, step_name=None)[source]

Returns the bool status and the result list of dictionaries of executing all the plugins in the stage. This function executes all the defined plugins for this stage using the _run_plugin()

stage_name : Name of the stage that’s executing.

plugins : List of plugins that has to execute.

stage_context : Context dictionary with the result of the context plugin containing the context_id, aset_name… Or None

stage_options : Options dictionary to be passed to each plugin.

stage_data : Data list of dictionaries to be passed to each stage.

plugins_order : Order of the plugins to be executed.

step_type : Type of the step.

run_step(step_name, stages, step_context, step_options, step_data, stages_order, step_type)[source]

Returns the bool status and the result list of dictionaries of executing all the stages in the step. This function executes all the defined stages for for this step using the run_stage() with the given stage_order.

step_name : Name of the step that’s executing.

stages : List of stages that has to execute.

step_context : Context dictionary with the result of the context plugin containing the context_id, aset_name… Or None

step_options : Options dictionary to be passed to each stage.

step_data : Data list of dictionaries to be passed to each stage.

stages_order : Order of the stages to be executed.

step_type : Type of the step.

run_definition(data)[source]

Runs the whole definition from the provided data. Call the method run_step() for each context, component and finalizer steps.

data : pipeline[‘data’] provided from the client host connection at run() Should be a valid definition.

ftrack_connect_pipeline.host.engine.asset_manager
class ftrack_connect_pipeline.host.engine.asset_manager.AssetManagerEngine(event_manager, host_types, host_id, asset_type_name=None)[source]

Bases: BaseEngine

Base Asset Manager Engine class.

engine_type = 'asset_manager'

Engine type for this engine class

__init__(event_manager, host_types, host_id, asset_type_name=None)[source]

Initialise AssetManagerEngine with instance of EventManager , and host, host_id and asset_type_name

host : Host type.. (ex: python, maya, nuke….)

host_id : Host id.

asset_type_name : Default None. If engine is initialized to publish or load, the asset type should be specified.

discover_assets(assets=None, options=None, plugin=None)[source]

Should be overridden by child

(Standalone mode, dev, testing) Discover 10 random assets from Ftrack with component name main. Returns status and ftrack_asset_info_list which is a list of FtrackAssetInfo

resolve_dependencies(context_id, options=None, plugin=None)[source]

Returns a list of the asset versions that task identified by context_id is depending upon, with additional options using the given plugin.

context_id : id of the task.

options : Options to resolver.

plugin : Plugin definition, a dictionary with the plugin information.

select_assets(assets, options=None, plugin=None)[source]

Returns status dictionary and results dictionary keyed by the id for executing the select_asset() for all the FtrackAssetInfo in the given assets list.

assets: List of FtrackAssetInfo

select_asset(asset_info, options=None, plugin=None)[source]

(Not implemented for python standalone mode) Returns the status and the result of selecting the given asset_info

asset_info : FtrackAssetInfo

update_assets(assets, options=None, plugin=None)[source]

Returns status dictionary and results dictionary keyed by the id for executing the update_asset() using the criteria of the given plugin for all the FtrackAssetInfo in the given assets list.

assets: List of FtrackAssetInfo

update_asset(asset_info, options=None, plugin=None)[source]

Returns the status and the result of updating the given asset_info using the criteria of the given plugin

asset_info : FtrackAssetInfo

options : Options to update the asset.

plugin : Plugin definition, a dictionary with the plugin information.

load_assets(assets, options=None, plugin=None)[source]

Returns status dictionary and results dictionary keyed by the id for executing the remove_asset() for all the FtrackAssetInfo in the given assets list.

assets: List of FtrackAssetInfo

load_asset(asset_info, options=None, plugin=None)[source]

(Not implemented for python standalone mode) Returns the status and the result of removing the given asset_info

asset_info : FtrackAssetInfo

change_version(asset_info, options, plugin=None)[source]

Returns the status and the result of changing the version of the given asset_info to the new version id passed in the given options

asset_info : FtrackAssetInfo

options : Options should contain the new_version_id key with the id value

plugin : Default None. Plugin definition, a dictionary with the plugin information.

unload_assets(assets, options=None, plugin=None)[source]

Returns status dictionary and results dictionary keyed by the id for executing the remove_asset() for all the FtrackAssetInfo in the given assets list.

assets: List of FtrackAssetInfo

unload_asset(asset_info, options=None, plugin=None)[source]

(Not implemented for python standalone mode) Returns the status and the result of unloading the given asset_info.

asset_info : FtrackAssetInfo

remove_assets(assets, options=None, plugin=None)[source]

Returns status dictionary and results dictionary keyed by the id for executing the remove_asset() for all the FtrackAssetInfo in the given assets list.

assets: List of FtrackAssetInfo

remove_asset(asset_info, options=None, plugin=None)[source]

(Not implemented for python standalone mode) Returns the status and the result of removing the given asset_info.

asset_info : FtrackAssetInfo

run(data)[source]

Override method of engine() Executes the method defined in the given data method key or in case is not given will execute the _run_plugin() with the provided data plugin key. Returns the result of the executed method or plugin.

data : pipeline[‘data’] provided from the client host connection at run()

ftrack_connect_pipeline.host.engine.open
class ftrack_connect_pipeline.host.engine.open.OpenerEngine(event_manager, host_types, host_id, asset_type_name)[source]

Bases: BaseEngine

engine_type = 'opener'

Engine type for this engine class

__init__(event_manager, host_types, host_id, asset_type_name)[source]

Initialise HostConnection with instance of EventManager , and host, host_id and asset_type_name

host : Host type.. (ex: python, maya, nuke….) host_id : Host id. asset_type_name : If engine is initialized to publish or load, the asset type should be specified.

ftrack_connect_pipeline.host.engine.load
class ftrack_connect_pipeline.host.engine.load.LoaderEngine(event_manager, host_types, host_id, asset_type_name)[source]

Bases: BaseEngine

engine_type = 'loader'

Engine type for this engine class

__init__(event_manager, host_types, host_id, asset_type_name)[source]

Initialise HostConnection with instance of EventManager , and host, host_id and asset_type_name

host : Host type.. (ex: python, maya, nuke….) host_id : Host id. asset_type_name : If engine is initialized to publish or load, the asset type should be specified.

ftrack_connect_pipeline.host.engine.publish
class ftrack_connect_pipeline.host.engine.publish.PublisherEngine(event_manager, host_types, host_id, asset_type_name)[source]

Bases: BaseEngine

engine_type = 'publisher'

Engine type for this engine class

__init__(event_manager, host_types, host_id, asset_type_name)[source]

Initialise HostConnection with instance of EventManager , and host, host_id and asset_type_name

host : Host type.. (ex: python, maya, nuke….) host_id : Host id. asset_type_name : If engine is initialized to publish or load, the asset type should be specified.

ftrack_connect_pipeline.host.validation
ftrack_connect_pipeline.host.validation.get_schema(definition_type, schemas)[source]

Returns the schema in the given schemas for the given definition_type

definition_type : Type of the definition. (asset_manager, publisher…)

schemas : List of schemas.

ftrack_connect_pipeline.host.validation.validate_schema(schemas, definition)[source]

Validates the schema of the given definition from the given schemas using the _validate_jsonschema function of the jsonschema.validate library.

schemas : List of schemas.

definition : Definition to be validated against the schema.

class ftrack_connect_pipeline.host.validation.PluginDiscoverValidation(session, host_types)[source]

Bases: object

Plugin discover validation base class

__init__(session, host_types)[source]

Initialise PluginDiscoverValidation with instance of ftrack_api.session.Session and host_types.

host_types : List of compatible host types. (maya, python, nuke….)

validate_plugins(definitions, schema_type)[source]

Validates all the definitions in the given definitions definitions calling the validate_context_plugins(), validate_components_plugins(), vaildate_finalizers_plugins().

Returns the invalid definition indices.

definitions : List of definitions (opener, loader, publisher and so on).

vaildate_definition_plugins(steps, definition_name, schema_type)[source]

Validates plugins in the given steps running the _discover_plugin()

steps : List of dictionaries with steps, stages and plugins.

definition_name : Name of the current definition.

schema_type : Schema type of the current definition.

ftrack_connect_pipeline.log

class ftrack_connect_pipeline.log.ResultEncoder(*, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, sort_keys=False, indent=None, separators=None, default=None)[source]

Bases: JSONEncoder

JSON encoder for handling non serializable objects in plugin result

default(obj)[source]

Implement this method in a subclass such that it returns a serializable object for o, or calls the base implementation (to raise a TypeError).

For example, to support arbitrary iterators, you could implement default like this:

def default(self, o):
    try:
        iterable = iter(o)
    except TypeError:
        pass
    else:
        return list(iterable)
    # Let the base class default method raise the TypeError
    return JSONEncoder.default(self, o)
class ftrack_connect_pipeline.log.LogDB(host_id, db_name=None, table_name=None)[source]

Bases: object

Log database class

database_expire_grace_s = 604800
__init__(host_id, db_name=None, table_name=None)[source]

Initializes a new persistent local log database having database name db_name on disk and table_name table name.

db_name = 'pipeline-{}.db'
table_name = 'LOGMGR'
property connection
get_database_path(host_id)[source]

Get local persistent pipeline database path.

Will create the directory (recursively) if it does not exist.

Raise if the directory can not be created.

add_log_item(log_item)[source]

Stores a LogItem in persistent log database.

get_log_items(host_id)[source]

Stores a LogItem in persistent log database.

get_log_items_by_plugin_id(host_id, plugin_id)[source]

Stores a LogItem in persistent log database.

ftrack_connect_pipeline.log.log_item
class ftrack_connect_pipeline.log.log_item.LogItem(log_result)[source]

Bases: object

Represents a Logging Item Base Class

__init__(log_result)[source]

Initialise LogItem with log_result

log_result: Dictionary with log information.

property execution_time

Return the duration of the log entry.

ftrack_connect_pipeline.plugin

class ftrack_connect_pipeline.plugin.BasePluginValidation(plugin_name, required_output, return_type, return_value)[source]

Bases: object

Plugin Validation base class

__init__(plugin_name, required_output, return_type, return_value)[source]

Initialise PluginValidation with plugin_name, required_output, return_type, return_value.

plugin_name : current plugin name.

required_output : required exporters of the current plugin.

return_type : required return type of the current plugin.

return_value : Expected return value of the current plugin.

validate_required_output(result)[source]

Ensures that result contains all the expected required_output keys defined for the current plugin.

result : exporters value of the plugin execution.

Return tuple (bool,str)

validate_result_type(result)[source]

Ensures that result is instance of the defined return_type of the current plugin.

result : exporters value of the plugin execution.

Return tuple (bool,str)

validate_result_value(result)[source]

Ensures that result is equal as the defined return_value of the current plugin.

result : exporters value of the plugin execution.

Return tuple (bool,str)

validate_user_data(user_data)[source]

Ensures that user_data is instance of dict. And validates that contains message and data keys.

Return tuple (bool,str)

class ftrack_connect_pipeline.plugin.BasePlugin(session)[source]

Bases: object

Base Class to represent a Plugin

plugin_type = None

Type of the plugin

plugin_name = None

Name of the plugin

type = 'base'

Type of the plugin default base. (action, collector…)

category = 'plugin'

Category of the plugin (plugin, plugin.widget…)

host_type = 'python'

Host type of the plugin

return_type = None

Required return type

return_value = None

Required return Value

class FtrackObjectManager(event_manager)

Bases: object

FtrackObjectManager class. Mantain the syncronization between asset_info and the ftrack information of the objects in the scene.

class DccObject(name=None, from_id=None, **kwargs)

Bases: dict

Base DccObject class.

__init__(name=None, from_id=None, **kwargs)

If the from_id is provided find an object in the dcc with the given from_id as assset_info_id. If a name is provided create a new object in the dcc.

connect_objects(objects)

Link the given objects ftrack attribute to the self name object asset_link attribute in the DCC.

objects List of DCC objects

create(name)

Creates a new dcc_object with the given name.

static dictionary_from_object(object_name)

Static method to be used without initializing the current class. Returns a dictionary with the keys and values of the given object_name if exists.

object_name ftrack object type from the DCC.

from_asset_info_id(asset_info_id)

Checks the dcc to get all the ftrack objects. Compares them with the given asset_info_id and returns them if matches.

ftrack_plugin_id = None

Plugin id used on some DCC applications

get(k, default=None)

If exists, returns the value of the given k otherwise returns default.

k : Key of the current dictionary.

default : Default value of the given Key.

property name

Return name of the object

property objects_loaded

Returns the attribute objects_loaded of the current self name

setdefault(key, value=None)

Sets a default value for the given key.

update(*args, **kwargs)

Updates the current keys and values with the given ones.

__init__(event_manager)

Initialize FtrackObjectManager with instance of EventManager

property asset_info

Returns instance of FtrackAssetInfo

connect_objects(objects)

Link the given objects ftrack attribute to the self dcc_object.

objects List of objects

create_new_dcc_object()

Creates a new dcc_object with a unique name.

property dcc_object

Returns instance of DccObject

property event_manager

Returns instance of EventManager

property is_sync

Returns if the self dcc_object is sync with the self asset_info

property objects_loaded

Returns whether the objects are loaded in the scene or not.

property session

Returns instance of ftrack_api.session.Session

class DccObject(name=None, from_id=None, **kwargs)

Bases: dict

Base DccObject class.

__init__(name=None, from_id=None, **kwargs)

If the from_id is provided find an object in the dcc with the given from_id as assset_info_id. If a name is provided create a new object in the dcc.

connect_objects(objects)

Link the given objects ftrack attribute to the self name object asset_link attribute in the DCC.

objects List of DCC objects

create(name)

Creates a new dcc_object with the given name.

static dictionary_from_object(object_name)

Static method to be used without initializing the current class. Returns a dictionary with the keys and values of the given object_name if exists.

object_name ftrack object type from the DCC.

from_asset_info_id(asset_info_id)

Checks the dcc to get all the ftrack objects. Compares them with the given asset_info_id and returns them if matches.

ftrack_plugin_id = None

Plugin id used on some DCC applications

get(k, default=None)

If exists, returns the value of the given k otherwise returns default.

k : Key of the current dictionary.

default : Default value of the given Key.

property name

Return name of the object

property objects_loaded

Returns the attribute objects_loaded of the current self name

setdefault(key, value=None)

Sets a default value for the given key.

update(*args, **kwargs)

Updates the current keys and values with the given ones.

property ftrack_object_manager

Initializes and returns an instance of FtrackObjectManager

property dcc_object

Returns the dcc_object from the FtrackObjectManager

property asset_info

Returns the asset_info from the FtrackObjectManager

property output

Returns a copy of required_output

property discover_topic

Return a formatted PIPELINE_DISCOVER_PLUGIN_TOPIC

property run_topic

Return a formatted PIPELINE_RUN_PLUGIN_TOPIC

property session

Returns instance of ftrack_api.session.Session

property event_manager

Returns instance of EventManager

property raw_data

Returns the current context id

property plugin_settings

Returns the current plugin_settings

property method

Returns the current method

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

plugin_id = None

Id of the plugin

register()[source]

Register function of the plugin to regiter it self.

Note

This function subscribes the plugin to two ftrack_api.event.base.Event topics:

PIPELINE_DISCOVER_PLUGIN_TOPIC: Topic to make the plugin discoverable for the host.

PIPELINE_RUN_PLUGIN_TOPIC: Topic to execute the plugin

run(context_data=None, data=None, options=None)[source]

Runs the current plugin with , context_data , data and options.

context_data provides a mapping with the asset_name, context_id, asset_type_name, comment and status_id of the asset that we are working on.

data a list of data coming from previous collector or empty list

options a dictionary of options passed from outside.

Note

Use always self.exporters as a base to return the values, don’t override self.exporters as it contains the _required_output

fetch(context_data=None, data=None, options=None)[source]

Runs the current plugin with , context_data , data and options.

context_data provides a mapping with the asset_name, context_id, asset_type_name, comment and status_id of the asset that we are working on.

data a list of data coming from previous collector or empty list

options a dictionary of options passed from outside.

Note

This function is meant to be ran as an alternative of the default run function. Usually to fetch information for the widget or to test the plugin.

ftrack_connect_pipeline.plugin.asset_manager
ftrack_connect_pipeline.plugin.asset_manager.action
class ftrack_connect_pipeline.plugin.asset_manager.action.AssetManagerActionPlugin(session)[source]

Bases: BaseActionPlugin

Class representing a Asset Manager Action Plugin Inherits from BaseActionPlugin

return_type

Type of object that should be returned

alias of list

plugin_type = 'asset_manager.action'

Plugin type of the current plugin

__init__(session)[source]

Initialise AssetManagerActionPlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.asset_manager.discover
class ftrack_connect_pipeline.plugin.asset_manager.discover.AssetManagerDiscoverPlugin(session)[source]

Bases: BaseDiscoverPlugin

Class representing a Asset Manager Action Plugin Inherits from BaseDiscoverPlugin

return_type

Type of object that should be returned

alias of list

plugin_type = 'asset_manager.discover'

Plugin type of the current plugin

__init__(session)[source]

Initialise AssetManagerDiscoverPlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.asset_manager.resolve
class ftrack_connect_pipeline.plugin.asset_manager.resolve.AssetManagerResolvePlugin(session)[source]

Bases: BaseActionPlugin

Class representing a Asset Manager Resolve Plugin Inherits from BaseActionPlugin

return_type

Type of object that should be returned

alias of dict

plugin_type = 'asset_manager.resolver'

Plugin type of the current plugin

__init__(session)[source]

Initialise AssetManagerActionPlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.base
ftrack_connect_pipeline.plugin.base.action
class ftrack_connect_pipeline.plugin.base.action.BaseActionPluginValidation(plugin_name, required_output, return_type, return_value)[source]

Bases: BasePluginValidation

Action Plugin Validation class inherits from BasePluginValidation

__init__(plugin_name, required_output, return_type, return_value)[source]

Initialise PluginValidation with plugin_name, required_output, return_type, return_value.

plugin_name : current plugin name.

required_output : required exporters of the current plugin.

return_type : required return type of the current plugin.

return_value : Expected return value of the current plugin.

validate_required_output(result)[source]

Ensures that result contains all the expected required_output values defined for the current plugin.

result : exporters value of the plugin execution.

Return tuple (bool,str)

class ftrack_connect_pipeline.plugin.base.action.BaseActionPlugin(session)[source]

Bases: BasePlugin

Base Action Plugin Class inherits from BasePlugin

return_type

Required return type

alias of list

plugin_type = 'action'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

run(context_data=None, data=None, options=None)[source]

Runs the current plugin with , context_data , data and options.

context_data provides a mapping with the asset_name, context_id, asset_type_name, comment and status_id of the asset that we are working on.

data a list of data coming from previous collector or empty list

options a dictionary of options passed from outside.

Note

Use always self.exporters as a base to return the values, don’t override self.exporters as it contains the _required_output

ftrack_connect_pipeline.plugin.base.collector
class ftrack_connect_pipeline.plugin.base.collector.BaseCollectorPluginValidation(plugin_name, required_output, return_type, return_value)[source]

Bases: BasePluginValidation

Collector Plugin Validation class inherits from BasePluginValidation

__init__(plugin_name, required_output, return_type, return_value)[source]

Initialise PluginValidation with plugin_name, required_output, return_type, return_value.

plugin_name : current plugin name.

required_output : required exporters of the current plugin.

return_type : required return type of the current plugin.

return_value : Expected return value of the current plugin.

validate_required_output(result)[source]

Ensures that result contains all the expected required_output values defined for the current plugin.

result : exporters value of the plugin execution.

Return tuple (bool,str)

class ftrack_connect_pipeline.plugin.base.collector.BaseCollectorPlugin(session)[source]

Bases: BasePlugin

Base Collector Plugin Class inherits from BasePlugin

return_type

Required return type

alias of list

plugin_type = 'collector'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

run(context_data=None, data=None, options=None)[source]

Runs the current plugin with , context_data , data and options.

context_data provides a mapping with the asset_name, context_id, asset_type_name, comment and status_id of the asset that we are working on.

data a list of data coming from previous collector or empty list

options a dictionary of options passed from outside.

Note

Use always self.exporters as a base to return the values, don’t override self.exporters as it contains the _required_output

ftrack_connect_pipeline.plugin.base.context
class ftrack_connect_pipeline.plugin.base.context.BaseContextPluginValidation(plugin_name, required_output, return_type, return_value)[source]

Bases: BasePluginValidation

Context Plugin Validation class inherits from BasePluginValidation

__init__(plugin_name, required_output, return_type, return_value)[source]

Initialise PluginValidation with plugin_name, required_output, return_type, return_value.

plugin_name : current plugin name.

required_output : required exporters of the current plugin.

return_type : required return type of the current plugin.

return_value : Expected return value of the current plugin.

class ftrack_connect_pipeline.plugin.base.context.BaseContextPlugin(session)[source]

Bases: BasePlugin

Base Context Plugin Class inherits from BasePlugin

return_type

Required return type

alias of dict

plugin_type = 'context'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

run(context_data=None, data=None, options=None)[source]

Runs the current plugin with , context_data , data and options.

context_data provides a mapping with the asset_name, context_id, asset_type_name, comment and status_id of the asset that we are working on.

data a list of data coming from previous collector or empty list

options a dictionary of options passed from outside.

Note

Use always self.exporters as a base to return the values, don’t override self.exporters as it contains the _required_output

ftrack_connect_pipeline.plugin.base.discover
class ftrack_connect_pipeline.plugin.base.discover.BaseDiscoverPluginValidation(plugin_name, required_output, return_type, return_value)[source]

Bases: BasePluginValidation

Discover Plugin Validation class inherits from BasePluginValidation

__init__(plugin_name, required_output, return_type, return_value)[source]

Initialise PluginValidation with plugin_name, required_output, return_type, return_value.

plugin_name : current plugin name.

required_output : required exporters of the current plugin.

return_type : required return type of the current plugin.

return_value : Expected return value of the current plugin.

validate_required_output(result)[source]

Ensures that result contains all the expected required_output values defined for the current plugin.

result : exporters value of the plugin execution.

Return tuple (bool,str)

class ftrack_connect_pipeline.plugin.base.discover.BaseDiscoverPlugin(session)[source]

Bases: BasePlugin

Base Discover Plugin Class inherits from BasePlugin

return_type

Required return type

alias of list

plugin_type = 'discover'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

run(context_data=None, data=None, options=None)[source]

Runs the current plugin with , context_data , data and options.

context_data provides a mapping with the asset_name, context_id, asset_type_name, comment and status_id of the asset that we are working on.

data a list of data coming from previous collector or empty list

options a dictionary of options passed from outside.

Note

Use always self.exporters as a base to return the values, don’t override self.exporters as it contains the _required_output

ftrack_connect_pipeline.plugin.base.finalizer
class ftrack_connect_pipeline.plugin.base.finalizer.BaseFinalizerPluginValidation(plugin_name, required_output, return_type, return_value)[source]

Bases: BasePluginValidation

Finalizer Plugin Validation class inherits from BasePluginValidation

__init__(plugin_name, required_output, return_type, return_value)[source]

Initialise PluginValidation with plugin_name, required_output, return_type, return_value.

plugin_name : current plugin name.

required_output : required exporters of the current plugin.

return_type : required return type of the current plugin.

return_value : Expected return value of the current plugin.

class ftrack_connect_pipeline.plugin.base.finalizer.BaseFinalizerPlugin(session)[source]

Bases: BasePlugin

Base Finalizer Plugin Class inherits from BasePlugin

return_type

Required return type

alias of dict

plugin_type = 'finalizer'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

run(context_data=None, data=None, options=None)[source]

Runs the current plugin with , context_data , data and options.

context_data provides a mapping with the asset_name, context_id, asset_type_name, comment and status_id of the asset that we are working on.

data a list of data coming from previous collector or empty list

options a dictionary of options passed from outside.

Note

Use always self.exporters as a base to return the values, don’t override self.exporters as it contains the _required_output

ftrack_connect_pipeline.plugin.base.importer
class ftrack_connect_pipeline.plugin.base.importer.BaseImporterPluginValidation(plugin_name, required_output, return_type, return_value)[source]

Bases: BasePluginValidation

Importer Plugin Validation class inherits from BasePluginValidation

__init__(plugin_name, required_output, return_type, return_value)[source]

Initialise PluginValidation with plugin_name, required_output, return_type, return_value.

plugin_name : current plugin name.

required_output : required exporters of the current plugin.

return_type : required return type of the current plugin.

return_value : Expected return value of the current plugin.

class ftrack_connect_pipeline.plugin.base.importer.BaseImporterPlugin(session)[source]

Bases: BasePlugin

Base Importer Plugin Class inherits from BasePlugin

return_type

Required return type

alias of dict

plugin_type = 'importer'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

run(context_data=None, data=None, options=None)[source]

Runs the current plugin with , context_data , data and options.

context_data provides a mapping with the asset_name, context_id, asset_type_name, comment and status_id of the asset that we are working on.

data a list of data coming from previous collector or empty list

options a dictionary of options passed from outside.

Note

Use always self.exporters as a base to return the values, don’t override self.exporters as it contains the _required_output

ftrack_connect_pipeline.plugin.base.exporter
class ftrack_connect_pipeline.plugin.base.exporter.BaseExporterPluginValidation(plugin_name, required_output, return_type, return_value)[source]

Bases: BasePluginValidation

Output Plugin Validation class inherits from BasePluginValidation

__init__(plugin_name, required_output, return_type, return_value)[source]

Initialise PluginValidation with plugin_name, required_output, return_type, return_value.

plugin_name : current plugin name.

required_output : required exporters of the current plugin.

return_type : required return type of the current plugin.

return_value : Expected return value of the current plugin.

validate_required_output(result)[source]

Ensures that result contains all the expected required_output keys defined for the current plugin.

result : exporters value of the plugin execution.

Return tuple (bool,str)

class ftrack_connect_pipeline.plugin.base.exporter.BaseExporterPlugin(session)[source]

Bases: BasePlugin

Base Output Plugin Class inherits from BasePlugin

return_type

Required return type

alias of list

plugin_type = 'exporter'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

run(context_data=None, data=None, options=None)[source]

Runs the current plugin with , context_data , data and options.

context_data provides a mapping with the asset_name, context_id, asset_type_name, comment and status_id of the asset that we are working on.

data a list of data coming from previous collector or empty list

options a dictionary of options passed from outside.

Note

Use always self.exporters as a base to return the values, don’t override self.exporters as it contains the _required_output

ftrack_connect_pipeline.plugin.base.post_import
class ftrack_connect_pipeline.plugin.base.post_importer.BasePostImporterPluginValidation(plugin_name, required_output, return_type, return_value)[source]

Bases: BasePluginValidation

Post Import Plugin Validation class inherits from BasePluginValidation

__init__(plugin_name, required_output, return_type, return_value)[source]

Initialise PluginValidation with plugin_name, required_output, return_type, return_value.

plugin_name : current plugin name.

required_output : required exporters of the current plugin.

return_type : required return type of the current plugin.

return_value : Expected return value of the current plugin.

class ftrack_connect_pipeline.plugin.base.post_importer.BasePostImporterPlugin(session)[source]

Bases: BasePlugin

Base Post Import Plugin Class inherits from BasePlugin

return_type

Required return type

alias of dict

plugin_type = 'post_importer'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

run(context_data=None, data=None, options=None)[source]

Runs the current plugin with , context_data , data and options.

context_data provides a mapping with the asset_name, context_id, asset_type_name, comment and status_id of the asset that we are working on.

data a list of data coming from previous collector or empty list

options a dictionary of options passed from outside.

Note

Use always self.exporters as a base to return the values, don’t override self.exporters as it contains the _required_output

ftrack_connect_pipeline.plugin.base.validator
class ftrack_connect_pipeline.plugin.base.validator.BaseValidatorPluginValidation(plugin_name, required_output, return_type, return_value)[source]

Bases: BasePluginValidation

Validator Plugin Validation class inherits from BasePluginValidation

__init__(plugin_name, required_output, return_type, return_value)[source]

Initialise PluginValidation with plugin_name, required_output, return_type, return_value.

plugin_name : current plugin name.

required_output : required exporters of the current plugin.

return_type : required return type of the current plugin.

return_value : Expected return value of the current plugin.

validate_required_output(result)[source]

Ensures that result contains all the expected required_output values defined for the current plugin.

result : exporters value of the plugin execution.

Return tuple (bool,str)

validate_result_value(result)[source]

Ensures that result is True.

result : exporters value of the plugin execution.

Return tuple (bool,str)

class ftrack_connect_pipeline.plugin.base.validator.BaseValidatorPlugin(session)[source]

Bases: BasePlugin

Base Validator Plugin Class inherits from BasePlugin

return_type

Required return type

alias of bool

plugin_type = 'validator'

Type of the plugin

return_value = True

Required return Value

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

run(context_data=None, data=None, options=None)[source]

Runs the current plugin with , context_data , data and options.

context_data provides a mapping with the asset_name, context_id, asset_type_name, comment and status_id of the asset that we are working on.

data a list of data coming from previous collector or empty list

options a dictionary of options passed from outside.

Note

Use always self.exporters as a base to return the values, don’t override self.exporters as it contains the _required_output

ftrack_connect_pipeline.plugin.open
ftrack_connect_pipeline.plugin.open.collector
class ftrack_connect_pipeline.plugin.open.collector.OpenerCollectorPlugin(session)[source]

Bases: BaseCollectorPlugin

Base Opener Collector Plugin Class inherits from BaseCollectorPlugin

return_type

Required return type

alias of list

plugin_type = 'opener.collector'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.open.context
class ftrack_connect_pipeline.plugin.open.context.OpenerContextPlugin(session)[source]

Bases: BaseContextPlugin

Base Opener Context Plugin Class inherits from BaseContextPlugin

return_type

Required return type

alias of dict

plugin_type = 'opener.context'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.open.finalizer
class ftrack_connect_pipeline.plugin.open.finalizer.OpenerFinalizerPlugin(session)[source]

Bases: BaseFinalizerPlugin

Base Opener Finalizer Plugin Class inherits from BaseFinalizerPlugin

return_type

Required return type

alias of dict

plugin_type = 'opener.finalizer'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.open.importer
class ftrack_connect_pipeline.plugin.open.importer.OpenerImporterPlugin(session)[source]

Bases: BaseImporterPlugin

Base Opener Importer Plugin Class inherits from BaseImporterPlugin

return_type

Required return type

alias of dict

plugin_type = 'opener.importer'

Type of the plugin

open_modes = {}

Available open modes for an asset

dependency_open_mode = ''

Default defendency open Mode

json_data = {}

Extra json data with the current open options

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

get_current_objects()[source]
init_nodes(context_data=None, data=None, options=None)[source]

Alternative plugin method to init all the nodes in the scene but not need to open the assets

open_asset(context_data=None, data=None, options=None)[source]

Alternative plugin method to only open the asset in the scene

init_and_open(context_data=None, data=None, options=None)[source]

Alternative plugin method to init and open the node and the assets into the scene

ftrack_connect_pipeline.plugin.open.post_import
class ftrack_connect_pipeline.plugin.open.post_importer.OpenerPostImporterPlugin(session)[source]

Bases: BasePostImporterPlugin

Base Opener Post Import Plugin Class inherits from BasePostImporterPlugin

return_type

Required return type

alias of dict

plugin_type = 'opener.post_importer'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.load
ftrack_connect_pipeline.plugin.load.collector
class ftrack_connect_pipeline.plugin.load.collector.LoaderCollectorPlugin(session)[source]

Bases: BaseCollectorPlugin

Base Loader Collector Plugin Class inherits from BaseCollectorPlugin

return_type

Required return type

alias of list

plugin_type = 'loader.collector'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.load.context
class ftrack_connect_pipeline.plugin.load.context.LoaderContextPlugin(session)[source]

Bases: BaseContextPlugin

Base Loader Context Plugin Class inherits from BaseContextPlugin

return_type

Required return type

alias of dict

plugin_type = 'loader.context'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.load.finalizer
class ftrack_connect_pipeline.plugin.load.finalizer.LoaderFinalizerPlugin(session)[source]

Bases: BaseFinalizerPlugin

Base Loader Finalizer Plugin Class inherits from BaseFinalizerPlugin

return_type

Required return type

alias of dict

plugin_type = 'loader.finalizer'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.load.importer
class ftrack_connect_pipeline.plugin.load.importer.LoaderImporterPlugin(session)[source]

Bases: BaseImporterPlugin

Base Loader Importer Plugin Class inherits from BaseImporterPlugin

return_type

Required return type

alias of dict

plugin_type = 'loader.importer'

Type of the plugin

load_modes = {}

Available load modes for an asset

dependency_load_mode = ''

Default defendency load Mode

json_data = {}

Extra json data with the current load options

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

get_current_objects()[source]
init_nodes(context_data=None, data=None, options=None)[source]

Alternative plugin method to init all the nodes in the scene but not need to load the assets

load_asset(context_data=None, data=None, options=None)[source]

Alternative plugin method to only load the asset in the scene

init_and_load(context_data=None, data=None, options=None)[source]

Alternative plugin method to init and load the node and the assets into the scene

ftrack_connect_pipeline.plugin.load.post_import
class ftrack_connect_pipeline.plugin.load.post_importer.LoaderPostImporterPlugin(session)[source]

Bases: BasePostImporterPlugin

Base Loader Post Import Plugin Class inherits from BasePostImporterPlugin

return_type

Required return type

alias of dict

plugin_type = 'loader.post_importer'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.publish
ftrack_connect_pipeline.plugin.publish.collector
class ftrack_connect_pipeline.plugin.publish.collector.PublisherCollectorPlugin(session)[source]

Bases: BaseCollectorPlugin

Base Publisher Collector Plugin Class inherits from BaseCollectorPlugin

return_type

Required return type

alias of list

plugin_type = 'publisher.collector'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.publish.context
class ftrack_connect_pipeline.plugin.publish.context.PublisherContextPlugin(session)[source]

Bases: BaseContextPlugin

Base Publisher Context Plugin Class inherits from BaseContextPlugin

return_type

Required return type

alias of dict

plugin_type = 'publisher.context'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.publish.finalizer
class ftrack_connect_pipeline.plugin.publish.finalizer.PublisherFinalizerPlugin(session)[source]

Bases: BaseFinalizerPlugin

Base Publisher Finalizer Plugin Class inherits from BaseFinalizerPlugin

return_type

Required return type

alias of dict

plugin_type = 'publisher.finalizer'

Type of the plugin

version_dependencies = []

Ftrack dependencies of the current asset version

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

create_component(asset_version_entity, component_name, component_path)[source]

Creates an ftrack component on the given asset_version_entity with the given component_name pointing to the given component_path

asset_version_entity : instance of ftrack_api.entity.asset_version.AssetVersion

component_name : Name of the component to be created.

component_path : Linked path of the component data.

create_thumbnail(asset_version_entity, component_name, component_path)[source]

Creates and uploads an ftrack thumbnail for the given ftrack_api.entity.asset_version.AssetVersion from the given component_path

component_path : path to the thumbnail.

create_reviewable(asset_version_entity, component_name, component_path)[source]

Encodes the ftrack media for the given ftrack_api.entity.asset_version.AssetVersion from the given component_path

component_path : path to the image or video.

ftrack_connect_pipeline.plugin.publish.exporter
class ftrack_connect_pipeline.plugin.publish.exporter.PublisherExporterPlugin(session)[source]

Bases: BaseExporterPlugin

Base Publisher Output Plugin Class inherits from BaseExporterPlugin

return_type

Required return type

alias of list

plugin_type = 'publisher.exporter'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.plugin.publish.validator
class ftrack_connect_pipeline.plugin.publish.validator.PublisherValidatorPlugin(session)[source]

Bases: BaseValidatorPlugin

Base Publisher Validator Plugin Class inherits from BaseValidatorPlugin

return_type

Required return type

alias of bool

plugin_type = 'publisher.validator'

Type of the plugin

__init__(session)[source]

Initialise BasePlugin with instance of ftrack_api.session.Session

ftrack_connect_pipeline.configure_logging

ftrack_connect_pipeline.configure_logging.get_log_directory()[source]

Get log directory.

Will create the directory (recursively) if it does not exist.

Raise if the directory can not be created.

ftrack_connect_pipeline.configure_logging.configure_logging(logger_name, level=None, format=None, extra_modules=None, extra_handlers=None, propagate=True)[source]

Configure loggerName loggers with console and file handler.

Optionally specify log level (default WARNING)

Optionally set format, default: %(asctime)s - %(name)s - %(levelname)s - %(message)s.

Optional extra_modules to extend the modules to be set to level.

ftrack_connect_pipeline.event

class ftrack_connect_pipeline.event.EventManager(session, mode=0)[source]

Bases: object

Manages the events handling.

property id
property session
property connected
property mode
__init__(session, mode=0)[source]
publish(event, callback=None, mode=None)[source]

Emit event and provide callback function.

subscribe(topic, callback)[source]

ftrack_connect_pipeline.exception

exception ftrack_connect_pipeline.exception.PipelineError[source]

Bases: Exception

Base pipeline error.

exception ftrack_connect_pipeline.exception.PluginError[source]

Bases: PipelineError

Exception raised in case of plugin error

exception ftrack_connect_pipeline.exception.ValidatorPluginError[source]

Bases: PluginError

Exception raised in case of validator plugin error

ftrack_connect_pipeline.utils

ftrack_connect_pipeline.utils.str_context(context, with_id=False, force_version_nr=None, delimiter='/')[source]

Utility function to produce a human readable string out or a context.

ftrack_connect_pipeline.utils.str_version(v, with_id=False, force_version_nr=None, delimiter='/')[source]

Utility function to produce a human readable string out or an asset version.

ftrack_connect_pipeline.utils.safe_string(string)[source]
ftrack_connect_pipeline.utils.get_save_path(context_id, session, extension=None, temp=True)[source]

Calculate the path to local save (work path), DCC independent

Release and migration notes

Find out information about what has changed between versions and any important migration notes to be aware of when switching to a new version.

Release Notes

1.1.0

8 November 2022
  • new

    definitionDefinition_object module implemented on client.

  • fix

    dependenciesFix markdown error on pipeline

  • changed

    contextRewired the context event flow to support standalone delayed context set

  • changed

    docAdded release notes and API documentation

  • changed

    utilsAdded shared safe_string util function

  • changed

    docFixed AM client docstrings

1.0.1

1 August 2022
  • new

    Initial release

Migrating from old ftrack Connectors

Why a new Framework?

The legacy DCC Connectors implementing did not carry any means of configuring engines, publisher or the plugins (e.g. importers, exporters) used within.

Neither were there any possibility to run the integrations in remote mode or easily customise the look and feel.

The new DCC Framework addresses this by providing a modular approach, configurable through the pipeline definitions and plugins.

Compability

The new Framework is not backward compatible, which means that previously publish DCC project files containing tracked assets imported using the legacy integrations will not be recognised.

Glossary

Application launcher

The Connect component responsible for discovery and launch of DCC applications, relies on the ftrack event system for communication. Further resources:

ftrack Python api

The supported Python Application Programmable Interface for communicating with the ftrack workspace. Further resources:

Client

The host counterpart interacting with the user, communicates with the host through the ftrack Event system. Clients are launched from the DCC module through an event, or invoked directly in standalone mode. A client can choose to rely on the UI module or run standalone. Example of a client is the Maya publisher panel.

Connect

The ftrack desktop application capable of launching DCC applications, publishing files and plugin management. Further resources:

Connect package

The Connect package is Connect built and packaged for a certain target platform. typically Windows, Mac OS and Linux. If supplies a default Python runtime for running a Connect as an executable, compiled using cx_freeze.

DCC

Digital Content Creation tool, e.g. Maya, 3D Studio Max, Unreal, Blender and so on. Each DCC application is defined by a host type and has an associated Framework plugin. For example the Maya plugin has the following resources:

Definition

A JSON configuration file defining Framework engine behaviour - which plugins and widgets to use. Is validated against a schema. Example of a definition is the Maya Geometry publisher. Definitions lives within the ftrack-connect-pipeline-definition plugin, resources:

Engine

A core Python module driving a specific behaviour within the Framework, for example publishing or asset management.

Event manager

A module responsible sending and receiving ftrack events, through the ftrack Python API.

Framework

A Framework is a structure that you can build software on. It serves as a foundation, so you’re not starting entirely from scratch. Frameworks are typically associated with a specific programming language and are suited to different types of tasks. The ftrack pipeline Framework is a set of modules/layers enabling asset publish, load, management and other core functionality within an DCC application or standalone. The core Framework module is called ftrack-connect-pipeline which this documentation is part of, source code to be found here: https://github.com/ftrackhq/ftrack-connect-pipeline.git

Host

The central part of the core Framework that discovers and executes definitions through engines, handle the context and much more. The host is designed to be able to operate in remote mode through the ftrack event system.

Host type

The host type is the actual DCC type and is used to identify a DCC module and bind a definition to the DCC application. An example host types value is maya.

JSON

JSON is a lightweight format for storing and transporting data, and stands for JavaScript Object Notation. For more information https://www.json.org/

Plugin

A module designed to be discovered by the ftrack Python API. Plugins designed to be discovered by Connect is called Connect plugins and are main components of the Framework. Framework plugins resides within the definition module and are referenced from the with the definition JSON configurations.

Plugin manager

A Connect widget that allows discovery and installation of Connect plugins, resources:

Python

A programming language that lets you work more quickly and integrate your systems more effectively. Often used in creative industries. Visit the language website at http://www.python.org

Qt

The default UI Framework utilised by the Framework, through PySide and the Qt.py Python binding module. The correponding Framework module containing UI bindings is named ftrack-connect-pipeline-qt, resources:

UI

User Interface of the Framework, built with Qt.

Schema

A JSON configuration defining the strict structure and syntax of a definition for use with an engine.

Troubleshooting

Common issues:

ISSUE

SOLUTION

I see no DCC app launchers when running action in ftrack / Connect.

Make sure: 1) Connect is running and all integration dependency plugins are found, either in default location or where FTRACK_CONNECT_PLUGIN_PATH points. Also make sure there are no duplicates. 2) Make sure you have installed DCC app in default location, or update the search location in ftrack-application-launcher/config. 3) Make sure you have built the Framework using the Python interpreter version supported by the DCC app (either Py 3.7 or Py 2.7 for older versions).

The ftrack menu is not showing within DCC.

Make sure you are running a Python 3 enabled DCC application, or a Python 2 enabled if you have built the Framework for Python 2.

My DCC is running a newer incompatible Python 3 interpreter.

You will need to rebuild the framework plugins with that Python version and launch Connect with FTRACK_CONNECT_PLUGIN_PATH pointing to the collected builds folder.

I am getting a traceback/exception with the DCC that I cannot interpret.

If the exception happens within the ftrack Framework and not within your custom code, feel free to reach out to support@ftrack.com and describe the issue together with supplied logs and other useful information.

Help and support

Please visit our user Slack channel to get help on integration issues, and receive tips and tricks from other users.

On our forum, you will also find useful information about Connect and Framework releases.

Indices and tables