Python API

API for writing configurators

class unfurl.configurator.Configurator(configurationSpec)

Returns whether this configurator can handle a dry-runs for the given task. (And should check task.dryRun in during run()””.


task (TaskView) –




Return whether or not the configurator can execute the given task.

Does this configurator support the requested action and parameters and given the current state of the target instance?


task (TaskView) –


Should return True or a message describing why the task couldn’t be run.

Return type

(bool or str)

checkDigest(task, changeset)

Generate a compact, deterministic representation of the current configuration. This is saved in the job log and used by checkDigest in subsequent jobs to determine if the configuration changed the operation needs to be re-run.

The default implementation recalculates the digest of input parameters that were accessed in the previous run.


task (TaskView) –


True if configuration’s digest has changed, false if it the same.

Return type


excludeFromDigest = ()

This should perform the operation specified in the ConfigurationSpec on the


task (TaskView) –


Should yield either a JobRequest, TaskRequest or a ConfiguratorResult when done


Generate a compact, deterministic representation of the current configuration. This is saved in the job log and used by checkDigest in subsequent jobs to determine if the configuration changed the operation needs to be re-run.

The default implementation calculates a SHA1 digest of the values of the inputs that where accessed while that task was run, with the exception of the input parameters listed in excludeFromDigest.


task (TaskView) –


A dictionary whose keys are strings that start with “digest”

Return type


shortName = None

shortName can be used to customize the “short name” of the configurator as an alternative to using the full name (“module.class”) when setting the implementation on an operation. (Titlecase recommended)


Does this configuration need to be run?

class unfurl.configurator.JobRequest(resources, errors)

Yield this to run a child job.

class unfurl.configurator.TaskRequest(configSpec, target, reason, persist=False, required=None, startState=None)

Yield this to run a child task. (see unfurl.configurator.TaskView.createSubTask())

property name
class unfurl.configurator.TaskView(manifest, configSpec, target, reason=None, dependencies=None)

The interface presented to configurators.

addDependency(expr, expected=None, schema=None, name=None, required=False, wantList=False)
createSubTask(operation, resource=None, inputs=None, persist=False, required=False)

Create a subtask that will be executed if yielded by run()

  • operation (str) – The operation call (like interface.operation)

  • resource (NodeInstance) –



done(success=None, modified=None, status=None, result=None, outputs=None, captureException=None)

run() should call this method and yield its return value before terminating.

>>> yield task.done(True)
  • success (bool) – indicates if this operation completed without an error.

  • modified (bool) – (optional) indicates whether the physical instance was modified by this operation.

  • status (Status) – (optional) should be set if the operation changed the operational status of the target instance. If not specified, the runtime will updated the instance status as needed, based the operation preformed and observed changes to the instance (attributes changed).

  • result (dict) – (optional) A dictionary that will be serialized as YAML into the changelog, can contain any useful data about these operation.

  • outputs (dict) – (optional) Operation outputs, as specified in the toplogy template.



findConnection(target, relation='tosca.relationships.ConnectsTo')
property inputs

Exposes inputs and task settings as expression variables, so they can be accessed like:

eval: $inputs::param

or in jinja2 templates:

{{ inputs.param }}

query(query, dependency=False, name=None, required=False, wantList=False, resolveExternal=True, strict=True, vars=None, throw=False)

Mark the given value as sensitive. Sensitive values will be encrypted or redacted when outputed.


A subtype of sensitive appropriate for the value or the value itself if it can’t be converted.

Return type



Notifies Unfurl of new or changes to instances made while the configurator was running.

Operational status indicates if the instance currently exists or not. This will queue a new child job if needed.

- name:     aNewResource
  template: aNodeTemplate
  parent:   HOST
     anAttribute: aValue
    local: ok
    state: state
- name:     SELF
      anAttribute: aNewValue

resources (list or str) – Either a list or string that is parsed as YAML.


To run the job based on the supplied spec

immediately, yield the returned JobRequest.

Return type


property vars

A dictionary of the same variables that are available to expressions when evaluating inputs.

Internal classes supporting the runtime.


An enumeration.

configured = 5
configuring = 4
created = 3
creating = 2
deleted = 11
deleting = 10
error = 12
initial = 1
started = 7
starting = 6
stopped = 9
stopping = 8

An enumeration.

critical = 3
ignore = 0
optional = 1
required = 2

An enumeration.

absent = 5
degraded = 2
error = 3
ok = 1
pending = 4
unknown = 0
class unfurl.result.ChangeRecord(jobId=None, startTime=None, taskId=0, previousId=None, parse=None)

A ChangeRecord represents a job or task in the change log file. It consists of a change ID and named attributes.

A change ID is an identifier with this sequence of 12 characters: - “A” serves as a format version identifier - 7 alphanumeric characters (0-9, A-Z, and a-z) encoding the date and time the job ran. - 4 hexadecimal digits encoding the task id

classmethod formatLog(changeId, attributes)

format: changeidtkey=valuetkey=value




This module defines the core model and implements the runtime operations of the model.

The state of the system is represented as a collection of Instances. Each instance have a status; attributes that describe its state; and a TOSCA template which describes its capabilities, relationships and available interfaces for configuring and interacting with it.

class unfurl.runtime.Operational

This is an abstract base class for Jobs, Resources, and Configurations all have a Status associated with them and all use the same algorithm to compute their status from their dependent resouces, tasks, and configurations

static aggregateStatus(statuses)

Returns: ok, degraded, pending or None

If there are no instances, return None If any required are not operational, return pending or error If any other are not operational or degraded, return degraded Otherwise return ok. (Instances with priority set to “ignore” are ignored.)


Whether or not this object changed since the give ChangeRecord.

property status

Return the effective status, considering first the local readyState and then the aggregate status’ of its dependencies (see aggregateStatus() and getOperationalDependencies()).

If the localStatus is non-operational, that takes precedence. Otherwise the localStatus is compared with its aggregate dependent status and worser value is choosen.

class unfurl.runtime.OperationalInstance(status=None, priority=None, manualOveride=None, lastStateChange=None, lastConfigChange=None, state=None)

A concrete implementation of Operational

property localStatus

The localStatus property.

property manualOverideStatus

The manualOverideStatus property.

property priority

The priority property.

property state

The state property.

APIs for controlling Unfurl

Classes for managing the local environment.

Repositories can optionally be organized into projects that have a local configuration.

By convention, the “home” project defines a localhost instance and adds it to its context.

class unfurl.localenv.LocalEnv(manifestPath=None, homePath=None, parent=None, project=None)

This class represents the local environment that an ensemble runs in, including the local project it is part of and the home project.

findGitRepo(repoURL, revision=None)
findOrCreateWorkingDir(repoURL, revision=None, basepath=None)
findPathInRepos(path, importLoader=None)

If the given path is part of the working directory of a git repository return that repository and a path relative to it


Walk parents looking for unfurl.yaml


Return a new context that merges the given context with the local context.

getLocalInstance(name, context)

If asdf is installed, build a PATH list from .toolversions found in the current project and the home project.

getProject(path, homeProject)
homeProject = None

Evaluate using project home as a base dir.

class unfurl.localenv.Project(path, homeProject=None)

A Unfurl project is a folder that contains at least a local configuration file (unfurl.yaml), one or more ensemble.yaml files which maybe optionally organized into one or more git repositories.

createWorkingDir(gitUrl, ref=None)
findGitRepo(repoURL, revision=None)
static findPath(testPath)

Walk parents looking for unfurl.yaml

findPathInRepos(path, importLoader=None)

If the given path is part of the working directory of a git repository return that repository and a path relative to it

getAsdfPaths(asdfDataDir, toolVersions={})
static normalizePath(path)
property venv

A Job is generated by comparing a list of specs with the last known state of the system. Job runs tasks, each of which has a configuration spec that is executed on the running system Each task tracks and records its modifications to the system’s state

class unfurl.job.ConfigChange(parentJob=None, startTime=None, status=None, previousId=None, **kw)

Represents a configuration change made to the system. It has a operating status and a list of dependencies that contribute to its status. There are two kinds of dependencies:

  1. Live resource attributes that the configuration’s inputs depend on.

  2. Other configurations and resources it relies on to function properly.

class unfurl.job.Job(runner, rootResource, jobOptions, previousId=None)

runs ConfigTasks and child Jobs


Checked at runtime right before each task is run

  • validate inputs

  • check pre-conditions to see if it can be run

  • check task if it can be run

runTask(task, depth=0)

During each task run: * Notification of metadata changes that reflect changes made to resources * Notification of add or removing dependency on a resource or properties of a resource * Notification of creation or deletion of a resource * Requests a resource with requested metadata, if it doesn’t exist, a task is run to make it so (e.g. add a dns entry, install a package).

Returns a task.


Checked at runtime right before each task is run

class unfurl.job.JobOptions(**kw)

Options available to select which tasks are run, e.g. read-only

does the config apply to the operation? is it out of date? is it in a ok state?

unfurl.job.runJob(manifestPath=None, _opts=None)

Loads the given Ensemble and creates and runs a job.

  • manifestPath (str, optional) – If None, it will look for an ensemble in the current working directory.

  • _opts (dict, optional) – the names of the command line options for creating jobs.


The job that just ran.

Return type


This module implements creating and cloning project and ensembles as well Unfurl runtimes.

unfurl.init._createInClonedProject(paths, clonedProject, dest, mono)

Called by clone when cloning an ensemble.

source ensemble


project root or ensemble in repo

git clone only

local ensemble or template

git clone + new ensemble

unfurl.init.clone(source, dest, includeLocal=False, **options)

Clone the source ensemble to dest. If dest isn’t in a project, create one. source can point to an ensemble_template, a service_template, an existing ensemble or a folder containing one of those. If it points to a project its default ensemble will be cloned.

Referenced Repositories will be cloned if a git repository or copied if a regular file folder, If the folders already exist they will be copied to new folder unless the git repositories have the same HEAD. but the local repository names will remain the same.



Inside project

new ensemble

new or empty dir

clone or create project (depending on source)

another project

error (not yet supported)



Utility classes and functions

Public Api:

mapValue - returns a copy of the given value resolving any embedded queries or template strings

Ref.resolve given an expression, returns a ResultList Ref.resolveOne given an expression, return value, none or a (regular) list Ref.isRef return true if the given diction looks like a Ref


evalRef() given expression (string or dictionary) return list of Result Expr.resolve() given expression string, return list of Result Results._mapValue same as mapValue but with lazily evaluation

class unfurl.eval.Ref(exp, vars=None)

A Ref objects describes a path to metadata associated with a resource.

resolve(ctx, wantList=True, strict=True)

If wantList=True (default) returns a ResultList of matches Note that values in the list can be a list or None If wantList=False return resolveOne semantics If wantList=’result’ return a Result

resolveOne(ctx, strict=True)

If no match return None If more than one match return a list of matches Otherwise return the match

Note: If you want to distinguish between None values and no match or between single match that is a list and a list of matches use resolve() which always returns a (possible empty) of matches

class unfurl.eval.RefContext(currentResource, vars=None, wantList=False, resolveExternal=False, trace=0, strict=True)

The context of the expression being evaluated.

unfurl.eval.evalRef(val, ctx, top=False)

val is assumed to be an expression, evaluate and return a list of Result

class unfurl.util.Generate(generator)

Roughly equivalent to “yield from” but works in Python < 3.3


>>>  gen = Generate(generator())
>>>  while gen():
>>>    gen.result = yield
exception unfurl.util.UnfurlError(message, saveStack=False, log=False)
exception unfurl.util.UnfurlTaskError(task, message, log=40)
unfurl.util.filterEnv(rules, env=None, addOnly=False)

If ‘env’ is None it will be set to os.environ

If addOnly is False (the default) all variables in env will be included in the returned dict, otherwise only variables added by rules will be included

foo: bar # add foo=bar +foo: # copy foo +foo: bar # copy foo, set it bar if not present +!foo*: # copy all except keys matching “foo*” -!foo: # remove all except foo ^foo: /bar/bin # treat foo like a PATH and prepend value: /bar/bin:$foo

class unfurl.util.sensitive_bytes

Transparent wrapper class to mark bytes as sensitive

decode(*args, **kwargs)

Wrapper method to ensure type conversions maintain sensitive context

class unfurl.util.sensitive_dict

Transparent wrapper class to mark a dict as sensitive

class unfurl.util.sensitive_list(iterable=(), /)

Transparent wrapper class to mark a list as sensitive

class unfurl.util.sensitive_str

Transparent wrapper class to mark a str as sensitive

encode(*args, **kwargs)

Wrapper method to ensure type conversions maintain sensitive context