taskgraph.util package

Submodules

taskgraph.util.attributes module

taskgraph.util.attributes.attrmatch(attributes, **kwargs)

Determine whether the given set of task attributes matches. The conditions are given as keyword arguments, where each keyword names an attribute. The keyword value can be a literal, a set, or a callable. A literal must match the attribute exactly. Given a set, the attribute value must be in the set. A callable is called with the attribute value. If an attribute is specified as a keyword argument but not present in the attributes, the result is False.

taskgraph.util.attributes.copy_attributes_from_dependent_job(dep_job)
taskgraph.util.attributes.keymatch(attributes, target)

Determine if any keys in attributes are a match to target, then return a list of matching values. First exact matches will be checked. Failing that, regex matches and finally a default key.

taskgraph.util.attributes.match_run_on_projects(project, run_on_projects)

Determine whether the given project is included in the run-on-projects parameter, applying expansions for things like “integration” mentioned in the attribute documentation.

taskgraph.util.bbb_validation module

taskgraph.util.bbb_validation.fetch_all_the_things()
taskgraph.util.bbb_validation.valid_bbb_builders

taskgraph.util.cached_tasks module

taskgraph.util.cached_tasks.add_optimization(config, taskdesc, cache_type, cache_name, digest=None, digest_data=None)

Allow the results of this task to be cached. This adds index routes to the task so it can be looked up for future runs, and optimization hints so that cached artifacts can be found. Exactly one of digest and digest_data must be passed.

Parameters:
  • config (TransformConfig) – The configuration for the kind being transformed.
  • taskdesc (dict) – The description of the current task.
  • cache_type (str) – The type of task result being cached.
  • cache_name (str) – The name of the object being cached.
  • digest (bytes or None) – A unique string indentifying this version of the artifacts being generated. Typically this will be the hash of inputs to the task.
  • digest_data (list of bytes or None) – A list of bytes representing the inputs of this task. They will be concatenated and hashed to create the digest for this task.
taskgraph.util.cached_tasks.cached_index_path(level, trust_domain, cache_type, cache_name, digest=None, digest_data=None)

Get the index path needed to locate the task that would be created by add_optimization().

Parameters:
  • level (int) – The SCM level of the task to look for.
  • trust_domain (str) – The trust domain to look for the task in.
  • cache_type (str) – The type of task result being cached.
  • cache_name (str) – The name of the object being cached.
  • digest (bytes or None) – A unique string indentifying this version of the artifacts being generated. Typically this will be the hash of inputs to the task.
  • digest_data (list of bytes or None) – A list of bytes representing the inputs of this task. They will be concatenated and hashed to create the digest for this task.
Return str:

The index path.

taskgraph.util.docker module

taskgraph.util.docker.build_from_context(docker_bin, context_path, prefix, tag=None)

Build a Docker image from a context archive.

Given the path to a docker binary, a image build tar.gz (produced with create_context_tar(), a prefix in that context containing files, and an optional tag for the produced image, build that Docker image.

taskgraph.util.docker.create_context_tar(topsrcdir, context_dir, out_path, prefix)

Create a context tarball.

A directory context_dir containing a Dockerfile will be assembled into a gzipped tar file at out_path. Files inside the archive will be prefixed by directory prefix.

We also scan the source Dockerfile for special syntax that influences context generation.

If a line in the Dockerfile has the form # %include <path>, the relative path specified on that line will be matched against files in the source repository and added to the context under the path topsrcdir/. If an entry is a directory, we add all files under that directory.

Returns the SHA-256 hex digest of the created archive.

taskgraph.util.docker.docker_image(name, by_tag=False)

Resolve in-tree prebuilt docker image to <registry>/<repository>@sha256:<digest>, or <registry>/<repository>:<tag> if by_tag is True.

taskgraph.util.docker.generate_context_hash(topsrcdir, image_path, image_name)

Generates a sha256 hash for context directory used to build an image.

taskgraph.util.docker.parse_volumes

Parse VOLUME entries from a Dockerfile for an image.

taskgraph.util.hash module

taskgraph.util.hash.hash_path

Hash a single file.

Returns the SHA-256 hash in hex form.

taskgraph.util.hash.hash_paths(base_path, patterns)

Give a list of path patterns, return a digest of the contents of all the corresponding files, similarly to git tree objects or mercurial manifests.

Each file is hashed. The list of all hashes and file paths is then itself hashed to produce the result.

taskgraph.util.parameterization module

taskgraph.util.parameterization.resolve_task_references(label, task_def, dependencies)

Resolve all instances of {‘task-reference’: ‘..<..>..’} in the given task definition, using the given dependencies

taskgraph.util.parameterization.resolve_timestamps(now, task_def)

Resolve all instances of {‘relative-datestamp’: ‘..’} in the given task definition

taskgraph.util.partials module

taskgraph.util.partials.get_balrog_platform_name(platform)

Convert build platform names into balrog platform names

taskgraph.util.partials.get_builds(release_history, platform, locale)

Examine cached balrog release history and return the list of builds we need to generate diffs from

taskgraph.util.partials.get_partials_artifact_map(release_history, platform, locale)
taskgraph.util.partials.get_partials_artifacts(release_history, platform, locale)
taskgraph.util.partials.get_release_builds(release)
taskgraph.util.partials.get_sorted_releases(product, branch)

Returns a list of release names from Balrog. :param product: product name, AKA appName :param branch: branch name, e.g. mozilla-central :return: a sorted list of release names, most recent first.

taskgraph.util.partials.populate_release_history(product, branch, maxbuilds=4, maxsearch=10)

Find relevant releases in Balrog Not all releases have all platforms and locales, due to Taskcluster migration.

Args:

product (str): capitalized product name, AKA appName, e.g. Firefox branch (str): branch name (mozilla-central) maxbuilds (int): Maximum number of historical releases to populate maxsearch(int): Traverse at most this many releases, to avoid

working through the entire history.
Returns:

json object based on data from balrog api

results = {
‘platform1’: {
‘locale1’: {
‘buildid1’: mar_url, ‘buildid2’: mar_url, ‘buildid3’: mar_url,

}, ‘locale2’: {

‘target.partial-1.mar’: {‘buildid1’: ‘mar_url’},

}

}, ‘platform2’: { }

}

taskgraph.util.platforms module

taskgraph.util.platforms.platform_family(build_platform)

Given a build platform, return the platform family (linux, macosx, etc.)

taskgraph.util.push_apk module

Common functions for both push-apk and push-apk-breakpoint.

taskgraph.util.push_apk.check_every_architecture_is_present_in_dependent_tasks(dependent_tasks)
taskgraph.util.push_apk.delete_non_required_fields_transform(_, jobs)
taskgraph.util.push_apk.fill_labels_tranform(_, jobs)
taskgraph.util.push_apk.generate_dependencies(dependent_tasks)
taskgraph.util.push_apk.validate_dependent_tasks_transform(_, jobs)
taskgraph.util.push_apk.validate_jobs_schema_transform_partial(description_schema, transform_type, config, jobs)

taskgraph.util.python_path module

taskgraph.util.python_path.find_object(path)

Find a Python object given a path of the form <modulepath>:<objectpath>. Conceptually equivalent to

def find_object(modulepath, objectpath):
import <modulepath> as mod return mod.<objectpath>

taskgraph.util.schema module

taskgraph.util.schema.Schema(*args, **kwargs)

Operates identically to voluptuous.Schema, but applying some taskgraph-specific checks in the process.

taskgraph.util.schema.check_schema(schema)
taskgraph.util.schema.optionally_keyed_by(*arguments)

Mark a schema value as optionally keyed by any of a number of fields. The schema is the last argument, and the remaining fields are taken to be the field names. For example:

‘some-value’: optionally_keyed_by(
‘test-platform’, ‘build-platform’, Any(‘a’, ‘b’, ‘c’))

The resulting schema will allow nesting of by-test-platform and by-build-platform in either order.

taskgraph.util.schema.resolve_keyed_by(item, field, item_name, **extra_values)

For values which can either accept a literal value, or be keyed by some other attribute of the item, perform that lookup and replacement in-place (modifying item directly). The field is specified using dotted notation to traverse dictionaries.

For example, given item:

job:
    test-platform: linux128
    chunks:
        by-test-platform:
            macosx-10.11/debug: 13
            win.*: 6
            default: 12

a call to `resolve_keyed_by(item, ‘job.chunks’, item[‘thing-name’]) would mutate item in-place to:

job:
    chunks: 12

The item_name parameter is used to generate useful error messages.

If extra_values are supplied, they represent additional values available for reference from by-<field>.

Items can be nested as deeply as the schema will allow:

chunks:
    by-test-platform:
        win.*:
            by-project:
                ash: ..
                cedar: ..
        linux: 13
        default: 12
taskgraph.util.schema.validate_schema(schema, obj, msg_prefix)

Validate that object satisfies schema. If not, generate a useful exception beginning with msg_prefix.

taskgraph.util.scriptworker module

Make scriptworker.cot.verify more user friendly by making scopes dynamic.

Scriptworker uses certain scopes to determine which sets of credentials to use. Certain scopes are restricted by branch in chain of trust verification, and are checked again at the script level. This file provides functions to adjust these scopes automatically by project; this makes pushing to try, forking a project branch, and merge day uplifts more user friendly.

In the future, we may adjust scopes by other settings as well, e.g. different scopes for push-to-candidates rather than push-to-releases, even if both happen on mozilla-beta and mozilla-release.

taskgraph.util.scriptworker.BALROG_SCOPE_ALIAS_TO_PROJECT = [[u'nightly', set([u'mozilla-central'])], [u'beta', set([u'mozilla-beta'])], [u'release', set([u'mozilla-release'])], [u'esr', set([u'mozilla-esr52'])]]

Map the balrog scope aliases to the actual scopes.

taskgraph.util.scriptworker.BALROG_SERVER_SCOPES = {u'esr': u'project:releng:balrog:server:esr', u'default': u'project:releng:balrog:server:dep', u'aurora': u'project:releng:balrog:server:aurora', u'nightly': u'project:releng:balrog:server:nightly', u'beta': u'project:releng:balrog:server:beta', u'release': u'project:releng:balrog:server:release'}

Map the balrog scope aliases to the actual channel scopes.

taskgraph.util.scriptworker.BEETMOVER_ACTION_SCOPES = {u'default': u'project:releng:beetmover:action:push-to-staging', u'all-candidates-tasks': u'project:releng:beetmover:action:push-to-candidates', u'all-publish-tasks': u'project:releng:beetmover:action:push-to-releases', u'all-nightly-tasks': u'project:releng:beetmover:action:push-to-nightly'}

Map balrog scope aliases to sets of projects.

This is a list of list-pairs, for ordering.

taskgraph.util.scriptworker.BEETMOVER_BUCKET_SCOPES = {u'default': u'project:releng:beetmover:bucket:dep', u'all-candidates-tasks': {u'all-release-branches': u'project:releng:beetmover:bucket:release'}, u'all-publish-tasks': {u'all-release-branches': u'project:releng:beetmover:bucket:release'}, u'all-nightly-tasks': {u'all-nightly-branches': u'project:releng:beetmover:bucket:nightly'}}

Map the beetmover tasks aliases to the actual action scopes.

taskgraph.util.scriptworker.BEETMOVER_RELEASE_TARGET_TASKS = set([u'publish_fennec', u'candidates_fennec'])

Map beetmover tasks aliases to sets of target task methods.

This is a list of list-pairs, for ordering.

taskgraph.util.scriptworker.BEETMOVER_SCOPE_ALIAS_TO_PROJECT = [[u'all-nightly-branches', set([u'mozilla-beta', u'mozilla-release', u'mozilla-central'])], [u'all-release-branches', set([u'mozilla-release', u'mozilla-beta'])]]

The set of all beetmover release target tasks.

Used for both BEETMOVER_SCOPE_ALIAS_TO_TARGET_TASK and get_release_build_number

taskgraph.util.scriptworker.BEETMOVER_SCOPE_ALIAS_TO_TARGET_TASK = [[u'all-nightly-tasks', set([u'nightly_linux', u'nightly_fennec', u'nightly_macosx', u'mozilla_beta_tasks', u'nightly_win64', u'mozilla_release_tasks', u'nightly_desktop', u'nightly_win32'])], [u'all-candidates-tasks', set([u'candidates_fennec'])], [u'all-publish-tasks', set([u'publish_fennec'])]]

Map the beetmover scope aliases to the actual scopes.

taskgraph.util.scriptworker.DEVEDITION_SIGNING_CERT_SCOPES = {u'default': u'project:releng:signing:cert:dep-signing', u'beta': u'project:releng:signing:cert:nightly-signing'}

Map beetmover scope aliases to sets of projects.

taskgraph.util.scriptworker.SIGNING_SCOPE_ALIAS_TO_PROJECT = [[u'all-nightly-branches', set([u'mozilla-central'])], [u'all-release-branches', set([u'mozilla-release', u'mozilla-beta'])]]

Map the signing scope aliases to the actual scopes.

taskgraph.util.scriptworker.VERSION_PATH = u'/Users/andrewswan/src/mozilla-unified/browser/config/version_display.txt'

Map signing scope aliases to sets of projects.

Currently m-c and DevEdition on m-b use nightly signing; Beta on m-b and m-r use release signing. These data structures aren’t set-up to handle different scopes on the same repo, so we use a different set of them for DevEdition, and callers are responsible for using the correct one (by calling the appropriate helper below). More context on this in https://bugzilla.mozilla.org/show_bug.cgi?id=1358601.

We will need to add esr support at some point. Eventually we want to add nuance so certain m-b and m-r tasks use dep or nightly signing, and we only release sign when we have a signed-off set of candidate builds. This current approach works for now, though.

This is a list of list-pairs, for ordering.

taskgraph.util.scriptworker.get_release_config(config, force=False)

Get the build number and version for a release task.

Currently only applies to beetmover tasks.

Parameters:config (dict) – the task config that defines the target task method.
Returns:
containing both build_number and version. This can be used to
update task.payload.
Return type:dict
taskgraph.util.scriptworker.get_scope_from_project(alias_to_project_map, alias_to_scope_map, config)

Determine the restricted scope from config.params[‘project’].

Parameters:
  • alias_to_project_map (list of lists) – each list pair contains the alias and the set of projects that match. This is ordered.
  • alias_to_scope_map (dict) – the alias alias to scope
  • config (dict) – the task config that defines the project.
Returns:

the scope to use.

Return type:

string

taskgraph.util.scriptworker.get_scope_from_target_method(alias_to_tasks_map, alias_to_scope_map, config)

Determine the restricted scope from config.params[‘target_tasks_method’].

Parameters:
  • alias_to_tasks_map (list of lists) – each list pair contains the alias and the set of target methods that match. This is ordered.
  • alias_to_scope_map (dict) – the alias alias to scope
  • config (dict) – the task config that defines the target task method.
Returns:

the scope to use.

Return type:

string

taskgraph.util.scriptworker.get_scope_from_target_method_and_project(alias_to_tasks_map, alias_to_project_map, aliases_to_scope_map, config)

Determine the restricted scope from both target_tasks_method and project.

On certain branches, we’ll need differing restricted scopes based on target_tasks_method. However, we can’t key solely on that, since that target_tasks_method might be run on an unprivileged branch. This method checks both.

Parameters:
  • alias_to_tasks_map (list of lists) – each list pair contains the alias and the set of target methods that match. This is ordered.
  • alias_to_project_map (list of lists) – each list pair contains the alias and the set of projects that match. This is ordered.
  • aliases_to_scope_map (dict of dicts) – the task alias to project alias to scope
  • config (dict) – the task config that defines the target task method and project.
Returns:

the scope to use.

Return type:

string

taskgraph.util.scriptworker.get_signing_cert_scope_per_platform(build_platform, is_nightly, config)

taskgraph.util.seta module

class taskgraph.util.seta.SETA

Bases: object

Interface to the SETA service, which defines low-value tasks that can be optimized out of the taskgraph.

is_low_value_task(label, project, pushlog_id, push_date, bbb_task=False)
minutes_between_pushes(project, cur_push_id, cur_push_date)
query_low_value_tasks(project, bbb=False)

taskgraph.util.signed_artifacts module

Defines artifacts to sign before repackage.

taskgraph.util.signed_artifacts.generate_specifications_of_artifacts_to_sign(build_platform, is_nightly=False, keep_locale_template=True)

taskgraph.util.taskcluster module

taskgraph.util.taskcluster.cancel_task(task_id, use_proxy=False)

Cancels a task given a task_id. In testing mode, just logs that it would have cancelled.

taskgraph.util.taskcluster.find_task_id(index_path, use_proxy=False)
taskgraph.util.taskcluster.get_artifact(task_id, path, use_proxy=False)

Returns the artifact with the given path for the given task id.

If the path ends with “.json” or “.yml”, the content is deserialized as, respectively, json or yaml, and the corresponding python data (usually dict) is returned. For other types of content, a file-like object is returned.

taskgraph.util.taskcluster.get_artifact_from_index(index_path, artifact_path, use_proxy=False)
taskgraph.util.taskcluster.get_artifact_url(task_id, path, use_proxy=False)
taskgraph.util.taskcluster.get_index_url(index_path, use_proxy=False, multiple=False)
taskgraph.util.taskcluster.get_purge_cache_url(provisioner_id, worker_type, use_proxy=False)
taskgraph.util.taskcluster.get_session
taskgraph.util.taskcluster.get_task_definition(task_id, use_proxy=False)
taskgraph.util.taskcluster.get_task_url(task_id, use_proxy=False)
taskgraph.util.taskcluster.get_taskcluster_artifact_prefix(task_id, postfix=u'', locale=None)
taskgraph.util.taskcluster.list_artifacts(task_id, use_proxy=False)
taskgraph.util.taskcluster.list_tasks(index_path, use_proxy=False)

Returns a list of task_ids where each task_id is indexed under a path in the index. Results are sorted by expiration date from oldest to newest.

taskgraph.util.taskcluster.purge_cache(provisioner_id, worker_type, cache_name, use_proxy=False)

Requests a cache purge from the purge-caches service.

taskgraph.util.templates module

taskgraph.util.templates.merge(*objects)

Merge the given objects, using the semantics described for merge_to, with objects later in the list taking precedence. From an inheritance perspective, “parents” should be listed before “children”.

Returns the result without modifying any arguments.

taskgraph.util.templates.merge_to(source, dest)

Merge dict and arrays (override scalar values)

Keys from source override keys from dest, and elements from lists in source are appended to lists in dest.

Parameters:
  • source (dict) – to copy from
  • dest (dict) – to copy to (modified in place)

taskgraph.util.time module

exception taskgraph.util.time.InvalidString

Bases: exceptions.Exception

exception taskgraph.util.time.UnknownTimeMeasurement

Bases: exceptions.Exception

taskgraph.util.time.current_json_time(datetime_format=False)
Parameters:datetime_format (boolean) – Set True to get a datetime output
Returns:JSON string representation of the current time.
taskgraph.util.time.days(value)
taskgraph.util.time.hours(value)
taskgraph.util.time.json_time_from_now(input_str, now=None, datetime_format=False)
Parameters:
  • input_str (str) – Input string (see value of)
  • now (datetime) – Optionally set the definition of now
  • datetime_format (boolean) – Set True to get a datetime output
Returns:

JSON string representation of time in future.

taskgraph.util.time.minutes(value)
taskgraph.util.time.months(value)
taskgraph.util.time.seconds(value)
taskgraph.util.time.value_of(input_str)

Convert a string to a json date in the future :param str input_str: (ex: 1d, 2d, 6years, 2 seconds) :returns: Unit given in seconds

taskgraph.util.time.years(value)

taskgraph.util.treeherder module

taskgraph.util.treeherder.join_symbol(group, symbol)

Perform the reverse of split_symbol, combining the given group and symbol. If the group is ‘?’, then it is omitted.

taskgraph.util.treeherder.split_symbol(treeherder_symbol)

Split a symbol expressed as grp(sym) into its two parts. If no group is given, the returned group is ‘?’

taskgraph.util.verify module

class taskgraph.util.verify.VerificationSequence

Bases: object

Container for a sequence of verifications over a TaskGraph. Each verification is represented as a callable taking (task, taskgraph, scratch_pad), called for each task in the taskgraph, and one more time with no task but with the taskgraph and the same scratch_pad that was passed for each task.

add(graph_name)
taskgraph.util.verify.verify_bbb_builders_valid(task, taskgraph, scratch_pad)

This function ensures that any task which is run in buildbot (via buildbot-bridge) is using a recognized buildername.

If you see an unexpected failure with a task due to this check, please see the IRC Channel, #releng.

taskgraph.util.verify.verify_dependency_tiers(task, taskgraph, scratch_pad)
taskgraph.util.verify.verify_docs(filename, identifiers, appearing_as)
taskgraph.util.verify.verify_gecko_v2_routes(task, taskgraph, scratch_pad)

This function ensures that any two tasks have distinct index.v2.routes

taskgraph.util.verify.verify_task_graph_symbol(task, taskgraph, scratch_pad)

This function verifies that tuple (collection.keys(), machine.platform, groupSymbol, symbol) is unique for a target task graph.

taskgraph.util.workertypes module

taskgraph.util.workertypes.worker_type_implementation(worker_type)

Get the worker implementation and OS for the given workerType, where the OS represents the host system, not the target OS, in the case of cross-compiles.

taskgraph.util.yaml module

taskgraph.util.yaml.load_yaml(path, name, enforce_order=False)

Convenience function to load a YAML file in the given path. This is useful for loading kind configuration files from the kind path. If enforce_order is given, then any top-level keys in the file must be given in order.

Module contents