123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287 |
- .. SPDX-License-Identifier: CC-BY-SA-2.0-UK
- *******************************************
- Understanding the Yocto Project Autobuilder
- *******************************************
- Execution Flow within the Autobuilder
- =====================================
- The "a-full" and "a-quick" targets are the usual entry points into the
- Autobuilder and it makes sense to follow the process through the system
- starting there. This is best visualized from the :yocto_ab:`Autobuilder
- Console view </valkyrie/#/console>`.
- Each item along the top of that view represents some "target build" and
- these targets are all run in parallel. The 'full' build will trigger the
- majority of them, the "quick" build will trigger some subset of them.
- The Autobuilder effectively runs whichever configuration is defined for
- each of those targets on a separate buildbot worker. To understand the
- configuration, you need to look at the entry on ``config.json`` file
- within the :yocto_git:`yocto-autobuilder-helper </yocto-autobuilder-helper>`
- repository. The targets are defined in the ``overrides`` section, a quick
- example could be ``qemux86-64`` which looks like::
- "qemux86-64" : {
- "MACHINE" : "qemux86-64",
- "TEMPLATE" : "arch-qemu",
- "step1" : {
- "extravars" : [
- "IMAGE_FSTYPES:append = ' wic wic.bmap'"
- ]
- }
- },
- And to expand that, you need the ``arch-qemu`` entry from
- the ``templates`` section, which looks like::
- "arch-qemu" : {
- "BUILDINFO" : true,
- "BUILDHISTORY" : true,
- "step1" : {
- "BBTARGETS" : "core-image-sato core-image-sato-dev core-image-sato-sdk core-image-minimal core-image-minimal-dev core-image-sato:do_populate_sdk",
- "SANITYTARGETS" : "core-image-minimal:do_testimage core-image-sato:do_testimage core-image-sato-sdk:do_testimage core-image-sato:do_testsdk"
- },
- "step2" : {
- "SDKMACHINE" : "x86_64",
- "BBTARGETS" : "core-image-sato:do_populate_sdk core-image-minimal:do_populate_sdk_ext core-image-sato:do_populate_sdk_ext",
- "SANITYTARGETS" : "core-image-sato:do_testsdk core-image-minimal:do_testsdkext core-image-sato:do_testsdkext"
- },
- "step3" : {
- "BUILDHISTORY" : false,
- "EXTRACMDS" : ["${SCRIPTSDIR}/checkvnc; DISPLAY=:1 oe-selftest ${HELPERSTMACHTARGS} -j 15"],
- "ADDLAYER" : ["${BUILDDIR}/../meta-selftest"]
- }
- },
- Combining these two entries you can see that ``qemux86-64`` is a three step
- build where ``bitbake BBTARGETS`` would be run, then ``bitbake SANITYTARGETS``
- for each step; all for ``MACHINE="qemux86-64"`` but with differing
- :term:`SDKMACHINE` settings. In step 1, an extra variable is added to the
- ``auto.conf`` file to enable wic image generation.
- While not every detail of this is covered here, you can see how the
- template mechanism allows quite complex configurations to be built up
- yet allows duplication and repetition to be kept to a minimum.
- The different build targets are designed to allow for parallelization,
- so different machines are usually built in parallel, operations using
- the same machine and metadata are built sequentially, with the aim of
- trying to optimize build efficiency as much as possible.
- The ``config.json`` file is processed by the scripts in the Helper
- repository in the ``scripts`` directory. The following section details
- how this works.
- Autobuilder Target Execution Overview
- =====================================
- For each given target in a build, the Autobuilder executes several
- steps. These are configured in ``yocto-autobuilder2/builders.py`` and
- roughly consist of:
- #. *Run clobberdir*.
- This cleans out any previous build. Old builds are left around to
- allow easier debugging of failed builds. For additional information,
- see :ref:`test-manual/understand-autobuilder:clobberdir`.
- #. *Obtain yocto-autobuilder-helper*
- This step clones the :yocto_git:`yocto-autobuilder-helper </yocto-autobuilder-helper>`
- git repository. This is necessary to avoid the requirement to maintain all
- the release or project-specific code within Buildbot. The branch chosen
- matches the release being built so we can support older releases and
- still make changes in newer ones.
- #. *Write layerinfo.json*
- This transfers data in the Buildbot UI when the build was configured
- to the Helper.
- #. *Call scripts/shared-repo-unpack*
- This is a call into the Helper scripts to set up a checkout of all
- the pieces this build might need. It might clone the BitBake
- repository and the OpenEmbedded-Core repository. It may clone the
- Poky repository, as well as additional layers. It will use the data
- from the ``layerinfo.json`` file to help understand the
- configuration. It will also use a local cache of repositories to
- speed up the clone checkouts. For additional information, see
- :ref:`test-manual/understand-autobuilder:Autobuilder Clone Cache`.
- This step has two possible modes of operation. If the build is part
- of a parent build, it's possible that all the repositories needed may
- already be available, ready in a pre-prepared directory. An "a-quick"
- or "a-full" build would prepare this before starting the other
- sub-target builds. This is done for two reasons:
- - the upstream may change during a build, for example, from a forced
- push and this ensures we have matching content for the whole build
- - if 15 Workers all tried to pull the same data from the same repos,
- we can hit resource limits on upstream servers as they can think
- they are under some kind of network attack
- This pre-prepared directory is shared among the Workers over NFS. If
- the build is an individual build and there is no "shared" directory
- available, it would clone from the cache and the upstreams as
- necessary. This is considered the fallback mode.
- #. *Call scripts/run-config*
- This is another call into the Helper scripts where it's expected that
- the main functionality of this target will be executed.
- Autobuilder Technology
- ======================
- The Autobuilder has Yocto Project-specific functionality to allow builds
- to operate with increased efficiency and speed.
- clobberdir
- ----------
- When deleting files, the Autobuilder uses ``clobberdir``, which is a
- special script that moves files to a special location, rather than
- deleting them. Files in this location are deleted by an ``rm`` command,
- which is run under ``ionice -c 3``. For example, the deletion only
- happens when there is idle IO capacity on the Worker. The Autobuilder
- Worker Janitor runs this deletion. See :ref:`test-manual/understand-autobuilder:Autobuilder Worker Janitor`.
- Autobuilder Clone Cache
- -----------------------
- Cloning repositories from scratch each time they are required was slow
- on the Autobuilder. We therefore have a stash of commonly used
- repositories pre-cloned on the Workers. Data is fetched from these
- during clones first, then "topped up" with later revisions from any
- upstream when necessary. The cache is maintained by the Autobuilder
- Worker Janitor. See :ref:`test-manual/understand-autobuilder:Autobuilder Worker Janitor`.
- Autobuilder Worker Janitor
- --------------------------
- This is a process running on each Worker that performs two basic
- operations, including background file deletion at IO idle (see
- "Run clobberdir" in :ref:`test-manual/understand-autobuilder:Autobuilder Target Execution Overview`)
- and maintenance of a cache of cloned repositories to improve the speed
- the system can checkout repositories.
- Shared DL_DIR
- -------------
- The Workers are all connected over NFS which allows :term:`DL_DIR` to be shared
- between them. This reduces network accesses from the system and allows
- the build to be sped up. The usage of the directory within the build system
- is designed to be able to be shared over NFS.
- Shared SSTATE_DIR
- -----------------
- The Workers are all connected over NFS which allows the ``sstate``
- directory to be shared between them. This means once a Worker has built
- an artifact, all the others can benefit from it. The usage of the directory
- within the build system is designed for sharing over NFS.
- Resulttool
- ----------
- All of the different tests run as part of the build generate output into
- ``testresults.json`` files. This allows us to determine which tests ran
- in a given build and their status. Additional information, such as
- failure logs or the time taken to run the tests, may also be included.
- Resulttool is part of OpenEmbedded-Core and is used to manipulate these
- JSON results files. It has the ability to merge files together, display
- reports of the test results and compare different result files.
- For details, see :yocto_wiki:`/Resulttool`.
- run-config Target Execution
- ===========================
- The ``scripts/run-config`` execution is where most of the work within
- the Autobuilder happens. It runs through a number of steps; the first
- are general setup steps that are run once and include:
- #. Set up any :term:`buildtools` tarball if configured.
- #. Call ``buildhistory-init`` if :ref:`ref-classes-buildhistory` is configured.
- For each step that is configured in ``config.json``, it will perform the
- following:
- #. Add any layers that are specified using the
- ``bitbake-layers add-layer`` command (logging as stepXa)
- #. Call the ``scripts/setup-config`` script to generate the necessary
- ``auto.conf`` configuration file for the build
- #. Run the ``bitbake BBTARGETS`` command (logging as stepXb)
- #. Run the ``bitbake SANITYTARGETS`` command (logging as stepXc)
- #. Run the ``EXTRACMDS`` command, which are run within the BitBake build
- environment (logging as stepXd)
- #. Run the ``EXTRAPLAINCMDS`` command(s), which are run outside the
- BitBake build environment (logging as stepXd)
- #. Remove any layers added in step
- 1 using the ``bitbake-layers remove-layer`` command (logging as stepXa)
- Once the execution steps above complete, ``run-config`` executes a set
- of post-build steps, including:
- #. Call ``scripts/publish-artifacts`` to collect any output which is to
- be saved from the build.
- #. Call ``scripts/collect-results`` to collect any test results to be
- saved from the build.
- #. Call ``scripts/upload-error-reports`` to send any error reports
- generated to the remote server.
- #. Cleanup the :term:`Build Directory` using
- :ref:`test-manual/understand-autobuilder:clobberdir` if the build was successful,
- else rename it to "build-renamed" for potential future debugging.
- Deploying Yocto Autobuilder
- ===========================
- The most up to date information about how to setup and deploy your own
- Autobuilder can be found in :yocto_git:`README.md </yocto-autobuilder2/tree/README.md>`
- in the :yocto_git:`yocto-autobuilder2 </yocto-autobuilder2>` repository.
- We hope that people can use the :yocto_git:`yocto-autobuilder2 </yocto-autobuilder2>`
- code directly but it is inevitable that users will end up needing to heavily
- customize the :yocto_git:`yocto-autobuilder-helper </yocto-autobuilder-helper>`
- repository, particularly the ``config.json`` file as they will want to define
- their own test matrix.
- The Autobuilder supports two customization options:
- - variable substitution
- - overlaying configuration files
- The standard ``config.json`` minimally attempts to allow substitution of
- the paths. The Helper script repository includes a
- ``local-example.json`` file to show how you could override these from a
- separate configuration file. Pass the following into the environment of
- the Autobuilder::
- $ ABHELPER_JSON="config.json local-example.json"
- As another example, you could also pass the following into the
- environment::
- $ ABHELPER_JSON="config.json /some/location/local.json"
- One issue users often run into is validation of the ``config.json`` files. A
- tip for minimizing issues from invalid JSON files is to use a Git
- ``pre-commit-hook.sh`` script to verify the JSON file before committing
- it. Create a symbolic link as follows::
- $ ln -s ../../scripts/pre-commit-hook.sh .git/hooks/pre-commit
|