intro.rst 21 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535
  1. .. SPDX-License-Identifier: CC-BY-SA-2.0-UK
  2. *****************************************
  3. The Yocto Project Test Environment Manual
  4. *****************************************
  5. Welcome
  6. =======
  7. Welcome to the Yocto Project Test Environment Manual! This manual is a
  8. work in progress. The manual contains information about the testing
  9. environment used by the Yocto Project to make sure each major and minor
  10. release works as intended. All the project's testing infrastructure and
  11. processes are publicly visible and available so that the community can
  12. see what testing is being performed, how it's being done and the current
  13. status of the tests and the project at any given time. It is intended
  14. that Other organizations can leverage off the process and testing
  15. environment used by the Yocto Project to create their own automated,
  16. production test environment, building upon the foundations from the
  17. project core.
  18. Currently, the Yocto Project Test Environment Manual has no projected
  19. release date. This manual is a work-in-progress and is being initially
  20. loaded with information from the README files and notes from key
  21. engineers:
  22. - *yocto-autobuilder2:* This
  23. :yocto_git:`README.md </yocto-autobuilder2/tree/README.md>`
  24. is the main README which details how to set up the Yocto Project
  25. Autobuilder. The ``yocto-autobuilder2`` repository represents the
  26. Yocto Project's console UI plugin to Buildbot and the configuration
  27. necessary to configure Buildbot to perform the testing the project
  28. requires.
  29. - *yocto-autobuilder-helper:* This :yocto_git:`README </yocto-autobuilder-helper/tree/README/>`
  30. and repository contains Yocto Project Autobuilder Helper scripts and
  31. configuration. The ``yocto-autobuilder-helper`` repository contains
  32. the "glue" logic that defines which tests to run and how to run them.
  33. As a result, it can be used by any Continuous Improvement (CI) system
  34. to run builds, support getting the correct code revisions, configure
  35. builds and layers, run builds, and collect results. The code is
  36. independent of any CI system, which means the code can work `Buildbot <https://docs.buildbot.net/0.9.15.post1/>`__,
  37. Jenkins, or others. This repository has a branch per release of the
  38. project defining the tests to run on a per release basis.
  39. Yocto Project Autobuilder Overview
  40. ==================================
  41. The Yocto Project Autobuilder collectively refers to the software,
  42. tools, scripts, and procedures used by the Yocto Project to test
  43. released software across supported hardware in an automated and regular
  44. fashion. Basically, during the development of a Yocto Project release,
  45. the Autobuilder tests if things work. The Autobuilder builds all test
  46. targets and runs all the tests.
  47. The Yocto Project uses now uses standard upstream
  48. `Buildbot <https://docs.buildbot.net/0.9.15.post1/>`__ (version 9) to
  49. drive its integration and testing. Buildbot Nine has a plug-in interface
  50. that the Yocto Project customizes using code from the
  51. ``yocto-autobuilder2`` repository, adding its own console UI plugin. The
  52. resulting UI plug-in allows you to visualize builds in a way suited to
  53. the project's needs.
  54. A ``helper`` layer provides configuration and job management through
  55. scripts found in the ``yocto-autobuilder-helper`` repository. The
  56. ``helper`` layer contains the bulk of the build configuration
  57. information and is release-specific, which makes it highly customizable
  58. on a per-project basis. The layer is CI system-agnostic and contains a
  59. number of Helper scripts that can generate build configurations from
  60. simple JSON files.
  61. .. note::
  62. The project uses Buildbot for historical reasons but also because
  63. many of the project developers have knowledge of Python. It is
  64. possible to use the outer layers from another Continuous Integration
  65. (CI) system such as
  66. `Jenkins <https://en.wikipedia.org/wiki/Jenkins_(software)>`__
  67. instead of Buildbot.
  68. The following figure shows the Yocto Project Autobuilder stack with a
  69. topology that includes a controller and a cluster of workers:
  70. .. image:: figures/ab-test-cluster.png
  71. :align: center
  72. :width: 70%
  73. Yocto Project Tests --- Types of Testing Overview
  74. =================================================
  75. The Autobuilder tests different elements of the project by using
  76. the following types of tests:
  77. - *Build Testing:* Tests whether specific configurations build by
  78. varying :term:`MACHINE`,
  79. :term:`DISTRO`, other configuration
  80. options, and the specific target images being built (or world). Used
  81. to trigger builds of all the different test configurations on the
  82. Autobuilder. Builds usually cover many different targets for
  83. different architectures, machines, and distributions, as well as
  84. different configurations, such as different init systems. The
  85. Autobuilder tests literally hundreds of configurations and targets.
  86. - *Sanity Checks During the Build Process:* Tests initiated through
  87. the :ref:`insane <ref-classes-insane>`
  88. class. These checks ensure the output of the builds are correct.
  89. For example, does the ELF architecture in the generated binaries
  90. match the target system? ARM binaries would not work in a MIPS
  91. system!
  92. - *Build Performance Testing:* Tests whether or not commonly used steps
  93. during builds work efficiently and avoid regressions. Tests to time
  94. commonly used usage scenarios are run through ``oe-build-perf-test``.
  95. These tests are run on isolated machines so that the time
  96. measurements of the tests are accurate and no other processes
  97. interfere with the timing results. The project currently tests
  98. performance on two different distributions, Fedora and Ubuntu, to
  99. ensure we have no single point of failure and can ensure the
  100. different distros work effectively.
  101. - *eSDK Testing:* Image tests initiated through the following command::
  102. $ bitbake image -c testsdkext
  103. The tests utilize the :ref:`testsdkext <ref-classes-testsdk>` class and the ``do_testsdkext`` task.
  104. - *Feature Testing:* Various scenario-based tests are run through the
  105. :ref:`OpenEmbedded Self test (oe-selftest) <ref-manual/release-process:Testing and Quality Assurance>`. We test oe-selftest on each of the main distributions
  106. we support.
  107. - *Image Testing:* Image tests initiated through the following command::
  108. $ bitbake image -c testimage
  109. The tests utilize the :ref:`testimage* <ref-classes-testimage*>`
  110. classes and the :ref:`ref-tasks-testimage` task.
  111. - *Layer Testing:* The Autobuilder has the possibility to test whether
  112. specific layers work with the test of the system. The layers tested
  113. may be selected by members of the project. Some key community layers
  114. are also tested periodically.
  115. - *Package Testing:* A Package Test (ptest) runs tests against packages
  116. built by the OpenEmbedded build system on the target machine. See the
  117. :ref:`Testing Packages With
  118. ptest <dev-manual/common-tasks:Testing Packages With ptest>` section
  119. in the Yocto Project Development Tasks Manual and the
  120. ":yocto_wiki:`Ptest </Ptest>`" Wiki page for more
  121. information on Ptest.
  122. - *SDK Testing:* Image tests initiated through the following command::
  123. $ bitbake image -c testsdk
  124. The tests utilize the :ref:`testsdk <ref-classes-testsdk>` class and
  125. the ``do_testsdk`` task.
  126. - *Unit Testing:* Unit tests on various components of the system run
  127. through :ref:`bitbake-selftest <ref-manual/release-process:Testing and Quality Assurance>` and
  128. :ref:`oe-selftest <ref-manual/release-process:Testing and Quality Assurance>`.
  129. - *Automatic Upgrade Helper:* This target tests whether new versions of
  130. software are available and whether we can automatically upgrade to
  131. those new versions. If so, this target emails the maintainers with a
  132. patch to let them know this is possible.
  133. How Tests Map to Areas of Code
  134. ==============================
  135. Tests map into the codebase as follows:
  136. - *bitbake-selftest:*
  137. These tests are self-contained and test BitBake as well as its APIs,
  138. which include the fetchers. The tests are located in
  139. ``bitbake/lib/*/tests``.
  140. Some of these tests run the ``bitbake`` command, so ``bitbake/bin``
  141. must be added to the ``PATH`` before running ``bitbake-selftest``.
  142. From within the BitBake repository, run the following::
  143. $ export PATH=$PWD/bin:$PATH
  144. After that, you can run the selftest script::
  145. $ bitbake-selftest
  146. The default output is quiet and just prints a summary of what was
  147. run. To see more information, there is a verbose option::
  148. $ bitbake-selftest -v
  149. To skip tests that access the Internet, use the ``BB_SKIP_NETTESTS``
  150. variable when running "bitbake-selftest" as follows::
  151. $ BB_SKIP_NETTESTS=yes bitbake-selftest
  152. Use this option when you wish to skip tests that access the network,
  153. which are mostly necessary to test the fetcher modules. To specify
  154. individual test modules to run, append the test module name to the
  155. "bitbake-selftest" command. For example, to specify the tests for the
  156. bb.data.module, run::
  157. $ bitbake-selftest bb.test.data.module
  158. You can also specify individual tests by defining the full name and module
  159. plus the class path of the test, for example::
  160. $ bitbake-selftest bb.tests.data.TestOverrides.test_one_override
  161. The tests are based on `Python
  162. unittest <https://docs.python.org/3/library/unittest.html>`__.
  163. - *oe-selftest:*
  164. - These tests use OE to test the workflows, which include testing
  165. specific features, behaviors of tasks, and API unit tests.
  166. - The tests can take advantage of parallelism through the "-j"
  167. option, which can specify a number of threads to spread the tests
  168. across. Note that all tests from a given class of tests will run
  169. in the same thread. To parallelize large numbers of tests you can
  170. split the class into multiple units.
  171. - The tests are based on Python unittest.
  172. - The code for the tests resides in
  173. ``meta/lib/oeqa/selftest/cases/``.
  174. - To run all the tests, enter the following command::
  175. $ oe-selftest -a
  176. - To run a specific test, use the following command form where
  177. testname is the name of the specific test::
  178. $ oe-selftest -r <testname>
  179. For example, the following command would run the tinfoil
  180. getVar API test::
  181. $ oe-selftest -r tinfoil.TinfoilTests.test_getvar
  182. It is also possible to run a set
  183. of tests. For example the following command will run all of the
  184. tinfoil tests::
  185. $ oe-selftest -r tinfoil
  186. - *testimage:*
  187. - These tests build an image, boot it, and run tests against the
  188. image's content.
  189. - The code for these tests resides in ``meta/lib/oeqa/runtime/cases/``.
  190. - You need to set the :term:`IMAGE_CLASSES` variable as follows::
  191. IMAGE_CLASSES += "testimage"
  192. - Run the tests using the following command form::
  193. $ bitbake image -c testimage
  194. - *testsdk:*
  195. - These tests build an SDK, install it, and then run tests against
  196. that SDK.
  197. - The code for these tests resides in ``meta/lib/oeqa/sdk/cases/``.
  198. - Run the test using the following command form::
  199. $ bitbake image -c testsdk
  200. - *testsdk_ext:*
  201. - These tests build an extended SDK (eSDK), install that eSDK, and
  202. run tests against the eSDK.
  203. - The code for these tests resides in ``meta/lib/oeqa/esdk``.
  204. - To run the tests, use the following command form::
  205. $ bitbake image -c testsdkext
  206. - *oe-build-perf-test:*
  207. - These tests run through commonly used usage scenarios and measure
  208. the performance times.
  209. - The code for these tests resides in ``meta/lib/oeqa/buildperf``.
  210. - To run the tests, use the following command form::
  211. $ oe-build-perf-test <options>
  212. The command takes a number of options,
  213. such as where to place the test results. The Autobuilder Helper
  214. Scripts include the ``build-perf-test-wrapper`` script with
  215. examples of how to use the oe-build-perf-test from the command
  216. line.
  217. Use the ``oe-git-archive`` command to store test results into a
  218. Git repository.
  219. Use the ``oe-build-perf-report`` command to generate text reports
  220. and HTML reports with graphs of the performance data. For
  221. examples, see
  222. :yocto_dl:`/releases/yocto/yocto-2.7/testresults/buildperf-centos7/perf-centos7.yoctoproject.org_warrior_20190414204758_0e39202.html`
  223. and
  224. :yocto_dl:`/releases/yocto/yocto-2.7/testresults/buildperf-centos7/perf-centos7.yoctoproject.org_warrior_20190414204758_0e39202.txt`.
  225. - The tests are contained in ``lib/oeqa/buildperf/test_basic.py``.
  226. Test Examples
  227. =============
  228. This section provides example tests for each of the tests listed in the
  229. :ref:`test-manual/intro:How Tests Map to Areas of Code` section.
  230. For oeqa tests, testcases for each area reside in the main test
  231. directory at ``meta/lib/oeqa/selftest/cases`` directory.
  232. For oe-selftest. bitbake testcases reside in the ``lib/bb/tests/``
  233. directory.
  234. ``bitbake-selftest``
  235. --------------------
  236. A simple test example from ``lib/bb/tests/data.py`` is::
  237. class DataExpansions(unittest.TestCase):
  238. def setUp(self):
  239. self.d = bb.data.init()
  240. self.d["foo"] = "value_of_foo"
  241. self.d["bar"] = "value_of_bar"
  242. self.d["value_of_foo"] = "value_of_'value_of_foo'"
  243. def test_one_var(self):
  244. val = self.d.expand("${foo}")
  245. self.assertEqual(str(val), "value_of_foo")
  246. In this example, a ``DataExpansions`` class of tests is created,
  247. derived from standard Python unittest. The class has a common ``setUp``
  248. function which is shared by all the tests in the class. A simple test is
  249. then added to test that when a variable is expanded, the correct value
  250. is found.
  251. BitBake selftests are straightforward Python unittest. Refer to the
  252. Python unittest documentation for additional information on writing
  253. these tests at: https://docs.python.org/3/library/unittest.html.
  254. ``oe-selftest``
  255. ---------------
  256. These tests are more complex due to the setup required behind the scenes
  257. for full builds. Rather than directly using Python's unittest, the code
  258. wraps most of the standard objects. The tests can be simple, such as
  259. testing a command from within the OE build environment using the
  260. following example::
  261. class BitbakeLayers(OESelftestTestCase):
  262. def test_bitbakelayers_showcrossdepends(self):
  263. result = runCmd('bitbake-layers show-cross-depends')
  264. self.assertTrue('aspell' in result.output, msg = "No dependencies were shown. bitbake-layers show-cross-depends output: %s"% result.output)
  265. This example, taken from ``meta/lib/oeqa/selftest/cases/bblayers.py``,
  266. creates a testcase from the ``OESelftestTestCase`` class, derived
  267. from ``unittest.TestCase``, which runs the ``bitbake-layers`` command
  268. and checks the output to ensure it contains something we know should be
  269. here.
  270. The ``oeqa.utils.commands`` module contains Helpers which can assist
  271. with common tasks, including:
  272. - *Obtaining the value of a bitbake variable:* Use
  273. ``oeqa.utils.commands.get_bb_var()`` or use
  274. ``oeqa.utils.commands.get_bb_vars()`` for more than one variable
  275. - *Running a bitbake invocation for a build:* Use
  276. ``oeqa.utils.commands.bitbake()``
  277. - *Running a command:* Use ``oeqa.utils.commandsrunCmd()``
  278. There is also a ``oeqa.utils.commands.runqemu()`` function for launching
  279. the ``runqemu`` command for testing things within a running, virtualized
  280. image.
  281. You can run these tests in parallel. Parallelism works per test class,
  282. so tests within a given test class should always run in the same build,
  283. while tests in different classes or modules may be split into different
  284. builds. There is no data store available for these tests since the tests
  285. launch the ``bitbake`` command and exist outside of its context. As a
  286. result, common bitbake library functions (bb.\*) are also unavailable.
  287. ``testimage``
  288. -------------
  289. These tests are run once an image is up and running, either on target
  290. hardware or under QEMU. As a result, they are assumed to be running in a
  291. target image environment, as opposed to a host build environment. A
  292. simple example from ``meta/lib/oeqa/runtime/cases/python.py`` contains
  293. the following::
  294. class PythonTest(OERuntimeTestCase):
  295. @OETestDepends(['ssh.SSHTest.test_ssh'])
  296. @OEHasPackage(['python3-core'])
  297. def test_python3(self):
  298. cmd = "python3 -c \\"import codecs; print(codecs.encode('Uryyb, jbeyq', 'rot13'))\""
  299. status, output = self.target.run(cmd)
  300. msg = 'Exit status was not 0. Output: %s' % output
  301. self.assertEqual(status, 0, msg=msg)
  302. In this example, the ``OERuntimeTestCase`` class wraps
  303. ``unittest.TestCase``. Within the test, ``self.target`` represents the
  304. target system, where commands can be run on it using the ``run()``
  305. method.
  306. To ensure certain test or package dependencies are met, you can use the
  307. ``OETestDepends`` and ``OEHasPackage`` decorators. For example, the test
  308. in this example would only make sense if python3-core is installed in
  309. the image.
  310. ``testsdk_ext``
  311. ---------------
  312. These tests are run against built extensible SDKs (eSDKs). The tests can
  313. assume that the eSDK environment has already been setup. An example from
  314. ``meta/lib/oeqa/sdk/cases/devtool.py`` contains the following::
  315. class DevtoolTest(OESDKExtTestCase):
  316. @classmethod def setUpClass(cls):
  317. myapp_src = os.path.join(cls.tc.esdk_files_dir, "myapp")
  318. cls.myapp_dst = os.path.join(cls.tc.sdk_dir, "myapp")
  319. shutil.copytree(myapp_src, cls.myapp_dst)
  320. subprocess.check_output(['git', 'init', '.'], cwd=cls.myapp_dst)
  321. subprocess.check_output(['git', 'add', '.'], cwd=cls.myapp_dst)
  322. subprocess.check_output(['git', 'commit', '-m', "'test commit'"], cwd=cls.myapp_dst)
  323. @classmethod
  324. def tearDownClass(cls):
  325. shutil.rmtree(cls.myapp_dst)
  326. def _test_devtool_build(self, directory):
  327. self._run('devtool add myapp %s' % directory)
  328. try:
  329. self._run('devtool build myapp')
  330. finally:
  331. self._run('devtool reset myapp')
  332. def test_devtool_build_make(self):
  333. self._test_devtool_build(self.myapp_dst)
  334. In this example, the ``devtool``
  335. command is tested to see whether a sample application can be built with
  336. the ``devtool build`` command within the eSDK.
  337. ``testsdk``
  338. -----------
  339. These tests are run against built SDKs. The tests can assume that an SDK
  340. has already been extracted and its environment file has been sourced. A
  341. simple example from ``meta/lib/oeqa/sdk/cases/python2.py`` contains the
  342. following::
  343. class Python3Test(OESDKTestCase):
  344. def setUp(self):
  345. if not (self.tc.hasHostPackage("nativesdk-python3-core") or
  346. self.tc.hasHostPackage("python3-core-native")):
  347. raise unittest.SkipTest("No python3 package in the SDK")
  348. def test_python3(self):
  349. cmd = "python3 -c \\"import codecs; print(codecs.encode('Uryyb, jbeyq', 'rot13'))\""
  350. output = self._run(cmd)
  351. self.assertEqual(output, "Hello, world\n")
  352. In this example, if nativesdk-python3-core has been installed into the SDK, the code runs
  353. the python3 interpreter with a basic command to check it is working
  354. correctly. The test would only run if Python3 is installed in the SDK.
  355. ``oe-build-perf-test``
  356. ----------------------
  357. The performance tests usually measure how long operations take and the
  358. resource utilization as that happens. An example from
  359. ``meta/lib/oeqa/buildperf/test_basic.py`` contains the following::
  360. class Test3(BuildPerfTestCase):
  361. def test3(self):
  362. """Bitbake parsing (bitbake -p)"""
  363. # Drop all caches and parse
  364. self.rm_cache()
  365. oe.path.remove(os.path.join(self.bb_vars['TMPDIR'], 'cache'), True)
  366. self.measure_cmd_resources(['bitbake', '-p'], 'parse_1',
  367. 'bitbake -p (no caches)')
  368. # Drop tmp/cache
  369. oe.path.remove(os.path.join(self.bb_vars['TMPDIR'], 'cache'), True)
  370. self.measure_cmd_resources(['bitbake', '-p'], 'parse_2',
  371. 'bitbake -p (no tmp/cache)')
  372. # Parse with fully cached data
  373. self.measure_cmd_resources(['bitbake', '-p'], 'parse_3',
  374. 'bitbake -p (cached)')
  375. This example shows how three specific parsing timings are
  376. measured, with and without various caches, to show how BitBake's parsing
  377. performance trends over time.
  378. Considerations When Writing Tests
  379. =================================
  380. When writing good tests, there are several things to keep in mind. Since
  381. things running on the Autobuilder are accessed concurrently by multiple
  382. workers, consider the following:
  383. **Running "cleanall" is not permitted.**
  384. This can delete files from DL_DIR which would potentially break other
  385. builds running in parallel. If this is required, DL_DIR must be set to
  386. an isolated directory.
  387. **Running "cleansstate" is not permitted.**
  388. This can delete files from SSTATE_DIR which would potentially break
  389. other builds running in parallel. If this is required, SSTATE_DIR must
  390. be set to an isolated directory. Alternatively, you can use the "-f"
  391. option with the ``bitbake`` command to "taint" tasks by changing the
  392. sstate checksums to ensure sstate cache items will not be reused.
  393. **Tests should not change the metadata.**
  394. This is particularly true for oe-selftests since these can run in
  395. parallel and changing metadata leads to changing checksums, which
  396. confuses BitBake while running in parallel. If this is necessary, copy
  397. layers to a temporary location and modify them. Some tests need to
  398. change metadata, such as the devtool tests. To protect the metadata from
  399. changes, set up temporary copies of that data first.