This specification describes DebTest -- a framework with conventions and tools
that allow Debian to distribute test batteries developed by upstream or Debian
-developers. DebTest will enable an extensive testing of a deployed Debian
-system or a particular software of interest in a uniform fashion.
+developers. DebTest aims to enable developers and users to perform extensive
+testing of a deployed Debian system or a particular software of interest in a
+uniform fashion.
== Rationale ==
Ideally software packaged for Debian comes with an exhaustive test suite that
-can be used to determine whether this software works as expected on the Debian
-platform. However, especially for complex software, these test suites are often
-resource hungry (CPU time, memory, diskspace, network bandwidth) and cannot be
-ran at package build time by buildds. Consequently, test suites are typically
-utilized manually only by the respective packager on a particular machine, before
-uploading a new version to the archive.
-
-However, Debian is an integrated system and packaged software typically
-relies on functionality provided by other Debian packages (e.g. shared
-libraries) instead of shipping duplicates with different versions in every
-package -- for many good reasons. Unfortunately, there is also a downside to
-this: Debian packages often use versions of 3rd-party tools different from
-those tested by upstream, and moreover, the actual versions of dependencies
-might change frequently between subsequent uploads of a dependent package. Currently
+can be used to determine whether this particular software works as expected on
+the Debian platform. However, especially for complex software, these test
+suites are often resource hungry (CPU time, memory, disk space, network
+bandwidth) and cannot be ran at package build time by buildds. Consequently,
+test suites are typically utilized manually and only by the respective packager
+on a particular machine, before uploading a new version to the archive.
+
+However, Debian is an integrated system and packaged software typically relies
+on functionality provided by other Debian packages (e.g. shared libraries)
+instead of shipping duplicates with different versions in every package -- for
+many good reasons. Unfortunately, there is also a downside to this: Debian
+packages often use versions of 3rd-party tools that are different from those
+tested by upstream, and moreover, the actual versions of dependencies might
+change frequently between subsequent uploads of a dependent package. Currently
a change in a dependency that introduces an incompatibility cannot be detected
-reliably even if upstream provides a test suite that would have caught
-the breakage. Therefore integration testing heavily relies on users to detect
-incorrect functioning and file bug reports. Although there are archive-wide
-QA efforts (e.g. constantly rebuilding all packages) these tests can only
-detect API/ABI breakage or functionality tested during build-time checks --
-they are not exhaustive for the aforementioned reasons.
+reliably even if upstream provides a test suite that would have caught the
+breakage. Therefore integration testing heavily relies on users to detect
+incorrect functioning and file bug reports. Although there are archive-wide QA
+efforts (e.g. constantly rebuilding all packages) these tests can only detect
+API/ABI breakage or functionality tested during build-time checks -- they are
+not exhaustive for the aforementioned reasons.
This is a proposal to, first of all, package upstream test suites in a way that
they can be used to run expensive archive-wide QA tests. However, this is also
-a proposal to establish means to test interactions between software from multiple
-Debian packages to provide more thorough continued integration and regression testing
-for the Debian systems.
+a proposal to establish means to test interactions between software from
+multiple Debian packages to provide more thorough continued integration and
+regression testing for the Debian systems.
== Use Cases ==
This includes the test suite of the authors of his favorite software, but
also all distribution test suites provided by Debian developers (see above).
+ * Sylvestre maintains a core computational library in Debian.
+ A new version (or other modification) of this library promises performance
+ advantages. Using DebTest he could not only verify the absence of
+ regressions but also to obtain direct performance comparison
+ against the previous version across a range of applications.
+
* Joerg maintains a repository of backports of Debian packages to be
installed in a stable environment. He wants to assure that
backporting of the packages has not caused a deviation in their
* Linus is an upstream developer. He just loves the fact that he can tell any
of his Debian-based users to just 'apt-get install' something and send him
- the output of a command, whenever they claim that his software doesn't work
- properly.
+ the output of a debtest command, whenever they claim that his software
+ doesn't work properly. It pleases him to see his carefully developed test
+ suite to be conveniently accessible for users.
* Finally, Lucas has access to a powerful computing facility and
likes to run all kinds of tests on all packages in the Debian archive.
== Design ==
+A specification should be built with the following considerations:
+
+ * The person implementing it may not be the person writing it. Specification should be
+ * clear enough for someone to be able to read it and have a clear path
+ * towards implementing it. If it is not straightforward, it needs more detail.
+
+ * Use cases covered in the specification should be practical
+ * situations, not contrived issues.
+
+ * Limitations and issues discovered during the creation of a specification
+ * should be clearly pointed out so that they can be dealt with explicitly.
+
+ * If you don't know enough to be able to competently write a spec, you should
+ * either get help or research the problem further. Avoid spending time making
+ * up a solution: base yourself on your peers' opinions and prior work.
+
+Specific issues related to particular sections are described further below.
+
+
=== Core components ===
* Organization of the framework
==== Packaged tests ====
* Metainformation:
- - duration: ....
- - resources:
- - suites:
+ * duration: ....
+ * resources:
+ * suites:
* Debug symbols: ....
- - do not strip symbols from test binary
- -
+ * do not strip symbols from test binary
* Packages that register tests might provide a virtual package
'test-<packagename>' to allow easy test discovery and retrival via
debtest tools.
+
==== debtest tools ====
- * Invocation:
- - single package tests
- - all (with -f to force even if resources are not sufficient)
- - given specific resources demands, just run
+ * Invocation::
+ * single package tests
+ * all (with -f to force even if resources are not sufficient)
+ * tests of dependent packages (discovered via rdepends,
+ "rrecommends" and "rsuggests")
+ * given specific resources demands, just run
the ones matching those
- * Customization/Output:
- - plugins
- + output: some structured output
- + interface to some dashboard
-
+ * Customization/Output::
+ plugins::
+ * job resources requirement adjustments
+ . manual customization
+ . request from dashboard for the system (or alike)
+ * executioners
+ . local execution (monitor resources)
+ . submit to cluster/cloud
+ * output/reports
+ . some structured output
+ . interfaces to dashboards
==== Maintainer helpers ====
- assess resources/performance:
-A specification should be built with the following considerations:
-
- * The person implementing it may not be the person writing it. Specification should be
- * clear enough for someone to be able to read it and have a clear path
- * towards implementing it. If it is not straightforward, it needs more detail.
+=== Supplementary infrastructure ===
- * Use cases covered in the specification should be practical
- * situations, not contrived issues.
-
- * Limitations and issues discovered during the creation of a specification
- * should be clearly pointed out so that they can be dealt with explicitly.
-
- * If you don't know enough to be able to competently write a spec, you should
- * either get help or research the problem further. Avoid spending time making
- * up a solution: base yourself on your peers' opinions and prior work.
-
-Specific issues related to particular sections are described further below.
+==== Dashboard server ====
=== Implementation Plan ===