X-Git-Url: https://git.donarmstrong.com/?a=blobdiff_plain;f=sandbox%2Fproposal_regressiontestframwork.moin;h=983a624c5d91550e3f7c12deaec304a560a8f373;hb=HEAD;hp=1040dbc50d118ad20c55349d98a31674279a63e8;hpb=911acafbaeb83d8be4d60b667b2a80436502a8ee;p=neurodebian.git diff --git a/sandbox/proposal_regressiontestframwork.moin b/sandbox/proposal_regressiontestframwork.moin index 1040dbc..983a624 100644 --- a/sandbox/proposal_regressiontestframwork.moin +++ b/sandbox/proposal_regressiontestframwork.moin @@ -9,39 +9,40 @@ This specification describes DebTest -- a framework with conventions and tools that allow Debian to distribute test batteries developed by upstream or Debian -developers. DebTest will enable an extensive testing of a deployed Debian -system or a particular software of interest in a uniform fashion. +developers. DebTest aims to enable developers and users to perform extensive +testing of a deployed Debian system or a particular software of interest in a +uniform fashion. == Rationale == Ideally software packaged for Debian comes with an exhaustive test suite that -can be used to determine whether this software works as expected on the Debian -platform. However, especially for complex software, these test suites are often -resource hungry (CPU time, memory, diskspace, network bandwidth) and cannot be -ran at package build time by buildds. Consequently, test suites are typically -utilized manually only by the respective packager on a particular machine, before -uploading a new version to the archive. - -However, Debian is an integrated system and packaged software typically -relies on functionality provided by other Debian packages (e.g. shared -libraries) instead of shipping duplicates with different versions in every -package -- for many good reasons. Unfortunately, there is also a downside to -this: Debian packages often use versions of 3rd-party tools different from -those tested by upstream, and moreover, the actual versions of dependencies -might change frequently between subsequent uploads of a dependent package. Currently +can be used to determine whether this particular software works as expected on +the Debian platform. However, especially for complex software, these test +suites are often resource hungry (CPU time, memory, disk space, network +bandwidth) and cannot be ran at package build time by buildds. Consequently, +test suites are typically utilized manually and only by the respective packager +on a particular machine, before uploading a new version to the archive. + +However, Debian is an integrated system and packaged software typically relies +on functionality provided by other Debian packages (e.g. shared libraries) +instead of shipping duplicates with different versions in every package -- for +many good reasons. Unfortunately, there is also a downside to this: Debian +packages often use versions of 3rd-party tools that are different from those +tested by upstream, and moreover, the actual versions of dependencies might +change frequently between subsequent uploads of a dependent package. Currently a change in a dependency that introduces an incompatibility cannot be detected -reliably even if upstream provides a test suite that would have caught -the breakage. Therefore integration testing heavily relies on users to detect -incorrect functioning and file bug reports. Although there are archive-wide -QA efforts (e.g. constantly rebuilding all packages) these tests can only -detect API/ABI breakage or functionality tested during build-time checks -- -they are not exhaustive for the aforementioned reasons. +reliably even if upstream provides a test suite that would have caught the +breakage. Therefore integration testing heavily relies on users to detect +incorrect functioning and file bug reports. Although there are archive-wide QA +efforts (e.g. constantly rebuilding all packages) these tests can only detect +API/ABI breakage or functionality tested during build-time checks -- they are +not exhaustive for the aforementioned reasons. This is a proposal to, first of all, package upstream test suites in a way that they can be used to run expensive archive-wide QA tests. However, this is also -a proposal to establish means to test interactions between software from multiple -Debian packages to provide more thorough continued integration and regression testing -for the Debian systems. +a proposal to establish means to test interactions between software from +multiple Debian packages to provide more thorough continued integration and +regression testing for the Debian systems. == Use Cases == @@ -94,6 +95,12 @@ for the Debian systems. This includes the test suite of the authors of his favorite software, but also all distribution test suites provided by Debian developers (see above). + * Sylvestre maintains a core computational library in Debian. + A new version (or other modification) of this library promises performance + advantages. Using DebTest he could not only verify the absence of + regressions but also to obtain direct performance comparison + against the previous version across a range of applications. + * Joerg maintains a repository of backports of Debian packages to be installed in a stable environment. He wants to assure that backporting of the packages has not caused a deviation in their @@ -134,6 +141,25 @@ This specification is applicable to all Debian packages, and Debian as a whole. == Design == +A specification should be built with the following considerations: + + * The person implementing it may not be the person writing it. Specification should be + * clear enough for someone to be able to read it and have a clear path + * towards implementing it. If it is not straightforward, it needs more detail. + + * Use cases covered in the specification should be practical + * situations, not contrived issues. + + * Limitations and issues discovered during the creation of a specification + * should be clearly pointed out so that they can be dealt with explicitly. + + * If you don't know enough to be able to competently write a spec, you should + * either get help or research the problem further. Avoid spending time making + * up a solution: base yourself on your peers' opinions and prior work. + +Specific issues related to particular sections are described further below. + + === Core components === * Organization of the framework @@ -146,30 +172,38 @@ This specification is applicable to all Debian packages, and Debian as a whole. ==== Packaged tests ==== * Metainformation: - - duration: .... - - resources: - - suites: + * duration: .... + * resources: + * suites: * Debug symbols: .... - - do not strip symbols from test binary - - + * do not strip symbols from test binary * Packages that register tests might provide a virtual package 'test-' to allow easy test discovery and retrival via debtest tools. + ==== debtest tools ==== - * Invocation: - - single package tests - - all (with -f to force even if resources are not sufficient) - - given specific resources demands, just run + * Invocation:: + * single package tests + * all (with -f to force even if resources are not sufficient) + * tests of dependent packages (discovered via rdepends, + "rrecommends" and "rsuggests") + * given specific resources demands, just run the ones matching those - * Customization/Output: - - plugins - + output: some structured output - + interface to some dashboard - + * Customization/Output:: + plugins:: + * job resources requirement adjustments + . manual customization + . request from dashboard for the system (or alike) + * executioners + . local execution (monitor resources) + . submit to cluster/cloud + * output/reports + . some structured output + . interfaces to dashboards ==== Maintainer helpers ==== @@ -178,23 +212,9 @@ This specification is applicable to all Debian packages, and Debian as a whole. - assess resources/performance: -A specification should be built with the following considerations: - - * The person implementing it may not be the person writing it. Specification should be - * clear enough for someone to be able to read it and have a clear path - * towards implementing it. If it is not straightforward, it needs more detail. +=== Supplementary infrastructure === - * Use cases covered in the specification should be practical - * situations, not contrived issues. - - * Limitations and issues discovered during the creation of a specification - * should be clearly pointed out so that they can be dealt with explicitly. - - * If you don't know enough to be able to competently write a spec, you should - * either get help or research the problem further. Avoid spending time making - * up a solution: base yourself on your peers' opinions and prior work. - -Specific issues related to particular sections are described further below. +==== Dashboard server ==== === Implementation Plan ===