diff mbox

[Branch,~linaro-validation/lava-test/trunk] Rev 173: doc updates for usage

Message ID 20120914151622.24059.91758.launchpad@ackee.canonical.com
State Accepted
Headers show

Commit Message

Andy Doan Sept. 14, 2012, 3:16 p.m. UTC
Merge authors:
  Andy Doan (doanac)
Related merge proposals:
  https://code.launchpad.net/~doanac/lava-test/doc-updates/+merge/124045
  proposed by: Andy Doan (doanac)
------------------------------------------------------------
revno: 173 [merge]
committer: Andy Doan <andy.doan@linaro.org>
branch nick: lt
timestamp: Fri 2012-09-14 10:15:00 -0500
message:
  doc updates for usage
modified:
  doc/usage.rst


--
lp:lava-test
https://code.launchpad.net/~linaro-validation/lava-test/trunk

You are subscribed to branch lp:lava-test.
To unsubscribe from this branch go to https://code.launchpad.net/~linaro-validation/lava-test/trunk/+edit-subscription
diff mbox

Patch

=== modified file 'doc/usage.rst'
--- doc/usage.rst	2012-03-31 08:44:50 +0000
+++ doc/usage.rst	2012-09-12 20:08:31 +0000
@@ -60,76 +60,26 @@ 
 By default parse will print the bundle to standard output for inspection. It
 should be redirected to a pager for easier verification.
 
-.. note::
-
-    While the syntax of the bundle created with `lava-test parse` is always
-    correct (or, if the parser does something really, really strange, a
-    detailed error is reported) the actual contents may not be what you
-    intended it to be. Parsers are ultimately fragile as they mostly deal with
-    unstructured or semi-structured free-form text that most test programs seem
-    to produce. The ultimate goal of a developer should be to produce
-    unambiguous, machine readable format. This level of integration would allow
-    to wrap a whole class of tests in one go (such as all xUnit-XML speaking
-    test frameworks).
-
-Usage with the dispatcher
-^^^^^^^^^^^^^^^^^^^^^^^^^
-
-The dispatcher is useful for automating LAVA Test environment setup, describing
-test scenarios (the list of tests to invoke) and finally storing the results in
-the LAVA dashboard.
-
-Typically this mode is based on the following sequence of commands:
-
-#. Install lava-test (from PPA or source) along with the required dependencies
-#. (optional) for out of tree tests install the additional `test definition` package
-#. Install the test or tests that are to be invoked with ``lava-tool install``.
-#. Run, parse and store in one go with ``lava-tool run --output=FILE``.
-
-Here the whole setup is non-interactive and at the end the dispatcher can copy
-the output bundle for additional processing.
-
-Automation considerations
-^^^^^^^^^^^^^^^^^^^^^^^^^
-
 .. _wrapping_existing_test_or_benchmark:
 
 Wrapping existing test or benchmark
 ===================================
 
-LAVA Test can be extended in several different ways. There is no best method,
-each has some pros and cons. In general we welcome any freely redistributable,
-generic tests. Those enrich the LAVA ecosystem and by providing useful
-out-of-the-box features to our users.
-
-Technically all tests are hidden behind a set of abstract interfaces that tell
-LAVA Test what to do in response to operator or dispatcher actions. The primary
-interface is :class:`~lava_test.api.core.ITest` and the three principal
-methods: :meth:`~lava_test.api.core.ITest.install`,
-:meth:`~lava_test.api.core.ITest.run`,
-:meth:`~lava_test.api.core.ITest.parse`.
-
-In practice it is usually much easier to instantiate our pluggable delegate
-test (:class:`lava_test.core.tests.Test`) and define the three delegates that
-know how to install, run and parse. Again for each step we have a base class
-that can be easily customized or even used directly as is.  Those classes are
-:class:`~lava_test.core.installers.TestInstaller`,
-:class:`~lava_test.core.runners.TestRunner` and
-:class:`~lava_test.core.parsers.TestParser`. They all implement well-defined
-interfaces (specified in :mod:`lava_test.api.delegates`) so if you wish to
-customize them you should become familiar with the API requirements first.
-
-Contributing new tests to LAVA
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+There are three different ways you can choose from to add a test:
+
+ * in-tree test
+ * out-of-tree
+ * declarative test
+
+In-Tree Test
+^^^^^^^^^^^^
 
 The most direct way to add a new test is to contribute patches to LAVA Test
 itself. This method will simply add a new test definition to the collection of
 available tests.
 
 This method is recommended for generic tests that rarely change and are
-suitable for wide variety of hardware and software (assuming basic Linux-like
-system, Android tests are a special case).
-
+suitable for wide variety of hardware and software.
 The advantage is that those tests can be invoked out of the box and will be
 maintained by the LAVA team. The disadvantage is that all changes to those
 tests need to follow Linaro development work flow, get reviewed and finally
@@ -138,7 +88,7 @@ 
 Test definitions are simply a way of telling LAVA-Test how to install a test,
 run it, and interpret the results. Tests definitions are in a simplified python
 format, and can be as simple as a few lines. More advanced test definitions can
-be written by deriving from the base classes. 
+be written by deriving from the base classes.
 
 Defining a simple test
 ++++++++++++++++++++++
@@ -283,8 +233,8 @@ 
 common features are found that would make it possible to eliminate or simplify
 cases like this, they should be merged into the Lava-test libraries.
 
-Maintaining out-of-tree tests
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+Out-Of-Tree Test
+^^^^^^^^^^^^^^^^
 
 For some kinds of tests (proprietary, non-generic, in rapid development, fused
 with application code) contributing their definition to upstream LAVA Test
@@ -317,21 +267,76 @@ 
 will discover this entry point, import the relevant module and make the test
 definition available.
 
-Maintaining simple declarative tests
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-    
-By registering pure declarative tests at runtime.
-
-.. todo::
-
-    Describe how to use declarative tests. It would be a nice
-    extension of the tutorial once the user feels comfortable with
-    the initial python-based version.
-
-Writing new tests from scratch 
+Declarative Test
+^^^^^^^^^^^^^^^^
+
+As you can in the most simple example of an in-tree test, there is very little
+python required of the user. Declarative tests allow you to define these bits
+in a JSON format that LAVA can use. This can be a nice alternative to an
+out-of-tree test. Some examples of declarative tests are:
+
+Pass/Fail
++++++++++
+If you have a test that prints out PASS/FAIL messages, your json could be::
+
+    {
+        "format": "Lava-Test Test Definition 1.0",
+        "test_id": "bigLITTLE",
+        "run": {
+            "steps": ["/bin/echo cache-coherency-switching : FAIL"]
+        },
+        "parse": {
+            "pattern": "(?P<test_case_id>.*-*)\\s+:\\s+(?P<result>(PASS|FAIL))",
+            "fixupdict": {
+                "PASS": "pass",
+                "FAIL": "fail"
+            }
+        }
+    }
+
+Stream
+++++++
+This does the equivalent of the in-tree stream test::
+
+    {
+        "format": "LAVA-Test Test Definition Format",
+        "test_id": "stream-json",
+        "install": {
+            "url": "http://www.cs.virginia.edu/stream/FTP/Code/stream.c",
+            "steps": ["cc stream.c -O2 -fopenmp -o stream"]
+        },
+        "run": {
+            "steps": ["./stream"]
+        },
+        "parse": {
+            "pattern": "^(?P<test_case_id>\\w+):\\W+(?P<measurement>\\d+\\.\\d+)",
+            "appendall": {
+                "units": "MB/s",
+                "result": "pass"
+            }
+        }
+    }
+
+Executing a Declarative Test
+++++++++++++++++++++++++++++
+
+The main difference with declarative tests is that you must register them with
+lava-test so that it knows about it. You can do this with::
+
+    lava-test register-test  $TEST_DEF_URL
+
+Then you do the standard::
+
+   lava-test install <test id>
+   lava-test run <test id>
+
+Writing new tests from scratch
 ==============================
 
-.. todo::
+The thing to keep in mind with lava-test is that its not intended to be a
+test framework itself. So the best advice is to build your test the best way
+you see fit. Then make a thin wrapper for lava-test using one the three methods
+described in the previous section.
 
-    Describe considerations for test writers. Using native test
-    format with human-readable output adapters.
+The most important thing is to make your tests output its results on a single
+line. This makes declaring the parsing for lava-test much easier.