aboutsummaryrefslogtreecommitdiffstatshomepage
path: root/tools/testing/selftests/tc-testing/README
blob: b0954c873e2f1a5d2ee52749b16aba9ec7597c0b (plain) (blame)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
tdc - Linux Traffic Control (tc) unit testing suite

Author: Lucas Bates - lucasb@mojatatu.com

tdc is a Python script to load tc unit tests from a separate JSON file and
execute them inside a network namespace dedicated to the task.


REQUIREMENTS
------------

*  Minimum Python version of 3.4. Earlier 3.X versions may work but are not
   guaranteed.

*  The kernel must have network namespace support if using nsPlugin

*  The kernel must have veth support available, as a veth pair is created
   prior to running the tests when using nsPlugin.

*  The kernel must have the appropriate infrastructure enabled to run all tdc
   unit tests. See the config file in this directory for minimum required
   features. As new tests will be added, config options list will be updated.

*  All tc-related features being tested must be built in or available as
   modules.  To check what is required in current setup run:
   ./tdc.py -c

   Note:
   In the current release, tdc run will abort due to a failure in setup or
   teardown commands - which includes not being able to run a test simply
   because the kernel did not support a specific feature. (This will be
   handled in a future version - the current workaround is to run the tests
   on specific test categories that your kernel supports)


BEFORE YOU RUN
--------------

The path to the tc executable that will be most commonly tested can be defined
in the tdc_config.py file. Find the 'TC' entry in the NAMES dictionary and
define the path.

If you need to test a different tc executable on the fly, you can do so by
using the -p option when running tdc:
	./tdc.py -p /path/to/tc


RUNNING TDC
-----------

To use tdc, root privileges are required.  This is because the
commands being tested must be run as root.  The code that enforces
execution by root uid has been moved into a plugin (see PLUGIN
ARCHITECTURE, below).

Tests that use a network device should have nsPlugin.py listed as a
requirement for that test. nsPlugin executes all commands within a
network namespace and creates a veth pair which may be used in those test
cases. To disable execution within the namespace, pass the -N option
to tdc when starting a test run; the veth pair will still be created
by the plugin.

Running tdc without any arguments will run all tests. Refer to the section
on command line arguments for more information, or run:
	./tdc.py -h

tdc will list the test names as they are being run, and print a summary in
TAP (Test Anything Protocol) format when they are done. If tests fail,
output captured from the failing test will be printed immediately following
the failed test in the TAP output.


OVERVIEW OF TDC EXECUTION
-------------------------

One run of tests is considered a "test suite" (this will be refined in the
future).  A test suite has one or more test cases in it.

A test case has four stages:

  - setup
  - execute
  - verify
  - teardown

The setup and teardown stages can run zero or more commands.  The setup
stage does some setup if the test needs it.  The teardown stage undoes
the setup and returns the system to a "neutral" state so any other test
can be run next.  These two stages require any commands run to return
success, but do not otherwise verify the results.

The execute and verify stages each run one command.  The execute stage
tests the return code against one or more acceptable values.  The
verify stage checks the return code for success, and also compares
the stdout with a regular expression.

Each of the commands in any stage will run in a shell instance.


USER-DEFINED CONSTANTS
----------------------

The tdc_config.py file contains multiple values that can be altered to suit
your needs. Any value in the NAMES dictionary can be altered without affecting
the tests to be run. These values are used in the tc commands that will be
executed as part of the test. More will be added as test cases require.

Example:
	$TC qdisc add dev $DEV1 ingress

The NAMES values are used to substitute into the commands in the test cases.


COMMAND LINE ARGUMENTS
----------------------

Run tdc.py -h to see the full list of available arguments.

usage: tdc.py [-h] [-p PATH] [-D DIR [DIR ...]] [-f FILE [FILE ...]]
              [-c [CATG [CATG ...]]] [-e ID [ID ...]] [-l] [-s] [-i] [-v] [-N]
              [-d DEVICE] [-P] [-n] [-V]

Linux TC unit tests

optional arguments:
  -h, --help            show this help message and exit
  -p PATH, --path PATH  The full path to the tc executable to use
  -v, --verbose         Show the commands that are being run
  -N, --notap           Suppress tap results for command under test
  -d DEVICE, --device DEVICE
                        Execute test cases that use a physical device, where
                        DEVICE is its name. (If not defined, tests that require
                        a physical device will be skipped)
  -P, --pause           Pause execution just before post-suite stage

selection:
  select which test cases: files plus directories; filtered by categories
  plus testids

  -D DIR [DIR ...], --directory DIR [DIR ...]
                        Collect tests from the specified directory(ies)
                        (default [tc-tests])
  -f FILE [FILE ...], --file FILE [FILE ...]
                        Run tests from the specified file(s)
  -c [CATG [CATG ...]], --category [CATG [CATG ...]]
                        Run tests only from the specified category/ies, or if
                        no category/ies is/are specified, list known
                        categories.
  -e ID [ID ...], --execute ID [ID ...]
                        Execute the specified test cases with specified IDs

action:
  select action to perform on selected test cases

  -l, --list            List all test cases, or those only within the
                        specified category
  -s, --show            Display the selected test cases
  -i, --id              Generate ID numbers for new test cases

netns:
  options for nsPlugin (run commands in net namespace)

  -N, --no-namespace
                        Do not run commands in a network namespace.

valgrind:
  options for valgrindPlugin (run command under test under Valgrind)

  -V, --valgrind        Run commands under valgrind


PLUGIN ARCHITECTURE
-------------------

There is now a plugin architecture, and some of the functionality that
was in the tdc.py script has been moved into the plugins.

The plugins are in the directory plugin-lib.  The are executed from
directory plugins.  Put symbolic links from plugins to plugin-lib,
and name them according to the order you want them to run. This is not
necessary if a test case being run requires a specific plugin to work.

Example:

bjb@bee:~/work/tc-testing$ ls -l plugins
total 4
lrwxrwxrwx  1 bjb  bjb    27 Oct  4 16:12 10-rootPlugin.py -> ../plugin-lib/rootPlugin.py
lrwxrwxrwx  1 bjb  bjb    25 Oct 12 17:55 20-nsPlugin.py -> ../plugin-lib/nsPlugin.py
-rwxr-xr-x  1 bjb  bjb     0 Sep 29 15:56 __init__.py

The plugins are a subclass of TdcPlugin, defined in TdcPlugin.py and
must be called "SubPlugin" so tdc can find them.  They are
distinguished from each other in the python program by their module
name.

This base class supplies "hooks" to run extra functions.  These hooks are as follows:

pre- and post-suite
pre- and post-case
pre- and post-execute stage
adjust-command (runs in all stages and receives the stage name)

The pre-suite hook receives the number of tests and an array of test ids.
This allows you to dump out the list of skipped tests in the event of a
failure during setup or teardown stage.

The pre-case hook receives the ordinal number and test id of the current test.

The adjust-command hook receives the stage id (see list below) and the
full command to be executed.  This allows for last-minute adjustment
of the command.

The stages are identified by the following strings:

  - pre  (pre-suite)
  - setup
  - command
  - verify
  - teardown
  - post (post-suite)


To write a plugin, you need to inherit from TdcPlugin in
TdcPlugin.py.  To use the plugin, you have to put the
implementation file in plugin-lib, and add a symbolic link to it from
plugins.  It will be detected at run time and invoked at the
appropriate times.  There are a few examples in the plugin-lib
directory:

  - rootPlugin.py:
      implements the enforcement of running as root
  - nsPlugin.py:
      sets up a network namespace and runs all commands in that namespace,
      while also setting up dummy devices to be used in testing.
  - valgrindPlugin.py
      runs each command in the execute stage under valgrind,
      and checks for leaks.
      This plugin will output an extra test for each test in the test file,
      one is the existing output as to whether the test passed or failed,
      and the other is a test whether the command leaked memory or not.
      (This one is a preliminary version, it may not work quite right yet,
      but the overall template is there and it should only need tweaks.)
  - buildebpfPlugin.py:
      builds all programs in $EBPFDIR.


ACKNOWLEDGEMENTS
----------------

Thanks to:

Jamal Hadi Salim, for providing valuable test cases
Keara Leibovitz, who wrote the CLI test driver that I used as a base for the
   first version of the tc testing suite. This work was presented at
   Netdev 1.2 Tokyo in October 2016.
Samir Hussain, for providing help while I dove into Python for the first time
    and being a second eye for this code.