Random Test Inputs – part 2

Test

If you saw my last post Random inputs in unit testing you’ll remember that I was advocating the benefits of using random test data in your unit tests.

One of the bits of feedback that I had (and seems to be the main complaint of most people who are against this) is that you need reliable test input so that you can recreate failures.

I don’t think these two concepts are mutually exclusive. Let me explain.

A decent test framework will help you recreate failures

I use testtools as my preferred test framework for Python. One of its benefits is that its so-called Matchers will output useful information for failures, including the data used for input. As a very basic example:

import testtools 
 
class TestMe(testtools.TestCase): 
    def test_failure(self): 
        self.assertThat(5, testtools.matchers.Equals(4))

This produces the output:

$ python -m testtools.run fail.py       
Tests running... 
====================================================================== 
FAIL: fail.TestMe.test_failure 
---------------------------------------------------------------------- 
Traceback (most recent call last): 
  File "fail.py", line 7, in test_failure 
    self.assertThat(5, testtools.matchers.Equals(4)) 
  File "/usr/lib/python2.7/dist-packages/testtools/testcase.py", line 435, in assertThat 
    raise mismatch_error 
testtools.matchers._impl.MismatchError: 4 != 5

You can see that the inputs are clearly listed. If this is not clear enough you can add more explanatory text that is output on failure. Changing your assertion to:

def test_failure(self):
    self.assertThat(5, testtools.matchers.Equals(4), "Expected 4, got 5")

Results in:

testtools.matchers._impl.MismatchError: 4 != 5: Expected 4, got 5

Thus, if you get a failure from a random input, it’s easy to see what data caused that.

Making your data recognisable

Once you start randomising a lot of input data, it starts to become hard to identify which input is which when you have a failure for code that has many inputs. Take this for example, which I have taken from my days as the MAAS engineering lead:

from itertools import imap, islice, repeat
import testtools
import random
import string


class TestMe(testtools.TestCase):
    random_letters = imap( 
        random.choice, repeat(string.letters + string.digits))

    def make_string(self, size=10):
        return "".join(islice(self.random_letters, size))

    def test_large_comparison(self):
        dict1 = dict(
            person=self.make_string(),
            age=self.make_string(),
            weight=self.make_string()
        )
        expected_dict = dict(person="foo", age="10", weight="200")

        self.assertThat(dict1, testtools.Matchers.Equals(expected_dict))

This results in:

====================================================================== 
FAIL: fail.TestMe.test_large_comparison 
---------------------------------------------------------------------- 
Traceback (most recent call last): 
  File "fail.py", line 26, in test_large_comparison 
    self.assertThat(dict1, testtools.matchers.Equals(expected_dict)) 
  File "/usr/lib/python2.7/dist-packages/testtools/testcase.py", line 435, in assertThat 
    raise mismatch_error 
testtools.matchers._impl.MismatchError: !=: 
reference = {'age': '10', 'person': 'foo', 'weight': '200'} 
actual    = {'age': 'HL8qw3xnJO', 'person': 'IuhoaGzQHB', 'weight': 'Rfi3lxsiHf'}

My first question on seeing output like this is “where did that data come from?” because it could have leaked from buggy code and be totally unrelated to your intended input.

This has an easy answer if you make a small change to the test harness:

    def make_string(self, prefix="", size=10): 
        return prefix + "".join(islice(self.random_letters, size)) 
 
    def test_large_comparison(self): 
        dict1 = dict( 
            person=self.make_string("person"), 
            age=self.make_string("age"), 
            weight=self.make_string("weight") 
        ) 
        expected_dict = dict(person="foo", age="10", weight="200") 
 
        self.assertThat(dict1, testtools.matchers.Equals(expected_dict))

Here, the code has been changed so that the make_string() function will accept a fixed prefix for the string you want generated. This has the following effect on the output:

reference = {'age': '10', 'person': 'foo', 'weight': '200'} 
actual    = {'age': 'ageb8BIwdwZrM', 
 'person': 'personBSUhvYxFTU', 
 'weight': 'weightngdjfm7ef1'}

You can instantly see that your random input was in fact generated in the way you intended (or not, as the case may be) and thus easy to identify its source!

I hope this was useful. Leave me feedback if this helped you at all!

Posted in tech | Leave a comment

Random inputs in unit testing

CC by StockMonkeys.com

I recently had an upstream reviewer telling me that I should not randomise my test input because “randomness does not provide a useful input to the test and sets a bad example of practices for writing tests”.

I am going to explain here why this is wrong and it’s actually good practice to randomise inputs. Let me start by saying that random test failures are not the same thing as spurious test failures. I’ll come back to that later.

Consider this simple code under test, it’s a contrived example but you will get the idea:

def myfunc(thing):
    """This function just returns what it's given."""
    return "foo"

OK so let’s consider this a stub implementation, as it has an obvious bug. So, if we wanted to write a test for this we might write something like this:

class TestMyfunc(unittest.TestCase):
    def test_myfunc_returns_same_input(self):
        returned = myfunc("foo")
        self.assertEqual(returned, "foo")

Here, I am using a fixed input of “foo” as many people like to do in tests as a way of saying “this value is irrelevant”.

The bug should be obvious here — the test passes when it should not because the code under test is returning the same value as used in the test. As I say a fairly contrived example, but it illustrates the point that tests should never assume anything about code under test.

Here’s a better way of writing the test:

import random

class TestMyfunc(unittest.TestCase):
    def test_myfunc_returns_same_input(self):
        expected = random.randint(0, 1000)
        returned = myfunc(expected)
        self.assertEqual(returned, expected)y

(A further improvement could be to generate a random string, but I’ll leave that for a future blog entry.)

Here, we’re generating a random input and asserting that the returned value is the same as the input. This not only avoids the bug above but it is far better at demonstrating test intent. It will also never fail unless the code under test is buggy, and that brings me back to the point above about random vs spurious test failures.

A random test failure is good. It means you found a bug! A spurious test failure is one that indicates you’re not testing properly – an example of this is where you depend on some network connectivity to complete your test; as networks are inherently unreliable this is a bad test and will create spurious test failures when the network fails.

Finally, I can recommend that you look at a tool called Hypothesis, which is a property-based testing utility. My friend Jono explains it in his blog here: https://jml.io/2016/06/evolving-toward-property-based-testing-with-hypothesis.html

Posted in tech | Tagged , | 1 Comment

Glen Rock State Forest

The latest Phantom 4 video shot at Glen Rock State Forest in South East Queensland’s Scenic Rim.

Posted in Uncategorized | Leave a comment

Montville via Phantom 4

Enjoy the spectacular scenery of the Sunshine Coast Hinterland.

Posted in Drone | Leave a comment

Super Moon

While I was dropping my mother at the airport tonight, I thought I’d try to get a nice shot of the “super moon”. I wasn’t too disappointed with this result!

It’s hard to convey just how bright the moon is. I processed this shot so the detail stands out, but in real life it’s a glowing white ball.

Supermoon

Posted in Photography | Tagged | Leave a comment

Another drone video on the Brisbane River

I tried to do some colour grading here. The range is spot on now, but I think I could increase the saturation a bit. Next time.

Posted in Drone | Leave a comment

LXD on Linode servers

I was recently trying to get LXD working on my Linode server, but was getting this error:

$ lxc launch ubuntu:16.04 wordpress       
Creating wordpress
Starting wordpress
error: Error calling 'lxd forkstart wordpress /var/lib/lxd/containers /var/log/lxd/wordpress/lxc.conf': err='exit status 1'
  lxc 20161010203338.247 ERROR lxc_seccomp - seccomp.c:get_new_ctx:224 - Seccomp error -17 (File exists) adding arch: 2
  lxc 20161010203338.247 ERROR lxc_start - start.c:lxc_init:430 - failed loading seccomp policy
  lxc 20161010203338.247 ERROR lxc_start - start.c:__lxc_start:1313 - failed to initialize the container

The fantastic Stéphane Graber helped me to work out that the default Linode kernel doesn’t have the right bits compiled into it, and I should be using an Ubuntu kernel instead.

So, following the guide at https://www.linode.com/docs/tools-reference/custom-kernels-distros/run-a-distribution-supplied-kernel-with-kvm to upgrade my kernel, it now works.

Posted in tech | Leave a comment

Brisbane River Drone Flight

My second drone video! A bit longer, and has a special appearance by my dog at the end.

Posted in Drone, tech | Tagged | Leave a comment

Droning on!

I went and bought myself a DJI Phantom 4. Holy crap, I’m impressed. 12MP images and 4k video. Not to mention all the other bells and whistles like collision avoidance, auto return to home and when you put it in speed mode, it does 70km/h!

Here’s the first of a couple of videos I put together for it:

Posted in Drone, tech | Tagged | 2 Comments

Webex using Ubuntu LXD containers

If you read my previous post WebEx in Ubuntu LXC containers you’ll have learned how to get Cisco’s Webex running on Ubuntu in a 12.04 container.

I figured it was time to work out how to get it running in the newer LXD containers available in 16.04, here’s how I did it.

Install LXD

apt-get install lxd
sudo lxd init

When it asks you about networking, use the existing lxcbr0 bridge, do not let it use a new bridge as by default it will create lxdbr0. We need to stay on lxcbr0 so that networking continues to work in the old containers.

Create the LXD container

lxd init ubuntu:precise webex

This will download a new 12.04 template if you don’t already have one, it will take a while depending on your Internet connection.

Cheat by copying the old rootfs

I’m not going to rebuild my rootfs from scratch, the old one is perfectly usable! So as root, we can copy it from the old LXC area:

cp -rp /var/lib/lxc/webex/rootfs /var/lib/lxd/containers/webex/

Configure the container

The old container rootfs was a privileged container so we need to do the same on this LXD copy:

lxd config set webex security.privileged true

To make the sound device available you need to set up the sound device in the container. Here I am adding all the devices under /dev/snd/ on my own host, note that yours may differ so edit the commands accordingly:

lxc config device add webex /dev/snd/controlC0 unix-char path=/dev/snd/controlC0
lxc config device add webex /dev/snd/hwC0D0 unix-char path=/dev/snd/hwC0D0
lxc config device add webex /dev/snd/hwC0D3 unix-char path=/dev/snd/hwC0D3
lxc config device add webex /dev/snd/pcmC0D0p unix-char path=/dev/snd/pcmC0D0p
lxc config device add webex /dev/snd/pcmC0D3p unix-char path=/dev/snd/pcmC0D3p
lxc config device add webex /dev/snd/seq unix-char path=/dev/snd/seq
lxc config device add webex /dev/snd/timer unix-char path=/dev/snd/timer

You may remember I was using ssh X forwarding in the old container. We don’t need to do that any more as we can get direct access to the video from the container by using this config:

lxc config device add webex /dev/dri/card0 unix-char path=/dev/dri/card0
lxc config device add webex /dev/dri/controlD64 unix-char path=/dev/dri/controlD64
lxc config device add webex /dev/dri/renderD128 unix-char path=/dev/dri/renderD128
lxc config device add webex /dev/dri/fb0 unix-char path=/dev/fb0
lxc config device add webex /dev/video0 unix-char path=/dev/video0
lxc config device add webex X11 disk source=/tmp/.X11-unix path=/tmp/.X11-unix

Again your devices under /dev/dri may differ a little to mine, change accordingly.

Now, start the container and start a bash shell in it:

lxc start webex
lxc exec webex bash

You’ll now have a root prompt in the container. You can test that sound is working by doing something like:

sudo -u ubuntu aplay /usr/share/sounds/alsa/Front_Center.wav

In root’s home we need to make a script to start firefox for us, it looks like this:

root@webex:~# cat webex.sh  
#!/bin/bash 
DISPLAY=:0 su -c firefox - ubuntu

Make sure to chmod +x webex.sh

Now, all things being good, you can do this to launch Firefox:

lxc exec webex ./webex.sh

Launch Webex as you normally would and verify that it works. If it’s OK, you can remove the old SSH service as it’s not needed any more.

apt-get remove openssh-server

In my next post, I’ll explain how to convert the configuration into a more handy LXD profile that you can use for any container.

Posted in tech | 5 Comments