I use Jooby for Java backends when I’m choosing Java for the backend(s). It is well-engineered, the lead (Edgar Espinar) is a force for completion, and most important of all I can deploy many services in one deployment (cookie-cutter scaling style), or split into separately scalable services (micro if you like). I engineer automatic tests into two steps following compilation - pure JUnit tests (no threads, no sockets, no file IO), then service tests via RestAssured (one of many types of integration tests). The former are ~3ms, and the later 150ms - 350ms on my older MacBookAir. Each test that is - everything bounced between tests. If the Jooby-using thing you were making had a UI, I’d do Selenium for UI automation in the following step and employ ways of doing so where the average test duration didn’t go up much more. In this case though, just RestAssured on head-less services.

A Modern Dev-Team Peril

The peril is overtesting of the services via fancy test libraries, and teams need to guard against that. How to track it though? Visual is one way:

To do that you need to instrument tests while running. I looked at tapping into RestAssured, but it was easier to add a JUnit listener for change of test-class/method, and a before(..) lambda on a test-extended Jooby “app” that the tests target. Jooby makes it easy to add extra “handlers” across all invocations via Java8’s Consumer.andThen(), so I’ve taken advantage of that.

The pic? Rectangles (test methods and class names) and ovals (endpoint names) have lines that are thicker to indicate overtesting. If each additional extra bold increment means 200ms added to the overall build then attack the boldest squares and ovals first. No need for a big bang to pull it back, just do some test splitting, or migration of service tests to pure unit tests a few at a time. The tech is GraphViz’s DOT of course.

I can’t share on from my startup, but I have ones with many hundreds of lines and boxes in one view, where it is super clear where housekeeping coding efforts should focus.

Explicit Goals

  1. A Build that does a number of test steps following compile, that’s approximate to the same build taht the CI infra would run on commit/push, but is possible to run on the dev-workstation before commit/push, and a dev team that does exactly that.
  2. A Build that is fast and utterly consistent in it’s pass/fail result for each time you run it with the same inputs (all tests pass)
  3. A Build that is fast - say the whole backend in 3 mins, with all pertinent test steps.

Code

JUnit 4 Listener

import org.jooby.Request;
import org.jooby.Response;
import org.junit.runner.Description;
import org.junit.runner.notification.RunListener;

public class DotGraphJUnitListener extends RunListener {

    private static Description RUNNING_TEST = Description.EMPTY;

    public static Description getRunningTest() {
        return RUNNING_TEST;
    }

    // From JUnit's RunListener API
    @Override
    public void testStarted(Description description) throws Exception {
        RUNNING_TEST = description;
    }

    public static void instrumentTestPath(Request req, Response rsp) {

        String foo = System.getProperty("TestMapDotGraph", "false");
        if (!Boolean.parseBoolean(foo)) {
            return;
        }

        String pathMethod = req.path().replaceAll("\\/\\d+", "/:num") + "$$$" + req.method();
        String methodClass = RUNNING_TEST.getMethodName() + "$$$" + RUNNING_TEST.getClassName();
        String pathMethodAsKey = pathMethod.replace("$$$", "_")
                .replace("/", "_")
                .replace("-", "_")
                .replace(":", "_")
                .replace("_", "_");
        String methodClassAsKey = methodClass.replace("$$$", "_")
                .replace(".", "_");

        System.out.println("JOOBY-PATH-TESTED-BY node " +
                methodClassAsKey + " [shape=box label=\"" +
                methodClass.replace("$$$", "()\\n") + "\"]");

        System.out.println("JOOBY-PATH-TESTED-BY node " +
                pathMethodAsKey + " [fontcolor=blue shape=oval label=\"" +
                pathMethod.replace("$$$", "\\n") + "\"]");

        System.out.println("JOOBY-PATH-TESTED-BY vertex " +
                methodClassAsKey + " -> " + pathMethodAsKey);

    }
}

Clearly, there’s some static hackery going on there. We wouldn’t do such things for prod code, but anything that works is OK for test code that isn’t in the JARs for production. We also need to run tests serially to make accurate maps. Not in any specific order, just once each and one at a time. The code is only activated if Maven is handed -DTestMapDotGraph=true.

Maven setup:

<build>
    <plugins>
         <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-surefire-plugin</artifactId>
            <version>3.0.0-M3</version>
            <configuration>
                <properties>
                    <property>
                        <name>listener</name>
                        <value>yourPackage.DotGraphJUnitListener</value>
                    </property>
                </properties>
            </configuration>
        </plugin>
    </plugins>
</build>

Test extended Jooby setup

In your JUnit setup before the RestAssured invocation of the test itself:

yourApp.andThen(app -> {
    app.before(DotGraphJUnitListener::instrumentTestPath);
});

Post processing to make SVG

#!/bin/bash

set -e

doit ()
{
  echo "digraph L {" > test_map.dot
  echo "  overlap = false" >> test_map.dot
  echo "" >> test_map.dot
  mvn install -DTestMapDotGraph 2>&1 | grep TESTED-BY | sort | sed 's/JOOBY-PATH-TESTED-BY//' | sed 's/ node //' | sed 's/ vertex //' > test_map_raw.dot

  cat test_map_raw.dot | grep "label=" | sort | uniq -c | awk '{ print $2 " " $3 " " $4 " " $5 "[penwidth=" $1 "]" }' | sed -e 's#\]\[# #' >> test_map.dot
  cat test_map_raw.dot | sed "/label=/d" | sort | uniq -c | awk '{ print $2 " " $3 " " $4 " [penwidth=" $1 "]" }' >> test_map.dot
  rm test_map_raw.dot

  echo "" >> test_map.dot
  echo "}" >> test_map.dot

  echo "turn dot graph int neato SVG for README"
  neato -Tsvg test_map.dot -o jooby_test_map.svg
  git add jooby_test_map.svg
}

echo "brew install graphviz ... if needed."

echo "First off a buildwithout running tests:"
mvn clean install -DskipTests 2>&1 > /dev/null

echo "Run tests and capture create DOT graph - assume all passing - for each module that has tests"

dirs=$(find . -d | grep test-classes | grep "test-classes$" | sed "s#/target/test-classes##")

while read -r line; do
	cd $line
	echo "$line"
    doit
	cd - > /dev/null
done <<< "$dirs"

echo "Run tests again but colect for all modules into one big DOT graph"
doit

Final thoughts

This works for a multi-module Maven project. It’d make one SVG per module, then one more for all modules that shows RestAssured tests that go outside of the module (undesirable). The SVG can be checked in, and READMEs can link do the SVG like so (Markdown):

![test map diagram](jooby_test_map.svg?raw=true&sanitize=true)

You won’t run this as part of the CI build of course, just once a week to bring the diagrams up to date, and I was happy to commit the resulting SVGs back to Git so that casual perusers of the READMES via the GitHub UI could see the maps (and zoom in / pan if they needed).



Published

October 15th, 2019
Reads:

Categories