Skip to main content
Notice removed Reward existing answer by janos
Bounty Ended with Simon Forsberg's answer chosen by janos
Tweeted twitter.com/#!/StackCodeReview/status/547965855951491073
Notice added Reward existing answer by janos
Bounty Started worth 100 reputation by janos
deleted 1 character in body
Source Link
janos
  • 113.1k
  • 15
  • 154
  • 396

I'm working on a micro-brenchmarkingbenchmarking framework in Java. To give you some context, this is how it will work:

I'm working on a micro-brenchmarking framework in Java. To give you some context, this is how it will work:

I'm working on a micro-benchmarking framework in Java. To give you some context, this is how it will work:

Source Link
janos
  • 113.1k
  • 15
  • 154
  • 396

Unit testing annotation processors in Java

I'm working on a micro-brenchmarking framework in Java. To give you some context, this is how it will work:

  1. Given multiple implementations of some algorithm, and you want to compare which one performs better by measuring the processing time.

  2. Create a brand new empty class, annotate with @Benchmark (from the framework)

  3. Add a public void method, that simply calls one of the implementations you want to measure, with all the necessary parameters. Annotate the method with @MeasureTime (from the framework). Repeat for each implementation.

  4. Make sure that all methods work with the same input sets, and that they don't produce side effects.

  5. Run the framework

The annotations @Benchmark and @MeasureTime can take parameters:

  • @Benchmark.iterations: the number of times to repeat all @MeasureTime methods by default
  • @Benchmark.warmUpIterations: the number of warmup runs, which won't be included in the time measurements, for all @MeasureTime methods by default
  • @MeasureTime.iterations: the number of times to repeat the method, overriding the default
  • @MeasureTime.warmUpIterations: as the name suggests, overriding the default

I want to test that the framework is using the annotation parameters correctly. That is:

  • the default iterations and warm-up iterations are correct (1 and 0)
  • the parameters of @Benchmark correctly override the defaults
  • the parameters of @MeasureTime correctly override the defaults

Here's a test for the most simple case:

public class DefaultRunnerTest {

    public static int runCount = 0;

    @Benchmark
    public static class RunWithDefaults {
        @MeasureTime
        public void sample() {
            ++runCount;
        }
    }

    @Before
    public void setUp() {
        runCount = 0;
    }

    @Test
    public void testRunWithDefaults() {
        new DefaultRunner(RunWithDefaults.class).run();
        assertEquals(1, runCount);
    }
}

What I don't like is the runCount static variable, manipulated from inside the annotated method. Initially I wanted to use mocks (with Mockito), but couldn't really see how to do this. I'm wondering if I'm missing something.

For reference, not necessarily for review, here's the complete test class:

public class DefaultRunnerTest {

    public static int runCount = 0;
    public static int runCountOfCustom = 0;

    @Benchmark
    public static class RunWithDefaults {
        @MeasureTime
        public void sample() {
            ++runCount;
        }
    }

    @Before
    public void setUp() {
        runCount = 0;
        runCountOfCustom = 0;
    }

    @Test
    public void testRunWithDefaults() {
        new DefaultRunner(RunWithDefaults.class).run();
        assertEquals(1, runCount);
    }

    @Benchmark(iterations = 5)
    public static class RunWith5Iterations {
        @MeasureTime
        public void sample() {
            ++runCount;
        }
    }

    @Test
    public void testRunWith5Iterations() {
        new DefaultRunner(RunWith5Iterations.class).run();
        assertEquals(5, runCount);
    }

    @Benchmark(warmUpIterations = 3)
    public static class RunWith3WarmUpIterations {
        @MeasureTime
        public void sample() {
            ++runCount;
        }
    }

    @Test
    public void testRunWith3WarmUpIterations() {
        new DefaultRunner(RunWith3WarmUpIterations.class).run();
        assertEquals(3 + 1, runCount);
    }

    @Benchmark(iterations = 5, warmUpIterations = 3)
    public static class RunWith5Iterations3WarmUpIterations {
        @MeasureTime
        public void sample() {
            ++runCount;
        }
    }

    @Test
    public void testRunWith5Iterations3WarmUpIterations() {
        new DefaultRunner(RunWith5Iterations3WarmUpIterations.class).run();
        assertEquals(5 + 3, runCount);
    }

    @Benchmark
    public static class RunWithOverridden5Iterations {
        @MeasureTime
        public void sample() {
            ++runCount;
        }

        @MeasureTime(iterations = 5)
        public void sampleCustom() {
            ++runCountOfCustom;
        }
    }

    @Test
    public void testRunWithOverridden5Iterations() {
        new DefaultRunner(RunWithOverridden5Iterations.class).run();
        assertEquals(1, runCount);
        assertEquals(5, runCountOfCustom);
    }

    @Benchmark
    public static class RunWithOverridden3WarmUpIterations {
        @MeasureTime
        public void sample() {
            ++runCount;
        }

        @MeasureTime(warmUpIterations = 3)
        public void sampleCustom() {
            ++runCountOfCustom;
        }
    }

    @Test
    public void testRunWithOverridden3WarmUpIterations() {
        new DefaultRunner(RunWithOverridden3WarmUpIterations.class).run();
        assertEquals(1, runCount);
        assertEquals(1 + 3, runCountOfCustom);
    }
}

The annotation classes (not necessarily for review):

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

@Retention(RetentionPolicy.RUNTIME)
public @interface Benchmark {

    int iterations() default DefaultRunner.DEFAULT_ITERATIONS;

    int warmUpIterations() default DefaultRunner.DEFAULT_WARM_UP_ITERATIONS;
}

@Retention(RetentionPolicy.RUNTIME)
public @interface MeasureTime {

    int[] iterations() default {};

    int[] warmUpIterations() default {};
}

And the runner (not for review this time, just for your reference):

import java.lang.annotation.Annotation;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.lang.reflect.Modifier;
import java.util.ArrayList;
import java.util.List;

public class DefaultRunner implements Runner {

    public static final int DEFAULT_ITERATIONS = 1;
    public static final int DEFAULT_WARM_UP_ITERATIONS = 0;

    private final Class klass;

    private final int defaultIterations;
    private final int defaultWarmUpIterations;

    private final List<Method> methods = new ArrayList<>();

    public DefaultRunner(Class klass) {
        this.klass = klass;

        Annotation annotation = klass.getAnnotation(Benchmark.class);
        if (annotation != null) {
            Benchmark suite = (Benchmark) annotation;
            defaultIterations = suite.iterations();
            defaultWarmUpIterations = suite.warmUpIterations();
        } else {
            defaultIterations = DEFAULT_ITERATIONS;
            defaultWarmUpIterations = DEFAULT_WARM_UP_ITERATIONS;
        }

        for (Method method : klass.getDeclaredMethods()) {
            Class retClass = method.getReturnType();
            int length = method.getParameterTypes().length;
            int modifiers = method.getModifiers();
            if (retClass == null || length != 0 || Modifier.isStatic(modifiers) || !Modifier.isPublic(modifiers) || Modifier.isInterface(modifiers) || Modifier.isAbstract(modifiers)) {
                continue;
            }
            if (method.getAnnotation(MeasureTime.class) != null) {
                methods.add(method);
            }
        }
    }

    private Object newInstance() throws IllegalAccessException, InstantiationException {
        return klass.newInstance();
    }

    @Override
    public void run() {
        runQuietly();
    }

    private void runQuietly() {
        try {
            runNormally();
        } catch (InvocationTargetException | IllegalAccessException | InstantiationException e) {
            e.printStackTrace();
        }
    }

    private void runNormally() throws InvocationTargetException, IllegalAccessException, InstantiationException {
        Object instance = newInstance();
        for (Method method : methods) {
            Annotation annotation = method.getAnnotation(MeasureTime.class);
            if (annotation != null) {
                MeasureTime measureTime = (MeasureTime) annotation;
                runMeasureTime(instance, method, measureTime);
            }
        }
    }

    private void runMeasureTime(Object instance, Method method, MeasureTime measureTime) throws InvocationTargetException, IllegalAccessException {
        for (int i = 0; i < getWarmUpIterations(measureTime); ++i) {
            method.invoke(instance);
        }
        int iterations = getIterations(measureTime);
        long sumDiffs = 0;
        for (int i = 0; i < iterations; ++i) {
            long start = System.nanoTime();
            method.invoke(instance);
            sumDiffs += System.nanoTime() - start;
        }
        System.out.println(String.format("Average execution time of %s: %s", method.getName(), sumDiffs / iterations));
    }

    private int getWarmUpIterations(MeasureTime measureTime) {
        int[] warmUpIterations = measureTime.warmUpIterations();
        if (warmUpIterations.length > 0) {
            return warmUpIterations[0];
        }
        return defaultWarmUpIterations;
    }

    private int getIterations(MeasureTime measureTime) {
        int[] iterations = measureTime.iterations();
        if (iterations.length > 0) {
            return iterations[0];
        }
        return defaultIterations;
    }
}

The full (very small) project is on GitHub. The DefaultRunnerTest class is ready run, so is the RunSortingDemo class, if you want to play with it.

In summary, my main question is about the way I'm testing the behavior of the annotations. But any additional remarks are welcome. The framework itself is still a work in progress, I intend to submit other parts for review soon in other questions.