Writing tests for the AspectJ compiler

The AspectJ project has a harness which reads test specification files and run tests. The tests are usually simple scenarios like "compile and run" or "compile expecting error", but may involve multiple files, incremental compilation, classpath or aspectpath entries, etc. This document shows how to write tests that can be run by the harness and suggests some patterns to use in test code, discussing Most people just writing a test case need to know only the information in Simple test definitions and Test source files.

Related documents:

Simple Test definitions

Test definitions are specified in XML files. Here is a simple example to compile Main.java and expect an error on line 10 in a file ajcSample.xml:
<!DOCTYPE suite SYSTEM "../tests/ajcTestSuite.dtd">
<suite>
    <ajc-test dir="new" title="simple error test">
        <compile files="Main.java">
            <message kind="error" line="10"/>
        </compile>
    </ajc-test>
</suite>

Here is an example to compile pack/Aspect.java and pack2/Main.java and run the main class:

    <ajc-test dir="new" title="simple run test">
        <compile files="pack/Aspect.java,pack1/Main.java"/>
        <run class="pack1.Main"/>
    </ajc-test>
The compile and run steps of a given ajc-test share a common sandbox, so (e.g.,) the run step knows to set its classpath using the classes directory generated by the compile step.

More complex compilations are discussed in Compiler Options below.

Test source files

The dir attribute in the ajc-test element specifies a base test directory relative to the directory of the test specification file. All paths are specified relative to this base test directory. E.g., the last example used dir="new" and presumed the following directory structure:
    {some dir}                  # test specification directory
        {testDefinition}.xml
        new/                    # test base directory
          pack/Aspect.java
          pack2/Main.java
Test cases with only one file in the default package can often share directories (e.g., see the many files in new/), but usually a test case has its own directory.

Incremental tests

Incremental tests are more complex because they involve updating source files before recompiling. Here's an example that
    <ajc-test dir="new/incremental1" title="incremental test">
        <compile staging="true" 
             sourceroots="." 
             options="-incremental" />
        <run class="Main"/>
        <inc-compile tag="20">
            <message kind="error" line="15">
        </inc-compile>
        <inc-compile tag="30"/>
        <run class="Main"/>
    </ajc-test>
To understand what's happening in this test would require looking at the source directory to see which files are tagged "20" and "30". But before walking through that, there's a variation of incremental building for AJDE. (The AJDE wrapper around the ajc compiler can also be driven by the test harness.)

In AJDE, incremental tests also involve the notion of "fresh builds", i.e., when the test reuses the same compiler and build configuration but rebuilds from scratch. In that case, there is still the question of whether source files should be updated; if not, the tag can have the special value "same". For example, if the last example had two more lines:

        ...
        <inc-compile tag="30"/>
        <run class="Main"/>
        
        <inc-compile fresh="true" tag="same"/>
        <run class="Main"/>
    </ajc-test>
The test would complete by completely rebuilding the same files and then running the main class. This option has no effect on the normal (ajc) compiler, and requires specifying -ajdeCompiler to the harness or compile step as an argument.

To recap the attributes of note for setting up incremental tests:

Now, to get back to the question of what exactly is happening in an incremental test. To do so, compare the tags with the files specified in the test source directory; the tagged files are the updates for that particular step. (By convention the tags are numeric and in order, but they need not be.) For example, here are some sources for the test above:

    {some dir}
        {testDefinition}.xml
        new/
          incremental1/
            DeleteMe.delete.30.java
            DeleteMe.java
            Main.20.java
            Main.30.java
            Main.java
            NewFile.30.java
Comparing this with the test specification, you can see the harness will run one compile and two re-compiles:
  1. Initially compile Main.java and DeleteMe.java
        <compile staging="true"
                  files="Main.java,DeleteMe.java"/>
    
        {some dir}
            {testDefinition}.xml
            new/
              incremental1/
                ...
                DeleteMe.java
                ...
                Main.java
                ...
      
  2. For incremental tag 20, update Main.java with the contents of Main.20.java and recompile, expecting an error on line 15:
        <inc-compile tag="20">
            <message kind="error" line="15">
        </inc-compile>
    
        {some dir}
            {testDefinition}.xml
            new/
              incremental1/
                ...
                Main.20.java
                ...
      
  3. For incremental tag 30, delete DeleteMe.java, add NewFile.java, update Main.java with the contents of Main.30.java and recompile with no error or warning messages:
        <inc-compile tag="30"/>
    
        {some dir}
            {testDefinition}.xml
            new/
              incremental1/
                DeleteMe.delete.30.java
                ...
                Main.30.java
                ...
                NewFile.30.java
      

Verifying test steps

As seen above, two ways to verify that a compile was successful are to run the corresponding class or check the compiler messages. More generally, the harness can verify compile/run test steps by detecting the following things and comparing against expected behavior:

Detect Evaluate
Exceptions signal failure
Result value heuristically compare with expected: compiles not expecting errors are expected to return a normal result status, and vice-versa.
Messages (e.g., compiler warnings and errors) Compare with expected messages
Directory changes (e.g., .class files created) Compare with expected changes
Runtime behavior Use Tester in test source code to signal events for comparison with expected events.

Messages
In a test definition, a nested message element specifies a condition on the successful completion of the nesting ajc-test sub-element. In the earlier example, if the harness does not detect an error message on line 10 or if there are unexpected messages, then the compile step will be reported as failing:
    <ajc-test dir="new" title="simple error test">
        <compile files="Main.java">
            <message kind="error" line="10"/>
        </compile>
    </ajc-test>
Expected messages can be specified as sub-elements for the three ajc-test elements compile, inc-compile, and run. Messages require a kind (error or warning) and a line. To make specification easier, if an error is specified for a line, the harness accepts as expected any number of errors on that line.

Most messages fall into those categories. However, an IMessage also has a Throwable thrown, a String detail, and a List of ISourceLocation (essentially, "see also", to point to other relevant places in the code). The thrown element is not supported, but you can specify the others:

    <ajc-test dir="new" title="simple error test">
        <compile files="Main.java">
            <message 
                kind="error" 
                line="10" 
                file="Main.java"
                text="This join point should never happen!"
                detail="(Find the declaring code below.)">
                <source line="12" file="Main.java"/>
                <source line="14" file="Main.java"/>
            <message>
        </compile>
    </ajc-test>
This compiler-error test specifies a single error message triggered on line 10 of Main.java, with some text and details and two other source locations that are relevant, on lines 12 and 14 of the same file.

When specifying messages, be sure to provide enough detail that the harness can distinguish expected messages. For example, if you only specify the line number, then it will match the message in any file (if there is more than one). If there are two or more messages expected on a line, provide enough information to distinguish them. If you are using text or detail attributes, do not use one string that is a prefix of the other, since it will match either message, and the other message might not match.

The "info" messages are special in that they are normally ignored. To specify expected "info" messages, you have to list all the messages the compiler will issue, which can vary depending on the compiler settings. Use the option ^verbose to force the compiler's -verbose option off.

By the same token, if you don't specify any extra source locations, then they will not be checked. If you think it is a bug if they are issued, then you have to specify one if them. (There is currently no way to specify that a message has no extra source locations.)

Changes in an output directory
As with messages, specifying directory changes as a nested element operates as a condition on the successful completion of the nesting element. The harness will check for files added, removed, updated, or unchanged from a given directory. The directory is specified explicitly or using a token for the shared classes or run directory. For even more brevity, the harness supports a default suffix for the files.

Directory changes have been used only to validate changes in the classes directory. The current harness defaults to using the classes directory, and when using the classes directory uses .class as a default suffix.

Here's an example specification:

    <ajc-test dir="new/dirchanges-test" title="dir-changes test">
        <compile staging="true"
                   files="Main.java,DeleteMe.java,Unchanged.java"/>
        <inc-compile tag="20">
            <dir-changes updated="Main" 
                         removed="DeleteMe"
                       unchanged="Unchanged"/>
        </inc-compile>
    </ajc-test>
It checks after a recompile that
Runtime behavior
Code that tests aspects often falls into the pattern of comparing expected and actual events/signals. For example, to prove that before advice in fact ran before a particular method execution, you might generated and expect signals corresponding to
  1. method-call
  2. before advice execution
  3. method-execution
The Tester utility class provides API's for signalling actual and expecting events and comparing the two. Typically, events are symbolized and compared as String. Here's a small sample test case that for the scenario above:
import org.aspectj.testing.Tester;

public class Main implements Runnable {
    public static void main(String[] args) {
        Tester.expectEvent("before advice");
        Tester.expectEvent("execute run");
        new Main().run();     
        Tester.checkAllEvents();
    }
    public void run() {
        Tester.event("execute run");
    }
}

aspect A {
    before () : target(Runnable) && execution(void run()) {
         Tester.event("before advice");
    }
}
If either the advice or the method does not run, the harness will report a failure.

Tester also has methods that operate like JUnit assertions as idioms to detect differences in expected and actual values, signalling appropriately.

Tester is at ../testing-client/src/org/aspectj/testing/Tester.java and is built into ../lib/tests/testing-client.jar which is included on the classpath by the compile and run steps.

You can write runtime test cases without using Tester; simply throw some exception from the main thread to signal failure.

Compiler options

The harness does not support all of the AspectJ 1.1 compiler options. Flags are mainly supported through the a comma-delimited list in the options attribute:
    <ajc-test dir="new" title="lint test">
        <compile files="LintTest.java" 
                 options="-Xlint,-emacssym,-source,1.4">
            <message kind="warning" line="22">
        </compile>
This should work even for complex single-arg options like -g:none, but will fail for comma-delimited single-arg options like -g:lines,vars because the comma delimiters are ambiguous (yes, a design bug!).

The compile element has the following attributes which handle most of the other compiler arguments:

Paths for these are all relative to the test base directory, and multiple entries are separated with commas. (Use only one entry for xlintfile.)

Here is a cooked example that uses all compiler attributes:

    <ajc-test dir="new" title="attributes test">
        <compile files="Main.java,injar.jar,some-directory" 
               staging="true"
               options="-Xlint,-g:none"
              argfiles="debug.lst,aspects/test.lst"
            aspectpath="jars/requiredAspects.jar"
             xlintfile="ignore-all-but-typenotfound.properties"
             classpath="providedClassesDir,jars/required.jar"/>
        <inc-compile tag="20"/>
    </ajc-test>
Test-only compiler attributes
The following attributes of the compiler entity dictate harness behavior:
Unsupported compiler options
The harness does not support the following AspectJ compiler options: -outjar {file}, -log {file}. (-d {dir} is used but specification is not supported.)

Background information on the Harness

To make the test specifications as terse as possible, harness components for inc-compile and run elements use information set up earlier by compile, some of which is only implicit. When a test is run, the harness creates a staging directory for temporary files and a sandbox component for sharing information between test components, particularly classpath entries shared between the compile and run components. The compile and run components share classpath information through the sandbox, adding default libraries: The harness provides some more advance behaviors, which you might see specified in the tests. For more information, see the API documentation for the harness ( org/aspectj/testing/drivers/package.html).
last updated March 8, 2004