Problems with Defensive Collection Getters and JAXB

1:28 PM , , 1 Comments

A good general practice with collection getters is that they a) return a reference to the collection or b) return an unmodifiable wrapper around the collection. Further, it's often desirable to have an addXXX and removeXXX instead of a collection setter to complete the encapsulation picture.



public class Order {
private List items;

public List getItems() {
Collections.unmodifiableList(items);
}

// .. no setter

public void addItem(Item item) {
items.add(item);
}

public void removeItem(Item item) {
items.remove(item);
}
}

This practice runs awry when using this object for marshalling XML with JAXB. What you will notice when it tries to unmarshal and XML message representing this object is that the resulting items list will be empty.

That's really too bad.

What is happening is that JAXB in the absence of a collection setter is calling the getter in the hopes of calling "add" on the reference that it gets back. Unfortunately, since we are returning an unmodifiable list, JAXB silently fails by not adding any items at all and moving on.

The solution is actually pretty simple. We just need to tell JAXB to look at the field instead of the method. You do this with two annotations:

public class Order {
@XmlElementWrapper("items") @XmlElement("item") private List items;

...
}

And that's all! Now, you can keep your defensive collection encapsulation and still allow JAXB to do it's unmarshalling. Enjoy!

1 comments:

Adding paths to your class loader

8:52 AM , , 0 Comments

So, a class loader is immutable in Java, right? At least, there are no setter methods in the public API (except the assert stuff) and there is no obvious way to specify what class loader you might want to use where.

This came to bug me while creating a maven plugin the other day in which the plugin needed to read a specific classpath resource from the project it was running in. Since maven plugins run in their own class loader, I wasn't going to be able to access project classpath resources.

I might have been able to add the @requiresDependencyResolution metadata annotation to resolve the problem, but we really didn't want to box ourselves in to needing an enclosing project to run the plug-in.

The Maven Exec Plugin gave me an idea.

The Maven Exec Plugin is the maven-y way to run command-line java through a maven goal. It can either be run within the maven process as a separate thread or as a separate process. Either way, it has the same challenge of propagating the enclosing project's classpath on to a separate and distinct classpath context.

In the in-maven-process case, what they do is create their own classloader, and then run the invocation of the main method inside a separate thread, setting that threads context classloader along the way:



private void executeWithClassLoader(Runnable runnable, ClassLoader classLoader) throws MojoExecutionException {
IsolatedThreadGroup threadGroup = new IsolatedThreadGroup( runnable.getClass().getName() );

Thread bootstrapThread = new Thread( threadGroup, runnable, runnable.getClass().getName() + ".run()" );
bootstrapThread.setContextClassLoader(classLoader);
bootstrapThread.start();

try {
bootstrapThread.join();
} catch ( InterruptedException e ) {
Thread.currentThread().interrupt(); // good practice if don't throw
getLog().warn( "interrupted while joining against thread " + bootstrapThread, e ); // not expected!
}

synchronized ( threadGroup )
{
if ( threadGroup.uncaughtException != null )
{
throw new MojoExecutionException( "An exception occured while executing the Java class. "
+ threadGroup.uncaughtException.getMessage(),
threadGroup.uncaughtException );
}
}
}


The Runnable that is passed in is the section of code where you actually need access to the project's classpath. The ClassLoader is created like this:

URL outputDirectory = new File( project.getBuild().getOutputDirectory() ).toURI().toURL();
ClassLoader classLoader = new URLClassLoader( new URL[] { outputDirectory } );
this.executeWithClassLoader(runnable, classLoader);

Of course, there may be other directories or artifacts that you need to add for your runnable to function correctly, but that is the basic idea.

For more ideas, check out http://svn.codehaus.org/mojo/tags/exec-maven-plugin-1.2/src/main/java/org/codehaus/mojo/exec/ExecJavaMojo.java

0 comments:

EasyMock and varargs

8:57 AM , 2 Comments

EasyMock is a neat tool for creating mocks at runtime for unit tests. It does some pretty cool things, including writing their main API using Fluent Pattern.

For example, when adding behavior to your mocks, you would call the following:


EasyMock.expect(...).andReturn(...).times(3).andReturn(...).times(2)...


Awesome.

There is one feature, though, that is lacking that can send you for a loop if you are unaware, which is varags.

If you need to add a behavior to a method that takes var args, you will need to how many parameters are going in at test time.

For example, say I have the following method:


Object myMethod(Object... args)


That I want to supply behavior for. In EasyMock, there isn't a way to say "this is a var args method". So, you will need to expand it according to your test case:


EasyMock.expect(myMethod(arg0, arg1, arg2)).andReturn(...)...


The reason is that EasyMock's strategy for method matching is to count the number of arguments in your behavior with the number of arguments in the method invocation. If you just say


EasyMock.expect((String[])EasyMock.anyObject())...


or something like that, it will see that there is only one parameter, whereas the method invocation parameters will never match.

2 comments:

Cross-referencing plugins in Sonar 2.2

9:37 AM , , 0 Comments

Formerly, we had three Sonar plugins, two "mavenly" dependent on the other. The parent plugin held the common code for uploading non-Java files into Sonar for reporting. The other two took care of analyzing xml and css, respectively, and tying violations to those files.

This worked great in Sonar 2.0.1, but when we upgraded to Sonar 2.2, the violation coloring stopped working on the server for these files.

What could be the problem? I walked carefully through the Sonar code and saw that the violations were making it into the database, but that two resources were getting created for each file in the project.

This didn't make a whole lot of sense since the files and their violations (and other metrics) are housed in the singleton DefaultSonarIndex as a map of Files to metrics (the actual class name for all the class metrics is called Bucket). What could be causing to records of the same file to make it into the map.

Enter the Resource equals method:


1. public boolean equals(Object o) {
2. if (this == o) {
3. return true;
4. }
5. if (o == null || getClass() != o.getClass()) {
6. return false;
7. }
8.
9. Resource resource = (Resource) o;
10. return key.equals(resource.key);
11.
12. }

This is a pretty standard looking equals method that doesn't really seem to be the suspect since I know that they keys are the same by verifying it in my debugger. The crazy thing is that it breaks on line 5.

What?? o is definitely not null and they are definitely the same class...oh, right...as of Sonar 2.2, each plug-in loads in its own classloader. The first loading of the class was by the parent plugin to load the source into Sonar and the second loading was by the child plugin to specify violations.

So the quick solution was to take the small amount of code in the parent project and distribute it to the others. The cleaner fix would be to refactor it so that only one project is actually referring to the classes (probably the children).

Phew. That was a tricky one.

0 comments:

Integration Tests in Sonar

Integration tests are another important aspect of analyzing a project's overall health that Sonar does not yet support out of the box. To get this functionality, you'll need to build a couple of Sonar plugins (or try using the ones that I built) that will instrument your integration test code, run the integration tests, and collect the integration test results as well as the new coverage data.

It sounds like a tall order, but a lot of the work has already been done for you.

Sonar runs the unit tests in your project automatically by way of surefire. Sonar has a surefire plugin which executes the surefire:test goal and then collects the results by reading the TEST-xxx.XML files that it produces.

So, why not do the same with failsafe, the maven plugin for running integration tests? Sounds pretty simple and there are only a couple of catches.

The first catch is that there are not pre-defined metrics in Sonar for integration tests. So, if you don't mind piggybacking on the unit test metrics--meaning that your success, failure, and coverage numbers will be aggregated across both unit and integration--then simply copy the SurefireSensor, changing only the part where you are saving the metrics to updating existing metrics.

The second is a corner case. What if you don't have any unit tests, but you do have integration tests? If the surefire sensor doesn't find any unit tests to run, then it preemptively sets one of the metrics that later on is overwritten by the UnitTestDecorator (read: things explode). The only way around this one (that I've found) is to add another sensor to our failsafe plugin that, when the Sonar batch process first starts up, creates a dummy surefire empty test result for the surefire sensor to catch. While temporary and dummy files give me an icky feeling inside, it does the trick.

One more involved way around both of these is to create your own set of metrics and a decorator to display them. In this way, you wouldn't collide with any of Sonar's efforts to run unit tests. I haven't done this, yet, and I think that there might be more involved than that, so I didn't touch that route.

Okay, the failsafe plugin takes care of running and reporting on integration test execution. What about code coverage?

My coverage tool of choice is Emma. A while ago, Sonatype wrote an article explaining how to use emma4it to add integration test code coverage to emma. So, we can follow the same pattern, creating a Sonar plugin to execute the appropriate maven commands.

The only tricky part here was telling Sonar when to run each command. As far as I understand it, one cannot specify maven lifecycle points at which to run each maven goal. Instead, Sonar invokes all the goals serially at the point when it's that sensor's turn to execute.

What we really want is emma:instrument and emma4it:instrument-project-artifact to happen together, then the failsafe sensor, and then emma4it:report. Hmm...

The way we solved it was to have three different MavenPluginHandlers and Sensors in our emma4it sonar plugin. The emma:instrument and emma4it:instrument-project-artifact are dependent on emma finishing and emma4it:report is configured as running in the Phase.Name.POST phase. Failsafe, also specified as being dependent on code coverage finishing falls by default in between the two.

If there were a closer mapping to post-integration-test, process-classes, etc. in the future, this piece would become a lot cleaner.

I'm going to spend a bit of time cleaning up my code before I upload it, but after that, you are free to code by example. :)
Enhanced by Zemanta

1 comments:

Reporting more than Java code in Sonar (Part I)

8:57 AM , , 2 Comments

Of course, anyone that has done static analysis on their project in the past has found certain bad practices that are out of their tools reach to spot. Some examples are:
  • Front-end code, like CSS and HTML
  • Configuration files, like a Maven pom.xml or a Spring applicationContext.xml
  • Localization files
While not supported out-of-the-box, Sonar makes reaching and reporting on these areas of your project much easier. Basically, here are the steps to getting Sonar to report on additional languages:
  1. Make Sonar aware of your new language.
  2. Attach quality rules for that language to an existing Quality Profile.
  3. Create/use a tool that will detect the bad practices you are looking for.
  4. Hook that tool together with Sonar either via the tool's Maven build plug-in or by invoking it programmatically within the Sonar plug-in framework.

Make Sonar Aware of Your New Language

First, it is easy to make Sonar aware of an additional language. In our case, we wanted to address configuration found in various xml application files for Spring, Maven, JSF, and the like. The following seven files are required:
  • XmlPlugin.java - What you are making in the most basic sense is a Sonar plug-in. For each sonar plug-in, there is a main plugin file like this one. I won't go into this here. Instead, I'd recommend you look at the Sonar Plug-in Documentation.
  • Xml.java - This file is what the rest of Sonar will refer to when it asks what language a file is, etc. Ours looks like this:

    public class Xml extends AbstractLanguage {
    protected static final String[] EXTENSIONS = { "xml", "xhtml" };
    public static final Xml INSTANCE = new Xml();

    public Xml() {
    super("xml", "XML"); // 'key' and 'name', or, in other words, internal and external names
    }

    public String[] getFileSuffixes() {
    return EXTENSIONS.clone();
    }

    // ... some other helper methods
    }


  • XmlFile.java and XmlPackage.java - These two classes represent the xml file and directory metadata. They aren't a way to get at the contents of the file, but instead its name, location, etc. The both extend org.sonar.api.resources.Resource.

  • XmlSourceImporter.java - This file is in charge of looking up all the "xml" files in the project and notifying Sonar about them. Because we also want to include the main pom.xml file, which is outside of the source directory, this class is a little more complicated. However, you can easily pull out the basics from it:

    public class XmlSourceImporter extends AbstractSourceImporter {
    public XmlSourceImporter() {
    super(Xml.INSTANCE);
    }

    public void analyse(Project project, SensorContext context) {
    try {
    doAnalyse(project, context);
    } catch ( IOException e ) {
    throw new SonarException("Parsing source files ended poorly", e);
    }
    }

    protected XmlFile createResource(File file, List sourceDirs, boolean unitTest) {
    ... create an XmlFile object ...
    }

    /* Depending on needs, one might ask the kind of project that it is or something. In this case, though, we want to execute this importer on every project, since we are anticipating the existence of xml files in every project. */
    public boolean shouldExecuteOnProject(Project project) {
    return isEnabled(project);
    }

    protected void doAnalyse(Project project, SensorContext context) throws IOException {
    ProjectFileSystem fileSystem = project.getFileSystem();
    File root = fileSystem.getBasedir();
    List sourceDirs = fileSystem.getSourceDirs();
    sourceDirs.add(root);
    List xmlFiles = ...magical method call that looks recursively in the sourceDirs for xml files...
    for ( File xmlFile : xmlFiles ) {
    // if it is in a derived directory, like the build directory, we don't want it
    if ( DefaultProjectFileSystem.getRelativePath(xmlFile, fileSystem.getBuildDir()) == null ) {
    sourceFiles.add(xmlFile);
    }
    }

    }
    }


Attach Quality Rules

Now, what is not so obvious is how to get Sonar to report on all languages, Java and otherwise, on one dashboard. In fact, in an email that I sent to the the Sonar developers, they apparently don't officially support it, yet. But, we found a way! And it works great for us.

The key lies in creating a rules repository that contributes to the existing Java quality profile that you already have set up. A rules repository is just another Sonar extension, one that represents an xml rule configuration file and marshals the contents of that file into a RulesProfile object.

Here is an example of what the getProvidedProfiles might look like:


public List getProvidedProfiles() {
RulesProfile profile = new RulesProfile("My Profile", Java.KEY);
profile.setDefaultProfile(true);
profile.setProvided(true);

List rules = getInitialReferential();
List activeRules = new ArrayList();
for (Rule rule : rules) {
activeRules.add(new ActiveRule(profile, rule, rule.getPriority()));
}
profile.setActiveRules(activeRules);

return Arrays.asList(profile);
}


In the end, it's a little bit of a hack.

The only other thing that is necessary from Sonar's perspective is a way to read the violations file that your analysis tool creates. We modeled ours after PMD's violations file. Here, you can extend AbstractViolationsXmlParser and follow the pattern in the PMD Sonar Plugin.


Enhanced by Zemanta

2 comments:

Creating a JAX-RS Compliant Stub for RESTTemplate

8:15 AM , 1 Comments

I really like RESTTemplate, but I have a small beef with the lack of JAX-RS support (go to Arjen Poutsuma's comments below for Spring's rationale for not building it in). So, if you are like me and want to use JAX-RS annotations and RESTTemplate together, here is what I did.

First, a little bit of background on what JAX-RS looks like. JAX-RS is the result of JSR 311 which includes a set of method and class annotations. These annotations are interpreted by a JAX-RS REST provider like Jersey or CXF. A typical class might be annotated in the following way:


@Path("/myResource")
@Consumes("application/json")
public interface MyEndpoint {
@GET
@Path("/myAction/{resourceId}")
@Produces("application/json")
MyClass getResource(@PathParam("resourceId") String resourceId);
}


Then, with some configuration, CXF and the like will create the client that translates method invocations into the appropriate HTTP request.

So, the goal is to create some sort of proxy or instrumentation piece that will take this interface and translate invocations to it into the correct RESTTemplate method invocation.

I only needed a handful of the features from the JAX-RS spec which basically included:
  • Support for GET, POST, PUT, and DELETE rest headers.

  • Support for url templates via PathParam, QueryParam, and FormParam.

  • Support for Path annotations at the class and method level.
One possibly important thing for other people that I DIDN'T tackle:
  • Support for one method parameter having no JAX-RS annotation.
(Note: I decided to create a java InvocationHandler, but there are several ways to go about this, including Spring AOP.)

In the case of a JAX-RS client, the HTTP methods actually translate into different RESTTemplate method calls. Since I wasn't doing location requests, I have a one-to-one mapping from JAX-RS HTTP method annotations and RESTTemplate method calls:


if ( httpMethod instanceof POST ) {
return restTemplate.postForObject(url + queryParamExtractor.getExtracted(), formParamExtractor.getExtracted(), method.getReturnType(), pathParamExtractor.getExtracted());
} else if ( httpMethod instanceof GET ) {
return restTemplate.getForObject(url + queryParamExtractor.getExtracted(), method.getReturnType(), pathParamExtractor.getExtracted());
} else if ( httpMethod instanceof DELETE ) {
restTemplate.delete(url + queryParamExtractor.getExtracted(), pathParamExtractor.getExtracted());
} else if ( httpMethod instanceof PUT ) {
restTemplate.put(url + queryParamExtractor.getExtracted(), formParamExtractor.getExtracted(), pathParamExtractor.getExtracted());
}


The more involved part is checking each method parameter and appropriately interpreting the JAX-RS method parameter annotations. I just needed PathParam, QueryParam, and FormParam:


for ( int i = 0; i < allParameterAnnotations.length; i++ ) {
Annotation[] parameterAnnotations = allParameterAnnotations[i];
Object arg = args[i];
for ( Annotation parameterAnnotation : parameterAnnotations ) {
if ( parameterAnnotation instanceof PathParam ) {
pathParamExtractor.extractFrom(((PathParam)parameterAnnotation).value(), arg);
} else if ( parameterAnnotation instanceof QueryParam ) {
queryParamExtractor.extractFrom(((QueryParam)parameterAnnotation).value(), arg);
} else if ( parameterAnnotation instanceof FormParam ) {
formParamExtractor.extractFrom(((FormParam)parameterAnnotation).value(), arg);
}
}
}


Everything else is just helper classes. Pay no attention, for example, to the variables 'pathParamExtractor', etc. They are simply helpers for accumulating the values of those parameters in a way to satisfy the url template.

Hope this gets you started, too.

1 comments:

Some Findbugs detectors

At our company, we've been creating various static analysis rules using PMD, Findbugs, and a home-grown tool all aggregated into a Sonar dashboard. It's pretty cool.

Here are some of the rules that we created and why we chose each particular tool:

Findbugs



Anemic Domain Modeling


Sounds pretty lofty, I know, but I think that we found a decent strategy that catches 80% of these issues.

To detect the practice of Anemic Domain Modeling, it seemed best to look from the point of view of the service provider where the logic usually is coded. We wanted to look for something like this:

public double getOrderTotal(Order o) {
double total = 0.0;
for ( OrderItem oi : o.getItems() ) {
total += oi.getQuantity() + oi.getPrice();
}
return total;
}

(By the way, it isn't my purpose here to debate why Anemic Domain Modeling might, indeed, be an anti-pattern. Read the article referenced to get started on that.)

What we noticed is that in these simple cases 1) no service state was being referenced and 2) no exit point other than those that originated from the method parameter was being invoked. So, our bug checks for these two cases and, finding both, logs a bug.

The reason for Findbugs here over PMD has to do with Findbugs, operating on the Java bytecode, has finer-grained access to the data types of each reference. This was the most common reason for choosing Findbugs over PMD when we did.


Unsaved Object Operation Detector


One of the bugs that has bitten me in the past is when I use an API that appears to mutate the calling object, but doesn't. Consider BigInteger, for example:

BigInteger original = BigInteger.ONE;
original.add(BigInteger.ONE);
// what is the value of 'original'?

The value of 'original' is still one because BigInteger's are immutable. In other words, and method call on BigInteger that returns a BigInteger is actually returning a new instance and not the original modified. This happens with String and several other classes.

To check for this, we currently maintain a list of API's that exhibit this behavior and then detect when a method that returns a value is called and whose return value is not set in the calling code, just as above.

Findbugs was used here because, by using the byte code, it can derive the data type of a reference after it has been declared. PMD can't do this (at least not to our knowledge) because it represents the code as an Abstract Syntax Tree, and it turns out to be fairly difficult to infer cross-transverse relationships in a hierarchical representation.


PMD




Avoid Run On Method Statements


The decision here was to try and make code more maintainable and modular by encouraging developers to not use run-on statements:

Object var2 = var1.method1().method2().method3().method4(var3).method5();

The code above has at least two problems. First, if method2 returns null, then it will result in a NullPointerException that is hard to debug. Second, unit tests become difficult to write because now several levels need to be mocked in order to prepare the variable 'var2'. A possible third problem would be that the method containing this line of code will have a much higher RPC making it more prone to faults and more brittle to change. (The reason I think it's only a possible third is because, even if these were separated into 5 different statements, the method would still have this same problem.)

In this detector, then, we find statements that make more than three chained invocations and notify the developer.

We chose PMD for this because in the byte code world, it became very, very difficult to decide when a method invocation was in the same chain. To be able to tell the difference between an invocation chain and not would be to keep track of a symbol table and program stack, which sounded like a lot of work.

Chains can be easily verified when looking at Java syntax, though, which is what PMD does. What took us 12 hours of trying and re-trying in Findbugs took us only 20 minutes in PMD.


Avoid Methods With Same Name As Class


I stole this one from Java Puzzlers. I didn't even know that this was allowable in Java:

public class MyClass {
public void MyClass() {
// body
}
}

You'll notice that the second line appears to be the signature for a constructor, but instead is the signature for a method! (Notice the 'void' keyword. That makes it a method) This would be crazy confusing, I think, if someone actually did it, so we made a rule.

Again, since this was really verifying syntax, PMD was the easiest approach.


Home-grown


I'll talk about our home-grown stuff in another post, because I have a lot of background to give. The basic idea though, is that the overall health of a project isn't limited to the Java code. Ideally, we'd like to take a stab at HTML, Javascript, CSS, Maven dependencies, and more. There didn't appear to be anything out there that analyzed this types of files, so we created one. More next time.

0 comments:

Findbugs and Source Code Lookup in Eclipse

11:29 AM 0 Comments

One of the first roadblocks that I ran into with Findbugs was how to see the Findbugs source code in Eclipse.

This is actually pretty simple, but I hadn't taken the time to do it, yet.

First, download the Findbugs source code:


wget http://sourceforge.net/projects/findbugs/files/findbugs/1.3.9/findbugs-1.3.9-source.zip/download


Second, unzip.

Third, go to the src/java directory, and jar up the sources:


cd findbugs-1.3.9/src/java
jar cvf findbugs-1.3.9-sources.jar .


Fourth (if you are using Maven), copy into .m2/repository/findbugs/findbugs/1.3.9.

Now, when you try to open one of the classes in Eclipse, it will ask you to attach the source code, which you can by referring to the location of that jar. Wohoo!

0 comments:

Woodstox and the w3c 503 error

4:06 PM 1 Comments

This morning, I was testing our static analysis tool and it threw a very strange error:


Could not read source file: Server returned HTTP response code: 503 for URL http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd

It came from the Woodstox parser when it was trying to parse the dtd listed in the html tag of one of our files.

So, I browsed to said dtd and found this page instead:


IP blocked due to re-requesting files too often



Your IP address has been blocked from accessing our site for 24 hours due to abuse.

The specific type of abuse we observed is: re-requesting the same resource too frequently. Specifically, we received at least 500 requests for the same resource (URI) from your IP address within a ten-minute time interval.

If you are using an application that makes HTTP requests to other sites, please configure it to use an outgoing HTTP cache instead of re-requesting the same files over and over again.

... and so on.

So, this led me to a lot of interesting research that I won't go over here. I am simply going to show one way that I learned around the issue with the Java XMLStreamReader and Woodstox.

To get an XMLStreamReader, one can do this:


InputStream is = ...;
XMLInputFactory factory = XMLInputFactory.getInstance();
XMLStreamReader reader = factory.createXMLStreamReader(is);


This will create a "ValidatingStreamReader" which is going to request the dtd each time it sees one. Thus, the complaint from w3c that its xhtml1-transitional dtd was being requested to often.

There are two ways that I see to solve this, and I found one after digging in the API for a few minutes. If I change my code to read this:


InputStream is = ...;
XMLInputFactory factory = XMLInputFactory.getInstance();
factory.setProperty(XMLInputFactory.SUPPORT_DTD, false);
XMLStreamReader reader = factory.createXMLStreamReader(is);


Then it won't request any dtds when parsing the xml file.

I found another property when digging through the Woodstox code that I can't figure out how to access. It was in InputConfigFlags and is referenced in ReaderConfig, which is an object fashioned in the Woodstox implementation of XMLStreamReader:


/**
* If true, input factory is allowed cache parsed external DTD subsets,
* potentially speeding up things for which DTDs are needed for: entity
* substitution, attribute defaulting, and of course DTD-based validation.
*/
final static int CFG_CACHE_DTDS = 0x00010000;


This seems like the more appropriate solution. Any ideas on how to access it?

1 comments:

Droid Post

10:39 PM 0 Comments

At Google I/O this year, they gave away Droids as free gifts to attendees. SO, I am now using AndraBlogger, an app that I downloaded from the Droid App Store, to make stpst from a mobile device.

So, you would have seen here an extra four paragraphs, but the app crashed. :) Fortunately, it salvaged the first paragraph. I'm tired now, though. I'll recreate the rest later.

0 comments:

Testing Custom FindBugs Detectors in Eclipse

12:15 PM 3 Comments

We are starting to use Sonar where I work, and I've been tasked with finding out how to write custom FindBugs rules.


First of all, that was pretty fun getting down into the byte code, it made me feel like a Java super-villain. Daniel Schneller has a very helpful post on creating your own FindBugs rule. With his help, you too can feel like a Java super-villain.


Anyway, the first thing that I noticed is that there wasn't really an easy way to debug through my detector. =[ Or, at least, I wasn't able to find one.


For my first stab, I decided to try creating a method that would take my detector class and a test class file to run the detector against.



public ProjectStats analyze(Class clazz, Detector detector)

Of course, I failed to remember that FindBugs analyzes class files, and so I would need to provide the location of the target directory. You need to do this with the maven plugin, so I'm not too worried about it:



public ProjectStats analyze(String filePath, Detector detector)

Finally, the bug reporting strategy is passed around as a parameter in most places, so I had to include the BugPattern and BugReporter in my signature:



public ProjectStats analyze(String filePath, Detector detector,
BugPattern bugPattern, BugReporter bugReporter)

Inside, it's a total mess of configuration. Hopefully, there is a way to clean it up:



public ProjectStats analyze(String filePath, Detector detector,
BugPattern bugPattern, BugReporter bugReporter)
throws CheckedAnalysisException, IOException, InterruptedException {
// internal to FindBugs, the code uses the Detector2 interface
Detector2 det = new DetectorToDetector2Adapter(detector);

// register the rules message
I18N.instance().registerBugPattern(bugPattern);

// a great deal of code to say
// 'analyze the files in this directory'
IClassFactory classFactory = ClassFactory.instance();
IClassPath classPath = classFactory.createClassPath();
IAnalysisCache analysisCache = classFactory
.createAnalysisCache(classPath, bugReporter);
Global.setAnalysisCacheForCurrentThread(analysisCache);
FindBugs2.registerBuiltInAnalysisEngines(analysisCache);
IClassPathBuilder builder = classFactory
.createClassPathBuilder(bugReporter);
ICodeBaseLocator locator = classFactory
.createFilesystemCodeBaseLocator(filePath);
builder.addCodeBase(locator, true);
builder.build(classPath, new NoOpFindBugsProgress());
List classesToAnalyze = builder.getAppClassList();
AnalysisCacheToAnalysisContextAdapter analysisContext =
new AnalysisCacheToAnalysisContextAdapter();
AnalysisContext.setCurrentAnalysisContext(analysisContext);

// finally, perform the analysis
for ( ClassDescriptor d : classesToAnalyse) {
det.visitClass(d);
}

// return the results
return bugReporter.getProjectStats();
}

Since I'm not particularly worried about the reporting, I just created a simple PrintStream bug reporter:



private static class PrintStreamBugReporter implements BugReporter {
private ProjectStats stats = new ProjectStats();
private PrintStream os;
private List observers =
new ArrayList();

public PrintStreamBugReporter(PrintStream os) {
this.os = os;
}

@Override
public void addObserver(BugReporterObserver arg0) {
observers.add(arg0);
}

@Override
public void finish() {
// TODO Auto-generated method stub

}

@Override
public ProjectStats getProjectStats() {
return stats;
}

@Override
public BugReporter getRealBugReporter() {
return this;
}

@Override
public void reportBug(BugInstance arg0) {
stats.addBug(arg0);
for ( BugReporterObserver observer : observers ) {
observer.reportBug(arg0);
}
os.println(arg0.getAbridgedMessage());
}

@Override
public void reportQueuedErrors() {
}

@Override
public void setErrorVerbosity(int arg0) {
}

@Override
public void setPriorityThreshold(int arg0) {
}

@Override
public void logError(String arg0) {
os.println(arg0.getBytes());
}

@Override
public void logError(String arg0, Throwable arg1) {
os.println(arg0.getBytes());
arg1.printStackTrace(os);
}

@Override
public void reportMissingClass(ClassNotFoundException arg0) {
arg0.printStackTrace(os);
}

@Override
public void reportMissingClass(ClassDescriptor arg0) {
os.println("Class not found: " + arg0);
}

@Override
public void reportSkippedAnalysis(MethodDescriptor arg0) {
os.println("Skipped Method: " + arg0);
}

@Override
public void observeClass(ClassDescriptor arg0) {
}


}

There you go. Now, I can debug my detectors in my IDE.

3 comments:

First Post!

6:23 PM 0 Comments

I've decided to start a separate blog dedicated to the more technical posts. There you go.

0 comments: