SonarQube (previously Sonar) is a quality management platform aimed mainly at Java (although other programming languages are supported to a varying degree. Here are a couple of tips to get it working on legacy projects:
Keep your code clean!
]]>Sometimes it is necessary to impose certain order on the tasks in a threadpool. Issue 206 of the JavaSpecialists newsletter presents one such case: we have multiple connections from which we read using NIO. We need to ensure that events from a given connection are executed in-order but events between different connections can be freely mixed.
I would like to present a similar but slightly different situation: we have N clients. We would like to execute events from a given client in the order they were submitted, but events from different clients can be mixed freely. Also, from time to time, there are “rollup” tasks which involve more than one client. Such tasks should block the tasks for all involved clients (but not more!). Let’s see a diagram of the situation:

As you can see tasks from client A and client B are happily processed in parallel until a “rollup” task comes along. At that point no more tasks of type A or B can be processed but an unrelated task C can be executed (provided that there are enough threads). The skeleton of such an executor is available in my repository. The centerpiece is the following interface:
public interface OrderedTask extends Runnable {
boolean isCompatible(OrderedTask that);
}
Using this interface the threadpool decides if two tasks may be run in parallel or not (A and B can be run in parallel if A.isCompatible(B) && B.isComaptible(A)). These methods should be implemented in a fast, non locking and time-invariant manner.
The algorithm behind this threadpool is as follows:

More information about the implementation:
Have fun with the source code! (maybe some day I’ll find the time to remove all the rough edges).
* somewhat of a misnomer, since there are still locks, only at a lower – CPU not OS – level, but this is the accepted terminology
** – benchmarking indicated this to be the most performant solution. This was inspired from the implementation of the ThreadPoolExecutor.
Meta: this post is part of the Java Advent Calendar and is licensed under the Creative Commons 3.0 Attribution license. If you like it, please spread the word by sharing, tweeting, FB, G+ and so on! Want to write for the blog? We are looking for contributors to fill all 24 slot and would love to have your contribution! Contact Attila Balazs to contribute!
]]>The Java runtime is a complex beast – and it has to be since it runs officially on seven platforms and unofficially on many more. Give this, it is normal that there are many knobs and dials to control how things function. The more well known ones are:
Other than these, it is (very) rarely the case that you need to change the defaults. However, thanks to Java being open source you can see the list of options, their default values and a short explanation directly from the source code. Currently there are almost 800 options in there!
An other way to see the options (but one which doesn’t display the explanations unfortunately) is the following command:
java -XX:+UnlockDiagnosticVMOptions -XX:+UnlockDiagnosticVMOptions -XX:+PrintFlagsFinal -version
These options are well worth studying. Not for tweaking them (since there is a wealth of testing behind the defaults the extent of which would be very hard to replicate), but rather to understand the different functionalities offered by the JVM (for example why you might not see stacktraces in exceptions).
Meta: this post is part of the Java Advent Calendar and is licensed under the Creative Commons 3.0 Attribution license. If you like it, please spread the word by sharing, tweeting, FB, G+ and so on! Want to write for the blog? We are looking for contributors to fill all 24 slot and would love to have your contribution! Contact Attila Balazs to contribute!
]]>It is common knowledge that Java optimizes the substring operation for the case where you generate a lot of substrings of the same source string. It does this by using the (value, offset, count) way of storing the information. See an example below:

In the above diagram you see the strings “Hello” and “World!” derived from “Hello World!” and the way they are represented in the heap: there is one character array containing “Hello World!” and two references to it. This method of storage is advantageous in some cases, for example for a compiler which tokenizes source files. In other instances it may lead you to an OutOfMemorError (if you are routinely reading long strings and only keeping a small part of it – but the above mechanism prevents the GC from collecting the original String buffer). Some even call it a bug. I wouldn’t go so far, but it’s certainly a leaky abstraction because you were forced to do the following to ensure that a copy was made: new String(str.substring(5, 6)).

This all changed in May of 2012 or Java 7u6. The pendulum is swung back and now full copies are made by default. What does this mean for you?
new String(str.substring) to force a copy of the character buffer, you can stop as soon as you update to the latest Java 7 (and you need to do that quite soon since Java 6 is being EOLd as we speak).Thankfully the development of Java is an open process and such information is at the fingertips of everyone!
A couple of more references (since we don’t say pointers in Java :-)) related to Strings:
Hope I didn’t strung you along too much and you found this useful! Until next time
– Attila Balazs
Meta: this post is part of the Java Advent
Calendar and is licensed under the Creative Commons 3.0 Attribution license. If you like it, please spread the word by sharing, tweeting, FB, G+ and so on! Want to write for the blog? We are looking for contributors to fill all 24 slot and would love to have your contribution! Contact Attila Balazs to contribute!
There are cases where you would like to start a Java process identical to the current one (or at least using the the same JVM with tweaked parameters). Some concrete cases where this would be useful:
Doing this is relatively simple – and can be done in pure Java – after you find the correct API calls:
Listarguments = new ArrayList<>(); // the java executable arguments .add(String.format("%s%sbin%sjava", System.getProperty("java.home"), File.separator, File.separator)); // pre-execuable arguments (like -D, -agent, etc) arguments.addAll(ManagementFactory.getRuntimeMXBean() .getInputArguments()); String classPath = System.getProperty("java.class.path"), javaExecutable = System .getProperty("sun.java.command"); if (classPath.equals(javaExecutable)) { // was started with -jar arguments.add("-jar"); arguments.add(javaExecutable); } else { arguments.add("-classpath"); arguments.add(classPath); arguments.add(javaExecutable); } // we might add additional arguments here which will be received by the // launched program // in its args[] paramater arguments.add("runme"); // launch it! new ProcessBuilder().command(arguments).start();
Some explanations about to the code:
java and is located in bin/java relative to java.home. We use File.separator for the code to be portable.-Xmx). It does not include the classpath.java.class.path-jar myjar.jar syntax or the MyMainClass syntax and replicate it.This is it! After that we use ProcessBuilder (which we should always favour over Runtime.exec because it auto-escapes the parts of the command line for us).
A final thought: if you intend to use this method to “daemonize” a process (that is: to ensure that it stays running after its parent process has terminated) you should do two things:
javaw instead of java. This ensures that the process won’t be tied to the console it was started from (however it will still be tied to the user login session and will terminate when the user logs out – for a more heavy-duty solution look into the Java Service Wrapper).This is it for today, hope you enjoyed it, fond it useful. If you run the code and it doesn’t work as advertised, let me know so that I can update it (I’m especially interested if it works with non Sun/Oracle JVMs). Come back tomorrow for an other article!
Meta: this post is part of the Java Advent Calendar and is licensed under the Creative Commons 3.0 Attribution license.
]]>Testing multi-threaded code is hard. The main problem is that you
invoke your assertions either too soon (and they fail for no good
reason) or too late (in which case the test runs for a long time,
frustrating you). A possible solution is to declare an interface like
the following:
interface ActivityWatcher {
void before();
void after();
void await(long time, TimeUnit timeUnit) throws InterruptedException, TimeoutException;
}
It is intended to be used as follows:
The net result is that when the counter is zero, all your asynchronous tasks have executed and you can run your assertions. See the example code. A couple more considerations:
One thing the above code doesn’t do is collecting exceptions: if the
exceptions happen on different threads than the one executing the
testrunner, they will just die and the testrunner will happily report
that the tests passs. You can work around this in two ways:
collect(Throwable)” method which gets called with the uncaught exceptions and “await” rethrows them.Implementing this is left as an exercise to the reader :-).![]()
JSON is a good alternative when you need a lightweight format to specify structured data. But sometimes (for example when you want the user to specify JSON manually) you would like to relax the formalism required to specify "valid" JSON data. For example the following snippet is not valid as per the spec, although its intent is quite clear:
[{ foo: 'bar' }]
To make this standard compliant we would need to write it as:
[{ "foo": "bar" }]
We shouldn’t run out and blame the standard of course since it needs to balance many contradictory requirements (ambiguity of encoded data, ease of understanding, ease of writing parsers, etc). If you decide that you want to strike the balance differently (make the definition of valid data more relaxed) you can do this easily with the Jackson parser:
JsonParser parser = new JsonFactory()
.createJsonParser("[{ foo: 'bar' }]")
.enable(JsonParser.Feature.ALLOW_UNQUOTED_FIELD_NAMES)
.enable(JsonParser.Feature.ALLOW_SINGLE_QUOTES);
JsonNode root = new ObjectMapper().readTree(parser);
assertEquals("bar", root.get(0).get("foo").asText());
If your tool of choice is gson, it is slightly more complicated but still doable. See the linked source code for a complete example.
JSON is a good tool for semi-structured data and using a relaxed parsing can make the programs you write easier to use.
]]>javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException:
PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException:
unable to find valid certification path to requested target
One cause of the problem can be that the server uses an SSL provider which is based on a root certificate that wasn’t included with the particular version of Java you are using (this is especially true for really old versions like Java 1.5). The issue can be solved by updating to the latest version, but it might be that this isn’t an option. Fortunately I found the following article: No more ‘unable to find valid certification path to requested target’
How to use it:
javac InstallCert.javajava InstallCert imap.mailprovider.com:993 (993 is the port for IMAPS) jssecacerts. You need to copy this to $JAVA_HOME/jre/lib/security/cacerts (back up the existing file first!) HTH
]]>On the surface it looks simple: just add the dependency and you can run the example code.
However what the jython artifact doesn’t get you are the standard python libraries like re. This means that as soon as you try to do something like the code below, it will error out:
PythonInterpreter interp = new PythonInterpreter();
try {
interp.exec("import re");
}
catch (PyException ex) {
ex.printStackTrace();
}
The solution? Use the jython-standalone artifact which includes the standard libraries. An other advantage is that it has the latest release (2.5.2) while jython lags two minor revisions behind (2.5.0) in Maven Central. A possible downside is the larger size of the jar.
<dependency>
<groupId>org.python</groupId>
<artifactId>jython-standalone</artifactId>
<version>2.5.2</version>
</dependency>
]]>The problem: you have some resources in an Ivy repository (and only there) which you would like to use in a project based on Maven. Possible solutions:
My goal for the solution (as complex as it may be) was:
The solution looks like the following (for the full source check out the code-repo):
Have two Maven profiles: ivy-dependencies activates when the dependencies have already been downloaded and ivy-resolve when there are yet to download. This is based on checking the directory where the dependencies are to be copied ultimately:
...
<id>ivy-dependencies</id>
<activation>
<activeByDefault>false</activeByDefault>
<file>
<exists>${basedir}/ivy-lib</exists>
</file>
</activation>
...
<id>ivy-resolve</id>
<activation>
<activeByDefault>false</activeByDefault>
<file>
<missing>${basedir}/ivy-lib</missing>
</file>
</activation>
...
Unfortunately there is a small repetition here, since Maven doesn’t seem to expand user-defined properties like ${ivy.target.lib.dir} in the profile activation section. The profiles also serve an other role: to avoid the consideration of the dependencies until they are actually resolved.
When the build is first run, it creates the target directory, writes the files needed for an Ivy build there (ivy.xml, ivysettings.xml and build.xml – for this example I’ve used some parts from corresponding files of the Red5 repo), runs the build and tries to clean up after itself. It also creates adependencies.txt file containing the chunck of text which needs to be added to the dependencies list. Finally, it bails out (fails) instructing the user to run the command again.
On the second (third, fourth, etc) run the dependencies will already be present, so the resolution process won’t be run repeteadly. This approach was chosen instead of running the resolution at every build because – even though the resolution process is quick quick – it can take tens seconds in some more complicated cases and I didn’t want to slow the build down.
And, Ivy, the Apache BSF framework, etc are fetched from the Maven central repository, so they need not be preinstalled for build to complete successfully.
A couple of words about choosing ${ivy.target.lib.dir}: if you choose it inside your Maven tree (like it was chose in the example), you will receive warnings from Maven that this might not be supported in the future. Also, be sure to add the directory to the ignore mechanism of your VCS (.gitignore, .hgignore, .cvsignore, svn:ignore, etc), as to avoid accidentally committing the libraries to VCS.
If you need to add a new (Ivy) dependency to the project, the steps are as follows:
${ivy.target.lib.dir} directory pom.xml which writes out the ivy.xml file to include the new dependency ivy-dependencies profile to include the new dependency (possibly copying from dependencies.txt)One drawback of this method is the fact that advanced functionalities of systems based on Maven will not work with these dependencies (for example dependency analisys / graphing plugins, automated downloading of sources / javadocs, etc). A possible workaround (and a good idea in general) is to use this method for the minimal subset – just the jars which can’t be found in Maven central. All the rests (even if they are actually dependencies of the code fetched from Ivy) should be declared as a normal dependency, to be fetched from the Maven repository.
Finally I would like to say that this endeavour once again showed me how flexible both Maven and Ivy/Ant can be and clarified many cornercases (like how we escape ]] inside of CDATA – we split it in two). And it can also be further tweaked (for example: adding a clean target to the ivy-resolve profile, so you can remove the directory with mvn clean -P ivy-resolve or re-jar-ing all the downloaded jars into a single one for example like this, thus avoiding the need to modify the pom file every time the list of Ivy dependencies gets changed – then again signed JARs can’t be re-jarred so it is not an universal solution either).