Monday, December 27, 2010

Bookmark and Share

This is more a note to myself, but in case someone else wonders how to specify the active Maven profile in IntelliJ IDEA for projects containing several profiles, that's the way to go:

  • Open the "Maven Projects" window by selecting "Window" - "Tool Windows" - "Maven Projects"
  • Expand the "Profiles" node and select the profile you want to use
Bookmark and Share

One of the new features introduced with version 3.0 of the Spring framework is the Spring Expression Language or short "SpEL". This language is tailored for the needs when working with Spring and can be used when creating XML/annotation based Spring bean definitions for instance.

So I thought it would be nice if it was possible to use SpEL together with Hibernate Validator's @ScriptAssert constraint which allows to express validation routines using script or expression languages.

Unfortunately this does not work since currently no SpEL language binding for JSR 223 ("Scripting for the JavaTM Platform") exists. As @ScriptAssert's validator uses JSR 223 for expression evaluation at least for now SpEL can't be used along with @ScriptAssert (there is an issue in Spring's JIRA addressing this problem).

But as shown in previous posts it is very simple to create new constraint annotations for the Bean Validation API. So the idea is to build a new constraint @SpelAssert which resembles HV's @ScriptAssert but works with SpEL instead of the JSR 223 API.

Defining the annotation type is straight-forward:

1
2
3
4
5
6

7
8
9
10
11
12
13
14
15

@Target({ TYPE })
@Retention(RUNTIME)
@Constraint(validatedBy = SpelAssertValidator.class)
@Documented
public @interface SpelAssert {


    String message() default "{de.gmorling.moapa.bvspel.SpelAssert.message}";

    Class<?>[] groups() default {};


    Class<? extends Payload>[] payload() default {};

    String value();


}

Besides the standard attributes message(), groups() and payload() mandated by the BV specification we define one more attribute value(), which takes the SpEL expression to evaluate.

Now let's come to the validator:

1
2
3
4
5
6

7
8
9
10
11
12
13
14
15

16
17
18
19
20
21
22
23
24

25
26
27
28
29
30
31
public class SpelAssertValidator implements

ConstraintValidator<SpelAssert, Object> {

    @Inject
    private ExpressionParser parser;

    private Expression expression;


    @Override
    public void initialize(SpelAssert constraintAnnotation) {

    String rawExpression = constraintAnnotation.value();


        if (rawExpression == null) {
            throw new IllegalArgumentException("The expression specified in @"

                + SpelAssert.class.getSimpleName() + " must not be null.");
        }

        expression = parser.parseExpression(rawExpression);
    }


    @Override
    public boolean isValid(Object value, ConstraintValidatorContext context) {

        if (value == null) {

            return true;
        }

        return Boolean.TRUE.equals(expression.getValue(value, Boolean.class));

    }
}

In initialize() we use an ExpressionParser to parse the specified SpEL expression (so this happens only once) and evaluate the given object against it in isValid().

But wait a minute, where does the ExpressionParser come from, it is not instantiated here?

Right, the cool thing is Spring comes with it's own ConstraintValidatorFactory which performs dependency injection on constraint validators. A validator relying on that feature of course is not portable, but as this validator is based on SpEL and Spring anyways this is not an issue here.

In order to have this working a parser bean must be part of the Spring application context. We just register a SpelExpressionParser:

1

2
3
4
5
6
7
8
9
10

11
12
<beans 
    xmlns="http://www.springframework.org/schema/beans"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"

    xsi:schemaLocation="http://www.springframework.org/schema/beans
        http://www.springframework.org/schema/beans/spring-beans-3.0.xsd">

    <bean id="parser"
        class="org.springframework.expression.spel.standard.SpelExpressionParser"/>


    <bean id="validator"
        class="org.springframework.validation.beanvalidation.LocalValidatorFactoryBean"/>
</beans>

This context also shows how to provide a BV Validator for dependency injection leveraging Spring's LocalValidatorFactoryBean.

Now let's have a look at the @SpelAssert constraint in action. The following shows the canonical example of a class CalendarEvent where the start date shall always be earlier than the end date:

1

2
3
4
5
6
7
8
9
10

@SpelAssert("startDate < endDate")
public class CalendarEvent {


    private Date startDate;

    private Date endDate;


    // getters, setters etc.

}

Note how SpEL allows dates to be compared using the "<" operator and that no alias for the evaluated bean is required, as all unqualified attribute/method names are resolved against the annotated object.

Finally we should have a test showing that the validator works as expected by validating a valid and an invalid CalendarEvent instance:

1
2
3
4
5
6
7
8

9
10
11
12
13
14
15
16
17

18
19
20
21
22
23
24
25
26

27
28
29
30
31
32
33
34
35

36
37
38
39
40
41
42
43
44

45
46
47
48
49
50
@RunWith(SpringJUnit4ClassRunner.class)

@ContextConfiguration
public class SpelAssertTest {

    @Inject
    private Validator validator;


    private Date startDate;
    private Date endDate;

    @Before

    public void setUpDates() {

        Calendar start = Calendar.getInstance();
        start.set(2010, 13, 24);

        startDate = start.getTime();

        Calendar end = Calendar.getInstance();
        end.set(2010, 13, 26);

        endDate = end.getTime();
    }

    @Test
    public void validEvent() {


        CalendarEvent event = new CalendarEvent();
        event.setStartDate(startDate);
        event.setEndDate(endDate);

        assertTrue(validator.validate(event).isEmpty());

    }

    @Test
    public void invalidEventYieldsConstraintViolation() {

        CalendarEvent event = new CalendarEvent();

        event.setStartDate(endDate);
        event.setEndDate(startDate);

        Set<ConstraintViolation<CalendarEvent>> violations = 
            validator.validate(event);

        assertEquals(1, violations.size());

        ConstraintViolation<CalendarEvent> violation = 
            violations.iterator().next();
        assertEquals(

            "SpEL expression \"startDate < endDate\" didn't evaluate to true.",
            violation.getMessage());
    }

}

The complete sources for this post can be found at GitHub, so don't hesistate to give it a try or use it in your projects if you like. Any feedback or ideas for improvement are warmly welcome.

Wednesday, November 17, 2010

Bookmark and Share
At my day job we are using the SVN property "svn:keywords" to let SVN replace the string "$Id$" with author name, revision number etc. within each Java source file.

One can add this property automatically when creating new files with the help of SVN's auto props feature. But from time to time someone, e.g. a new developer not knowing about auto props, checks in files without having the "svn:keywords" property set.

So I wondered how to identify such files in the repository. SVN doesn't provide a command answering that question, you only can retrieve all files having a certain property set.

But no problem, some shell magic to the rescue:

1
comm -23 <(sort <(sed 's/\.\///g' <(find . -name "*.java"))) <(sort <(sed 's/ - Id//g' <(svn propget svn:keywords * -R)))

So what's happening here? The basic idea is to list those files with the svn:keywords property set (svn propget) and compare this to a list with all files (find).

The outputs of both commands are brought into the same format using sed, sorted and then passed as parameters to the comm command, which compares two input files to each other. The -23 parameter causes only those lines to be put out which are only contained in file1 but not in file2, which are exactly the names of those files lacking the "svn:keywords" property.

I tested the command successfully on Mac OS X, but I think it should work pretty much the same way on other Unix systems, too.

Sunday, August 15, 2010

Bookmark and Share

While developing an Eclipse plug-in for a small spare-time project of mine I wondered where to find the sources of the Java Development Tools (JDT). They used to be part of the Eclipse RCP/plug-in developers package in previous releases, but this isn't the case anymore with Eclipse 3.6 (Helios).

After some googling I found the reason: as of release 3.6 the RCP distribution (now named "Eclipse for RCP and RAP Developers") no longer contains any sources (except those for the actual platform plug-ins) in order to reduce the size of the download package.

If you want to get the sources for plug-ins such as JDT which are not part of the core platform you have to retrieve them separately. To do so you can use the new plug-in import wizard which allows to fetch source projects corresponding to plug-ins of the target platform directly from the Eclipse CVS.

Just open the "Plug-ins" view, right-click on the plug-in you want to retrieve (e.g. org.eclipse.jdt.ui) and select "Import as" > "Project from a Repository ...". Confirm the next dialog by clicking "Finish", and the check-out starts. Afterwards the new project will automatically replace the plug-in JAR as dependency in any dependent projects within your workspace.

Saturday, July 24, 2010

Bookmark and Share

In the last couple of days I spent some time experimenting a little bit with CDI, the standard of the Java EE 6 platform for dependency injection services.

I must say that I really like that spec, as it hits the sweet spot between specifying features that provide a value out of the box (type-safe DI, eventing, interceptor services etc.) and being open enough to allow people to build totally new stuff based on it (using portable extensions).

I've got the feeling we're going to see a lot of exciting things based on CDI within the near future. Actually this reminds me a bit of Java annotations. Having been introduced with Java 5, just over time people started using them for more and more use cases that had not been foreseen in the first place.

Anyway, to do something practical with CDI I built a small portable extension which allows to retrieve JSR 223 scripting engines using dependency injection.

Just annotate any injection points of type javax.script.ScriptEngine with one of the qualifier annotations @Language, @Extension or @MimeType. The following code extract shows an example of a JavaScript engine (for example the Rhino engine shipping with Java 6) being injected into some managed bean, where it can be used for arbitrary script evaluations:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
@RequestScoped
public class MyBean {

    @Inject 
    @Language("javascript") 
    private ScriptEngine jsEngine;

    // ...

    public void foo() throws ScriptException {

        assert 42.0d == (Double)jsEngine.eval("2 * 21");
        // ...
    }
}

The extension can be found in my Maven repository. Just add the following dependency to your POM in order to use it:

1
2
3
4
5
<dependency>
    <groupId>de.gmorling.cdi.extensions</groupId>
    <artifactId>scripting-extension</artifactId>
    <version>0.1</version>
</dependency>

In case you want to take a look at the source code (which is just a few lines), you can check it out from GitHub. As always, any ideas for improvement or other feedback are highly appreciated.

Monday, June 28, 2010

Bookmark and Share

I'm very proud to report that today Hibernate Validator 4.1.0 Final has been released. Besides many bug fixes this release adds also a lot of new features to the code base.

The changes fall into four areas, which I'm going to discuss in detail in the following:

  • New constraint annotations
  • ResourceBundleLocator API
  • API for programmatic constraint creation
  • Constraint annotation processor

New constraint annotations

In addition to the constraints defined in the Bean Validation spec and those custom constraints already part of HV 4.0, the new release ships with the following new constraint annotations:

  • @CreditCardNumber: Validates that a given String represents a valid credit card number using the Luhn algorithm. Useful to detect mis-entered numbers for instance.
  • @NotBlank: Validates that a given String is neither null nor empty nor contains only whitespaces.
  • @URL: Validates that a given String is a valid URL. Can be restricted to certain protocols if required: @URL(protocol = "http") private String url;
  • @ScriptAssert: Allows to use scripting or expression languages for the definition of class-level validation routines.

Let's have a closer look at the @ScriptAssert constraint, which I'm particularly excited about as I have implemented it :-).

The intention behind it is to provide a simplified way for expressing validation logic that is based on multiple attributes of a given type. Instead of having to implement dedicated class-level constraints the @ScriptAssert constraint allows to express such validation routines in an ad hoc manner using a wide range of scripting and expression languages.

In order to use this constraint an implementation of the Java Scripting API as defined by JSR 223 ("Scripting for the JavaTM Platform") must be part of the class path. This is automatically the case when running on Java 6. For older Java versions, the JSR 223 RI can be added manually to the class path.

As example let's consider a class representing calendar events. The start date of such an event shall always be earlier than the end date. Using JavaScript (for which an engine comes with Java 6) this requirement could be expressed as follows:

1
2
3
4
5
6
7
8
9
@ScriptAssert(lang = "javascript", script = "_this.startDate.before(_this.endDate)")
public class CalendarEvent {

    private Date startDate;

    private Date endDate;

    //...
}

So all you have to do is to specify a scripting expression returning true or false within the script attribute. The expression must be implemented in the language given within the language attribute. The language name must be the language's name as registered with the JSR 223 ScriptEngineManager. Within the expression the annotated element can be accessed using the alias _this by default.

The cool thing is that the @ScriptConsert constraint can be used with any other scripting language for which a JSR 223 binding exists. Let's take JEXL from the Apache Commons project for instance. Using Maven you only have to add the following dependency:

1
2
3
4
5
<dependency>
    <groupId>org.apache.commons</groupId>
    <artifactId>commons-jexl</artifactId>
    <version>2.0.1</version>
</dependency>

With JEXL dates can be compared using the "<" operator. Using a shorter alias for the evaluated object the constraint from above therefore can be rewritten as follows:

1
2
3
4
5
6
7
8
9
@ScriptAssert(lang = "jexl", script = "_.startDate < _.endDate", alias = "_")
public class CalendarEvent {

    private Date startDate;

    private Date endDate;

    //...
}

Very likely one will work with only one scripting language throughout all @ScriptAssert constraints of an application. So let's leverage the power of constraint composition to create a custom constraint which allows for an even more compact notation by setting the attributes lang and alias to fixed values:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
@Target({ TYPE })
@Retention(RUNTIME)
@Constraint(validatedBy = {})
@Documented
@ScriptAssert(lang = "jexl", script = "", alias = "_")
public @interface JexlAssert {

    String message() default "{org.hibernate.validator.constraints.ScriptAssert.message}";

    Class<?>[] groups() default {};

    Class<? extends Payload>[] payload() default {};

    @OverridesAttribute(constraint = ScriptAssert.class, name = "script")
    String value();
}

Note how the script attribute of the composing @ScriptAssert constraint is overridden using the @OverridesAttribute meta-annotation. Using this custom constraint the example finally reads as follows:

1
2
3
4
5
6
7
8
9
@JexlAssert("_.startDate < _.endDate")
public class CalendarEvent {

    private Date startDate;

    private Date endDate;

    //...
}

As shown the @ScriptAssert constraint allows to define class-level constraints in a very compact way.

But there is also a price to pay. As scripting languages are used, compile-time type-safety is lost. When for instance the startDate attribute is renamed the script expression would have to be adapted by hand. Also evaluation performance should be considered when validation logic is getting more complex.

So I recommend to try it out and choose what ever fits your needs best.

The ResourceBundleLocator API

The Bean Validation API defines the MessageInterpolator interface which allows to plug in custom strategies for message interpolation and resource bundle loading.

As it turned out, most users only want to customize the latter aspect (e.g. in order to load message bundles from a database) but would like to re-use the interpolation algorithm provided by Hibernate Validator.

Therefore Hibernate Validator 4.1 introduces the interface ResourceBundleLocator which is used by HV's default MessageInterpolator implementation ResourceBundleMessageInterpolator to do the actual resource bundle loading.

The interface defines only one method, in which implementations have to return the bundle for a given locale:

1
2
3
4
5
public interface ResourceBundleLocator {

    ResourceBundle getResourceBundle(Locale locale);

}

The ResourceBundleLocator to be used can be set when creating a ValidatorFactory:

1
2
3
4
5
6
7
ValidatorFactory validatorFactory = Validation
    .byProvider(HibernateValidator.class)
    .configure()
    .messageInterpolator(
        new ResourceBundleMessageInterpolator(
            new MyCustomResourceBundleLocator()))
    .buildValidatorFactory();

The default ResourceBundleLocator implementation used by Hibernate Validator is PlatformResourceBundleLocator which simply loads bundles using ResourceBundle.loadBundle(). Another implementation provided out of the box is AggregateResourceBundleLocator, which allows to retrieve message texts from multiple bundles by merging them into a single aggregate bundle. Let's look at an example:

1
2
3
4
5
6
7
8
9
10
11
HibernateValidatorConfiguration configuration = Validation
    .byProvider(HibernateValidator.class)
    .configure();

ValidatorFactory validatorFactory = configuration
    .messageInterpolator(
        new ResourceBundleMessageInterpolator(
            new AggregateResourceBundleLocator(
                Arrays.asList("foo", "bar"), 
                configuration.getDefaultResourceBundleLocator())))
    .buildValidatorFactory();

Here messages from bundles "foo" and "bar" could be used in constraints. In case the same key was contained in both bundles, the value from bundle "foo" would have precedence, as it comes first in the list. If a given key can't be found in any of the two bundles as fallback the default locator (which provides access to the ValidationMessages bundle as demanded by the JSR 303 spec) will be tried.

API for programmatic constraint creation

Using the Bean Validation API constraints can be declared using either annotations and/or XML descriptor files. Hibernate Validator 4.1 introduces a third approach by providing an API for programmatic constraint declaration.

This API can come in handy for instance in dynamic scenarios where constraints change upon runtime or for testing scenarios.

As example let's consider two classes, Customer and Order, from a web shop application for which the following constraints shall apply:

  • each customer must have a name
  • each order must have a customer
  • each order must comprise at least one order line

With help of the programmatic constraint API these constraints can be declared as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
ConstraintMapping mapping = new ConstraintMapping();

mapping
    .type(Order.class)
        .property("customer", ElementType.FIELD)
            .constraint(NotNullDef.class)
        .property("orderLines", ElementType.FIELD)
            .constraint(SizeDef.class)
                .min(1)
                .message("An order must contain at least one order line")
    .type(Customer.class)
        .property("name", ElementType.FIELD)
            .constraint(NotNullDef.class);  

As the listing shows the API is designed in a fluent style with the class ConstraintMapping as entry point. Constraints are declared by chaining method calls which specify to which property of which type which constraints should be added.

The API provides constraint definition classes such as NotNullDef etc. for all built-in constraints, which allow to access their properties (min(), message() etc.) in a type-safe way. For custom constraints you could either provide your own constraint definition class or you could make use of GenericConstraintDef which allows to identify properties by name.

Having created a constraint mapping it has to be registered with the constraint validator factory. As the programmatic API is not part of the Bean Validation spec, we must explicitely specify Hibernate Validator as the BV provider to be used:

1
2
3
4
5
6
Validator validator = Validation
    .byProvider(HibernateValidator.class)
    .configure()
    .addMapping(mapping)
    .buildValidatorFactory()
    .getValidator();

If you now use this validator to validate an Order object which has no customer set a constraint violation will occur, just as if the "customer" property was annotated with @NotNull.

Constraint annotation processor

The Hibernate Validator annotation processor might become your new favourite tool if you find yourself accidentally doing things like

  • annotating Strings with @Min to specify a minimum length (instead of using @Size)
  • annotating the setter of a JavaBean property (instead of the getter method)
  • annotating static fields/methods with constraint annotations (which is not supported)

Normally you would notice such mistakes only during run-time. The annotation processor helps in saving your valuable time by detecting these and similar errors already upon compile-time by plugging into the build process and raising compilation errors whenever constraint annotations are incorrectly used.

The processor can be used in basically every build environment (plain javac, Apache Ant, Apache Maven) as well as within all common IDEs. Just be sure to use JDK 6 or later, as the processor is based on the "Pluggable Annotation Processing API" defined by JSR 269 which was introduced with Java 6.

The HV reference guide describes in detail how to integrate the processor into the different environments, so I'll spare you the details here. As example the following screenshot shows some compilation errors raised by the processor within the Eclipse IDE (click to enlarge):

Hibernate Validator Annotation Processor in Eclipse IDE

Just for the record it should be noted that the annotation processor already works pretty well but is currently still under development and is considered as an experimental feature as of HV 4.1. If you are facing any problems please report them in JIRA. Some known issues are also discussed in the reference guide.

Summary

The focus for Hibernate Validator 4.0 was to provide a feature-complete, production-ready reference implementation of the Bean Validation spec.

While strictly staying spec-compliant, HV 4.1 goes beyond what is defined in JSR 303 and aims at generating even more user value by providing new constraints, new API functionality as well as an annotation processor for compile-time annotation checking.

In order to try out the new features yourself just download the release from SourceForge. Of course HV 4.1 can also be retrieved from the JBoss Maven repository.

If you are already using HV 4.0.x, the new release generally should work as drop-in replacement. The only exception is that we had to move the class ResourceBundleMessageInterpolator to the new package messageinterpolation. So if you're directly referencing this class, you'll have to update your imports here.

The rationale behind this relocation was to clearly separate those packages which can safely be accessed by HV users from packages intended for internal use only. The public packages are:

  • org.hibernate.validator
  • org.hibernate.validator.cfg
  • org.hibernate.validator.constraints
  • org.hibernate.validator.messageinterpolation
  • org.hibernate.validator.resourceloading

Of course we'd be very happy on any feedback. Questions and comments can be posted in the HV forum, while any issues or feature requests should be reported in JIRA.

Saturday, June 12, 2010

Bookmark and Share

With the help of Oracle's extractValue() function one can retrieve values from XML documents stored in a database using XPath expressions.

Generally this function works as expected but it gave me a hard time when XML namespaces came into play. As I didn't find much related information on the web I thought a short example might be helpful for others facing the same problem.

Let's take the web service from the video rental shop from my recent post Integrating JAX-WS with XmlBeans as example.

For auditing purposes all requests and responses of the web service might be logged in a database table REQUESTS created as follows:

1
2
3
4
5
CREATE TABLE REQUESTS (
    ID NUMBER(10,0) PRIMARY KEY,
    REQUEST XMLTYPE,
    RESPONSE XMLTYPE
);

In reality, logging would probably be implemented using a message handler, but for demonstration purposes let's simply insert a sample request of the FindMoviesByDirector() operation using SQL:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
INSERT INTO REQUESTS (ID, REQUEST, RESPONSE)
VALUES (
    1,
    '<soapenv:Envelope 
        xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" 
        xmlns:typ="http://www.gunnarmorling.de/moapa/videorental/types">

        <soapenv:Header/>
        <soapenv:Body>
            <typ:FindMoviesByDirectorRequest>
                <Director>Bryan Singer</Director>
            </typ:FindMoviesByDirectorRequest>
        </soapenv:Body>
     </soapenv:Envelope>',
    '...';

Here we have two namespace aliases declared, soapenv for the SOAP message and typ for the actual message content.

The key for accessing values from this document using extractValue() is the pretty poorly documented optional parameter namespace_string, which can be used to declare any namespaces. This has to happen in the form xmlns:alias="URI", multiple namespaces must be separated by a space character.

Knowing that, it's easy to issue a SQL query which retrieves the director name from the request above. Just make sure to qualify the element names with their namespace alias in the XPath expression:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
SELECT
    req.id,
    extractValue(
        req.request,
        '/soapenv:Envelope/soapenv:Body/typ:FindMoviesByDirectorRequest/Director',
        'xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:typ="http://www.gunnarmorling.de/moapa/videorental/types"') Title
FROM 
    requests req
WHERE
    id = 1
;

ID   TITLE 
-----------------
1    Bryan Singer

Friday, May 14, 2010

Bookmark and Share

Amongst others the Bean Validation API defines two constraint annotations related to time: @Past and @Future. With the help of these constraints one can validate that a given element is a date either in the past or in the future.

As per the BV specification these constraints are allowed for the types java.util.Date and java.util.Calendar. But what if you are working with an alternative date/time library such as the Joda Time API? Does that mean you can't use the @Past/@Future constraints?

Luckily not, as the Bean Validation API defines a mechanism for adding new validators to existing constraints. Basically all you have to do is to implement a validator for each type to be supported and register it within a constraint mapping file.

Note that for the remainder of this post I'll focus on the @Past constraint. Doing the same for @Future is left as an exercise for the reader.

Providing a Validator

So let's start with implementing a validator. The Joda Time API provides a whole bunch of types replacing the JDK date and time types. A good introduction to these types can be found in Joda's quickstart guide.

All Joda types representing exact points on the time-line implement the interface ReadableInstant. Providing an @Past validator for that interface will allow the @Past constraint to be used for widely used ReadableInstant implementations such as DateTime or DateMidnight.

Implementing the validator is straight-forward. Obeying the contract defined by @Past the given date is simply compared to a new DateTime instance which represents the current instant in the default time zone:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
public class PastValidatorForReadableInstant implements
        ConstraintValidator<Past, ReadableInstant> {

    public void initialize(Past constraintAnnotation) {}

    public boolean isValid(ReadableInstant value,
            ConstraintValidatorContext constraintValidatorContext) {

        if(value == null) {
            return true;
        }

        return value.isBefore(new DateTime());
    }
}

Similar validators could also be written for other Joda types which don't implement ReadableInstant (such as LocalDate) but as this is basically the same, it is out of the scope of this post.

Registering the Validator

Having implemented the validator we need to register it within a constraint mapping file:

1
2
3
4
5
6
7
8
9
10
11
12
13
<constraint-mappings
    xmlns="http://jboss.org/xml/ns/javax/validation/mapping"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation=
        "http://jboss.org/xml/ns/javax/validation/mapping validation-mapping-1.0.xsd">

    <constraint-definition annotation="javax.validation.constraints.Past">
        <validated-by include-existing-validators="true">
             <value>de.gmorling.moapa.joda_bv_integration.PastValidatorForReadableInstant</value>
        </validated-by>
    </constraint-definition>

</constraint-mappings>

Using the validated-by element we add our new validator to the validators for the @Past constraint. By setting include-existing-validators to true, we ensure that the @Past constraint still can be used at the JDK date types.

As demanded by the Bean Validation API we then register the constraint mapping file within the central configuration file validation.xml:

1
2
3
4
5
6
7
8
9
<?xml version="1.0" encoding="UTF-8"?>
<validation-config
    xmlns="http://jboss.org/xml/ns/javax/validation/configuration"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://jboss.org/xml/ns/javax/validation/configuration validation-configuration-1.0.xsd">

    <constraint-mapping>META-INF/validation/custom-constraints.xml</constraint-mapping>

</validation-config>

Trying it out

Now it's time to test how that all works out. To do so, we define an examplary domain class Customer which has an attribute birthday of the Joda type DateMidnight:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
public class Customer {

    private final String name;

    private final DateMidnight birthday;

    public Customer(String name, DateMidnight birthday) {

        this.name = name;
        this.birthday = birthday;
    }

    @NotNull
    public String getName() {
        return name;
    }

    @NotNull
    @Past
    public DateMidnight getBirthday() {
        return birthday;
    }
}

A simple test finally shows that creating a customer with a future birthday causes a constraint violation, while a customer with a birthday in the past doesn't:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
public class CustomerTest {

    private static Validator validator;

    @BeforeClass
    public static void setUpValidatorAndDates() {

        ValidatorFactory validatorFactory = Validation.buildDefaultValidatorFactory();
        validator = validatorFactory.getValidator();
    }

    @Test
    public void customerWithFutureBirthdayCausesConstraintViolation() {

        Customer customer = new Customer("Bob", new DateMidnight(2020, 11, 3));
        Set<ConstraintViolation<Customer>> constraintViolations = validator.validate(customer);

        assertEquals(1, constraintViolations.size());
        assertEquals("must be in the past", constraintViolations.iterator().next().getMessage());
    }

    @Test
    public void customerWithPastBirthdayCausesNoConstraintViolation() {

        Customer customer = new Customer("Bob", new DateMidnight(1960, 11, 3));
        Set<ConstraintViolation<Customer>> constraintViolations = validator.validate(customer);

        assertTrue(constraintViolations.isEmpty());
    }

}

The complete source code in form of a Maven project can be found in my Git repository.

So just try it out and don't hesistate to post any feedback. It is also planned to add support for the Joda types in an ucoming version of Hibernate Validator.

Friday, April 9, 2010

Bookmark and Share

Recently I gave a presentation on JSR 303 ("Bean Validation") at the Java User Group in Hamburg, Germany.

I did also some live coding during the talk and mentioned some things only verbally. Nevertheless I thought the slides (which are in German) might be of interest, so I decided to put them online.

You can also find the presentation at slideshare.

The talk went pretty well, there were many questions and people generally seemed quite interested :-) One question related to the automatic validation of constraints at JPA entities I couldn't answer immediately, though.

The question was, whether lazy properties are loaded from the database during validation of class-level constraints (for constraint validation at fields/properties lazy attributes must not be loaded, which is ensured by checking each attribute's load state using a TraversableResolver).

Actually this can happen, depending on the attributes accessed by the validator implementation for the class-level constraint. I think, generally this is alright, as the validator needs to know the values of all attributes relevant for the evaluation of the constraint. If in certain scenarios this behavior is not acceptable, one could examine the attributes' load states manually using javax.validation.PersistenceUtil and depending on that access only certain attributes.

Thursday, March 25, 2010

Bookmark and Share

When I started blogging on JSR 303 ("Bean Validation"), there was not too much information available on the web concerning the BV API, its usage and its integration with other technologies.

In between JSR 303 got approved, BV is part of Java EE 6 and as it generally gains wider adoption, more and more blog posts and other information related to Bean Validation come available.

That's why I thought it might be a good idea to collect the most interesting pieces and publish those links here every once in a while.

And there you go, here is the first couple of links related to JSR 303:

I plan to post follow-ups, whenever I gathered some interesting links, so stay tuned.

Tuesday, March 16, 2010

Bookmark and Share

Recently I received an IntelliJ IDEA code style settings file for Hibernate Validator, the reference implementation of JSR 303 ("Bean Validation"), to which I'm contributing.

I wanted to import it into IntelliJ in order to format any code changes I make in the project's standard style. But as it turned out, there is no functionality within the IDE for importing code style settings. Instead one has to do the following:

  • Copy the settings XML file to INTELLIJ_SETTINGS_DIR/config/codestyles (where INTELLIJ_SETTINGS_DIR is the folder containing your IntelliJ settings; typically it is situated within your home directory, the name depends on your version of IntelliJ; in my case the complete path is "~/.IdeaIC90/config/codestyles")
  • Start IntelliJ
  • Go to "File" - "Settings" - "Code Style", select "Use global settings" and choose the previously imported style from the drop-down box
  • Optionally apply the style to one project only by clicking on "Copy to Project" and selecting "Use per project settings" afterwards

Tuesday, March 9, 2010

Bookmark and Share

When working with XML-based web services, it usually a good idea to validate all requests and responses against their associated XML schema in order to ensure the integrity of the incoming/outgoing messages.

Unfortunately JAX-WS, the standard API for SOAP-based web services of the Java Platform doesn't specify a standard way for schema validation. Therefore most of the JAX-WS implementations such as Metro or CXF provide a proprietary mechanism to perform that task.

Using Metro, this is simply done by annotating the endpoint class with @SchemaValidation:

1
2
3
4
5
@WebService(endpointInterface = "...")
@SchemaValidation
public class VideoRentalPort implements VideoRentalPortType {
    //...
}

This works as expected, but has one big flaw in my eyes: to enable or disable validation for a given endpoint, the application code must be modified, followed by a re-deployment of the application.

This is not viable in many scenarios. Taking a large enterprise application with a whole lot of services for example, schema validation might be turned off by default for performance reasons. But for the purpose of bug analysis it might be required to temporarily enable validation for single endpoints. Re-deploying the application is not an option in this scenario.

That's why I had a look into Metro's implementation of the validation feature and tried to find a way to make schema validation configurable during runtime.

Custom Metro tubes

Schema validation in Metro is realized using in form of a so-called "tube". Metro tubes are conceptually similar to JAX-WS message handlers, but a magnitude more powerful. Since Metro 2.0 the tubes to be applied for a given application can be configured by providing so-called custom "tubelines".

So the basic idea is to extend the standard validation tube provided by Metro in a way which makes it runtime-configurable and register this customized tube instead of the original one.

Note: Before going into details, it should be said that as of Metro 2.0 the tube-related APIs are regarded as Metro-internal APIs, so they could be changed in future versions, causing the approach described here not to work anymore. But as we don't write very much code, this shouldn't really scare us.

Managing configuration state

First of all, some sort of data structure is needed, which will store whether validation is enabled for a given web service endpoint. For this purpose a simple map is sufficient (as this class must be accessible for multiple threads at the same time, a ConcurrentMap is used):

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
public class ValidationConfigurationHolder {

    private static ConcurrentMap<String, Boolean> configuration = 
        new ConcurrentHashMap<String, Boolean>();

    public static boolean isValidationEnabled(String portName) {

        Boolean theValue = configuration.get(portName);
        return theValue != null ? theValue : Boolean.TRUE;
    }

    public static void setValidationEnabled(String portName, boolean enabled) {
        configuration.put(portName, enabled);
    }
}

Next we need a way to manipulate this structure during application runtime. Java's standard API for management purposes as the one at hand is JMX. The following listing shows a JMX MBean, which later can be invoked to enable or disable validation for a given port:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
//MBean interface
public interface EndpointConfigurationMBean {

    public String getName();

    public boolean isSchemaValidationEnabled();

    public void setSchemaValidationEnabled(boolean enabled);

}

//MBean implementation
public class EndpointConfiguration implements EndpointConfigurationMBean {

    private String name;

    public EndpointConfiguration() {}

    public EndpointConfiguration(String name) {
        this.name = name;
    }

    public String getName() {
        return name;
    }

    public boolean isSchemaValidationEnabled() {
        return ValidationConfigurationHolder.isValidationEnabled(name);
    }

    public void setSchemaValidationEnabled(boolean enabled) {
        ValidationConfigurationHolder.setValidationEnabled(name, enabled);
    }

}

The customized validation tube

The schema validation support of Metro is basically realized in three classes: AbstractSchemaValidationTube and ServerSchemaValidationTube resp. ClientSchemaValidationTube which extend the former. To make schema validation configurable on the server side, we create a sub-class of ServerSchemaValidationTube:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
public class ConfigurableServerSchemaValidationTube extends ServerSchemaValidationTube {

    private String name;

    public ConfigurableServerSchemaValidationTube(WSEndpoint<?> endpoint, WSBinding binding,
            SEIModel seiModel, WSDLPort wsdlPort, Tube next) {

        super(endpoint, binding, seiModel, wsdlPort, next);

        name = seiModel.getServiceQName().getLocalPart() + "-" + wsdlPort.getName().getLocalPart();
        ValidationConfigurationHolder.setValidationEnabled(name, true);
        registerMBean();
    }

    private void registerMBean() {

        MBeanServer mbs = ManagementFactory.getPlatformMBeanServer(); 

        try {
            mbs.registerMBean(
                new EndpointConfiguration(name),
                new ObjectName("de.gmorling.moapa.videorental.jmx:type=Endpoints,name=" + name));
        }
        catch (Exception e) {
            throw new RuntimeException(e);
        }
    }

    @Override
    protected boolean isNoValidation() {
        return !ValidationConfigurationHolder.isValidationEnabled(name);
    }

    protected ConfigurableServerSchemaValidationTube(ConfigurableServerSchemaValidationTube that, TubeCloner cloner) {
        super(that,cloner);
        this.name = that.name;
    }

    public ConfigurableServerSchemaValidationTube copy(TubeCloner cloner) {
        return new ConfigurableServerSchemaValidationTube(this, cloner);
    }

}

The important thing here is, that the method isNoValidation() is overridden. This method is defined within the base class AbstractSchemaValidationTube and is invoked whenever a request/response is processed by this tube. The implementation of this method simply delegates to the ValidationConfigurationHolder shown above.

Within the constructor an instance of the EndpointConfiguration MBean for the given endpoint is registered with the platform MBean server, which later can be used to toggle validation for this endpoint.

Providing a TubeFactory

Metro tubes are instantiated by implementations of the TubeFactory interface. The following implementation allows the Metro runtime to retrieve instances of the ConfigurableServerSchemaValidationTube:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
public class ConfigurableValidationTubeFactory implements TubeFactory {

    //...

    public Tube createTube(ServerTubelineAssemblyContext context)
            throws WebServiceException {

        ServerTubeAssemblerContext wrappedContext = context.getWrappedContext();
        WSEndpoint<?> endpoint = wrappedContext.getEndpoint();
        WSBinding binding = endpoint.getBinding();
        Tube next = context.getTubelineHead();
        WSDLPort wsdlModel = wrappedContext.getWsdlModel();
        SEIModel seiModel = wrappedContext.getSEIModel();

        if (binding instanceof SOAPBinding && binding.isFeatureEnabled(SchemaValidationFeature.class) && wsdlModel != null) {
            return new ConfigurableServerSchemaValidationTube(endpoint, binding, seiModel, wsdlModel, next);
        } else
            return next;
    }
}

The code resembles Metro's instantiation logic for the standard validation tube, the only difference being that a ConfigurableServerSchemaValidationTube is created. Note that we also check whether the SchemaValidationFeature is enabled or not. That way given endpoints can be completey excluded from validation by simply not annotating them with @SchemaValidation.

To register the tube factory with Metro as part of a custom tubeline the configuration file META-INF/metro.xml has to be provided:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
<?xml version="1.0" encoding="UTF-8"?>
<metro  xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance'
   xmlns='http://java.sun.com/xml/ns/metro/config'
   version="1.0">
    <tubelines default="#custom-tubeline">
        <tubeline name="custom-tubeline">
            <client-side>
                ...
            </client-side>
            <endpoint-side>
                ...
                <tube-factory className="com.sun.xml.ws.assembler.jaxws.HandlerTubeFactory" />
                <tube-factory className="de.gmorling.moapa.videorental.metro.ConfigurableValidationTubeFactory" />
                <tube-factory className="com.sun.xml.ws.assembler.jaxws.TerminalTubeFactory" />
            </endpoint-side>
        </tubeline>
    </tubelines>
</metro>

As suggested in a post by Metro developer Marek Potociar it's a good idea to take Metro's standard tube configuration, metro-default.xml, as template and adapt it as needed. In our case the ConfigurableValidationTubeFactory is registered instead of the standard ValidationTubeFactory.

Configuring Schema Validation using VisualVM

Having registered the tube factory, it's time to give the whole thing a test run. A complete project containing all sources of this post together with an examplary video rental web service can be found in my Git repo.

When starting the project's main class the service will be fired up and waiting for requests. Any JMX client such as VisualVM can now be used to connect to the running JVM. The following screenshot shows VisualVM connected to the video rental application:

By setting the value of the "SchemaValidationEnabled" property of the MBean named "VideoRentalService-VideoRentalPort", schema validation now can be enabled or disabled for the corresponding endpoint.

Update (march 10, 2010): I created an request for enhancement for that feature: Allow schema validation to be enabled/disabled at runtime. So let's see, whether this will be part of some future Metro version :-)