Enterprise Java Development@TOPIC@
Built on: 2019-08-22 07:13 EST
Copyright © 2019 jim stafford (jim.stafford@jhu.edu)
Abstract
This book contains lab exercises covering for JHU 605.784.31
Copyright © 2019 jim stafford (jim.stafford@jhu.edu)
Built on: 2019-08-22 07:08 EST
Abstract
This paper contains setup information required to begin building, deploying, running, and modifying examples. It also provides a basis for building class projects to be turned in.
Table of Contents
Establish a build environment for
Building and deploying class examples
Developing class projects
Exposure to various JavaEE build environment tools
Many of the command line examples listed in these instructions use bash syntax on Linux. You will need to translate the specific commands into the shell syntax that is appropriate for your environment. The bash shell or Linux is not a requirement for class. Many students have taken this class and used various versions of Windows and Mac systems.
It is never a good idea to install software in a directory with spaces in the fully qualified path name if the software contains jar files that will eventually be part of a classpath. I strongly recommend your working directory, the JDK, Wildfly, and all repository locations comply with this critical guideline or you may suffer classpath issues at some point.
You will need a JDK 11 installed.
Although most of the source code and maven modules used in class are backwards compatible with JDK 8, the modules and the build system around it have been migrated to JDK 11. There are a few changes between JDK 8 and JDK 11 that will make flipping between the two difficult. Therefore target JDK 11 for use in class.
Removal of JavaEE APIs from JavaSE requiring additional dependencies for the JDK 11 classpath
Command line changes to the JDK that require custom maven-compiler-plugin configuration
Oracle JDK no longer free for commercial use. Suitable alternate is OpenJDK.
Keep the 32/64-bit choice consistent with what you download later for Eclipse.
Mac Users AdoptOpenJDK or thru package manager (e.g., brew)
Linux Users thru package manager (e.g., yum, apt)
$ sudo apt install openjdk-11-jdk
Windows Users Azul OpenJDK
Depending on how you performed the installation, you may need to add the JDK to your PATH
Figure 2.2. Mac Example
$ java -version openjdk version "11.0.4" 2019-07-16 OpenJDK Runtime Environment AdoptOpenJDK (build 11.0.4+11) OpenJDK 64-Bit Server VM AdoptOpenJDK (build 11.0.4+11, mixed mode) $ javac -version javac 11.0.4
Figure 2.3. Ubuntu Linux Example
$ java -version openjdk version "11.0.4" 2019-07-16 OpenJDK Runtime Environment (build 11.0.4+11-post-Ubuntu-1ubuntu218.04.3) OpenJDK 64-Bit Server VM (build 11.0.4+11-post-Ubuntu-1ubuntu218.04.3, mixed mode, sharing) $ javac -version javac 11.0.4
Figure 2.4. Windows Example
> java -version openjdk version "11.0.4" 2019-07-16 LTS OpenJDK Runtime Environment Zulu11.33+15-CA (build 11.0.4+11-LTS) OpenJDK 64-Bit Server VM Zulu11.33+15-CA (build 11.0.4+11-LTS, mixed mode) >javac -version javac 11.0.4
You will use Git in this class to perform an initial checkout and get updates for source files. Any Git client should be able to perform that function. You can determine if you have a command line Git client already installed using the following simple command.
Figure 3.1. Verifying if Git is Installed
//my Ubuntu system $ git --version git version 2.17.1
//my Mac system $ git --version git version 2.22.0
//my Windows system >git --version git version 2.22.0.windows.1
There are a number of options and some are going to be based on on your platform. Your basic options include command line or using an Eclipse plugin
Eclipse GUI Users: There is a git plugin for Eclipse available called EGit. It automatically comes with Eclipse these days, so no extra installation is required.
Linux Command-Line Users: Use your package installer. My normal choice is to select git, gitk, and git-gui packages.
//my Ubuntu system $ sudo apt update $ sudo apt install git gitk git-gui $ git --version git version 2.17.1
Mac Users: Although confusing at times, I find most development tools are available thru the brew package manager.
//my Mac $ brew update $ brew install git $ git --version git version 2.22.0
Cygwin Command-Line Users: Use the cygwin setup tool to locate the git packages
Windows Command-Line Users: Use one of the many available packagings for Git
git-scm example
Download and launch the installer
Pick your choice of editors.
Pick how deeply integrated with the Windows shell you want Git and Git bash commands to be. When working on Windows systems, I find the Git bash shell to be a very usable and lightweight alternative to Cygwin.
Choose your checkout style. Don't worry about commit style you will only be performing read-only checkouts. I chose "Checkout as-is, commit Unix-style line endings" since I used all linux-style editors on Windows. You may want the Window checkout format.
Choose your terminal window. I would recommend anything over the default Windows console window -- but that is a personal preference.
Click on the "Git Bash" icon after the installation is complete.
The class repository is located on github and can be browsed using the following http URL https://github.com/ejavaguy/ejava-student. With a cloned copy, you can receive file updates during the semester.
CD to a directory you wish to place source code. Make sure the path to this directory contains no spaces.
Clone the class repository using the following URL git://github.com/ejavaguy/ejava-student.git
$ git clone git://github.com/ejavaguy/ejava-student.git Cloning into 'ejava-student'... ... Checking out files: 100% (1289/1289), done. ... $ ls ejava-student/ ... $ cd ejava-student $ git branch -a //list all branches -- local and remote * master remotes/origin/HEAD -> origin/master remotes/origin/master
Git leaves you with all branches fetched and a local master branch referencing the class' master branch on github. You will be using the master branch for the duration of the semester. Other branches may show up, including my working branches where I am actively working on the next wave of updates. The master branch is usually updated the evening before or the day of class and should always be stable.
Perform a mock update. This is what you will be doing several times this semester to get file updates.
$ git checkout master #switches to master branch $ git pull #downloads changes and attempts merge Already up-to-date.
There are many modules within the class repository. Some are ready for use, some are still being worked, and some are not for use this semester. The ones ready for your use will be wired into the build and will be compiled during a follow-on section. The list will increase as the semester moves forward. Please ignore these extra modules. Keeping them within the baseline helps me keep related things centralized.
If you ever make changes to the class examples and would like to keep those changes separate from the updates. Store them in a new branch at any time using the following git commands.
$ git checkout -b new-branch #creates new branch from current branch #and switches to that branch $ git commit -am "saving my stuff" #commits all dirty files to new branch $ git checkout master #switches back to the master branch
If you simply want to throw away any changes you made, you can discard those changes to tracked files using the following git commands.
$ git reset --hard master $ git clean -rn #shows you what it would delete without deleting $ git clean -rf #deletes files not managed or specifically ignored by git
Download Maven 3 http://maven.apache.org/download.html or download thru a package manager
Unzip the contents into a directory with no spaces in its path.
$ ls apache-maven-3.6.1 bin boot conf lib LICENSE NOTICE README.txt
Add an environment variable for MAVEN_HOME and add MAVEN_HOME/bin to your PATH
# my bash systems -- should be done in .bashrc or .bash_profile export MAVEN_HOME=/opt/apache-maven-3.6.1 export PATH=$MAVEN_HOME/bin:$PATH # my Windows system -- should be done in Advanced System Settings->Environment Variables set MAVEN_HOME=c:/apps/apache-maven-3.6.1 set PATH=%MAVEN_HOME%\bin;%PATH%
Verify maven is installed and in the path
//my Ubuntu system $ mvn -version Apache Maven 3.6.1 (d66c9c0b3152b2e69ee9bac180bb8fcc8e6af555; 2019-04-04T15:00:29-04:00) Maven home: /opt/apache-maven-3.6.1 Java version: 11.0.4, vendor: Ubuntu, runtime: /usr/lib/jvm/java-11-openjdk-amd64 Default locale: en_US, platform encoding: UTF-8 OS name: "linux", version: "5.0.0-23-generic", arch: "amd64", family: "unix" //my Mac system - installed via brew $ mvn -version Apache Maven 3.6.1 (d66c9c0b3152b2e69ee9bac180bb8fcc8e6af555; 2019-04-04T15:00:29-04:00) Maven home: /usr/local/Cellar/maven/3.6.1/libexec Java version: 1.8.0_202, vendor: AdoptOpenJdk, runtime: /Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/Home/jre Default locale: en_US, platform encoding: UTF-8 OS name: "mac os x", version: "10.14.5", arch: "x86_64", family: "mac" //my Windows system >mvn -version Apache Maven 3.6.1 (d66c9c0b3152b2e69ee9bac180bb8fcc8e6af555; 2019-04-04T15:00:29-04:00) Maven home: C:\apps\apache-maven-3.6.1\bin\.. Java version: 11.0.4, vendor: Azul Systems, Inc., runtime: C:\apps\zulu-openjdk11 Default locale: en_US, platform encoding: Cp1252 OS name: "windows 10", version: "10.0", arch: "amd64", family: "windows"
Add a skeletal settings.xml file that will be used to provide local overrides for the build. This is the place where you can customize the build for local environment specifics like directory locations, server address, server ports, etc.
Create a .m2
directory below your HOME directory.
Add the following to the.m2/settings.xml
file below your HOME directory.
<?xml version="1.0"?>
<settings xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">
<offline>false</offline>
<profiles>
</profiles>
<activeProfiles>
</activeProfiles>
</settings>
You can test whether your settings.xml file is seen by Maven by temporarily making it an invalid XML file and verifying that the next Maven build command fails with a parsing error.
$ mvn clean [ERROR] Error executing Maven. [ERROR] 1 problem was encountered while building the effective settings [FATAL] Non-parseable settings /home/user/.m2/settings.xml: only whitespace content allowed before start tag and not s (position: START_DOCUMENT seen <?xml version="1.0"?>\ns... @2:2) @ /home/user/.m2/settings.xml, line 2, column 2
Add a default specification for the database
profile we will be using for class at the bottom
of the .m2/settings.xml
file in your
HOME directory.
<activeProfiles>
<activeProfile>h2db</activeProfile>
</activeProfiles>
If your operating system HOME directory has spaces
in the path (e.g., Windows XP's Documents and Settings)
then add a localRepository
path specification
to the .m2/settings.xml
file and have
it point to a location that does not have spaces in
the path. The path does not have to exist. It will
be created during the next build.
<offline>false</offline>
<!-- this overrides the default $HOME/.m2/repository location. -->
<localRepository>c:/jhu/repository</localRepository>
Each week you will be asked to update your cloned copy of the class examples and perform a test build. This will give both of us some comfort that your environment is setup correctly and act as a baseline for debugging your class assignments. Therefore, do the following to test your initial installation and repeat each week.
Change your current directory to the root of the cloned repository and make sure you have a current copy.
$ ls README.md async build common coursedocs ejb javase jpa pom.xml projects src $ git checkout master $ git pull origin master From https://github.com/ejavaguy/ejava-student * branch master -> FETCH_HEAD Already up-to-date.
Test your configuration using
$ mvn clean install [INFO] Scanning for projects... ...
If you receive an "OutOfMemoryError: PermGen space" error, you can update the amount of memory allocated to the build by setting MAVEN_OPTS.
You can set this in the current shell using one of the following commands
bash> export MAVEN_OPTS="-Xmx512m" windows> set MAVEN_OPTS=-Xmx512m
You can optionally set these properties in one of the shell-specific environment scripts.
bash> $HOME/.mavenrc windows> %HOME%\mavenrc_pre.bat
If you are getting the following error...
[ERROR] Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty -> [Help 2]
and a -list
of the default cacerts file without a password produces a type of PKCS12
and 0 entries ...
$ keytool -list -cacerts Enter keystore password: Keystore type: PKCS12 Keystore provider: SUN Your keystore contains 0 entries
and executing the same command with a changeit
password produces entries...
$ keytool -list -cacerts Enter keystore password: changeit Keystore type: PKCS12 Keystore provider: SUN Your keystore contains 134 entries
then you have a post JDK9 generated truststore and will need to supply the password each time you execute a Java command. The former default format of truststores was changed from JKS to PKCS12 in JDK9. JKS allows access to public keys without a password. PKCS12 requires a password.
$ mvn clean install -Djavax.net.ssl.trustStorePassword=changeit
One workaround would be to add the trustStore password to the .mavenrc
mentioned
earlier.
bash> export MAVEN_OPTS="-Djavax.net.ssl.trustStorePassword=changeit" windows> set MAVEN_OPTS=-Djavax.net.ssl.trustStorePassword=changeit
Another workaround option is to regenerate the truststore as a JKS that does not require a password
to obtain public certs. Edit the security/java.security
property file
sudo vi /etc/java-11-openjdk/security/java.security
Change the keystore.type
from pkcs12
to jks
.
274 # Default keystore type. 275 # 276 #keystore.type=pkcs12 277 keystore.type=jks
Remove the existing cacerts
file and re-generate it.
$ sudo rm /etc/ssl/certs/java/cacerts $ sudo update-ca-certificates -f
The newly generated cacerts
file will be of type JKS
and enable access to public certs without a password.
$ keytool -list -cacerts Enter keystore password: Keystore type: JKS Keystore provider: SUN Your keystore contains 133 entries
There are a few cases where dependencies cannot be hosted in public repositories and must be downloaded and installed manually. Oracle DB Client is one example.
Figure 4.1. Missing Maven Dependency Error/Warning Message
Failure to find com.oracle:ojdbc6:pom:11.2.0.3 in ... was cached in the local repository, resolution will not be reattempted until the update interval of ... has elapsed or updates are forced.
If the message is a warning (i.e., for site/javadoc documentation -- it can be ignored). If you want to eliminate the warning or it is coming up as an error, you can download the artifact directly from the vendor and manually install it in your local repository.
This is only an example. You are *not* required to download if the Oracle database driver for class. You can create a dummy file ($ touch dummy.jar) and register it using a dummy groupId, artifactId, and version if you wish.
Download the driver jar from Oracle accept the license agreement.
Install it manually into your localRepository
$ mvn install:install-file -Dfile=/home/jcstaff/Downloads/ojdbc6.jar -DgroupId=com.oracle -DartifactId=ojdbc6 -Dversion=11.2.0.3 -Dpackaging=jar [INFO] Scanning for projects... ... [INFO] --- maven-install-plugin:2.4:install-file (default-cli) @ standalone-pom --- [INFO] Installing /home/jcstaff/Downloads/ojdbc6.jar to /home/jcstaff/.m2/repository/com/oracle/ojdbc6/11.2.0.3/ojdbc6-11.2.0.3.jar [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS ...
The class web site will be pre-built with up-to-date information. However, you may wish to have a local version and can do so by performing the following.
$ mvn clean verify site -DskipTests $ mvn site:stage # the output will be in target/staging
If you have more time and wish to generate more detailed reports, execute the following.
$ mvn clean verify site -Preports $ mvn site:stage # the output will be in target/staging
If you have less time and are looking only to get up-to-date documents, execute the following from the coursedocs directory.
$ cd coursedocs $ mvn site:stage # the output will be in target/site
We will be using the JBoss/Wildfly Application Server this semester. This is a fully-compliant JavaEE 8 application server.
JBoss has a community version (formerly called JBoss AS - renamed Wildfly ~2012) and commercial version (JBoss EAP) of their JavaEE application server. Both are open source and built off the same code base. In theory, changes propagate through the community version first in daily changes and short iterations. In theory, commercial version is a roll-up of a stable version of the community version with the ability to purchase support on that specific version. With commercial version support - you can receive patches for a specific issue prior to upgrading to the latest release. With the community version - you pretty much need to keep up with the latest release to get any patches. Of course, with either version you are free to perform your own support and code changes, but you can only get this commercially with the EAP release. There is a newsgoup post and slide show that provides a decent, short description of the two.
JBoss makes the EAP version available for *development* use from jboss.org but lags behind Wildfly (at wildfly.org) for obvious reasons. We will be using the open source/Wildfly version of the server.
JBoss AS/Wildfly version numbers are ahead of JBoss EAP because not every community version becomes a commercial version. JBoss AS 6 was skipped entirely by EAP.
Download Wildfly 17.0.1.Final https://www.wildfly.org/downloads/. The 'Quickstarts' examples are also helpful but class notes, exercises, and guidance may have simplified or alternative approaches to what is contained in the guides.
Install JBoss into a directory that does not have any spaces in its path.
$ unzip ~/Downloads/wildfly-17.0.1.Final.zip $ ls wildfly-17.0.1.Final/ appclient bin copyright.txt docs domain jboss-modules.jar LICENSE.txt modules README.txt standalone welcome-content
Test the installation by starting the default configuration installation.
$ ./wildfly-17.0.1.Final/bin/standalone.sh ========================================================================= JBoss Bootstrap Environment JBOSS_HOME: /opt/wildfly-17.0.1.Final JAVA: java JAVA_OPTS: -server -Xms64m -Xmx512m -XX:MetaspaceSize=96M -XX:MaxMetaspaceSize=256m -Djava.net.preferIPv4Stack=true -Djboss.modules.system.pkgs=org.jboss.byteman -Djava.awt.headless=true --add-exports=java.base/sun.nio.ch=ALL-UNNAMED --add-exports=jdk.unsupported/sun.misc=ALL-UNNAMED --add-exports=jdk.unsupported/sun.reflect=ALL-UNNAMED ========================================================================= 17:37:03,254 INFO [org.jboss.modules] (main) JBoss Modules version 1.9.1.Final 17:37:04,270 INFO [org.jboss.msc] (main) JBoss MSC version 1.4.8.Final 17:37:04,292 INFO [org.jboss.threads] (main) JBoss Threads version 2.3.3.Final 17:37:04,558 INFO [org.jboss.as] (MSC service thread 1-1) WFLYSRV0049: WildFly Full 17.0.1.Final (WildFly Core 9.0.2.Final) starting ... 17:37:11,257 INFO [org.jboss.as] (Controller Boot Thread) WFLYSRV0060: Http management interface listening on http://127.0.0.1:9990/management 17:37:11,259 INFO [org.jboss.as] (Controller Boot Thread) WFLYSRV0051: Admin console listening on http://127.0.0.1:9990 17:37:11,259 INFO [org.jboss.as] (Controller Boot Thread) WFLYSRV0025: WildFly Full 17.0.1.Final (WildFly Core 9.0.2.Final) started in 8842ms - Started 314 of 576 services (369 services are lazy, passive or on-demand)
There are .sh version of scripts for *nix platforms and .bat forms of the scripts for Windows platforms. Use the one that is appropriate for your environment.
Verify you can access the server
Main Page:http://localhost:8080
Admin Page:http://localhost:9990/console This will fail until the admin account is added.
Shutdown the server using Control-C
Copy over the class example server files from what you cloned and built from github earlier.
$ cd wildfly-17.0.1.Final wildfly-17.0.1.Final]$ unzip .../ejava-student/servers/ejava-wildfly1701/target/ejava-wildfly1701-5.1.0-SNAPSHOT-server.zip Archive: .../ejava-student/servers/ejava-wildfly1701/target/ejava-wildfly1701-5.1.0-SNAPSHOT-server.zip replace standalone/configuration/application-roles.properties? [y]es, [n]o, [A]ll, [N]one, [r]ename: A inflating: standalone/configuration/application-roles.properties inflating: standalone/configuration/application-users.properties inflating: standalone/configuration/application.keystore inflating: standalone/configuration/standalone.xml inflating: domain/configuration/application-roles.properties inflating: domain/configuration/application-users.properties
Restart the server
There is a bug within the Artemis JMS logging initialization that requires an upgrade of that software to fix. I have seen the error on Windows but not on *Nix systems. It is activated when we added the ejava class configuration files because the class configuration adds JMS -- which is not part of the standard configuration.
18:13:50,363 WARN [org.apache.activemq.artemis.core.server] (ServerService Thread Pool -- 82) AMQ222277: Problem initializing automatic logging configuration reload for file:c:\apps\wildfly-17.0.1.Final\standalone\configuration/logging.properties: java.net.URISyntaxException: Illegal character in opaque part at index 7: file:c:\apps\wildfly-17.0.1.Final\standalone\configuration/logging.properties
Use the batch script to add an admin user to the system. Note the password must have at least one digit and one non-alphanumeric character. If you run the application server on a remote machine or under a different account, please use the jboss.user and jboss.password supplied in build/dependencies/pom.xml. JBoss/Wildfly will bypass user credentials when the client executes on the same machine by the same user that started the server.
ejava-student]$ egrep 'jboss.user|jboss.password' -R build build/ejava-build-parent/pom.xml: <jboss.user>admin</jboss.user> build/ejava-build-parent/pom.xml: <jboss.password>password1!</jboss.password>
$ ./bin/add-user.sh What type of user do you wish to add? a) Management User (mgmt-users.properties) b) Application User (application-users.properties) (a): Enter the details of the new user to add. Using realm 'ManagementRealm' as discovered from the existing property files. Username : admin User 'admin' already exists and is disabled, would you like to... a) Update the existing user password and roles b) Enable the existing user c) Type a new username (a): a Password recommendations are listed below. To modify these restrictions edit the add-user.properties configuration file. - The password should be different from the username - The password should not be one of the following restricted values {root, admin, administrator} - The password should contain at least 8 characters, 1 alphabetic character(s), 1 digit(s), 1 non-alphanumeric symbol(s) Password : Re-enter Password : What groups do you want this user to belong to? (Please enter a comma separated list, or leave blank for none)[ ]: Updated user 'admin' to file '/opt/wildfly-17.0.1.Final/standalone/configuration/mgmt-users.properties' Updated user 'admin' to file '/opt/wildfly-17.0.1.Final/domain/configuration/mgmt-users.properties' Updated user 'admin' with groups to file '/opt/wildfly-17.0.1.Final/standalone/configuration/mgmt-groups.properties' Updated user 'admin' with groups to file '/opt/wildfly-17.0.1.Final/domain/configuration/mgmt-groups.properties' Is this new user going to be used for one AS process to connect to another AS process? e.g. for a slave host controller connecting to the master or for a Remoting connection for server to server EJB calls. yes/no? no
Retry logging into the Admin Application http://localhost:9990/console
Wildfly comes already prepared for remote debugging using a command line option. Therefore there is no longer anything detailed to do with step (nice!)
Restart the Wildfly server in debug mode. The default port is 8787, but can be easily customized with an extra command line argument.
## from bin/standalone.sh # Use --debug to activate debug mode with an optional argument to specify the port. # Usage : standalone.sh --debug # standalone.sh --debug 9797
Restart the server and notice the additional listen output. Use control-C to stop the server.
$ ./bin/standalone.sh --debug ========================================================================= ... ========================================================================= Listening for transport dt_socket at address: 8787 <control-C> $ ./bin/standalone.sh --debug 8000 ========================================================================= ... ========================================================================= Listening for transport dt_socket at address: 8000
If you already have a process listening on localhost:8080 or any of
the other JBoss ports on 127.0.0.1, you can switch addresses
by editing the interfaces
section of
standandalone.xml. You can also do this at runtime by adding
-Djboss.bind.address.management=...
and/or
-Djboss.bind.address=...
on the command line.
Note that I said "If..." above. You only need to modify the network information if you are running into a conflict on your development platform. Change is not hard, but keeping the default is the simplest way to go.
<interfaces>
<interface name="management">
<loopback-address value="${jboss.bind.address.management:127.0.0.2}"/>
</interface>
<interface name="public">
<loopback-address value="${jboss.bind.address:127.0.0.2}"/>
</interface>
<interface name="unsecure">
<inet-address value="${jboss.bind.address.unsecure:127.0.0.2}"/>
</interface>
</interfaces>
Provide a specification to maven where your JBoss server
has been installed in the .m2/settings.xml
file using the jboss.home
property. *If* you
have changed the address - you can specify that using
the jboss.host
property.
<profile>
<id>wildfly17</id>
<properties>
<jboss.host>127.0.0.2</jboss.host>
<jboss.home>/opt/wildfly-17.0.1.Final</jboss.home>
</properties>
</profile>
</profiles>
...
<activeProfiles>
<activeProfile>wildfly17</activeProfile>
<activeProfile>h2db</activeProfile>
</activeProfiles>
</settings>
The application server and application clients used in class require a relational database. Application server vendors generally package a lightweight database with their downloads so that the server can be used immediately for basic scenarios. JBoss comes packaged with the H2 database. This database can run in one of three modes
Embedded/in-memory
Embedded/file
Server-based
The in-memory and file-based embedded modes require no administrative setup. The database runs within the JVM of its host process. This makes it easy to startup and run tests or demonstrations. The in-memory mode keeps everything in-memory and should be the fastest. The file-based mode stores information to disk and is basically the same thing as the remote server except that it runs local to the client process. In both the file-based and server-based modes -- there can only be one process working on the database file(s) at a time. In server-mode, you can connect your applications and a UI (to manually inspect and modify) the database at the same time.
JBoss and the class examples come setup with the embedded drivers. The application server is initially set to in-memory and the class examples are set to file-based. The class examples are set to file-based mode so that you can test manual DDL schema creation prior to running unit tests while still allowing for zero administration to run tests. The in-memory mode requires that all schema be created by the process hosting the database. JBoss and the class development environment can both be easily modified to use the same database server instance in server-mode and switched back. You will learn how to do that here.
Embedded mode requires less administration overhead in the test environment but is restricted to a single client.
Server mode provides access to database state during application execution from multiple clients -- which is good for debugging.
File-based embedded and server-modes allow you to leverage external plugins that access your database independently to create and populate your database tables separate from your application code.
Obtain a copy of the H2 database jar from one of the following sources
Within the JBoss installation
wildfly-x.x.x.Final/modules/.../com/h2database/h2/main/
Internet Maven Repository ( http://repo2.maven.org/maven2/com/h2database/h2/)
Local Maven Repository: HOME/.m2/repository/com/h2database/h2/ if previously downloaded by a DB example module
Product Web Site ( http://h2database.com/html/download.html)
Start database, web server, and launch browser page
java -jar h2.jar
Start database and web server only
java -jar h2.jar -tcp -web
If the database comes up delayed and slow and your development system (like mine) has a lot of virtual networks defined, override the default behavior of binding to all addresses and specify a specific one.
java -Dh2.bindAddress=127.0.0.1 -jar h2.jar
Connect to URL http://127.0.1.1:8082 from a browser
Use JDBC URL: jdbc:h2:tcp://127.0.0.1:9092/./h2db/ejava
Log in as user "sa" and empty password
This will create a database file(s) prefixed with ejava
in a folder
called h2db
relative to where you started the database server.
Do one of the following
add -P\!Ph2db -Ph2srv to the command line of each build
The bang ("!") character turns off a profile. Unix shells require the bang ("!") character to be escaped. Windows DOS does not.
change the settings.xml activeProfile specification from embedded mode (h2db)
<activeProfile>h2db</activeProfile>
to server mode (h2srv)
<activeProfile>h2srv</activeProfile>
If you look at the root pom.xml (build/ejava-build-parent/pom.xml
),
the database server profile defines the following
<profile> <!-- H2 server-based DB -->
<id>h2srv</id>
<properties>
<jdbc.driver>org.h2.Driver</jdbc.driver>
<jdbc.url>jdbc:h2:tcp://${db.host}:9092/./h2db/ejava</jdbc.url>
<jdbc.user>sa</jdbc.user>
<jdbc.password/>
<hibernate.dialect>
org.hibernate.dialect.H2Dialect
</hibernate.dialect>
</properties>
<dependencies>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
</profile>
Change standalone/configuration/standalone.xml from
<connection-url>jdbc:h2:mem:test;DB_CLOSE_DELAY=-1</connection-url>
...
<security>
<user-name>sa</user-name>
<password>sa</password>
</security>
to
<connection-url>jdbc:h2:tcp://${jboss.bind.address:127.0.0.1}:9092/./h2db/ejava</connection-url>
...
<security>
<user-name>sa</user-name>
<password></password>
</security>
This will use the same database file(s) as before in folder
called h2db
relative to where you started the database server.
The most current release of Eclipse IDE as of preparing these instructions is 2019-06. Eclipse IDE seems to have gone from a yearly release cycle to a three month release cycle . When they followed a yearly release cycle -- it was always released in the August/Sepember timeframe and that was the worst time to switch. The best time to upgrade when they followed the yearly cycle was in the Janurary timeframe when issues had been reported and patches had time to be released. I am not sure yet how to time the new three month cycles -- but needless to say "the latest" is not a requirement for this course. If you already have an IDE or a version of Eclipse that you are comfortable with -- there should be no reason to upgrade.
There is no course requirement that you use Eclipse IDE or the latest Eclipse IDE. Many students have completed this course using IntelliJ.
Download Eclipse IDE for JavaEE Developers Eclipse IDE for Enterprise Developers or latest from Eclipse Packages or Eclipse Downloads
Install Eclipse IDE for JavaEE Developers and Start
For the Mac installation, a .dmg was provided. Install Eclipse to the Applications folder, pin it to the taskbar so that it is easy to locate, and start Eclipse.
For the Linux installation, a tar.gz archive was provided. Exctract this to a location on your computer, and start Eclipse.
For the Windows installation, a .zip archive was provided. Extract this to a
location on your computer, pin the eclipse
executable
to the taskbar so that it is easy to locate, and start Eclipse.
Eclipse will default to the JRE when launching most tools/applications. Sometimes this can be an issue and requires a JDK. Add the JDK and make it your default JRE. Make sure it is JDK 11.
Window->Preferences->Java->Installed JREs
press Add..->Standard VM and reference the JDK instance you installed earlier
Make the new JDK VM your default
It is a good idea to use assertions (assert foo!=null : "hey!";
)
in your code and enable assertions within your development environment
(by adding -ea
to the "Default VM Arguments" setting within
the JRE Definition).
m2e is a plugin installed into Eclipse that configures Eclipse based on the Maven pom.xml configuration. When adjusting your builds, you should always define changes within the Maven pom.xml and rely on m2e to translate that into Eclipse. Any changes added directly to Eclipse will not be seen by the command-line build.
m2e was pre-installed in the download. Nothing more needs to be done to install it. The rest of these instructions are concerned with using it and demonstrating it.
Add the Java Package Explorer to the JavaEE Perspective. I find this easier to work with than the Project Explorer used by default in the JavaEE perspective.
Select Window->Open Perspective->Other->Java
Select Top Level Elements->Working Sets from the down-facing triangle in the top-right corner of the Package Explorer.
Create a New Java Working Set and call it "javase"
Press OK
Import the class examples into Eclipse as a Maven Project
Right click on "javase" in the Package Explorer and select Import...->Maven->Existing Maven Projects
Browse to the project area where you performed
the Git checkout earlier and select the javase
folder.
Select Next to have the current project(s) added to the selected working set.
There will be an extra panel that appears the first time you import a project with a new maven plugin. Allow Eclipse to setup any that it knows about, discover ones that may be new, or resolve later for those that it cannot find. There will be unsigned content warnings for most of the plugins. Eclipse will want to restart after installing any new plugins.
Build and test the javase5Enhancements
application using
Run As->Maven Install by right clicking on any of the
project folders.
If you receive the following error
[ERROR] No compiler is provided in this environment. Perhaps you are running on a JRE rather than a JDK?
add the following '-vm' option to your eclipse.ini file -- pointing to your javaw executable in the JDK directory, restart Eclipse, and retry.
-vm C:\apps\Java\jdk1.8.0_181/bin/javaw.exe
--launcher.defaultAction openFile -vm C:\apps\Java\jdk1.8.0_181/bin/javaw.exe -vmargs ...
Try also Run As->JUnit Test.
You can use the command line Git to functionally clone the remote repository and update your local copy. However, having Git integrated into Eclipse allows the plugin to make Eclipse transition easier between one checked out branch to another or correctly react to an updated branch.
Git is pre-installed in the download. Nothing more needs to be done to install it (yay!).
JBoss maintains a set of Eclipse plugins to help with development and use of their products. There are too many to describe -- let alone understand in total. However, we can make use of a few. The primary one is to optionally run/manage Wildfly within Eclipse versus the command line. Follow these steps if you want to enable a few additional productivity JBoss Tools plugins.
Open the Eclipse Marketplace panel using
Help->Eclipse Marketplace
Type Wildfly
into the seach field and press Go
Click Install for the JBoss Tools
Complete the installation steps for JBoss Tools. There are many tools in the repository. Very few of them are needed for class or not obvious how to use them without more investigation. Choose the following suggested minimal set.
Context and Dependency Injection Tools
Hibernate Tools
JBoss AS, Wildfly, & EAP Tools
JBoss JAX-RS Tools
You will receive a warning about the content within the plugin being unsigned.
You will receive a warning after restarting Eclipse about the reporting anonymous use to JBoss. They use those callbacks to determine where and when their tools are being used (for bragging rights?). I disable that.
Define a Server Instance for JBoss
Open the JavaEE Perspective
Select "new server wizard..." in the Servers panel
Select Wildfly 17 under the JBoss Community folder
Select defaults for location, controlled by, and create new runtime.
Set HOME to your wildfly installation directory using the Browse button on the next panel.
Set the JRE to JavaSE-11
Leave the server base directory as "standalone" and configuration to "standalone.xml".
Review options. Note that I generally start my server externally so that standard output does not compete with my other actions within Eclipse. However, there are pros and cons to both methods and I will start you with internally launched here.
Start the server by right clicking on it and selecting Start. You should see some server log activity in the Eclipse console. This will conflict with but play nice with the externally-managed mode. We will look at this closer once we get to server-side development.
Ant is used in class to wrap command lines and encapsulate the building of classpaths for stand-alone applications. Just download and add Ant to your PATH here.
The latest version of Ant is 1.10.6. Older versions of Ant will work as well (e.g., 1.9.x) if you already have it installed.
Linux users - use your package installer or follow the Windows instructions.
Windows and Mac users
Download Apache Ant from http://ant.apache.org/
Unzip the archive to directory without spaces in its path
Set ANT_HOME to the where you unzipped the archive
Add ANT_HOME to your PATH
Verify Ant is in your PATH
> ant -version Apache Ant(TM) version 1.10.6 compiled on July 10 2018
Copyright © 2019 jim stafford (jim.stafford@jhu.edu)
Built on: 2019-08-22 07:08 EST
Abstract
This document contains an introductory exercise for building a Maven-based JavaSE project that is self-contained. The exercise takes a building-block approach that demystifies some of the Maven build concepts by breaking down many core concepts into files, directories, and individual commands prior to introducing the overall framework and its integration with an IDE.
Table of Contents
Identify the core use cases required to develop a Java Archive (JAR) module
Demonstrate how Maven fits within the development of a JAR module
Demonstrate how Maven integrates with a sample IDE
At the completion of this topic, the student shall be able to:
Create a module with directory structure and files to build a Java Archive (JAR)
Create a Java class for inclusion in the JAR
Create a unit test for the Java class
Automate the build using Maven
Import the Maven module into an IDE for development
Use the IDE to interactively debug the Java class and unit test
Some of the parts of this exercise are marked OPTIONAL! There is no need to physically perform the details of these steps if you are already familiar with the concepts presented. Skim the material and advance to the parts you are not familiar with.
In this chapter you will be introduced to a standard module file structure that contains a class we intend to use in production and a unit test to verify the functionality of the production class. You will be asked to form the directory structure of files and execute the commands required to build and run the unit test.
This chapter is optional!!! It contains many tedious steps that are somewhat shell-specific. The intent is to simply introduce the raw data structure and actions that need to take place and then to later automate all of this through Maven. If you wish to just skim the steps -- please do. Please do not waste time trying to port these bash shell commands to your native shell.
This part requires junit.jar. These should have been downloaded for you when you built the class examples and can be located in $M2_REPO/junit/junit/(version)/. Where M2_REPO is HOME/.m2/repository or the location you have specified in the localRepository element of $HOME/.m2/settings.xml.
Set a few shell variables to represent root directories. For the
purposes of the follow-on steps, PROJECT_BASEDIR
is the root directory
for this exercise. In the example below, the user has chosen a directory of
$HOME/proj/784/exercises
to be the root directory for all class
exercises and named the root directory for this project "ex1". An
alternative for CLASS_HOME
might be c:/jhu/784
. M2_REPO
is the path to your Maven repository.
export CLASS_HOME=$HOME/proj/784 export PROJECT_BASEDIR=$CLASS_HOME/exercises/ex1 mkdir -p $PROJECT_BASEDIR cd $PROJECT_BASEDIR export M2_REPO=$HOME/.m2/repository
Create project directory structure. In this example, the developer used $HOME/proj/784 for all work in this class.
$PROJECT_BASEDIR |-- src | |-- main | | `-- java | | `-- myorg | | `-- mypackage | | `-- ex1 | `-- test | |-- resources | `-- java | `-- myorg | `-- mypackage | `-- ex1 `-- target |-- classes |-- test-classes `-- test-reports
mkdir -p src/main/java/myorg/mypackage/ex1 mkdir -p src/test/java/myorg/mypackage/ex1 mkdir -p src/test/resources mkdir -p target/classes mkdir -p src/test/java/myorg/mypackage/ex1 mkdir -p target/test-classes mkdir -p target/test-reports
Add the following Java implementation class to $PROJECT_BASEDIR/src/main/java/myorg/mypackage/ex1/App.java
package myorg.mypackage.ex1;
public class App {
public int returnOne() {
System.out.println( "Here's One!" );
return 1;
}
public static void main( String[] args ) {
System.out.println( "Hello World!" );
}
}
Add the following Java test class to $PROJECT_BASEDIR/src/test/java/myorg/mypackage/ex1/AppTest.java
package myorg.mypackage.ex1;
import static org.junit.Assert.*;
import org.junit.Test;
/**
* Unit test for simple App.
*/
public class AppTest {
@Test
public void testApp() {
System.out.println("testApp");
App app = new App();
assertTrue("app didn't return 1", app.returnOne() == 1);
}
}
Make sure you put AppTest.java in the src/test tree.
Compile the application and place it in target/ex1.jar. The compiled classes will go in target/classes.
javac src/main/java/myorg/mypackage/ex1/App.java -d target/classes jar cvf target/ex1.jar -C target/classes . jar tf target/ex1.jar
$ javac src/main/java/myorg/mypackage/ex1/App.java -d target/classes $ jar cvf target/ex1.jar -C target/classes . added manifest adding: myorg/(in = 0) (out= 0)(stored 0%) adding: myorg/mypackage/(in = 0) (out= 0)(stored 0%) adding: myorg/mypackage/ex1/(in = 0) (out= 0)(stored 0%) adding: myorg/mypackage/ex1/App.class(in = 519) (out= 350)(deflated 32%) $ jar tf target/ex1.jar META-INF/ META-INF/MANIFEST.MF myorg/ myorg/mypackage/ myorg/mypackage/ex1/ myorg/mypackage/ex1/App.class
Compile the JUnit test and place the compiled tests in target/test-classes.
export JUNIT_JARS="$M2_REPO/junit/junit/4.12/junit-4.12.jar:$M2_REPO/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar" javac -classpath "target/ex1.jar:$JUNIT_JARS" src/test/java/myorg/mypackage/ex1/AppTest.java -d target/test-classes
Verify you have your "production" class from src/main compiled into target/classes directory, your unit test class from src/test compiled into target/test-classes directory, and the Java archive with the production class is in target directory.
target |-- classes | `-- myorg | `-- mypackage | `-- ex1 | `-- App.class |-- ex1.jar |-- test-classes | `-- myorg | `-- mypackage | `-- ex1 | `-- AppTest.class `-- test-reports
Run the JUnit test framework.
java -classpath "target/ex1.jar:$JUNIT_JARS:target/test-classes" org.junit.runner.JUnitCore myorg.mypackage.ex1.AppTest
JUnit version 4.12 .testApp Here's One! Time: 0.003 OK (1 test)
Change add/remove a test that will fail, re-compile the test class, and re-run.
//AppTest.java
@Test
public void testFail() {
System.out.println("testFail");
App app = new App();
assertTrue("app didn't return 0", app.returnOne() == 0);
}
javac -classpath "target/ex1.jar:$JUNIT_JARS" src/test/java/myorg/mypackage/ex1/AppTest.java -d target/test-classes java -classpath "target/ex1.jar:$JUNIT_JARS:target/test-classes" org.junit.runner.JUnitCore myorg.mypackage.ex1.AppTest
JUnit version 4.12 .testApp Here's One! .testFail Here's One! E Time: 0.007 There was 1 failure: 1) testFail(myorg.mypackage.ex1.AppTest) java.lang.AssertionError: app didn't return 0 at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.assertTrue(Assert.java:43) at myorg.mypackage.ex1.AppTest.testFail(AppTest.java:26) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ... at org.junit.runner.JUnitCore.main(JUnitCore.java:45) FAILURES!!! Tests run: 2, Failures: 1
In this chapter of the exercise you setup, built, and tested a sample project with only command-line commands. You did this to help show what higher level tools will need to do as well. Even though the command line provides clarity, it doesn't scale and shell scripts aren't generally portable and optimized (wrong tool for the job). Hopefully, after going through this, you have an understanding of the low level structure and usecases and are now interested adding a build environment.
This chapter demonstrates the basics of automating the manual steps in the previous chapter using the Apache Ant build tool. If you just skim thru this step, please be sure to take note of how everything gets explicitly defined in Ant. There are not many rules of the road and standard defaults to live by. That will be a big contrast when working with Maven.
All course examples and projects submitted will use Maven. Ant will be used to wrap command lines for Java SE clients executed outside the normal build environment. However, this exercise shows Ant only being used as part of the artifact build and test environment as a stepping stone to understanding some of the basic build and test concepts within Maven.
If you do not have Ant installed on your system, it can be from http://ant.apache.org/
This chapter is optional!!! It contains many tedious steps to setup a module build using the Ant build tool -- which will not be part of class. It is presented here as an example option to building the module with shell scripts. If you wish to just skim the steps -- please do. Please do not waste time trying to get Ant to build your Java modules for this class.
Create a build.properties file in $PROJECT_BASEDIR. This will be used to define any non-portable property values. Place the most non-portable base variables (.e.g, M2_REPO location) towards the top and build lower-level paths from them. This makes the scripts much easier to port to another environment. If you still have your maven repository in your $HOME directory, you can make use of ${user.home} environment variable rather than a hard-coded path.
#ex1 build.properties #M2_REPO=c:/jhu/repository M2_REPO=${user.home}/.m2/repository junit.classpath=${M2_REPO}/junit/junit/4.10/junit-4.10.jar
Create a build.xml file in $PROJECT_BASEDIR. Note the following key elements.
project - a required root for build.xml files
name - not significant, but helpful
default - the target to run if none is supplied on command line
basedir - specifies current directory for all tasks
property - defines an immutable name/value
file - imports declarations from a file; in this case build.properties created earlier
name/value - specifies a property within the script
target - defines an entry point into the build.xml script. It hosts one or more tasks.
name - defines name of target, which can be supplied on command line.
echo - a useful Ant task to printout status and debug information. See Ant docs for more information.
<?xml version="1.0" encoding="utf-8" ?>
<!-- ex1 build.xml
-->
<project name="ex1" default="" basedir=".">
<property file="build.properties"/>
<property name="artifactId" value="ex1"/>
<property name="src.dir" value="${basedir}/src"/>
<property name="build.dir" value="${basedir}/target"/>
<target name="echo">
<echo>basedir=${basedir}</echo>
<echo>artifactId=${artifactId}</echo>
<echo>src.dir=${src.dir}</echo>
<echo>build.dir=${build.dir}</echo>
<echo>junit.classpath=${junit.classpath}</echo>
</target>
</project>
Sanity check your build.xml and build.properties file with the echo target.
$ ant echo Buildfile: /home/jim/proj/784/exercises/ex1/build.xml echo: [echo] basedir=/home/jim/proj/784/exercises/ex1 [echo] artifactId=ex1 [echo] src.dir=/home/jim/proj/784/exercises/ex1/src [echo] build.dir=/home/jim/proj/784/exercises/ex1/target [echo] junit.classpath=/home/jim/.m2/repository/junit/junit/4.10/junit-4.10.jar BUILD SUCCESSFUL Total time: 0 seconds
Add the "package" target to compile and archive your /src/main classes. Note the following tasks in this target.
mkdir - creates a directory. See Ant Mkdir docs for more infomation.
javac - compiles java sources files. See Ant Javac docs for more information. Note that we are making sure we get JavaSE 8 classes compiled.
jar - builds a java archive. See Ant Jar Docs for more information.
<target name="package">
<mkdir dir="${build.dir}/classes"/>
<javac srcdir="${src.dir}/main/java"
destdir="${build.dir}/classes"
debug="true"
source="1.8"
target="1.8"
includeantruntime="false">
<classpath>
</classpath>
</javac>
<jar destfile="${build.dir}/${artifactId}.jar">
<fileset dir="${build.dir}/classes"/>
</jar>
</target>
Execute the "package" target just added. This should compile the production class from src/main into target/classes and build a Java archive with the production class in target/.
$ rm -rf target/; ant package Buildfile: /home/jim/proj/784/exercises/ex1/build.xml package: [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/classes [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/classes [jar] Building jar: /home/jim/proj/784/exercises/ex1/target/ex1.jar BUILD SUCCESSFUL Total time: 2 seconds
You may get the following error when you execute the javac task. If so, export JAVA_HOME=(path to JDK_HOME) on your system to provide Ant a reference to a JDK instance.
build.xml:26: Unable to find a javac compiler; com.sun.tools.javac.Main is not on the classpath. Perhaps JAVA_HOME does not point to the JDK. It is currently set to ".../jre"
$ find . -type f ./src/main/java/myorg/mypackage/ex1/App.java ./src/test/java/myorg/mypackage/ex1/AppTest.java ./build.properties ./build.xml ./target/classes/myorg/mypackage/ex1/App.class ./target/ex1.jar
Add the "test" target to compile your /src/test classes. Make this the default target for your build.xml file. Note too that it should depend on the successful completion of the "package" target and include the produced archive in its classpath.
<project name="ex1" default="test" basedir=".">
...
<target name="test" depends="package">
<mkdir dir="${build.dir}/test-classes"/>
<javac srcdir="${src.dir}/test/java"
destdir="${build.dir}/test-classes"
debug="true"
source="1.8"
target="1.8"
includeantruntime="false">
<classpath>
<pathelement location="${build.dir}/${artifactId}.jar"/>
<pathelement path="${junit.classpath}"/>
</classpath>
</javac>
</target>
Execute the new "test" target after clearing out the contents of the target directory. Note that the target directory gets automatically re-populated with the results of the "compile" target and augmented with the test class from src/test compiled into target/test-classes.
$ rm -rf target/; ant Buildfile: /home/jim/proj/784/exercises/ex1/build.xml package: [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/classes [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/classes [jar] Building jar: /home/jim/proj/784/exercises/ex1/target/ex1.jar test: [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-classes [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/test-classes BUILD SUCCESSFUL Total time: 3 seconds
> find . -type f ./src/main/java/myorg/mypackage/ex1/App.java ./src/test/java/myorg/mypackage/ex1/AppTest.java ./build.properties ./build.xml ./target/classes/myorg/mypackage/ex1/App.class ./target/ex1.jar ./target/test-classes/myorg/mypackage/ex1/AppTest.class
Add the junit task to the test target. The junit task is being configured to run in batch mode and write a TXT and XML reports to the target/test-reports directory. See Ant docs for more details on the junit task. Make special note of the following:
printsummary - produce a short summary to standard out showing the number of tests run and a count of errors, etc.
fork - since Ant runs in a JVM, any time you run a task that requires a custom classpath, it is usually required that it be forked into a separate process (with its own classpath).
batchtest - run all tests found and write results of each test into the test-reports directory.
formatter - write a text and XML report of results
<mkdir dir="${build.dir}/test-reports"/>
<junit printsummary="true" fork="true">
<classpath>
<pathelement path="${junit.classpath}"/>
<pathelement location="${build.dir}/${artifactId}.jar"/>
<pathelement location="${build.dir}/test-classes"/>
</classpath>
<batchtest fork="true" todir="${build.dir}/test-reports">
<fileset dir="${build.dir}/test-classes">
<include name="**/*Test*.class"/>
</fileset>
</batchtest>
<formatter type="plain"/>
<formatter type="xml"/>
</junit>
A few years ago when I sanity checked this exercise I got the common error below. I corrected the issue by downloading a full installation from the Ant website and exporting my ANT_HOME to the root of that installation. (export ANT_HOME=/opt/apache-ant-1.9.4) and adding $ANT_HOME/bin to the PATH (export PATH=$ANT_HOME/bin:$PATH) ANT_HOME is required for Ant to locate the junit task.
BUILD FAILED /home/jim/proj/784/exercises/ex1/build.xml:57: Problem: failed to create task or type junit Cause: the class org.apache.tools.ant.taskdefs.optional.junit.JUnitTask was not found. This looks like one of Ant's optional components. Action: Check that the appropriate optional JAR exists in -/usr/share/ant/lib -/home/jim/.ant/lib -a directory added on the command line with the -lib argument Do not panic, this is a common problem. The commonest cause is a missing JAR. This is not a bug; it is a configuration problem
Execute the updated "test" target with the JUnit test.
$ rm -rf target; ant package: [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/classes [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/classes [jar] Building jar: /home/jim/proj/784/exercises/ex1/target/ex1.jar test: [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-classes [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/test-classes [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-reports [junit] Running myorg.mypackage.ex1.AppTest [junit] Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 15.143 sec [junit] Test myorg.mypackage.ex1.AppTest FAILED BUILD SUCCESSFUL Total time: 17 seconds
$ find . -type f ./src/main/java/myorg/mypackage/ex1/App.java ./src/test/java/myorg/mypackage/ex1/AppTest.java ./build.properties ./build.xml ./target/classes/myorg/mypackage/ex1/App.class ./target/ex1.jar ./target/test-classes/myorg/mypackage/ex1/AppTest.class ./target/test-reports/TEST-myorg.mypackage.ex1.AppTest.txt ./target/test-reports/TEST-myorg.mypackage.ex1.AppTest.xml
Note the 17 seconds it took to run/complete the test seems excessive. I was able to speed that up to 0.001 sec by commenting out the XML report option (which we will not use in this exercise).
Test output of each test is in the TXT and XML reports.
$ more target/test-reports/TEST-myorg.mypackage.ex1.AppTest.txt Testsuite: myorg.mypackage.ex1.AppTest Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 15.246 sec ------------- Standard Output --------------- testApp Here's One! testFail Here's One! ------------- ---------------- --------------- Testcase: testApp took 0.007 sec Testcase: testFail took 0.022 sec FAILED app didn't return 0 junit.framework.AssertionFailedError: app didn't return 0 at myorg.mypackage.ex1.AppTest.testFail(AppTest.java:26)
Add a clean target to the build.xml file to delete built artifacts. See Ant docs for details on the delete task.
<target name="clean">
<delete dir="${build.dir}"/>
</target>
Re-run and use the new "clean" target you just added.
$ ant clean test Buildfile: /home/jim/proj/784/exercises/ex1/build.xml clean: [delete] Deleting directory /home/jim/proj/784/exercises/ex1/target package: [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/classes [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/classes [jar] Building jar: /home/jim/proj/784/exercises/ex1/target/ex1.jar test: [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-classes [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/test-classes [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-reports [junit] Running myorg.mypackage.ex1.AppTest [junit] Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 15.123 sec [junit] Test myorg.mypackage.ex1.AppTest FAILED BUILD SUCCESSFUL Total time: 17 seconds
Comment out the bogus testFail and rerun.
$ cat src/test/java/myorg/mypackage/ex1/AppTest.java ... //@Test public void testFail() {
$ ant clean test Buildfile: /home/jim/proj/784/exercises/ex1/build.xml clean: [delete] Deleting directory /home/jim/proj/784/exercises/ex1/target package: [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/classes [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/classes [jar] Building jar: /home/jim/proj/784/exercises/ex1/target/ex1.jar test: [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-classes [javac] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/test-classes [mkdir] Created dir: /home/jim/proj/784/exercises/ex1/target/test-reports [junit] Running myorg.mypackage.ex1.AppTest [junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.161 sec BUILD SUCCESSFUL Total time: 17 seconds
In this chapter you performed many of the same actions you did in the previous chapter except that you implemented in a portable build tool called Ant. Ant is very powerful and open-ended. You were shown these same steps in Ant as an intermediate towards Maven. Ant, being open-ended, does not follow many conventions. Everything must be explicitely specified. You will find that to be in significant contrast with Maven. Ant is a "build tool" and Maven is a "build system" with conventions/rules.
In this chapter we will refine the use of print and debug statements by using a "logger". By adopting a logger into your production and test code you can avoid print statements to stdout/stderr and be able to re-direct them to log files, databases, messaging topics etc. There are several to choose from (Java's built-in logger, Commons logging API, SLF's logging API, and log4j to name a few). This course uses the SLF API and and the log4j implementation.
Change the System.out() calls in App and AppTest from Part A to use SLF logging API. The slf4j-api Javadoc and manual will be helpful in understanding this interface.
package myorg.mypackage.ex1;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class App {
private static Logger logger = LoggerFactory.getLogger(App.class);
public int returnOne() {
//System.out.println( "Here's One!" );
logger.debug( "Here's One!" );
return 1;
}
public static void main( String[] args ) {
//System.out.println( "Hello World!" );
logger.info( "Hello World!" );
}
}
package myorg.mypackage.ex1;
...
import static org.junit.Assert.*;
import org.junit.Test;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class AppTest {
private static Logger logger = LoggerFactory.getLogger(AppTest.class);
...
@Test
public void testApp() {
//System.out.println("testApp");
logger.info("testApp");
App app = new App();
assertTrue("app didn't return 1", app.returnOne() == 1);
}
}
Add a log4j.xml configuration file to the directory structure. Place this file in src/test/resources/log4j.xml. This file is used to control logging output. Refer to the log4j manual for possible information on how to configure and use log4j. It doesn't matter whether you use a log4j.xml format or log4j.properties format. Of note, the implementation we are using is based on "Log4j 1" -- which reached its end of life in 2015. It still works but all energy in this area is within "Log4j 2".
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
<log4j:configuration
xmlns:log4j="http://jakarta.apache.org/log4j/"
debug="false">
<appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
<param name="Target" value="System.out"/>
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern"
value="%-5p %d{dd-MM HH:mm:ss,SSS} (%F:%M:%L) -%m%n"/>
</layout>
</appender>
<appender name="logfile" class="org.apache.log4j.RollingFileAppender">
<param name="File" value="target/log4j-out.txt"/>
<param name="Append" value="false"/>
<param name="MaxFileSize" value="100KB"/>
<param name="MaxBackupIndex" value="1"/>
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern"
value="%-5p %d{dd-MM HH:mm:ss,SSS} [%c] (%F:%M:%L) -%m%n"/>
</layout>
</appender>
<logger name="myorg.mypackage">
<level value="debug"/>
<appender-ref ref="logfile"/>
</logger>
<root>
<priority value="info"/>
<appender-ref ref="CONSOLE"/>
</root>
</log4j:configuration>
The log4j.xml is placed in the JVM classpath; where log4j will locate it by default. However, it should not be placed in with the main classes (ex1.jar). Placing it in a our JAR file would polute the application assembler and deployer's job of specifying the correct configuration file at runtime. Our test classes and resources are not a part of follow-on deployment.
Add the slf4j-api.jar to the compile classpaths and the slf4j-api.jar, slf4j-log4j.jar, and log4j.jar to the runtime classpath used during tests. Also add an additional task to copy the log4j.xml file into target/test-classes so that it is seen by the classloader as a resource. Realize that your classes have no compilation dependencies on log4j. Log4j is only used if it is located at runtime.
# ex1 build.properties junit.classpath=${M2_REPO}/junit/junit/4.12/junit-4.12.jar:\ ${M2_REPO}/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar slf4j-api.classpath=${M2_REPO}/org/slf4j/slf4j-api/1.7.25/slf4j-api-1.7.25.jar slf4j-log4j.classpath=${M2_REPO}/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar log4j.classpath=${M2_REPO}/log4j/log4j/1.2.17/log4j-1.2.17.jar
<target name="echo">
...
<echo>slf4j-api.classpath=${slf4j-api.classpath}</echo>
<echo>slf4j-log4j.classpath=${slf4j-log4j.classpath}</echo>
<echo>log4j.classpath=${log4j.classpath}</echo>
</target>
<javac srcdir="${src.dir}/main/java"
destdir="${build.dir}/classes"
debug="true"
source="1.8"
target="1.8"
includeantruntime="false">
<classpath>
<pathelement path="${slf4j-api.classpath}"/>
</classpath>
</javac>
<javac srcdir="${src.dir}/test/java"
destdir="${build.dir}/test-classes"
debug="true"
source="1.8"
target="1.8"
includeantruntime="false">
<classpath>
<pathelement location="${build.dir}/${artifactId}.jar"/>
<pathelement path="${junit.classpath}"/>
<pathelement path="${slf4j-api.classpath}"/>
</classpath>
</javac>
<copy todir="${build.dir}/test-classes">
<fileset dir="${src.dir}/test/resources"/>
</copy>
<junit printsummary="true" fork="true">
<classpath>
<pathelement path="${junit.classpath}"/>
<pathelement location="${build.dir}/${artifactId}.jar"/>
<pathelement location="${build.dir}/test-classes"/>
<pathelement path="${commons-logging.classpath}"/>
<pathelement path="${log4j.classpath}"/>
</classpath>
...
Test application and inspect reports. All loggers inherit from the root logger and may only extend its definition; not limit it. Notice that the root logger's priority filter "info" value allows log.info() (warning and fatal) messages to printed to the console. The myorg.mypackage logger's level filter allows log.debug() messages from the myorg.mypackage.* classes to appear in both the console and logfile. This means that any Java classes not in our package hierarchy will only have INFO or higher priority messages logged.
$ ant clean test Buildfile: /home/jcstaff/proj/784/exercises/ex1/build.xml clean: [delete] Deleting directory /home/jcstaff/proj/784/exercises/ex1/target package: [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/classes [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/classes [jar] Building jar: /home/jcstaff/proj/784/exercises/ex1/target/ex1.jar test: [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-classes [javac] Compiling 1 source file to /home/jcstaff/proj/784/exercises/ex1/target/test-classes [copy] Copying 1 file to /home/jcstaff/proj/784/exercises/ex1/target/test-classes [mkdir] Created dir: /home/jcstaff/proj/784/exercises/ex1/target/test-reports [junit] Running myorg.mypackage.ex1.AppTest [junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.127 sec BUILD SUCCESSFUL Total time: 17 seconds
You won't see the output come to stdout when using Ant, but you can locate all output in the FILE logger output defined to be in target/log4j-out.txt. This behavior will get a little better under Maven.
$ more target/log4j-out.txt INFO 13-08 18:24:21,983 [myorg.mypackage.ex1.AppTest] (AppTest.java:testApp:18) -testApp DEBUG 13-08 18:24:21,986 [myorg.mypackage.ex1.App] (App.java:returnOne:11) -Here's One!
Your project structure should look like the following at this point.
> find . -type f ./src/main/java/myorg/mypackage/ex1/App.java ./src/test/java/myorg/mypackage/ex1/AppTest.java ./src/test/resources/log4j.xml ./build.properties ./build.xml ./target/classes/myorg/mypackage/ex1/App.class ./target/ex1.jar ./target/test-classes/myorg/mypackage/ex1/AppTest.class ./target/test-classes/log4j.xml ./target/test-reports/TEST-myorg.mypackage.ex1.AppTest.txt ./target/test-reports/TEST-myorg.mypackage.ex1.AppTest.xml ./target/log4j-out.txt
Change the logging level so that only the App class performs logs to the logfile. By extending the logger name specification all the way to the class, we further limit which classes apply to this logger.
<logger name="myorg.mypackage.ex1.App">
<level value="debug"/>
<appender-ref ref="logfile"/>
</logger>
After re-running the build you should notice the DEBUG for only the App is included because of the change we made to the logger outside the code.
$ more target/log4j-out.txt DEBUG 26-08 23:07:04,809 [myorg.mypackage.ex1.App] (App.java:returnOne:11) -Here's One!
Repeat after me. "I will never use System.out.println() in this class." Doing so will make it difficult for your deployed components to have their logs controlled and accessible as it is instantiated in unit testing, integration testing, and deployment environments.
In this chapter we added some sophistication to the production class and its unit test which required additional compile-time and runtime resources. Please note that we added only a dependency on slf4j-api to the javac compiler task because log4j was never directly referenced in the source code. We added a runtime dependency on log4j (and the slf4j-log4j adapter) in order to configure a logger implementation during the execution of tests. The distinction of resources required for compile-time, test, and runtime required will become even more important when using Maven. Use the explicit classpaths we created here to help you understand Maven dependency scope when we get there.
In this chapter you will automate the build using Maven by defining a simple Maven project definition that will go with your project tree. In the previous chapters you worked with a reasonable project tree that could have looked different in a number of ways and could have been accounted for by different path constructs. However, why be different? The project tree we put together that accounted for production classes, test classes, resource files, archives, unit tests, test reports, etc. follows Maven's standard build tree almost exactly (with the exception of the name of the target/test-reports directory). We will be able to add a Maven project definition without much effort.
The Maven community has a tremendous amount of documentation, examples, and on-line discussions. This course has many examples that are more specific for the work you will be actively performing. Many of these resources are a quick google search away but know that I go out of my way to make sure you spend as much time as possible on design and JavaEE aspects in class. If you are stuck on Maven -- ask. I know what you are trying to do and can likely point you to an example that is relevant to what you are doing in class. If you are still stuck on Maven issues -- send it to me. I will fix it personally. There is nothing more irritating for you than to be fighting with the builds when you want to be spending more time understanding, designing, trying, and mastering the product of what is being built.
Using Maven requires only an initial download and installation. Plugins and dependencies will be downloaded from remote repositories as needed. Connectivity to the internet is required until all dependencies have been satisfied.
Maven will automatically go out and download any missing
dependencies and recursively download what they depend upon. If
you are running Maven for the first time, this could result in a
significant amount of downloading and may encounter an occasional
connection failure with repositories. Once a non-SNAPSHOT version
is downloaded (e.g., 1.3), Maven will not re-attempt to download
it. Maven will, however, go out and check various resources to stay
in sync. If you know you already have everything you need, you can
run in off-line mode using the "-o" flag on the command line or its
equivalent entry within the settings.xml file. This can save you
seconds of build time when disconnected from the Internet. If you want
to make sure that all dependencies have been resolved,
use the mvn dependency:go-offline
goal to eagerly resolve
all dependencies.
Create a pom.xml file in project basedir. This will be used to define your entire project. Refer to the Maven POM Reference for details about each element.
modelVersion - yes; its required
groupId - just as it sounds, this value is used to group related artifacts. groupId is a hierarchical value and the individual names are used to form a directory structure in the Maven repository (e.g., artifacts in the myorg.myproject.foo groupId will be located below the HOME/.m2/repository/myorg/myproject/foo directory).
version - Maven has a strong versioning system and versions appended with the word SNAPSHOT are handled differently. Projects with a version ending in -SNAPSHOT are thought to be in constant change, with no official release yet available. Projects with a version lacking the -SNAPSHOT ending are meant to be an official release, with no other variants available with the same tag.
dependency.scope - this is used to define the scope the dependency gets applied. It defines the visibility within the project for the dependency and whether it is carried along with the module through transitive dependency analysis. With open-source software, a typical JavaEE application could have 10s to 100s of individual modules it dependends upon and the proper use of transitive dependencies makes this manageable.
scope=compile is the default and is used to describe artifacts that the src/main directory depends upon and will also be visible by classes in src/test. These dependency artifacts will be brought along with the module when transitive dependencies are evaluated.
scope=test is used to define artifacts which src/test depends upon. These will be made available during testing, but will not be visible to classes in src/main and will not be considered a dependency for downstream users of the module. Consult the maven documentation for other scopes, but one other that is commonly used in class is scope=provided.
scope=provided is similar to scope=compile in that the src/main tree can see it, however like scope=test, it is not carried forward. Each downstream module is required to know about the dependency and provide a replacement. This is common when using JavaEE APIs that have been packaged by different vendors used by different module developers.
maven-compiler-plugin - this declaration is only necessary to if we need to override the default Java version -- like what we did in our Ant script.
properties.project.build.sourceEncoding - this defines the default handling of file content for all plugins within a module. The default is platform-specific if left unspecified and we avoid an annoying warning by specifying it.
Although the m2e Eclipse plugin reads the pom dependency and creates a classpath within Eclipse, it does not honor the differences between the different scope values. All dependencies are blended together when inside the IDE. The result is that something may compile and run fine within Eclipse and report a missing class when built at the command line. If that happens, check for classes using artifacts that have been brought in as scope=test or for classes incorrectly placed within the src/main tree.
<?xml version="1.0"?>
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>myorg.myproject</groupId>
<artifactId>ex1</artifactId>
<name>My First Simple Project</name>
<version>1.0-SNAPSHOT</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.25</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.25</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.7.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
Your project tree should look like the following at this point.
> find . -type f ./src/main/java/myorg/mypackage/ex1/App.java ./src/test/java/myorg/mypackage/ex1/AppTest.java ./src/test/resources/log4j.xml ./build.properties ./build.xml ./pom.xml
Note that the pom.xml file is not required to have an assigned schema. However, adding one does allow for XML editing tools to better assist in creating a more detailed POM. Replace the project element from above with the following declarations to assign an XML schema.
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
Run the package "phase" and watch the project compile, assemble, and test. Maven has many well-known phases that correspond to the lifecycle of build steps that goes into validating, preparing, building, testing, and deploying artifacts of a module. You can find out more about Maven phases here I refer to this page very often.
$ mvn package [INFO] Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building JavaSE::Simple Module Exercise Solution 5.0.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ firstSimpleModuleEx --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /home/jim/proj/784/exercises/ex1/src/main/resources [INFO] [INFO] --- maven-compiler-plugin:3.7.0:compile (default-compile) @ firstSimpleModuleEx --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/classes [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ firstSimpleModuleEx --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-compiler-plugin:3.7.0:testCompile (default-testCompile) @ firstSimpleModuleEx --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 1 source file to /home/jim/proj/784/exercises/ex1/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ firstSimpleModuleEx --- [INFO] Surefire report directory: /home/jim/proj/784/exercises/ex1/target/surefire-reports ------------------------------------------------------- T E S T S ------------------------------------------------------- Running myorg.mypackage.ex1.AppTest INFO 13-08 19:03:19,264 (AppTest.java:testApp:18) -testApp DEBUG 13-08 19:03:19,267 (App.java:returnOne:11) -Here's One! Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.13 sec Results : Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 [INFO] [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ firstSimpleModuleEx --- [INFO] Building jar: /home/jim/proj/784/exercises/ex1/target/firstSimpleModuleEx-5.0.0-SNAPSHOT.jar [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1.553 s [INFO] Finished at: 2018-08-13T19:03:19-04:00 [INFO] Final Memory: 19M/309M [INFO] ------------------------------------------------------------------------ [WARNING] The requested profile "h2db" could not be activated because it does not exist.
You were asked to declare a h2db
profile as active
within $HOME/.m2/settings.xml during the
software installation instructions. Maven will warn you about any profile you
request but is not found within the module to help identify spelling errors.
In this case, we simply do not need the profile and have not defined it.
The contents of your development tree should look as follows.
> find . -type f ./build.xml ./build.properties ./pom.xml ./target/surefire-reports/TEST-myorg.mypackage.ex1.AppTest.xml ./target/surefire-reports/myorg.mypackage.ex1.AppTest.txt ./target/log4j-out.txt ./target/maven-archiver/pom.properties ./target/ex1-1.0-SNAPSHOT.jar ./target/test-classes/myorg/mypackage/ex1/AppTest.class ./target/test-classes/log4j.xml ./target/classes/myorg/mypackage/ex1/App.class ./target/maven-status/... ./src/test/resources/log4j.xml ./src/test/java/myorg/mypackage/ex1/AppTest.java ./src/main/java/myorg/mypackage/ex1/App.java
src/main/java classes were built in the target/classes directory by convention by the maven-compiler plugin that is automatically wired into JAR module builds. We didn't have to configure it because we structured our project using Maven directory structure and used the default packaging=jar module type (since packaging=jar is the default, it could be left unspecified). Many of the standard features are enacted when for modules with packaging=jar type.
src/test/java classes where built in the target/test-classes directory by convention by the maven-compiler plugin.
src/test/resources where copied to the target/test-classes directory by convention by the maven-resources-plugin that is automatically wired into JAR module builds.
test cases were run and their reports were placed in target/surefire-reports by convention by the maven-surefire-plugin that is automatically wired into JAR module builds.
The build.xml and build.properties file from our work with Ant is still allowed to exist. We could even delegate from Maven to Ant using the maven-antrun-plugin if we had legacy build.xml scripts that we wanted to leverage.
For *fun*, lets add a README that could be used to describe something about your project and have it be processed as part of the documentation for the module. You do not need to do this for class projects, but walking through this may be helpful in understanding how the course website is created from the source you have on your disk. Maven supports a couple of documentation generation languages, but lets just use HTML to keep this simple. Place the following content to src/site/resources/README.html
mkdir -p src/site/resources
$ cat src/site/resources/README.html
<?xml version="1.0"?>
<html>
<head>
<title>My First Project</title>
</head>
<body>
<section><h1>My First Project</h1></section>
<p/>
This is my first project. Take a look at
<p/>
<ul>
<li>this ....</li>
<li>that ....</li>
<li>or <a href="./index.html">go home</a></li>
</ul>
</section>
</body>
</html>
The above is enough to provide the page. Now add a link to it from the project menu. Add the following content to src/site/site.xml
$ cat src/site/site.xml
<?xml version="1.0" encoding="UTF-8"?>
<project name="${project.name}">
<body>
<menu name="Content">
<item name="README" href="README.html"/>
</menu>
</body>
</project>
You must also specify a version# for the maven-site-plugin and maven-project-info-reports-plugin in the pom.xml. Maven is extremely version-aware.
<plugins>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-site-plugin</artifactId>
<version>3.4</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-project-info-reports-plugin</artifactId>
<version>2.8</version>
</plugin>
</plugins>
> find . -type f ./src/main/java/myorg/mypackage/ex1/App.java ./src/test/java/myorg/mypackage/ex1/AppTest.java ./src/test/resources/log4j.xml ./src/site/resources/README.html ./src/site/site.xml ./build.properties ./build.xml ./pom.xml
Build the site and open target/site/index.html in your browser. You should see a link to the README on the left side.
$ mvn site [INFO] Scanning for projects... ... [INFO] BUILD SUCCESS
$ find target/site/ -name *.html target/site/plugin-management.html target/site/index.html target/site/mail-lists.html target/site/issue-tracking.html target/site/license.html target/site/project-info.html target/site/dependency-info.html target/site/README.html target/site/dependencies.html target/site/team-list.html target/site/source-repository.html target/site/integration.html target/site/distribution-management.html target/site/project-summary.html target/site/plugins.html
If you use the posted firstSimpleModuleEx as a starting point for your work you will need to re-enable site generation under the maven-site-plugin. This was turned off since the posted examples do not contain enough information to be posted with the rest of the class examples.
<!-- exclude this modules from site generation -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-site-plugin</artifactId>
<version>3.4</version>
<configuration>
<skip>false</skip>
<skipDeploy>false</skipDeploy>
</configuration>
</plugin>
Okay, that was a lot of work to just copy an html file. Now lets add javadoc to our project and create a link to it. Add the following contents to the bottom of the pom.xml file.
...
</build>
<reporting>
<plugins>
<plugin>
<artifactId>maven-javadoc-plugin</artifactId>
<groupId>org.apache.maven.plugins</groupId>
<version>3.0.1</version>
<configuration>
<detectLinks>false</detectLinks>
<detectOfflineLinks>true</detectOfflineLinks>
<show>private</show>
<source>1.8</source>
<additionalparam>-Xdoclint:none</additionalparam>
<failOnError>false</failOnError>
<links>
<link>http://download.oracle.com/javase/8/docs/api/</link>
<link>https://javaee.github.io/javaee-spec/javadocs/</link>
</links>
</configuration>
</plugin>
</plugins>
</reporting>
We could create a link the the apidocs/index.html like we did with README.html, but that would be something we'd keep having to update each time we added a new report. Lets add a property to the site.xml menu so a link to Javadoc and other reports can drop in automatically.
# src/site/site.xml
<?xml version="1.0" encoding="UTF-8"?>
<project name="${project.name}">
<body>
<menu name="Content">
<item name="README" href="README.html"/>
</menu>
<menu ref="reports"/>
</body>
</project>
Re-generate the site documentation with the site target. Open the target/site/index.html page and you should now see a menu item for "Project Reports" -> "JavaDocs". Our App class should be included in the Javadoc.
The pom.xml file is the main configuration source for 99% of what you develop with Maven. There is an additional $HOME/.m2/settings.xml file where you can specify build site-specific properties. These will be available to all pom.xml files. You want to be careful not to over-populate the settings.xml file (taking advantage of its re-usable specification) since it will make you pom.xml files too dependent on a particular build location. Refer to the Settings Descriptor for detailed information on settings.xml. The following provides a step-wise generation of the settings.xml file you put in place during Development Environment Setup. Read thru this for reference since you likely already have everything in place you need.
Let's start a settings.xml file to store properties that are specific to our build site. You can find details about each setting at the following URL.
cat $HOME/.m2/settings.xml
<?xml version="1.0"?>
<settings xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">
</settings>
If your $HOME directory path contains spaces, you will want to provide an override for the localRepository. Provide a custom path that does not contain spaces. This value will default to a "$HOME/.m2/repository" directory.
<!-- this overrides the default $HOME/.m2/repository location. -->
<localRepository>c:/jhu/repository</localRepository>
Add the following specification to either the settings.xml file or the local pom.xml file. If you specify it to the local pom.xml file -- it will only apply to that project. If you specify it in the settings.xml file -- it will be global to all projects in your area. More will be covered on this later. However, it should be noted that this profile is not active unless someone specifically asks for it (-Pdebugger) or the "debugger" environment variable is set (-Ddebugger=(anything)).
<profiles>
<profile>
<id>debugger</id>
<!-- this should only be activated when performing interactive
debugging -->
<activation>
<property>
<name>debugger</name>
</property>
</activation>
<properties>
<surefire.argLine>-Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE</surefire.argLine>
</properties>
</profile>
</profiles>
Although not needed for this class -- at times you will need access to a dependency that is not available in a Maven repository. COTS libraries are generally not available at ibiblio.org. You must download it and manually install it locally.
This step will go though importing a stand-alone archive into the repository to resolve any dependencies. Start by declaring a dependency before we do the import. Note that a new scope property was added. See the Dependency Mechanism Intro Page for a discussion of scope, but in this case it is indicating that it should only be present on the command line and not the runtime classpath.
<dependency>
<groupId>foo</groupId>
<artifactId>bar</artifactId>
<version>1.1</version>
<scope>provided</scope>
</dependency>
Attempt the build the module with the missing dependency. The build should fail but note that Maven attempted all known external repositores.
> mvn package [INFO] Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building My First Simple Project 1.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ Downloading: http://webdev.apl.jhu.edu/~jcs/maven2/foo/bar/1.1/bar-1.1.pom Downloading: http://webdev.apl.jhu.edu/~jcs/maven2-snapshot/foo/bar/1.1/bar-1.1.pom Downloading: http://repo1.maven.org/maven2/foo/bar/1.1/bar-1.1.pom [WARNING] The POM for foo:bar:jar:1.1 is missing, no dependency information available Downloading: http://webdev.apl.jhu.edu/~jcs/maven2/foo/bar/1.1/bar-1.1.jar Downloading: http://webdev.apl.jhu.edu/~jcs/maven2-snapshot/foo/bar/1.1/bar-1.1.jar Downloading: http://repo1.maven.org/maven2/foo/bar/1.1/bar-1.1.jar [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1.437s [INFO] Finished at: Wed Feb 02 12:20:51 EST 2011 [INFO] Final Memory: 2M/15M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal on project ex1: Could not resolve dependencies for project myorg.myproject:ex1:jar:1.0-SNAPSHOT: Could not find artifact foo:bar:jar:1.1 in webdev-baseline (http://webdev.apl.jhu.edu/~jcs/maven2) -> [Help 1]
The old error message provided for Maven 2 was much better if a manual install is what you really needed. The newer (Maven 3) one does not provide instruction. In this case, manually install a jar file that represents the declaration. Assign it a groupId of foo, an artifactId of bar, and a version of 1.1. Don't forget to add the -DgeneratePom=true or you will get a download warning everytime you try to build. All we need is a valid .jar file. If you don't have one laying around, just create one with valid structure.
$ touch bar.jar $ mvn install:install-file -DgroupId=foo -DartifactId=bar -Dversion=1.1 -Dpackaging=jar -Dfile=bar.jar -DgeneratePom=true [INFO] Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building My First Simple Project 1.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-install-plugin:2.4:install-file (default-cli) @ ex1 --- [INFO] Installing /home/jim/proj/784/exercises/ex1/bar.jar to /home/jim/.m2/repository/foo/bar/1.1/bar-1.1.jar [INFO] Installing /tmp/mvninstall5322334237902777597.pom to /home/jim/.m2/repository/foo/bar/1.1/bar-1.1.pom [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------
After successfully installing the dummy archive, you should be able to locate the JAR and other supporting files in the local repository. Be sure to look where you have directed localRepository in $HOME/.m2/settings.xml
$ find /home/jim/.m2/repository/foo/bar/
/home/jim/.m2/repository/foo/bar/
/home/jim/.m2/repository/foo/bar/1.1
/home/jim/.m2/repository/foo/bar/1.1/bar-1.1.pom.lastUpdated
/home/jim/.m2/repository/foo/bar/1.1/_remote.repositories
/home/jim/.m2/repository/foo/bar/1.1/bar-1.1.jar.lastUpdated
/home/jim/.m2/repository/foo/bar/1.1/bar-1.1.jar
/home/jim/.m2/repository/foo/bar/1.1/bar-1.1.pom
/home/jim/.m2/repository/foo/bar/maven-metadata-local.xml
Notice that Maven always makes sure there is a POM file present -- whether it had to generate it or not.
$ more /home/jim/.m2/repository/foo/bar/1.1/bar-1.1.pom
<?xml version="1.0" encoding="UTF-8"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<modelVersion>4.0.0</modelVersion>
<groupId>foo</groupId>
<artifactId>bar</artifactId>
<version>1.1</version>
<description>POM was created from install:install-file</description>
</project>
Now try running "mvn package" and it should successfully resolve the fake dependency on the bar.jar.
One last thing...Maven pulls in definitions from many places in the build environment. If you ever want to know what the total sum of those sources are (the "effective POM"), the execute the help:effective-pom goal.
$ mvn help:effective-pom
[INFO] Scanning for projects...
...
<project xmlns...
<modelVersion>4.0.0</modelVersion>
<groupId>myorg.myproject</groupId>
<artifactId>ex1</artifactId>
<version>1.0-SNAPSHOT</version>
<name>My First Simple Project</name>
<properties>
<jboss.home>/opt/wildfly-13.0.0.Final</jboss.home>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>foo</groupId>
<artifactId>bar</artifactId>
<version>1.1</version>
<scope>provided</scope>
</dependency>
...
During this exercise, you were able to establish a project which was understood by Maven. Once Maven-compliant, each plugin can be added to perform different tasks for development. By the time we start adding databases, building EARs, and deploying to application servers, we can use all the plugin help we can get.
In this chapter we will be importing the project into the Eclipse IDE, running a few project goals, and demonstrating a debug session. IDEs provide very useful code navigation and refactoring tools to name only a few features. However, one of the unique tools offered by the IDEs is the ability to step through the code in a debugging session. Please do not end this exercise before becoming comfortable with the ability to use the debugger.
Maven/Eclipse integration was once the most volatile aspects of the environment. However, over the years the two have been designed to work well together as long as you keep things Maven-centric.
The Maven/Eclipse integration is a Maven-first approach where the Eclipse project always follows the Maven pom.xml. That is on of the main reasons this exercise started you with a pom.xml file first and progressed later to the IDE. It is wrong (or at least non-portable) to manually adjust the build path of a project within Eclipse. You must adjust the build path of a project by editing the pom.xml and having Eclipse automatically detect and follow that change.
The exercise asked you to name your module "ex1". This portion of the exercise demonstrates actions within Eclipse in a posted solution in your examples tree call "firstSimpleModuleEx". You may/should continue to use your "ex1" solution from the previous chapters but know that firstSimpleModuleEx (as posted) has a few minor tweeks to its pom.xml to allow it to be distributed with the rest of the class examples as part of the class site.
Select File->Import->Maven->Existing Maven Projects, navigate to the directory with the project you have been working with and select OK.
The project should successfully import. Note that Eclipse has imported the project configuration from the Maven POM and has done at least the following...
Created three build packages; src/main/java, src/test/java, and src/test/resources. Had there been a src/main/resources we would have had a fourth build package added.
Defined the project for use with Java 8.
Added five Maven dependencies. Notice how the path to these dependencies are from the localRepository.
Four of these dependencies (slf-api, junit, slf-log4j, and log4j) were through direct references. The fifth (hamcrest-core) is through a transitive dependency from junit. You can visualize this transitive dependency by opening the pom.xml and selecting the dependency hierarchy tab at the bottom of the window.
You can get a text version of the dependency heirarchy using the Maven dependency:tree goal.
$ mvn dependency:tree ... [INFO] ------------------------------------------------------------------------ [INFO] Building JavaSE::Simple Module Exercise Solution 1.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-dependency-plugin:2.8:tree (default-cli) @ firstSimpleModuleEx --- [INFO] myorg.myproject:firstSimpleModuleEx:jar:5.0.0-SNAPSHOT [INFO] +- org.slf4j:slf4j-api:jar:1.7.25:provided [INFO] +- junit:junit:jar:4.12:test [INFO] | \- org.hamcrest:hamcrest-core:jar:1.3:test [INFO] +- org.slf4j:slf4j-log4j12:jar:1.7.25:test [INFO] \- log4j:log4j:jar:1.2.17:test
Make a small edit to the pom.xml and notice the change in the Maven Dependencies within Eclipse. Switch back and forth between these two settings and watch the value under Maven Dependencies follow the edits (be sure to save the file between edits).
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
<scope>test</scope>
</dependency>
Right-click on the pom.xml file or project folder and execute Run As->"Maven install". You can also get back to this window through the Run As option on the toolbar once you have the project selective. This mode runs the JUnit test you wrote within the context of the full maven project. All pre-test and post-test setup and teardown you wired into the Maven command line build will be executed.
Note that you can create a separate window for any of the Eclipse tabs. Using dual monitors -- I commonly display the IDE on one page the the Console output on another when using debug statements.
Rerun the tests as a JUnit test. This mode runs the JUnit test raw within Eclipse. This is very efficient for making and testing Java code changes but will not run any maven setup or teardown plugins (which is not always required or can be avoided).
Maven is a very useful and powerful tool. However, there is a point where the information from Maven has been captured by the IDE and we don't need to run full Maven builds (e.g., RunAs: Maven Install). As you saw from the RunAs: JUnit test we were able to run the unit test and run it exceptionally fast without Maven. I strongly recommend making your unit tests Eclipse/JUnit-friendly so that you can work efficiently in certain areas. That means hard-code reasable defaults without relying on the maven-surefire-plugin passing in properties from the outside environment. Allow overrides, but code in a usable default into the test.
There are two primary ways to use the debugger; separate/remote process and embedded (within Eclipse). The second is much easier to use but is limited by what you can execute within the Eclipse IDE. The first takes a minor amount of setup but can be re-used to debug code running within application servers on your local and remote machines.
Lets start with a remote debugging session by recalling the profile you were asked to add to either your pom.xml or settings.xml. If you have not done so, you can add it to either at this time.
<profiles> <profile> <!-- tells surefire to run JUnit tests with remote debug --> <id>debugger</id> <activation> <property> <name>debugger</name> </property> </activation> <properties> <surefire.argLine>-Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE</surefire.argLine> </properties> </profile> </profiles>
Add a definition for the "surefire.argLine" within the maven-surefire-plugin declaration. Surefire is already being pulled into the project, this declaration just specifies the extra configuration along with a specific version. Maven will start complaining ig you leave off that version.
<plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire--plugin</artifactId> <version>2.17</version> <configuration> <argLine>${surefire.argLine}</argLine> </configuration> </plugin>
Uncomment (or re-add) your failure test in AppTest.java.
@Test
public void testFail() {
//System.out.println("testFail");
log.info("testFail");
App app = new App();
assertTrue("app didn't return 0", app.returnOne() == 0);
}
Execute a Run As Maven test. My selecting the project, right clicking and chosing the right target. You should see the following error in the console.
Running myorg.mypackage.ex1.AppTest INFO 28-08 23:52:31,809 (AppTest.java:testApp:17) -testApp DEBUG 28-08 23:52:31,821 (App.java:returnOne:11) -Here's One! INFO 28-08 23:52:31,829 (AppTest.java:testFail:25) -testFail DEBUG 28-08 23:52:31,831 (App.java:returnOne:11) -Here's One! Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.409 sec <<< FAILURE! testFail(myorg.mypackage.ex1.AppTest) Time elapsed: 0.016 sec <<< FAILURE! java.lang.AssertionError: app didn't return 0 at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.assertTrue(Assert.java:43) at myorg.mypackage.ex1.AppTest.testFail(AppTest.java:27)
Click on the hyperlink to one of the lines in the project source code in the failure stack trace. Place a breakpoint at that line by double-clicking on the line number. A blue ball should appear to the left of the numbers.
Debug As->Maven build..., change the base directory to a re-usable ${project_loc} variable, assign the "test" goal, and activate the "debugger" profile. Click "Debug" when finished. It will automatically save.
You should see the Maven build start but pause before executing the first JUnit test. Think of this as the "server-side" of the debugger session.
[INFO] --- maven-surefire-plugin:2.17:test (default-test) @ firstSimpleModuleEx --- [INFO] Surefire report directory: /home/jcstaff/workspaces/ejava-class/ejava-student/javase/firstSimpleModuleEx/target/surefire-reports ------------------------------------------------------- T E S T S ------------------------------------------------------- Listening for transport dt_socket at address: 8000
Start the "client-side" of the debugger session by clicking on the bug pulldown at the top of the window. Select debug configurations, double click on Remote Java Application, select your project, and notice the localhost:8000 that came up agrees with your server-side configuration. Press "Debug" when you are ready and answer the prompt to change you view.
The IDE should switch to a debugger view and be waiting at the line you set the breakpoint on. From here you can look at the state of any variables (we don't have any) and step into the next call.
Once you are done with the debugging session you can click continue (agreen right arrow at top) or detach from the server (red swiggly line at top).
Terminate the debugger session, retun to one of the JavaEE or Java-based views. Select a specific JUnit test, test method, package, or entire application and click Debug As JUnit test.
Note the IDE again switches to the Debug view and is stopped at the breakpoint. You may step into the call, look at the state of any variable, or terminate the program (red square at top of window).
I walked you through the harder debugging session setup that is only necessary when the problem is occurring in code running in a remote JVM. If the code can be executed using "Run-As -> JUnit Test", then simply set a breakpoint and use "Debug-As -> JUnut Test"
In this chapter, you were able to integrate your Maven and Eclipse environments. This allows you to leverage the Maven plugins as your core build system and leverage Eclipse for developing the content.
As mentioned, Eclipse will be the primary demonstration environment in class, but you may use other IDEs, like NetBeans, to match personal preferences. It is my belief that anything Eclipse can do -- the other leading IDEs can do as well. There have been many occasions where students very familiar with an alternate IDE have picked up the Maven aspects and handled the integration with their IDE without issue.
You have now completed this exercise and as a part of it you were able to create a project tree, build/test the project manually, build/test the project using Ant build tool, build/test the project using the Maven build system, and integrate the project with an IDE for faster and more detailed development. We have covered a lot but clearly we have only touched a small amount of what can be done. Luckily one doesn't have to know how to do it all right away in order to be productive.
Copyright © 2019 jim stafford (jim.stafford@jhu.edu)
Built on: 2019-08-22 07:09 EST
Abstract
This document contains an introductory exercise for building a Maven-based project for modules that make use of a database and JPA. It will provide step-by-step instructions and explanations of what is being added and why. Many of these steps will become boilerplate for what we need to do for most JPA Maven modules. You go through this in detail here to provide you with exposure of what is required and why certain things are in place in the examples. It is also another chance to gain experience building a Maven project from scratch before we start leveraging templates and parent projects to hide much of the details from us.
Table of Contents
Provide one more chance to build a Maven module using step-by-step instructions
Provide step-by-step instructions for setting up a Maven module for JPA
Provide an introduction to JPA EntityManager CRUD methods
At the completion of this topic, the student shall be able to:
Maven/General project setup
Setup a database for use with module development
Create a Maven project for use with a JPA module
Manage module-specific schema within the database
Add custom SQL tuning properties to their module schema definition
Import a Maven/JPA project into IDE
Automatically generate database schema for project
Refector single Maven/JPA module into a re-suable parent and individual child modules
JPA
Setup a JPA Persistence Unit
Setup a JPA Unit test
Implement tests of JPA EntityManager methods
This chapter will cover setup required to start the development database in server-mode. The database has at least three (3) modes.
In-memory mode
File-based mode
Server-based mode
The in-memory option manages all tables within the memory footprint of the JVM. This is very fast and requires no server-process setup -- which makes it a good option for automated unit tests. However, since the DB instance is created and destroyed with each JVM execution it makes a bad choice for modules relying on multiple tools to pre-populate the database prior to executing tests.
The file-based option stores all information in the filesystem. It is useful for multi-JVM (sequential) setup and post-mortem analysis. However only a single client may access the files at one time. I have seen this used effectively when simulating external databases -- where an external setup script populates the database and the JVM running the unit tests just queries the information as they would do in production. We will use this as an option to server-based mode since we are using separate plugins to initialize the database. We also want to treat our database schema as a first-class artifact for our application -- and not rely on test capabilities to instantiate the database for each test.
The server-based option requires a separate process activated but allows concurrent connections from database user interface while the JVM is under test. This chapter will focus on the setup required to run the database in server mode.
Prepare your environment to run the database in server mode for this exercise by following the instructions defined in Development Environment Setup.
Start the database and web server server in a directory where you wish to create database files. Your h2.jar file source be located in M2_REPO/com/h2database/h2/*/h2*.jar to name at least one location. Another location is JBOSS_HOME/modules/com/h2database/h2/main/h2*.jar
cd /tmp
java -jar h2.jar
This should result in a database server process and access to a web-based database UI through the following local URL: http://localhost:8082/login.jsp
Connect to the database server from the web-based UI.
Driver Class: org.h2.Driver
JDBC URL: jdbc:h2:tcp://localhost:9092/./h2db/ejava</jdbc.url>
User Name: sa
Password:
Look in the directory where you started the server. After connecting with a relative URL using the UI, there should be a new "h2db" directory with one or more files called "ejava*". You want to make sure you use the same URL in the UI and application so you are seeing the same instance of the database.
If you use file-based mode, the connection information would look like the following where "./h2db/ejava" must point to the exact same path your JVM under test uses. This can be a relative or fully-qualified path.
Driver Class: org.h2.Driver
JDBC URL: jdbc:h2:./target/h2db/ejava
User Name: sa
Password:
In this chapter you...
located a copy of the Java archive required to run the server
located a scratch area to run the server
started the server
launched the web-based UI
connected to the server using the web-based UI
This is the only database mode that requires administrative setup. You cannot connect to the database running in-memory but you can connect to the database using file-mode once all other JVMs have released their exclusive lock on the database storage files.
This chapter will put in place a couple of core constructs that will allow Maven and the IDE recognize the elements of your source tree.
Create a root directory for your project and populate with a pom.xml file
<?xml version="1.0"?>
<project
xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>myorg.jpa</groupId>
<artifactId>entityMgrEx</artifactId>
<version>1.0-SNAPSHOT</version>
<name>Entity Manager Exercise</name>
</project>
Define a remote repository to use to download hibernate artifacts
<!-- needed to resolve some hibernate dependencies -->
<repositories>
<repository>
<id>jboss-nexus</id>
<name>JBoss Nexus Repository</name>
<url>https://repository.jboss.org/nexus/content/groups/public-jboss/</url>
</repository>
</repositories>
Add property definitions for versions of dependencies and plugins we will be using.
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.source.version>1.8</java.source.version>
<java.target.version>1.8</java.target.version>
<jboss.host>localhost</jboss.host>
<db.host>${jboss.host}</db.host>
<maven-compiler-plugin.version>3.7.0</maven-compiler-plugin.version>
<maven-jar-plugin.version>3.1.0</maven-jar-plugin.version>
<maven-surefire-plugin.version>2.22.0</maven-surefire-plugin.version>
<sql-maven-plugin.version>1.5</sql-maven-plugin.version>
<h2db.version>1.4.197</h2db.version>
<javax.persistence-api.version>2.2</javax.persistence-api.version>
<hibernate-entitymanager.version>5.3.1.Final</hibernate-entitymanager.version>
<junit.version>4.12</junit.version>
<log4j.version>1.2.17</log4j.version>
<slf4j.version>1.7.25</slf4j.version>
<ejava.version>5.0.0-SNAPSHOT</ejava.version>
</properties>
Add a dependencyManagement section to define and configure the dependencies we will be using to work with JPA. These are passive definitions -- meaning they don't actually add any dependencies to your project. They just define the version and possible exclusions for artifacts so all child/leaf projects stay consistent.
<dependencyManagement>
<dependencies>
<dependency>
<groupId>javax.persistence</groupId>
<artifactId>javax.persistence-api</artifactId>
<version>${javax.persistence-api.version}</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>${hibernate-entitymanager.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
</dependencies>
</dependencyManagement>
Knowing this exercise will always be a single module -- we could do this simpler. However, it is assumed that you will soon take the information you learn here to a real enterprise project and that will have many modules and could benefit from reusing a standard parent configuration. All definitions will be housed in a single module during the conduct of this exercise but the properties, dependencyManagement, and pluginManagement sections we will build below can be lifted and moved to a parent pom in a multi-module project.
Add pluginManagement definitions for certain plugins we will use in this module. Like above -- these are passive definitions that define the configuration for certain plugins when the child/leaf projects chose to use them. Lets start with a simple example and add a few more complex ones later. In this example, we are making sure all uses of the jar plugin are of a specific version.
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>${maven-jar-plugin.version}</version>
</plugin>
</plugins>
</pluginManagement>
</build>
Add the src/main dependencies. This represents what your code depends upon at compile time and runtime.
scope=compile is used when your src/main code depends on the artifact to compile and you wish the design of transitive dependency to automatically bring this dependency with the module.
scope=provided is used when your src/main code depends on the artifact to compile but you do not wish this automatically be brought along when used with downstream clients. Normally this type of artifact is an API and the downstream client will be providing their own version of the API packaged with their provider.
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.persistence</groupId>
<artifactId>javax.persistence-api</artifactId>
<scope>provided</scope>
</dependency>
...
</dependencies>
Notice how the definitions above are lacking a version element. The dependency declaration actively brings the dependency into the project and inherits the definition specified by the dependencyManagement section above.
Add the src/test standard dependencies.
scope=test is used for anything that your src/test code depends upon (but not your src/main) or what your unit tests need at runtime to operate the test. For example, a module may declare a scope=test dependency on h2 database (later) to do some local unit testing and then be ultimately deployed to a postgres server in a downstream client. In this case we are picking JUnit4 as the testing framework, log4j as the logging implementation for commons-logging, and hibernate as the JPA implementation for the jpa-api. We are also adding an extra dependency to allow hibernate slf4j logging to be integrated with log4j.
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<scope>test</scope>
</dependency>
Add a testResources definition to the build section to get src/test/resource files filtered when copied into the target tree. We do this so we have a chance to replace the variables in the persistence.xml and hibernate.properties file.
<build>
<!-- filtering will replace URLs, credentials, etc in the
files copied to the target directory and used during testing.
-->
<testResources>
<testResource>
<directory>src/test/resources</directory>
<filtering>true</filtering>
</testResource>
</testResources>
<pluginManagement>
...
</build>
Add a compiler specification to control the source and target java versions. In this case we are picking up the specific value from property variables set above and can be overridden in the developer's settings.xml or on the command line using system properties.
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven-compiler-plugin.version}</version>
<configuration>
<source>${java.source.version}</source>
<target>${java.target.version}</target>
</configuration>
</plugin>
...
</plugins>
</pluginManagement>
</build>
Add a definition for the maven-surefire-plugin so we can set properties needed for testing.
<build>
<pluginManagement>
<plugins>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${maven-surefire-plugin.version}</version>
<configuration>
<argLine>${surefire.argLine}</argLine>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
At this point, we are just allowing the argLine defined elsewhere to be optionally specified (for debugging). We do not yet have a need for system properties, but if we did the example shows how -Dname=value would be specified within the plugin configuration. The plugin (not pluginManagement) definition in the child pom will include any necessary system properties to be passed to the unit test.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<systemPropertyVariables>
<name1>value1</name1>
<name2>value2</name2>
</systemPropertyVariables>
</configuration>
</plugin>
Add a set of profiles that define H2 and Hibernate as our database and persistence provider. In the example below we are adding two database definitions (that happen to both be the same vendor). One requires the server to be running and the other uses an embedded server and a local filesystem. The embedded version can be easier to test with. The server version can be easier to debug. Activate which one you want with either your settings.xml#activeProfile settings or using a -Pprofile-name argument on the command line. If you already have a settings.xml#activeProfile setting, you can turn it off using -P\!deactivated-profile-name ((bash) or -P!deactivated-profile-name (dos)) and follow it up with -Pactivated-profile-name.
<profiles>
<profile> <!-- H2 server-based DB -->
<id>h2srv</id>
<properties>
<jdbc.driver>org.h2.Driver</jdbc.driver>
<jdbc.url>jdbc:h2:tcp://${db.host}:9092/./h2db/ejava</jdbc.url>
<jdbc.user>sa</jdbc.user>
<jdbc.password/>
<hibernate.dialect>org.hibernate.dialect.H2Dialect</hibernate.dialect>
</properties>
<dependencies>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>${h2db.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
</profile>
<profile> <!-- H2 file-based DB -->
<id>h2db</id>
<activation>
<property>
<name>!jdbcdb</name>
</property>
</activation>
<properties>
<jdbc.driver>org.h2.Driver</jdbc.driver>
<jdbc.url>jdbc:h2:${basedir}/target/h2db/ejava</jdbc.url>
<jdbc.user>sa</jdbc.user>
<jdbc.password/>
<hibernate.dialect>org.hibernate.dialect.H2Dialect</hibernate.dialect>
</properties>
<dependencies>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>${h2db.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
</profile>
</profiles>
Profiles can be used to control which options are enabled at build time to make the module more portable. I also use them to help identify which dependencies are brought in for what reason -- especially for profiles that are configure to always activate.
Perform a test of your pom.xml by issuing a sample build command. All should complete even though there is nothing yet in your source tree.
$ mvn clean test
In this chapter you created a core Maven project definition with the ability to resolve dependencies, build, and potentially test a JPA module. At this point you have an empty but valid Maven project/module. In the following chapters we will add more artifacts to make to build something useful.
This chapter will take you through steps that will populate your database with a (simple) database schema. A database schema is required by any module that directly interacts with a RDMBS. The JPA provider can automatically generate a database schema but that is generally restricted to early development and quick prototypes. A module within the data tier will ultimately be responsible for providing a separate artifact the create and/or migrate the schema from version-to-version. That is typically finalized by humans knowledgable about particular databases and can be aided by tool(s) we introduce in this exercise.
Create a set of ddl scripts in src/main/resources/ddl to handle creating the schema, deleting rows in the schema, and dropping tables in the schema. Make sure each script has the word "create", "delete", or "drop" in its file name to match some search strings we'll use later. Have the database generate a value for the primary key. That value should not be allowed to be null.
`-- src |-- main | |-- java | `-- resources | |-- ddl | | |-- emauto_create.ddl | | |-- emauto_delete.ddl | | `-- emauto_drop.ddl `-- test |-- java `-- resources
We could actually skip this step and have the persistence provider create the table for us. That approach is great for quick Java-first prototypes. However, creating the schema outside of the persistence provider is a more realistic scenario for larger developments.
# src/main/resources/ddl/emauto_create.ddl CREATE TABLE EM_AUTO ( ID BIGINT generated by default as identity (start with 1) not null, MAKE VARCHAR(32), MODEL VARCHAR(32), COLOR VARCHAR(32), MILEAGE INT, CONSTRAINT em_autoPK PRIMARY KEY(ID) ) # src/main/resources/ddl/emauto_delete.ddl DELETE FROM EM_AUTO; # src/main/resources/ddl/emauto_drop.ddl DROP TABLE EM_AUTO if EXISTS;
You can perform a sanity check of the above scripts by pasting them into the DB UI SQL area and executing.
Add the standard database setup and teardown scripts. This allows us to create a legacy database schema and write classes that map to that schema. We will later have the persistence provider create the schema for us when we are in quick prototype mode. First create the reusable portion of the definition in the pluginManagement section. This will define the version, database dependencies, and property information for all to inherit.
<build>
<pluginManagement>
<plugins>
...
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>sql-maven-plugin</artifactId>
<version>${sql-maven-plugin.version}</version>
<dependencies>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>${h2db.version}</version>
</dependency>
</dependencies>
<configuration>
<username>${jdbc.user}</username>
<password>${jdbc.password}</password>
<driver>${jdbc.driver}</driver>
<url>${jdbc.url}</url>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
Next add the potentially project-specific portion to a build-plugins-plugin section that would normally be in the child module. However, when you add this to the module -- do so within a profile that is wired to always run except when the system property -DskipTests is defined. This is a standard maven system property that builders use to build the module and bypass both unit and integration testing. By honoring the property here -- our module will only attempt to work with the database if we ware not skipping tests. Note the !bang-not character means "the absence of this system property".
<profiles>
...
<profile>
<id>testing</id>
<activation>
<property>
<name>!skipTests</name>
</property>
</activation>
<build>
<plugins>
<plugin>
<!-- runs schema against the DB -->
<groupId>org.codehaus.mojo</groupId>
<artifactId>sql-maven-plugin</artifactId>
<executions>
<!-- place execution elements here -->
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
Configure the sql-maven-plugin executions element to run any drop scripts in the source tree before running tests.
<execution>
<id>drop-db-before-test</id>
<phase>process-test-classes</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>main/resources/ddl/**/*drop*.ddl</include>
</includes>
</fileset>
<onError>continue</onError>
</configuration>
</execution>
Note that we are controlling when the scripts are executed using the phase element. This is naming a well known Maven lifecycle phase for the build.
Configure the sql-maven-plugin executions element to run any scripts from the source tree to create schema before running tests.
<execution>
<id>create-db-before-test</id>
<phase>process-test-classes</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>main/resources/ddl/**/*create*.ddl</include>
</includes>
</fileset>
<print>true</print>
</configuration>
</execution>
Configure the sql-maven-plugin executions element to run any populate scripts from the source tree to add rows to the database before running tests.
<execution>
<id>populate-db-before-test</id>
<phase>process-test-classes</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>test/resources/ddl/**/*populate*.ddl</include>
</includes>
</fileset>
</configuration>
</execution>
Configure the sql-maven-plugin executions element to run any drop scripts after testing. You may want to comment this out if you want to view database changes in a GUI after the test.
<execution>
<id>drop-db-after-test</id>
<phase>test</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>main/resources/ddl/**/*drop*.ddl</include>
</includes>
</fileset>
</configuration>
</execution>
Build and run the tests. The schema should show up in the DB UI.
$mvn clean test -P\!h2db -Ph2srv
Remember to turn off (-P!profile-name) the embedded profile (h2db) if active by default and turn on the server profile (h2srv) if you wish to use the server and DB UI while the unit test is running. The DB UI can only inspect the embedded file once all active clients close the file. The backslash is only needed for commands from the bash shell.
In this chapter you added a (trivial) schema definition for your module. This schema definition was used to manage (create, delete, and drop) the schema within the database. Although we will show that schema can be generated automatically by the JPA persistence provider and managed at runtime -- this feature is only feasible for functional unit testing and quick prototypes. Any real application will require a separate database schema artifact finalized by developers to be tuned appropriately for the target database.
In this chapter we are going to add tuning aspects to the schema put in place above. Examples of this include any indexes we believe would enhance the query performance. This example is still quite simple and lacks enough context to determine what would and would not be a helpful index. Simply treat this exercise as a tutorial in putting an index in place when properly identified. Adding the physical files mentioned here could be considered optional if all schema is hand-crafted. You control the contents of each file in a 100% hand-crafted DDL solution. However, for those cases where auto-generated schema files are created, you may want a separate set of files designated for "tuning" the schema that was auto-generated. We will demonstrate using two extra files to create/drop database indexes.
Add a file to add database indexes for your schema
# src/main/resources/ddl/emauto_tuningadd.ddl CREATE INDEX EM_AUTO_MAKEMODEL ON EM_AUTO(MAKE, MODEL);
Wire this new file into your SQL plugin definition for creating schema. Have it run after your table creates. Add an "orderFile" element to the configuration to specify to the plugin that you wish the files be executed in a specific order. Otherwise the order will be non-deterministic and the tuning may get executed before the schema is created.
<execution>
<id>create-db-before-test</id>
<phase>process-test-classes</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<orderFile>ascending</orderFile>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>main/resources/ddl/**/*create*.ddl</include>
<include>main/resources/ddl/**/*tuningadd*.ddl</include>
</includes>
</fileset>
<print>true</print>
</configuration>
</execution>
Add a file to augment the drop script and remove indexes from your schema
# src/main/resources/ddl/emauto_tuningremove.ddl DROP INDEX EM_AUTO_MAKEMODEL if exists;
Wire this new file into your SQL plugin definition for dropping schema. Have it run before your table drops. Add an "orderFile" element to the configuration to specify to the plugin that you wish the files be executed in a specific order.
<configuration>
<autocommit>true</autocommit>
<orderFile>decending</orderFile>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>main/resources/ddl/**/*tuningremove*.ddl</include>
<include>main/resources/ddl/**/*drop*.ddl</include>
</includes>
</fileset>
<onError>continue</onError>
</configuration>
Build the schema for your module
$ mvn clean process-test-classes ... [INFO] --- sql-maven-plugin:1.4:execute (drop-db-before-test) @ entityMgrEx --- [INFO] Executing file: .../src/main/resources/ddl/emauto_drop.ddl [INFO] Executing file: .../src/main/resources/ddl/emauto_tuningremove.ddl [INFO] 2 of 2 SQL statements executed successfully [INFO] [INFO] --- sql-maven-plugin:1.4:execute (create-db-before-test) @ entityMgrEx --- [INFO] Executing file: .../src/main/resources/ddl/emauto_create.ddl [INFO] Executing file: .../entityMgrEx/src/main/resources/ddl/emauto_tuningadd.ddl [INFO] 2 of 2 SQL statements executed successfully
This chapter will add an entity class, the persistence.xml, and associated property file to define the persistence unit.
The entity class represents one or more tables in the database and each instance of the entity class represents a specific row in those tables.
The persistence.xml defines a JPA persistence unit (along with other related XML files and entity class annotations). Instances of a persistence unit is called a persistence context. Instances of the persistence unit are accessed through an EntityManager.
`-- src |-- main | |-- java | | `-- myorg | | `-- entitymgrex | | `-- Auto.java | `-- resources | `-- META-INF | `-- persistence.xml `-- test |-- java `-- resources `-- hibernate.properties
Create a (Plain Old Java Object (POJO)) class to represent an automobile. Use class annotations to provide the following:
# src/main/java/myorg/entitymgrex/Auto.java
package myorg.entitymgrex;
import java.io.Serializable;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.GenerationType;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
import javax.persistence.Table;
@Entity @Table(name="EM_AUTO")
public class Auto implements Serializable {
private static final long serialVersionUID = 1L;
@Id @GeneratedValue(strategy=GenerationType.IDENTITY)
private long id;
private String make;
private String model;
private String color;
private int mileage;
public Auto(){}
public Auto(int id) { this.id=id; }
public long getId() { return id; }
//more getter/setters go here
}
@Entity, @Id, and a default constructor are the only requirements on an entity class. The implementation of java.io.Serializable is not a JPA requirement but will be leveraged by a later example within this exercise.
Add the remaining setter/getter methods to the class. If you are using Eclipse to author the class -- right click->Source->Generate Getters and Setters will generate all of this for you. Since we are using generated primary key values, there is no immediate need for a constructor to set the id. If you add this later, remember to also add a default constructor, which was removed by the compiler when you manually add the first constructor.
public void setMake(String make) {
this.make = make;
}
public int getMileage() { return mileage; }
public void setMileage(int mileage) {
this.mileage = mileage;
}
public String getModel() { return model; }
public void setModel(String model) {
this.model = model;
}
public String getColor() { return color; }
public void setColor(String color) {
this.color = color;
}
You may also want to add a public toString():String method to conveniently print your Auto objects. Eclipse can also generate that on demand and configurable.
@Override
public String toString() {
StringBuilder builder = new StringBuilder();
builder
.append("id=").append(id)
.append(", make=").append(make)
.append(", model=").append(model)
.append(", color=").append(color)
.append(", mileage=").append(mileage);
return builder.toString();
}
Create a META-INF/persistence.xml file to define the persistence unit for our jar file.
persistence-unit name: must match what we place in our JUnit test
provider: specify that this persistence unit is defined for the org.hibernate.jpa.HibernatePersistenceProvider provider.
define provider-specific properties that tell the provider how to obtain a connection to the database as well as some other configuration properties.
The technique to add the provider-specific properties includes somewhat sensitive information like user credentials. If we place them in the persistence.xml file within the src/main tree, these properties will become part of our deployed artifact. To avoid this, we will define them in a separate hibernate.properties file placed in the src/test tree.
# src/main/resources/META-INF/persistence.xml
<?xml version="1.0" encoding="UTF-8"?>
<persistence xmlns="http://java.sun.com/xml/ns/persistence"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd" version="2.0">
<persistence-unit name="entityMgrEx">
<provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>
<properties>
<!-- defined in src/test/resources/hibernate.properties -->
</properties>
</persistence-unit>
</persistence>
Create a hibernate.properties file in src/test/resources to hold information we want to support testing, but may not want to be part of the deployed artifact. Leave the volatile values as variables so they can be expanded into the target tree during compile time.
the variables will be filled in during the build process using "filtering" and the resources plugin.
the show and format_sql options are only turned on during early development and debug.
the jdbc.batch_size property set to 0 is also used during debug. Setting it to this value will eliminate any batching of SQL commands, allowing errors about the commands to be better reported to the developer.
# src/test/resources/hibernate.properties hibernate.connection.url=${jdbc.url} hibernate.connection.driver_class=${jdbc.driver} hibernate.connection.username=${jdbc.user} hibernate.connection.password=${jdbc.password} hibernate.dialect=${hibernate.dialect} #hibernate.hbm2ddl.auto=create hibernate.hbm2ddl.import_files=/ddl/emauto-tuningdrop.ddl,/ddl/emauto-tuning.ddl hibernate.show_sql=true hibernate.format_sql=true #hibernate.jdbc.batch_size=0
Make sure your project still builds. Your area should look something like the following after the build.
$ mvn clean install -P\!h2db -Ph2srv
|-- pom.xml |-- src | |-- main | | |-- java | | | `-- myorg | | | `-- entitymgrex | | | `-- Auto.java | | `-- resources | | |-- ddl | | | |-- emauto_create.ddl | | | |-- emauto_delete.ddl | | | |-- emauto_drop.ddl | | | |-- emauto_tuningadd.ddl | | | `-- emauto_tuningremove.ddl | | `-- META-INF | | `-- persistence.xml | `-- test | `-- resources | `-- hibernate.properties `-- target |-- classes | |-- ddl | | |-- emauto_create.ddl | | |-- emauto_delete.ddl | | |-- emauto_drop.ddl | | |-- emauto_tuningadd.ddl | | `-- emauto_tuningremove.ddl | |-- META-INF | | `-- persistence.xml | `-- myorg | `-- entitymgrex | `-- Auto.class ... `-- test-classes `-- hibernate.properties
You should also check that your hibernate.properties file was filtered by your build.testResources definition provided earlier. The variable definitions within your src/test/resources source file(s) should be replaced with property values from your environment.
$ more src/test/resources/hibernate.properties target/test-classes/hibernate.properties :::::::::::::: src/test/resources/hibernate.properties :::::::::::::: hibernate.dialect=${hibernate.dialect} hibernate.connection.url=${jdbc.url} hibernate.connection.driver_class=${jdbc.driver} hibernate.connection.password=${jdbc.password} hibernate.connection.username=${jdbc.user} #hibernate.hbm2ddl.auto=create hibernate.show_sql=true hibernate.format_sql=true #hibernate.jdbc.batch_size=0 :::::::::::::: target/test-classes/hibernate.properties :::::::::::::: hibernate.dialect=org.hibernate.dialect.H2Dialect hibernate.connection.url=jdbc:h2:tcp://127.0.0.1:9092/./h2db/ejava hibernate.connection.driver_class=org.h2.Driver hibernate.connection.password= hibernate.connection.username=sa #hibernate.hbm2ddl.auto=create #hibernate.hbm2ddl.import_files=/ddl/emauto-tuningdrop.ddl,/ddl/emauto-tuning.ddl hibernate.show_sql=true hibernate.format_sql=true #hibernate.jdbc.batch_size=0
In this chapter you defined a JPA entity class, persistence unit, and properties specific to your database instance. With these tackled, you have the core building blocks in place to begin instantiating the JPA EntityManager in the next chapter. You will notice that the specification of your database properties came from your Maven pom properties instead of hard-coded into individual files. This leveraged the filtering feature of resource files during the build. In the last chapter of this exercise you will be asked to re-factor the Maven database properties into a parent Maven project to be re-used across several projects.
This chapter will create a JUnit test case that will instantiate an EntityManager, perform some basic operations, and log information about the tests.
`-- src `-- test |-- java | `-- myorg | `-- entitymgrex | `-- EntityMgrTest.java `-- resources `-- log4j.xml
Create a JUnit test case to hold your test code. The following is an example of a 4.x JUnit test case that uses @Annotations.
# src/test/java/myorg/entitymgrex/EntityMgrTest.java
package myorg.entitymgrex;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import javax.persistence.Persistence;
import javax.persistence.Query;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import static org.junit.Assert.*;
import org.junit.After;
import org.junit.AfterClass;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.Test;
public class EntityMgrTest {
private static final Logger log = LoggerFactory.getLogger(EntityMgrTest.class);
@Test
public void testTemplate() {
logger.info("testTemplate");
}
}
Provide a setUpClass() method that runs once before all tests that can create the entity manager. This method must be static.
private static final String PERSISTENCE_UNIT = "entityMgrEx";
private static EntityManagerFactory emf;
@BeforeClass
public static void setUpClass() {
log.debug("creating entity manager factory");
emf = Persistence.createEntityManagerFactory(PERSISTENCE_UNIT);
}
Provide a setUp() method that will be called before each testMethod is executed. Have this method create an entity manager for the tests to use.
private EntityManager em;
@Before
public void setUp() throws Exception {
log.debug("creating entity manager");
em = emf.createEntityManager();
//cleanup();
}
Provide a tearDown() method that will be called after each testMethod. Have this flush all remaining items in the persistence context to the database and close the entity manager.
@After
public void tearDown() throws Exception {
if (em!=null) {
logger.debug("tearDown() started, em={}", em);
em.getTransaction().begin();
em.flush();
//logAutos();
em.getTransaction().commit();
em.close();
logger.debug("tearDown() complete, em={}", em);
em=null;
}
}
Provide a tearDownClass() method that will be called after all testMethods have completed. This method must be static and should close the entity manager factory.
@AfterClass
public static void tearDownClass() {
if (emf!=null) {
logger.debug("closing entity manager factory");
emf.close();
emf=null;
}
}
Add in a logAutos() method to query and print all autos in the database. Do this after flushing the entity manager in the tearDown() method so you can see the changes from the previous test. The following example uses the entity manager to create an ad-hoc EJB-QL statement.
@After
public void tearDown() throws Exception {
...
em.flush();
logAutos();
em.getTransaction().commit();
...
}
public void logAutos() {
Query query = em.createQuery("select a from Auto as a");
for (Object o: query.getResultList()) {
logger.info("EM_AUTO: {}", o);
}
}
You might also want to add a cleanup() to clear out the Auto table between tests. The example below uses the entity manager to create a native SQL statement.
@Before
public void setUp() throws Exception {
...
em = emf.createEntityManager();
cleanup();
}
public void cleanup() {
em.getTransaction().begin();
Query query = em.createNativeQuery("delete from EM_AUTO");
int rows = query.executeUpdate();
em.getTransaction().commit();
logger.info("removed {} rows", rows);
}
Add a log4j.xml file to src/test/resources that has your desired settings. The one below produces less timestamp information at the console and more details in the logfile.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
<log4j:configuration
xmlns:log4j="http://jakarta.apache.org/log4j/"
debug="false">
<appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
<param name="Target" value="System.out"/>
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern"
value="(%F:%M:%L) -%m%n"/>
</layout>
</appender>
<appender name="logfile" class="org.apache.log4j.RollingFileAppender">
<param name="File" value="target/log4j-out.txt"/>
<param name="Append" value="false"/>
<param name="MaxFileSize" value="100KB"/>
<param name="MaxBackupIndex" value="1"/>
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern"
value="%-5p %d{dd-MM HH:mm:ss,SSS} [%c] (%F:%M:%L) -%m%n"/>
</layout>
</appender>
<logger name="myorg">
<level value="debug"/>
<appender-ref ref="logfile"/>
</logger>
<root>
<priority value="fatal"/>
<appender-ref ref="CONSOLE"/>
</root>
</log4j:configuration>
Although it might be a bit entertaining to set the priority of the root appender to debug to see everything the persistence provider has to say, it is quite noisy. Consider changing to root priority to fatal so that a majority of the log statements are yours.
You should be able to build and test your module at this time.
$ mvn clean test Running myorg.entitymgrex.EntityMgrTest (EntityMgrTest.java:setUpClass:25) -creating entity manager factory (EntityMgrTest.java:setUp:31) -creating entity manager Hibernate: delete from EM_AUTO (EntityMgrTest.java:cleanup:58) -removed 0 rows (EntityMgrTest.java:testTemplate:75) -testTemplate (EntityMgrTest.java:tearDown:39) -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@3e52a475 Hibernate: select auto0_.id as id0_, auto0_.color as color0_, auto0_.make as make0_, auto0_.mileage as mileage0_, auto0_.model as model0_ from EM_AUTO auto0_ (EntityMgrTest.java:tearDown:45) -tearDown() complete, em=org.hibernate.ejb.EntityManagerImpl@3e52a475 (EntityMgrTest.java:tearDownClass:69) -closing entity manager factory Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.337 sec Results : Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 ... [INFO] BUILD SUCCESS
If you tests failed and there is no key information printed, try raising the verbosity for all org.hibernate classes to DEBUG.
# src/test/resources/log4j.xml <logger name="org.hibernate"> <level value="trace"/> <appender-ref ref="logfile"/> </logger>
Check that you have the following artifacts in your project tree.
|-- pom.xml
|-- src
| |-- main
| | |-- java
| | | `-- myorg
| | | `-- entitymgrex
| | | `-- Auto.java
| | `-- resources
| | |-- ddl
| | | |-- emauto_create.ddl
| | | |-- emauto_delete.ddl
| | | |-- emauto_drop.ddl
| | | |-- emauto_tuningadd.ddl
| | | `-- emauto_tuningremove.ddl
| | `-- META-INF
| | `-- persistence.xml
| `-- test
| |-- java
| | `-- myorg
| | `-- entitymgrex
| | `-- EntityMgrTest.java
| `-- resources
| |-- hibernate.properties
| `-- log4j.xml
`-- target
|-- antrun
| `-- build-main.xml
|-- classes
| |-- ddl
| | |-- emauto_create.ddl
| | |-- emauto_delete.ddl
| | |-- emauto_drop.ddl
| | |-- emauto_tuningadd.ddl
| | `-- emauto_tuningremove.ddl
| |-- META-INF
| | `-- persistence.xml
| `-- myorg
| `-- entitymgrex
| `-- Auto.class
|-- generated-sources
| `-- annotations
|-- generated-test-sources
| `-- test-annotations
|-- h2db
| `-- ejava.h2.db
|-- log4j-out.txt
|-- maven-status
| `-- maven-compiler-plugin
| |-- compile
| | `-- default-compile
| | |-- createdFiles.lst
| | `-- inputFiles.lst
| `-- testCompile
| `-- default-testCompile
| |-- createdFiles.lst
| `-- inputFiles.lst
|-- surefire-reports
| |-- myorg.entitymgrex.EntityMgrTest.txt
| `-- TEST-myorg.entitymgrex.EntityMgrTest.xml
`-- test-classes
|-- hibernate.properties
|-- log4j.xml
`-- myorg
`-- entitymgrex
`-- EntityMgrTest.class
In this chapter you created an EntityManager, managed the transaction, and closed the EntityManager within the JUnit framework set of callbacks. JUnit will create a new instance of the class and execute @Before, @Test, @After in that order for each @Test. @BeforeClass is run before the first @Before/@Test and @AfterClass is run after the last @Test/@After and must be static.
There are numerous ways to setup the transaction properties. Some make it a practice to rollback the transaction as a part of each @After (after an em.flush() of course). There is nothing wrong with managing the transaction within the @Test for certain tests but make it a point *NOT* to attempt to manage the transaction within code placed in the src/main tree without some careful consideration. JavaEE is a framework and transaction management is a key part of that framework. You would be outside the framework if you attempt to directly manage a tranaction in your src/main production code. Much more on that later.
Over the years/versions, Eclipse has progressed from being ignorant of Maven (with all integration coming from the Maven side) to being very much integrated with Maven. In that later/integrated mode, Eclipse will try really hard to do the right thing within Eclipse for what was defined to be done outside of Eclipse. For example, Eclipse will turn Maven dependencies directly into an Eclipse build path. There exists, however, some plugins that Eclipse has yet to learn about and will alert you to that fact. Many of these have no role within Eclipse and you simply need to explicitly give Eclipse instruction to ignore the plugin. Luckily these cases get fewer and fewer each year and Eclipse will update your pom.xml with the necessary configuration when it is needed.
Import the project into Eclipse using "Existing Maven Projects" option. You should see an error for the maven-sql-plugin. Ignore the error and continue with the import. We will address the error in the next step.
Add the following profile to your pom.xml. The profile is activated when the m2e.version property is defined -- which is a property m2e (Maven To Eclipse) sets within Eclipse.
<profiles>
...
<!-- tell Eclipse what to do with some of the plugins -->
<profile>
<id>m2e</id>
<activation>
<property>
<name>m2e.version</name>
</property>
</activation>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.codehaus.mojo</groupId>
<artifactId>sql-maven-plugin</artifactId>
<versionRange>[1.0.0,)</versionRange>
<goals>
<goal>execute</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore />
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</profile>
...
</profiles>
The red error marks should have cleared from the pom.xml editor. The above provide is activated when the "m2e.version" is present within Eclipse and tells Eclipse to ignore the sql-maven-plugin.
This chapter will demonstrate various methods to perform create, read, update, and delete (CRUD) operations on the database using the EntityManager, the persistence unit/context, and the entity class.
The following changes are all made to the EntityMgrTest.java JUnit test class. Everything is being done within this file to keep things simple. This test case is playing the role of the business and persistence (Data Access Object (DAO)) logic.
add a testCreate() method to test the functionality of EntityManager.create(). This will add an object to the database once associated with a transaction.
@Test
public void testCreate() {
logger.info("testCreate");
Auto car = new Auto();
car.setMake("Chrysler");
car.setModel("Gold Duster");
car.setColor("Gold");
car.setMileage(60*1000);
logger.info("creating auto: {}", car);
em.persist(car);
}
-testCreate -creating auto:myorg.entitymgrex.Auto@140984b, id=0, make=Chrysler, model=Gold Duster, color=Gold, mileage=60000 -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@3ac93e -EM_AUTO:myorg.entitymgrex.Auto@140984b, id=1, make=Chrysler, model=Gold Duster, color=Gold, mileage=60000 -removed 1 rows
add a testMultiCreate() to test creating several objects. This should also help verify that unique primary keys are being generated.
@Test
public void testMultiCreate() {
logger.info("testMultiCreate");
for(int i=0; i<5; i++) {
Auto car = new Auto();
car.setMake("Plymouth " + i);
car.setModel("Grand Prix");
car.setColor("Green");
car.setMileage(80*1000);
logger.info("creating auto: {}", car);
em.persist(car);
}
}
-testMultiCreate -creating auto:myorg.entitymgrex.Auto@c3e9e9, id=0, make=Plymouth 0, model=Grand Prix, color=Green, mileage=80000 -creating auto:myorg.entitymgrex.Auto@31f2a7, id=0, make=Plymouth 1, model=Grand Prix, color=Green, mileage=80000 -creating auto:myorg.entitymgrex.Auto@131c89c, id=0, make=Plymouth 2, model=Grand Prix, color=Green, mileage=80000 -creating auto:myorg.entitymgrex.Auto@1697b67, id=0, make=Plymouth 3, model=Grand Prix, color=Green, mileage=80000 -creating auto:myorg.entitymgrex.Auto@24c4a3, id=0, make=Plymouth 4, model=Grand Prix, color=Green, mileage=80000 -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@1e9c82e -EM_AUTO:myorg.entitymgrex.Auto@c3e9e9, id=2, make=Plymouth 0, model=Grand Prix, color=Green, mileage=80000 -EM_AUTO:myorg.entitymgrex.Auto@31f2a7, id=3, make=Plymouth 1, model=Grand Prix, color=Green, mileage=80000 -EM_AUTO:myorg.entitymgrex.Auto@131c89c, id=4, make=Plymouth 2, model=Grand Prix, color=Green, mileage=80000 -EM_AUTO:myorg.entitymgrex.Auto@1697b67, id=5, make=Plymouth 3, model=Grand Prix, color=Green, mileage=80000 -EM_AUTO:myorg.entitymgrex.Auto@24c4a3, id=6, make=Plymouth 4, model=Grand Prix, color=Green, mileage=80000
add a testFind() to test the ability to find an object by its primary key value.
@Test
public void testFind() {
logger.info("testFind");
Auto car = new Auto();
car.setMake("Ford");
car.setModel("Bronco II");
car.setColor("Red");
car.setMileage(0*1000);
logger.info("creating auto: {}", car);
em.persist(car);
//we need to associate the em with a transaction to get a
//primary key generated and assigned to the auto
em.getTransaction().begin();
em.getTransaction().commit();
Auto car2 = em.find(Auto.class, car.getId());
assertNotNull("car not found:" + car.getId(), car2);
logger.info("found car: {}", car2);
}
-testFind -creating auto:myorg.entitymgrex.Auto@aae86e, id=0, make=Ford, model=Bronco II, color=Red, mileage=0 -found car:myorg.entitymgrex.Auto@aae86e, id=7, make=Ford, model=Bronco II, color=Red, mileage=0 -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@97d026 -EM_AUTO:myorg.entitymgrex.Auto@aae86e, id=7, make=Ford, model=Bronco II, color=Red, mileage=0
add a getReference() to test the ability to get a reference to an object. With such a shallow object, this will act much like find().
@Test
public void testGetReference() {
logger.info("testGetReference");
Auto car = new Auto();
car.setMake("Ford");
car.setModel("Escort");
car.setColor("Red");
car.setMileage(0*1000);
logger.info("creating auto: {}", car);
em.persist(car);
//we need to associate the em with a transaction to get a
//primary key generated and assigned to the auto
em.getTransaction().begin();
em.getTransaction().commit();
Auto car2 = em.getReference(Auto.class, car.getId());
assertNotNull("car not found:" + car.getId(), car2);
logger.info("found car: {}", car2);
}
-testGetReference -creating auto:myorg.entitymgrex.Auto@608760, id=0, make=Ford, model=Escort, color=Red, mileage=0 -found car:myorg.entitymgrex.Auto@608760, id=8, make=Ford, model=Escort, color=Red, mileage=0 -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@157ea4a -EM_AUTO:myorg.entitymgrex.Auto@608760, id=8, make=Ford, model=Escort, color=Red, mileage=0
add a testUpdate() method to test the ability to have the setter() of a managed ubject update the database.
@Test
public void testUpdate() {
logger.info("testUpdate");
Auto car = new Auto();
car.setMake("Pontiac");
car.setModel("Gran Am");
car.setColor("Red");
car.setMileage(0*1000);
logger.info("creating auto: {}", car);
em.persist(car);
//we need to associate the em with a transaction to get a
//primary key generated and assigned to the auto
em.getTransaction().begin();
em.getTransaction().commit();
for(int mileage=car.getMileage(); mileage<(100*1000); mileage+=20000) {
//here's where the update is done
car.setMileage(mileage);
//commit the update to the database for query
em.getTransaction().begin();
em.getTransaction().commit();
//inspect database for value
int value = getMileage(car.getId());
assertTrue("unexpected mileage:" + value, value == mileage);
logger.info("found mileage: {}", value);
}
}
private int getMileage(long id) {
Query query =
em.createQuery("select a.mileage from Auto as a where a.id=:pk");
query.setParameter("pk", id);
return (Integer)query.getSingleResult();
}
-testUpdate -creating auto:myorg.entitymgrex.Auto@6a3960, id=0, make=Pontiac, model=Gran Am, color=Red, mileage=0 -found mileage:0 -found mileage:20000 -found mileage:40000 -found mileage:60000 -found mileage:80000 -EM_AUTO:myorg.entitymgrex.Auto@6a3960, id=9, make=Pontiac, model=Gran Am, color=Red, mileage=80000
add a testMerge() method to test the ability to perform updates based on the current values of a detached object. Note that we are using Java serialization to simulate sending a copy of the object to/from a remote process and then performing the merge based on the updated object.
@Test
public void testMerge() throws Exception {
logger.info("testMerge");
Auto car = new Auto();
car.setMake("Chrystler");
car.setModel("Concord");
car.setColor("Red");
car.setMileage(0*1000);
logger.info("creating auto: {}", car);
car = em.merge(car); //using merge to persist new
//we need to associate the em with a transaction to get a
//primary key generated and assigned to the auto
em.getTransaction().begin();
em.getTransaction().commit();
for(int mileage=(10*1000); mileage<(100*1000); mileage+=20000) {
//simulate sending to remote system for update
Auto car2 = updateMileage(car, mileage);
//verify the object is not being managed by the EM
assertFalse("object was managed", em.contains(car2));
assertTrue("object wasn't managed", em.contains(car));
assertTrue("mileage was same",
car.getMileage() != car2.getMileage());
//commit the update to the database for query
em.merge(car2);
assertTrue("car1 not merged:" + car.getMileage(),
car.getMileage() == mileage);
em.getTransaction().begin();
em.getTransaction().commit();
//inspect database for value
int value = getMileage(car.getId());
assertTrue("unexpected mileage:" + value, value == mileage);
logger.info("found mileage:" + value);
}
}
private Auto updateMileage(Auto car, int mileage) throws Exception {
//simulate sending the object to a remote system
ByteArrayOutputStream bos = new ByteArrayOutputStream();
ObjectOutputStream oos = new ObjectOutputStream(bos);
oos.writeObject(car);
oos.close();
//simulate receiving an update to the object from remote system
ByteArrayInputStream bis =
new ByteArrayInputStream(bos.toByteArray());
ObjectInputStream ois = new ObjectInputStream(bis);
Auto car2 = (Auto)ois.readObject();
ois.close();
//here's what they would have changed in remote process
car2.setMileage(mileage);
return car2;
}
-testMerge -creating auto:myorg.entitymgrex.Auto@147358f, id=0, make=Chrystler, model=Concord, color=Red, mileage=0 -found mileage:10000 -found mileage:30000 -found mileage:50000 -found mileage:70000 -found mileage:90000 -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@1b4c1d7 -EM_AUTO:myorg.entitymgrex.Auto@147358f, id=10, make=Chrystler, model=Concord, color=Red, mileage=90000
add a testRemove() method to verify that we can delete objects from the database.
@Test
public void testRemove() {
logger.info("testRemove");
Auto car = new Auto();
car.setMake("Jeep");
car.setModel("Cherokee");
car.setColor("Green");
car.setMileage(30*1000);
logger.info("creating auto: {}", car);
em.persist(car);
//we need to associate the em with a transaction to get a
//primary key generated and assigned to the auto
em.getTransaction().begin();
em.getTransaction().commit();
Auto car2 = em.find(Auto.class, car.getId());
assertNotNull("car not found:" + car.getId(), car2);
logger.info("found car: {}", car2);
//now remove the car
logger.info("removing car: {}", car);
em.remove(car);
//we need to associate the em with a transaction to
//physically remove from database
em.getTransaction().begin();
em.getTransaction().commit();
Auto car3 = em.find(Auto.class, car.getId());
assertNull("car found", car3);
}
-testRemove -creating auto:myorg.entitymgrex.Auto@28305d, id=0, make=Jeep, model=Cherokee, color=Green, mileage=30000 -found car:myorg.entitymgrex.Auto@28305d, id=11, make=Jeep, model=Cherokee, color=Green, mileage=30000 -removing car:myorg.entitymgrex.Auto@28305d, id=11, make=Jeep, model=Cherokee, color=Green, mileage=30000
In this chapter you worked with several of the EntityManager CRUD methods witin the context of a JUnit test case and Maven project. This should have given you a decent taste of what can be done with the JPA EntityManager and how to bring this all together within a Maven module. In the next chapter we will tack on a few more useful features to know.
In a previous chapter, you manually created a set of DDL files to create schema, delete rows from the schema in the database, and drop the schema from the database. Since your persistence provider knows how to work with schema, you can optionally get it to create schema for you rather than generating it manually. Even if you are working with legacy schema (and won't be changing the database), it is extremely helpful to see the persistence providers version of the schema to be able to more quickly determine a mis-match in the mapping rather than waiting until runtime testing. In order to add schema generation to your projects you can add one of the following; runtime schema generation or compile-time schema generation. Runtime schema generation is fine for examples and small prototypes, but compile-time generation is suitable for more realistic development scenarios.
runtime schema generation can be added to your project by adding the following property to your persistence-unit or hibernate.properties. Coldstart your database, comment out your SQL plugin, and re-run your tests if you want to verify the above will create the database at runtime.
#persistence.xml
<property name="hibernate.hbm2ddl.auto" value="create"/>
#hibernate.properties
hibernate.hbm2ddl.auto=create
A set of files for schema can be generated by adding a standard set of properties to the persistence.xml properties element.
<properties>
<property name="javax.persistence.schema-generation.scripts.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.scripts.create-target" value="target/classes/ddl/entityMgrEx-JPAcreate.ddl"/>
<property name="javax.persistence.schema-generation.scripts.drop-target" value="target/classes/ddl/entityMgrEx-JPAdrop.ddl"/>
</properties>
With the above configuration in place, the persistence unit will create two files in the target/classes/ddl directory that represent the JPA provider's view of the mapping.
target/classes/ddl/ |-- emauto_create.ddl |-- emauto_delete.ddl |-- emauto_drop.ddl |-- emauto_tuningadd.ddl |-- emauto_tuningremove.ddl |-- entityMgrEx-JPAcreate.ddl <== generated `-- entityMgrEx-JPAdrop.ddl <== by the persistence unit
The primary downfall in this approach is that the schema is generated too late for us to use the maven plugin to populate schema and it will execute this behavior all the way into production.
compile-time schema generation can be moved forward in the build cycle by instantiating the persistence unit twice; once in a small program designed only to generate schema and once for our unit tests. I have wrapped that small program in a Maven plugin which we can install in our pom. It can be configured some. However, since I wrote it for use with this course -- it pretty much does what we want without much configuration.
Add the following plugin definition to the pluginManagement section of your pom.xml. This will define the core behavor of the jpa-schemagen-maven-plugin to execute the generate goal. By default it executes during the process-test-classes phase.
<build>
<pluginManagement>
<plugins>
...
<plugin>
<groupId>info.ejava.utils.jpa</groupId>
<artifactId>jpa-schemagen-maven-plugin</artifactId>
<version>${ejava.version}</version>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
</plugin>
...
</plugins>
</pluginManagement>
</build>
Add the following active declaration to you pom to activate the plugin and fill in the module-specifics. We could optionally add it to the database profiles.
...
</pluginManagement>
<plugins>
<plugin>
<artifactId>jpa-schemagen-maven-plugin</artifactId>
<groupId>info.ejava.utils.jpa</groupId>
<configuration>
<persistenceUnit>entityMgrEx</persistenceUnit>
</configuration>
</plugin>
</plugins>
</build>
Build your module and notice the generated JPA.ddl files
$ mvn clean process-test-classes
...
[INFO] --- jpa-schemagen-maven-plugin:5.0.0-SNAPSHOT:generate (default) @ entityMgrEx ---
[INFO] Generating database schema for: entityMgrEx
[INFO] removing existing target file:/Users/jim/proj/784/entityMgrEx/target/classes/ddl/entityMgrEx-drop.ddl
[INFO] removing existing target file:/Users/jim/proj/784/entityMgrEx/target/classes/ddl/entityMgrEx-create.ddl
Aug 14, 2018 10:28:50 PM org.hibernate.jpa.internal.util.LogHelper logPersistenceUnitInformation
INFO: HHH000204: Processing PersistenceUnitInfo [
name: entityMgrEx
...]
Aug 14, 2018 10:28:50 PM org.hibernate.Version logVersion
...
INFO: HHH000476: Executing import script 'org.hibernate.tool.schema.internal.exec.ScriptSourceInputNonExistentImpl@10850d17'
...
---
---
target/classes/ddl/
|-- emauto_create.ddl
|-- emauto_delete.ddl
|-- emauto_drop.ddl
|-- emauto_tuningadd.ddl
|-- emauto_tuningremove.ddl
|-- entityMgrEx-JPAcreate.ddl <== generated thru
|-- entityMgrEx-JPAdrop.ddl <===== configuration in persistence.xml
|-- entityMgrEx-create.ddl <== generated thru
`-- entityMgrEx-drop.ddl <===== plugin we just added
(Optionally) update your SQL plugin defintion added in previous chapter to reference the dynamically generated schema in the target tree.
(Optionally) update your persistence.xml to turn off schema generation from within all uses of the persistence unit.
Eclipse will again report a plugin error within the pom.xml editor. Add the following definition to the lifecycle-mapping plugin to have the error ignored.
<pluginExecution>
<pluginExecutionFilter>
<groupId>info.ejava.utils.jpa</groupId>
<artifactId>jpa-schemagen-maven-plugin</artifactId>
<versionRange>[5.0.0-SNAPSHOT,)</versionRange>
<goals>
<goal>generate</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore/>
</action>
</pluginExecution>
In this chapter you configured a Maven project to create a set of file artifacts as a part of the build that represent what the persistence provider believes the database should look like. You can optionally directly use this as a part of your module's database schema population, use it as a starting reference to manually create schema, or *most important* gain insight into what the persistence provider thinks your persistence unit is defined. This will save you some ignorant bliss that is usually followed by hours of debugging an incorrect mapping.
Since you will likely have many JPA modules in your enterprise application, lets take a moment to break the current module into a parent and child before you quit. That way you can better visualize which parts are specific to the child module and which are reusable from a common parent.
Create a sibling module called ../jpa-parent
$ mkdir ../jpa-parent
Add the module definition (../jpa-parent/pom.xml)
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>myorg.jpa</groupId>
<artifactId>jpa-parent</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>pom</packaging>
<name>JPA Parent POM</name>
<description>
This parent pom is intended to provide common and re-usable
definitions and constructs across JPA projects.
</description>
</project>
Add the following parent declaration to your existing module. The relativePath is only useful if you find yourself changing the parent pom on a frequent basis. Otherwise the parent module can be found in the localRepository once it has been installed.
<parent>
<groupId>myorg.jpa</groupId>
<artifactId>jpa-parent</artifactId>
<version>1.0-SNAPSHOT</version>
<relativePath>../jpa-parent</relativePath>
</parent>
<groupId>myorg.jpa</groupId>
<artifactId>entityMgrEx-child</artifactId>
<name>Entity Manager Exercise</name>
Verify your project still builds. This will verify your relativePath is correct.
$mvn clean verify ... [INFO] BUILD SUCCESS
Move the following constructs from the entityMgrEx module to the jpa-parent module. These represent the *passive* definitions that will not directly impact the child module until the child requests that feature. Your child module should still have the same build and test functionality except now it should look a little smaller. One could also make a case for moving some of the SQL/DDL script execution definitions also to the parent -- which would make this module almost of trivial size.
properties
repositories
dependencyManagement
pluginManagement
select profiles
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>myorg.jpa</groupId>
<artifactId>jpa-parent</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>pom</packaging>
<name>JPA Parent POM</name>
<description>
This parent pom is intended to provide common and re-usable
definitions and constructs across JPA projects.
</description>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.source.version>1.8</java.source.version>
<java.target.version>1.8</java.target.version>
<jboss.host>localhost</jboss.host>
<db.host>${jboss.host}</db.host>
<maven-compiler-plugin.version>3.7.0</maven-compiler-plugin.version>
<maven-jar-plugin.version>3.1.0</maven-jar-plugin.version>
<maven-surefire-plugin.version>2.22.0</maven-surefire-plugin.version>
<sql-maven-plugin.version>1.5</sql-maven-plugin.version>
<h2db.version>1.4.197</h2db.version>
<javax.persistence-api.version>2.2</javax.persistence-api.version>
<hibernate-entitymanager.version>5.3.1.Final</hibernate-entitymanager.version>
<junit.version>4.12</junit.version>
<log4j.version>1.2.17</log4j.version>
<slf4j.version>1.7.25</slf4j.version>
<ejava.version>5.0.0-SNAPSHOT</ejava.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>javax.persistence</groupId>
<artifactId>javax.persistence-api</artifactId>
<version>${javax.persistence-api.version}</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>${hibernate-entitymanager.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
</dependencies>
</dependencyManagement>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven-compiler-plugin.version}</version>
<configuration>
<source>${java.source.version}</source>
<target>${java.target.version}</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>${maven-jar-plugin.version}</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${maven-surefire-plugin.version}</version>
<configuration>
<argLine>${surefire.argLine}</argLine>
<systemPropertyVariables>
<property.name>value</property.name>
</systemPropertyVariables>
</configuration>
</plugin>
<plugin>
<groupId>info.ejava.utils.jpa</groupId>
<artifactId>jpa-schemagen-maven-plugin</artifactId>
<version>${ejava.version}</version>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>sql-maven-plugin</artifactId>
<version>${sql-maven-plugin.version}</version>
<dependencies>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>${h2db.version}</version>
</dependency>
</dependencies>
<configuration>
<username>${jdbc.user}</username>
<password>${jdbc.password}</password>
<driver>${jdbc.driver}</driver>
<url>${jdbc.url}</url>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
<profiles>
<profile> <!-- H2 server-based DB -->
<id>h2srv</id>
<properties>
<jdbc.driver>org.h2.Driver</jdbc.driver>
<jdbc.url>jdbc:h2:tcp://${db.host}:9092/./h2db/ejava</jdbc.url>
<jdbc.user>sa</jdbc.user>
<jdbc.password/>
<hibernate.dialect>org.hibernate.dialect.H2Dialect</hibernate.dialect>
</properties>
<dependencies>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>${h2db.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
</profile>
<profile> <!-- H2 file-based DB -->
<id>h2db</id>
<activation>
<property>
<name>!jdbcdb</name>
</property>
</activation>
<properties>
<jdbc.driver>org.h2.Driver</jdbc.driver>
<jdbc.url>jdbc:h2:${basedir}/target/h2db/ejava</jdbc.url>
<jdbc.user>sa</jdbc.user>
<jdbc.password/>
<hibernate.dialect>org.hibernate.dialect.H2Dialect</hibernate.dialect>
</properties>
<dependencies>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>${h2db.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
</profile>
<profile>
<id>testing</id>
<activation>
<property>
<name>!skipTests</name>
</property>
</activation>
<build>
<plugins>
<plugin>
<!-- runs schema against the DB -->
<groupId>org.codehaus.mojo</groupId>
<artifactId>sql-maven-plugin</artifactId>
<executions>
<!-- place execution elements here -->
<execution>
<id>drop-db-before-test</id>
<phase>process-test-classes</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<orderFile>decending</orderFile>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>main/resources/ddl/**/*tuningremove*.ddl</include>
<include>main/resources/ddl/**/*drop*.ddl</include>
</includes>
</fileset>
<onError>continue</onError>
</configuration>
</execution>
<execution>
<id>create-db-before-test</id>
<phase>process-test-classes</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<orderFile>ascending</orderFile>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>main/resources/ddl/**/*create*.ddl</include>
<include>main/resources/ddl/**/*tuningadd*.ddl</include>
</includes>
</fileset>
<print>true</print>
</configuration>
</execution>
<execution>
<id>populate-db-before-test</id>
<phase>process-test-classes</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>test/resources/ddl/**/*populate*.ddl</include>
</includes>
</fileset>
</configuration>
</execution>
<!--
<execution>
<id>drop-db-after-test</id>
<phase>test</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>main/resources/ddl/**/*drop*.ddl</include>
</includes>
</fileset>
</configuration>
</execution>
-->
</executions>
</plugin>
</plugins>
</build>
</profile>
<!-- tell Eclipse what to do with some of the plugins -->
<profile>
<id>m2e</id>
<activation>
<property>
<name>m2e.version</name>
</property>
</activation>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.codehaus.mojo</groupId>
<artifactId>sql-maven-plugin</artifactId>
<versionRange>[1.0.0,)</versionRange>
<goals>
<goal>execute</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore />
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</profile>
</profiles>
</project>
Leave the following the child project. This is a collection of *active* project constructs.
plugins
dependencies
module-specific properties
profiles that declare plugins and dependencies
<?xml version="1.0"?>
<project
xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>myorg.jpa</groupId>
<artifactId>jpa-parent</artifactId>
<version>1.0-SNAPSHOT</version>
<relativePath>../jpa-parent</relativePath>
</parent>
<groupId>myorg.jpa</groupId>
<artifactId>entityMgrEx</artifactId>
<version>1.0-SNAPSHOT</version>
<name>Entity Manager Exercise</name>
<build>
<!-- filtering will replace URLs, credentials, etc in the
files copied to the target directory and used during testing.
-->
<testResources>
<testResource>
<directory>src/test/resources</directory>
<filtering>true</filtering>
</testResource>
</testResources>
<plugins>
<plugin>
<artifactId>jpa-schemagen-maven-plugin</artifactId>
<groupId>info.ejava.utils.jpa</groupId>
<configuration>
<persistenceUnit>entityMgrEx</persistenceUnit>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.persistence</groupId>
<artifactId>javax.persistence-api</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<profiles>
<profile>
<id>testing</id>
<activation>
<property>
<name>!skipTests</name>
</property>
</activation>
<build>
<plugins>
<plugin>
<!-- runs schema against the DB -->
<groupId>org.codehaus.mojo</groupId>
<artifactId>sql-maven-plugin</artifactId>
<executions>
<!-- place execution elements here -->
<execution>
<id>drop-db-before-test</id>
<phase>process-test-classes</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<orderFile>decending</orderFile>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>main/resources/ddl/**/*tuningremove*.ddl</include>
<include>main/resources/ddl/**/*drop*.ddl</include>
</includes>
</fileset>
<onError>continue</onError>
</configuration>
</execution>
<execution>
<id>create-db-before-test</id>
<phase>process-test-classes</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<orderFile>ascending</orderFile>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>main/resources/ddl/**/*create*.ddl</include>
<include>main/resources/ddl/**/*tuningadd*.ddl</include>
</includes>
</fileset>
<print>true</print>
</configuration>
</execution>
<execution>
<id>populate-db-before-test</id>
<phase>process-test-classes</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>test/resources/ddl/**/*populate*.ddl</include>
</includes>
</fileset>
</configuration>
</execution>
<!--
<execution>
<id>drop-db-after-test</id>
<phase>test</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<autocommit>true</autocommit>
<fileset>
<basedir>${basedir}/src</basedir>
<includes>
<include>main/resources/ddl/**/*drop*.ddl</include>
</includes>
</fileset>
</configuration>
</execution>
-->
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
</project>
Verify your project still builds. This will verify your relativePath is correct.
$mvn clean verify ... [INFO] BUILD SUCCESS
Optionally change your jpa-parent dependency to the class examples base parent project.
<parent>
<groupId>info.ejava.examples.build</groupId>
<artifactId>dependencies</artifactId>
<version>x.x.x-SNAPSHOT</version>
<relativePath>build/dependencies/pom.xml</relativePath>
</parent>
Replace x.x.x-SNAPSHOT with the correct version for class.
It is never a good idea to declare *active* POM constructs in a parent of a multi-module project unless *ALL* child modules are of the same purpose. Strive for parent Maven projects to define standards to follow without inserting unecessary dependencies or other constructs.
In this chapter you re-factored your module into a reusable jpa-parent module (that could be replaced by the class example) and child module. The child is then freed of defining version#s and other constructs that can be shared across several modules -- allowing smaller child POMs to be created through templates that produce modules that extend the jpa-parent. Cool. You didn't want to do that again.
You have now finished all chapters of the Maven and JPA/EntityManager exercise that was geared at getting you started developing modules that implement the data tier.
Copyright © 2019 jim stafford (jim.stafford@jhu.edu)
Built on: 2019-08-22 07:09 EST
Abstract
This document contains a series of exercises for mapping Java classes (without relationships) to the database using JPA. It covers many of the core and corner mapping issues and demonstrates various issues that come up.
Table of Contents
To provide hands on experience
Defining JPA entity classes and mapping them to a relational database schema
Using JPA annotations and XML descriptors
Defining primary key mechanisms for entities
Defining alternate entity/table mappings that do not require JPA relationships
At the completion of this exercise, the student will be able to
Create a legal POJO to be used as a JPA entity
Map POJO classes and properties to database tables and columns
Define entity mappings with Java annotations and XML descriptors
Define different primary key strategies
Define multi-table/single-class joins
Define embedded object mappings
Generate a database schema based on their entity definitions
The project created from this archetype has two mechanisms for managing schema; maven plugin-based and EntityManager-based. Know that when you start this exercise the two mechanisms may be both activated and competing (last one wins). One type may be better for some uses while the other may be more helpful in others. Know which is best to use in which circumstance and which one is active or turned off when encountering a database setup issue.
The EntityManager mechanism is activated and deactivated through the hibernate.hbm2ddl.auto property in src/test/resources/hibernate.properties. Set to create to turn on or comment out to turn off.
# src/test/resources/hibernate.properties ... #hibernate.hbm2ddl.auto=create
The maven plugin mechanism is turned activated by the sql-maven-plugin within the local pom.xml. You can either comment out the entire plugin, or comment out or mangle key sections of the plugin definition in order to turn off this capability.
# pom.xml <!-- runs schema against the DB <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>sql-maven-plugin</artifactId> ... </plugin> -->
The EntityManager technique is useful when making quick schema changes to your entities within Eclipse but not as good if you want to preserve data for analysis. The maven plugin technique is useful when you commonly build from maven after making entity changes. It is also useful in generating schema that can be analyzed after the build is complete.
Add the following to your .m2/settings.xml file. This will allow you to resolve the exercise archetype and set a default database for the exercise.
<profiles> <profile> <id>webdev-repositories</id> <repositories> <repository> <id>webdev</id> <name>ejava webdev repository</name> <url>http://webdev.jhuep.com/~jcs/maven2</url> <releases> <enabled>true</enabled> <updatePolicy>never</updatePolicy> </releases> <snapshots> <enabled>false</enabled> </snapshots> </repository> <repository> <id>webdev-snapshot</id> <name>ejava webdev snapshot repository</name> <url>http://webdev.jhuep.com/~jcs/maven2-snapshot</url> <releases> <enabled>false</enabled> </releases> <snapshots> <enabled>true</enabled> <updatePolicy>daily</updatePolicy> </snapshots> </repository> </repositories> </profile> </profiles> <activeProfiles> <activeProfile>h2db</activeProfile> <!-- <activeProfile>h2srv</activeProfile> --> </activeProfiles>
Use the ejava.jpa:jpa-archetype to setup a new Maven project for this exercise. Activate the webdev-repositories profile (-Pwebdev-repositories) so that you can resolve the archetype off the Internet. The following should be run outside of the class example tree.
$ mvn archetype:generate -B -DarchetypeGroupId=info.ejava.examples.jpa -DarchetypeArtifactId=jpa-archetype -DarchetypeVersion=5.0.0-SNAPSHOT -DgroupId=myorg.entityex -DartifactId=entityEx -Pwebdev-repositories INFO] Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Maven Stub Project (No POM) 1 [INFO] ------------------------------------------------------------------------ [INFO] [INFO] >>> maven-archetype-plugin:3.0.1:generate (default-cli) > generate-sources @ standalone-pom >>> [INFO] [INFO] <<< maven-archetype-plugin:3.0.1:generate (default-cli) < generate-sources @ standalone-pom <<< [INFO] [INFO] [INFO] --- maven-archetype-plugin:3.0.1:generate (default-cli) @ standalone-pom --- [INFO] Generating project in Batch mode [WARNING] Archetype not found in any catalog. Falling back to central repository. [WARNING] Add a repsoitory with id 'archetype' in your settings.xml if archetype's repository is elsewhere. [INFO] ---------------------------------------------------------------------------- [INFO] Using following parameters for creating project from Archetype: jpa-archetype:5.0.0-SNAPSHOT [INFO] ---------------------------------------------------------------------------- [INFO] Parameter: groupId, Value: myorg.entityex [INFO] Parameter: artifactId, Value: entityEx [INFO] Parameter: version, Value: 1.0-SNAPSHOT [INFO] Parameter: package, Value: myorg.entityex [INFO] Parameter: packageInPathFormat, Value: myorg/entityex [INFO] Parameter: version, Value: 1.0-SNAPSHOT [INFO] Parameter: package, Value: myorg.entityex [INFO] Parameter: groupId, Value: myorg.entityex [INFO] Parameter: artifactId, Value: entityEx [INFO] Project created from Archetype in dir: .../proj/784/entityEx [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 20.490 s [INFO] Finished at: 2018-08-17T19:59:39-04:00 [INFO] Final Memory: 18M/317M [INFO] ------------------------------------------------------------------------
You should now have an instantiated template for a JPA project
entityEx/ ├── pom.xml └── src ├── main │ └── java │ └── myorg │ └── entityex │ └── Auto.java └── test ├── java │ └── myorg │ └── entityex │ └── AutoTest.java └── resources ├── hibernate.properties ├── log4j.xml └── META-INF └── persistence.xml
Verify the instantiated template builds in your environment
Activate the h2db profile (and deactivate the h2srv profile) to use an embedded file as your database. This option does not require a server but is harder to inspect database state in between tests. Remember that the "!" character must be escaped ("\!") for bash shells.
entityEx> mvn clean test -Ph2db -P\!h2srv ... -HHH10001005: using driver [org.h2.Driver] at URL [jdbc:h2:/Users/jim/proj/784/entityEx/target/h2db/ejava] ... [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------
Start your database server
$ java -jar M2_REPO/com/h2database/h2/1.4.197/h2-1.4.197.jar
Activate the h2srv profile (and deactivate the h2db profile) to use a running H2 database server. This option provides more interaction with your database but does require the server to be running.
entityEx> mvn clean test -P\!h2db -Ph2srv ... -HHH10001005: using driver [org.h2.Driver] at URL [jdbc:h2:tcp://127.0.0.1:9092/./h2db/ejava] ... [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------
You may now import the instantiated template into Eclipse as an "Existing Maven Project"
This chapter will take you through the steps to register a Java POJO with the JPA persistence unit using both orm.xml mapping-file descriptors and Java class annotations. It will also take you through the steps to define a POJO class legal to be used as JPA entity class.
JPA Classes are required to ...
Be identified as being a JPA entity class
Have a non-private default constructor
At least have one property defined as the primary key
Create a POJO Java class in the ...mapped Java package
package myorg.entityex.mapped;
import java.util.Date;
public class Animal {
private int id;
private String name;
private Date dob;
private double weight;
public Animal(String name, Date dob, double weight) {
this.name = name;
this.dob = dob;
this.weight = weight;
}
public int getId() { return id; }
public void setId(int id) {
this.id = id;
}
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
public Date getDob() { return dob; }
public void setDob(Date dob) {
this.dob = dob;
}
public double getWeight() { return weight; }
public void setWeight(double weight) {
this.weight = weight;
}
}
Copy the existing AutoTest.java to AnimalTest.java and remove (or ignore) references to the Auto class from AnimalTest.java
Attempt to persist the Animal by adding the following @Test method to the AnimalTest.java JUnit class.
# src/test/java/myorg/entityex/AnimalTest.java
@Test
public void testCreateAnimal() {
logger.info("testCreateAnimal");
Animal animal = new Animal("bessie",
new GregorianCalendar(1960, 1, 1).getTime(), 1400.2);
em.persist(animal);
assertNotNull("animal not found", em.find(Animal.class,animal.getId()));
}
Attempt to build and run your test. Your test should fail with the following error message. This means that although your class is a valid Java POJO, it has not been made known to the persistence unit as a JPA entity.
testCreateAnimal(myorg.entityex.AutoTest): Unknown entity: myorg.entityex.mapped.Animal ... java.lang.IllegalArgumentException: Unknown entity: myorg.entityex.mapped.Animal at org.hibernate.ejb.AbstractEntityManagerImpl.persist(AbstractEntityManagerImpl.java:856) at myorg.entityex.AutoTest.testCreateAnimal(AutoTest.java:100)
Add the POJO class to the persistence unit by adding an orm.xml JPA mapping file to your project. Place the file in the src/main/resources/orm directory.
# src/main/resources/orm/Animal-orm.xml
<?xml version="1.0" encoding="UTF-8"?>
<entity-mappings xmlns="http://java.sun.com/xml/ns/persistence/orm"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence/orm http://java.sun.com/xml/ns/persistence/orm_2_0.xsd" version="2.0">
<entity class="myorg.entityex.mapped.Animal"/>
</entity-mappings>
Register the orm.xml file with the persistence unit by adding a mapping-file element reference.
# src/test/resources/META-INF/persistence.xml
<persistence-unit name="entityEx-test">
<provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>
<mapping-file>orm/Animal-orm.xml</mapping-file>
<class>myorg.entityex.Auto</class>
<properties>
...
Attempt to build and run your test. Your test should fail with the following error message. The specifics of the error message will depend upon whether you are running just the JUnit test or building within Maven since the pom is configured to build database schema from the JPA mappings prior to running the JUnit test.
PersistenceUnit: entityEx-test] Unable to configure EntityManagerFactory: No identifier specified for entity: myorg.entityex.mapped.Animal
Caused by: org.hibernate.AnnotationException: No identifier specified for entity: myorg.entityex.mapped.Animal
Although the class is a valid POJO and we followed the deployment descriptor mechanism for registering it with the persistence unit, it is not a legal entity. The error message indicates it is lacking a primary key field.
Update the orm.xml file and define the "id" column as the primary key property for the entity.
<entity class="myorg.entityex.mapped.Animal">
<attributes>
<id name="id"/>
</attributes>
</entity>
Rebuild your module and it should now persist the POJO as a JPA entity. The SQL should be printed in the debug output.
$ mvn clean test ... Hibernate: insert into Animal (dob, name, weight, id) values (?, ?, ?, ?) -tearDown() complete, em=org.hibernate.ejb.EntityManagerImpl@12a80ea3 -closing entity manager factory -HHH000030: Cleaning up connection pool [jdbc:h2:/home/jcstaff/workspaces/ejava-javaee/git/jpa/jpa-entity/entityEx/target/h2db/ejava] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.94 sec Results : Tests run: 2, Failures: 0, Errors: 0, Skipped: 0 [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------
Update your JUnit test method to look like the following. The unit test now clears the cache of entities and forces the entity manager to instantiate a new instance for the value returned from the find().
@Test
public void testCreateAnimal() {
logger.info("testCreateAnimal");
Animal animal = new Animal("bessie",
new GregorianCalendar(1960, 1, 1).getTime(), 1400.2);
em.persist(animal);
assertNotNull("animal not found", em.find(Animal.class,animal.getId()));
em.flush(); //make sure all writes were issued to DB
em.clear(); //purge the local entity manager entity cache to cause new instance
assertNotNull("animal not found", em.find(Animal.class,animal.getId()));
}
Attempt to rebuild your module. It should fail because the entity class does not have a default constructor. Remember that default constructors are provided for free in POJOs until you add the first constructor. Once you add a custom constructor you are required to add a default constructor to make it a legal entity class.
javax.persistence.PersistenceException: org.hibernate.InstantiationException: No default constructor for entity: myorg.entityex.mapped.Animal
Update the POJO with a default constructor.
public Animal() {} //must have default ctor
public Animal(String name, Date dob, double weight) {
this.name = name;
this.dob = dob;
this.weight = weight;
}
Rebuild the module. It should now pass because you have defined and registered a compliant entity class. The class was
Registered as an entity using and orm.xml deployment descriptor wired into the persistence unit through a mapping-file reference in the persistence.xml
Assigned an identity property to use for a primary key. The Java "id" property existed from the start, but the property had to be identified to JPA.
Provided with a default constructor. If we removed the custom constructor -- we would get this constructor for free.
Copy the POJO class to a new java package and class name (Animal2).
package myorg.entityex.annotated;
import java.util.Date;
public class Animal2 {
private int id;
private String name;
private Date dob;
private double weight;
public Animal2() {} //must have default ctor
...
}
Add a javax.persistence.Entity annotation to the class
import javax.persistence.Entity;
@javax.persistence.Entity
public class Animal2 {
Register the new entity with the persistence.xml using a class element reference
<persistence-unit name="entityEx-test">
<provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>
<mapping-file>orm/Animal-orm.xml</mapping-file>
<class>myorg.entityex.Auto</class>
<class>myorg.entityex.annotated.Animal2</class>
<properties>
Classes annotated with @Entity are automatically located when placed in the same archive as the persistence.xml or correctly referenced by a jarfile element. However, since this exercise has placed the persistence.xml outside of the src/test tree with the @Entity classes, we must define them in the persistence unit. Note how the current structure looks -- the persistence.xml and Animal2 class are in what is considered separate archives.
$ tree target/*classes target/classes ├── ddl ├── myorg │ └── entityex │ ├── annotated │ │ └── Animal2.class │ ├── Auto.class │ └── mapped │ └── Animal.class └── orm └── Animal-orm.xml target/test-classes ├── hibernate.properties ├── log4j.xml ├── META-INF │ └── persistence.xml └── myorg └── entityex └── AutoTest.class
Add a new test method to work with the new class added to the module.
@Test
public void testCreateAnimalAnnotated() {
logger.info("testCreateAnimalAnnotated");
myorg.entityex.annotated.Animal2 animal = new myorg.entityex.annotated.Animal2("bessie",
new GregorianCalendar(1960, 1, 1).getTime(), 1400.2);
em.persist(animal);
assertNotNull("animal not found", em.find(myorg.entityex.annotated.Animal2.class,animal.getId()));
em.flush(); //make sure all writes were issued to DB
em.clear(); //purge the local entity manager entity cache to cause new instance
assertNotNull("animal not found", em.find(myorg.entityex.annotated.Animal2.class,animal.getId()));
If you get a very generic error message like "PersistenceException: No Persistence provider for EntityManager named entityEx-test" within Eclipse when running Junit -- build from the command line to help see errors produced by Hibernate. If you do not see valuable error messages -- try turning up the verbosity on "org.hibernate" or looking in the target/log4j-out.txt file.
Attempt to build/run your module at this point. You should get a familiar error about Animal2 not having an identifier.
Unable to configure EntityManagerFactory: No identifier specified for entity: myorg.entityex.annotated.Animal2
Since we want to use annotations for the new class, fix the issue by adding a @javax.persistence.Id annotation to the id attribute. This is called FIELD access in JPA. You can alternately use PROPERTY access by moving the annotation to the getId() method.
@javax.persistence.Id
private int id;
Re-run you test. It should succeed this time.
$ mvn clean test ... [INFO] BUILD SUCCESS ...
If you would like to observe the data in the database, do two things
Turn off the hibernate.hbm2ddl.auto=create option by commenting it out in hibernate.properties
$ more src/test/resources/hibernate.properties hibernate.dialect=${hibernate.dialect} hibernate.connection.url=${jdbc.url} hibernate.connection.driver_class=${jdbc.driver} hibernate.connection.password=${jdbc.password} hibernate.connection.username=${jdbc.user} #hibernate.hbm2ddl.auto=create hibernate.show_sql=true hibernate.format_sql=true #hibernate.jdbc.batch_size=0
Make sure you are running with the h2srv (server) profile and have the server running. The easier way to run with the server profile within Eclipse is to choose it as one of your active profiles in .m2/settings.xml and have Eclipse re-read your settings.xml
# start server $ java -jar (localRepository)/com/h2database/h2/1.4.197/h2-1.4.197.jar # .m2/settings.xml <activeProfiles> <!-- <activeProfile>h2db</activeProfile> --> <activeProfile>h2srv</activeProfile> # or manually from command line $ mvn clean test -P\!h2db -Ph2srv
Type the following command in the H2 browser UI
SELECT * FROM ANIMAL2; ID DOB NAME WEIGHT 0 1960-02-01 00:00:00.0 bessie 1400.2
If you do not see your EM_AUTO, ANIMAL, and ANIMAL2 tables in the UI, verify the URL used by the application and the UI. They must be the same.
In this chapter we will create custom class/database mappings for some class properties
Map a class to a specific table
Map a property to a specific column
Define constraints for properties
Take a look at using getters and setters
Copy your Animal.java class to Cat.java
package myorg.entityex.mapped;
import java.util.Date;
public class Cat {
private int id;
private String name;
private Date dob;
private double weight;
public Cat() {} //must have default ctor
public Cat(String name, Date dob, double weight) {
this.name = name;
this.dob = dob;
this.weight = weight;
}
public int getId() { return id; }
...
Copy your Animal2.java class to Cat2.java
package myorg.entityex.annotated;
import java.util.Date;
@javax.persistence.Entity
public class Cat2 {
private int id;
private String name;
private Date dob;
private double weight;
public Cat2() {} //must have default ctor
public Cat2(String name, Date dob, double weight) {
this.name = name;
this.dob = dob;
this.weight = weight;
}
@javax.persistence.Id
public int getId() { return id; }
...
Name the new Cat entity class in the Animal-orm.xml
# src/main/resources/orm/Animal-orm.xml
<entity class="myorg.entityex.mapped.Animal">
...
<entity class="myorg.entityex.mapped.Cat">
<attributes>
<id name="id"/>
</attributes>
</entity>
Name the new Cat2 entity class in the persistence.xml
# src/test/resources/META-INF/persistence.xml
<mapping-file>orm/Animal-orm.xml</mapping-file>
<class>myorg.entityex.Auto</class>
<class>myorg.entityex.annotated.Animal2</class>
<class>myorg.entityex.annotated.Cat2</class>
Rebuild your module form the command line and observe the create schema generated for Cat and Cat2. Notice that the JPA provider used the class name as the default entity name and will be attempting to map the entity to a database table by the same name as the entity.
$ more target/classes/ddl/* ... create table Cat ( id integer not null, dob timestamp, name varchar(255), weight double not null, primary key (id) ); create table Cat2 ( id integer not null, dob timestamp, name varchar(255), weight double not null, primary key (id) );
Add a table element to the orm.xml definition to map Cat to the ENTITYEX_CAT table.
<entity class="myorg.entityex.mapped.Cat">
<table name="ENTITYEX_CAT"/>
<attributes>
Add a @javax.persistence.Table annotation to the Cat2 class to map instances to the ENTITYEX_CAT table.
@javax.persistence.Entity
@javax.persistence.Table(name="ENTITYEX_CAT")
public class Cat2 {
private int id;
Rebuild your module form the command line and observe the create schema generated for Cat and Cat2. Notice now that we have mapped two entity classes to the same table using a custom table name.
$ more target/classes/ddl/* ... create table ENTITYEX_CAT ( id integer not null, dob timestamp, name varchar(255), weight double not null, primary key (id) );
Map the id property for both the Cat and Cat2 to the CAT_ID column. Also have the persistence provider automatically generate a value for the primary key during the persist(). The exercise will go into generated primary key types in more detaiu
@javax.persistence.Id
@javax.persistence.Column(name="CAT_ID")
@javax.persistence.GeneratedValue
private int id;
<entity class="myorg.entityex.mapped.Cat">
<table name="ENTITYEX_CAT"/>
<attributes>
<id name="id">
<column name="CAT_ID"/>
<generated-value/>
</id>
</attributes>
</entity>
Make the name column mandatory (nullable=false) and define the length of the string to be 20 characters. Note that these property assignments are only useful as documentation and generating schema. Many of the column properties are not used at runtime by the provider.
@javax.persistence.Column(nullable=false, length=20)
private String name;
<basic name="name">
<column nullable="false" length="20"/>
</basic>
Have the weight column stored with a precision of 3 digits, with 1 digit (scale) to the right of the decimal place. You will need to change the datatype of the mapped property to BigDecimal to fully leverage this capability.
# src/main/java/myorg/entityex/annotated/Cat2.java
@javax.persistence.Column(precision=3, scale=1) //10.2lbs
private BigDecimal weight;
...
public Cat2(String name, Date dob, BigDecimal weight) {
...
public BigDecimal getWeight() { return weight; }
public void setWeight(BigDecimal weight) {
# src/main/java/myorg/entityex/mapped/Cat.java
private BigDecimal weight;
...
public Cat(String name, Date dob, BigDecimal weight) {
...
public BigDecimal getWeight() { return weight; }
public void setWeight(BigDecimal weight) {
# src/main/resources/orm/Animal-orm.xml
<basic name="weight">
<column precision="3" scale="1"/>
</basic>
Rebuild the module from the command line and observe the database schema generated generated for the ENTITEX_CAT table.
# target/classes/ddl/entityEx-createJPA.ddl create table ENTITYEX_CAT ( CAT_ID integer generated by default as identity, dob date, name varchar(20) not null, weight decimal(3,1), primary key (CAT_ID) );
Notice how
All defaults not overwritten are preserved (e.g., column names)
The Cat and Cat2 entities have been mapped to the ENTITYEX_CAT table.
The id property has been mapped to the CAT_ID column
The id property will have a unique value automatically generated and assigned
The id and name properties are required (i.e., "not null")
The dob and weight properties continue to be optional since that is the default and not overridden
The weight property will be stored with 3 digits and one of those digits is to the right of the decimal place.
In the above example, you used FIELD access to the property values. This is the preferred method if your business object attributes provide an accurate representation as to what should be stored in the database. FIELD access was chosen by the provider by the fact that our annotated class placed the @Id annotation on a Java attribute and not a Java getter().
# implies FIELD access
@javax.persistence.Id
@javax.persistence.Column(name="CAT_ID")
@javax.persistence.GeneratedValue
private int id;
...
public int getId() { return id; }
If moved the @Id property definitions to the getter(), then the access would have been switched to PROPERTY. That was how JPA 1.0 annotated classed worked and it was always one way or another.
# implies PROPERTY access
private int id;
...
@javax.persistence.Id
@javax.persistence.Column(name="CAT_ID")
@javax.persistence.GeneratedValue
public int getId() { return id; }
Since it was always one way or the other with JPA 1.0, the specification in the orm.xml file was placed on the root element of the entity
<entity class="myorg.entityex.mapped.Cat"
access="FIELD">
Starting with JPA 2.0, we can also make the specification more explicit (like the XML technique) with the addition of the @Access annotation
@javax.persistence.Access(javax.persistence.AccessType.FIELD)
public class Cat2 {
Although switching between FIELD and PROPERTY access was always a capability in JPA 1.0 -- JPA 2.0 added the ability to chose on a per-property basis. This is done by applying the @Access annotation to the getter() you want to have property access. In this section, we will continue to expose all our properties to the provider through FIELD access, but define a PROPERTY access for the "weight" property.
Update the annotated Cat2 entity to store weight as a double and expose it to the provider as a BigDecimal.
private double weight;
...
@javax.persistence.Column(precision=3, scale=1) //10.2lbs
@javax.persistence.Access(javax.persistence.AccessType.PROPERTY)
public BigDecimal getWeight() {
return new BigDecimal(weight);
}
public void setWeight(BigDecimal weight) {
this.weight = weight==null ? 0 : weight.doubleValue();
}
Add a logger and some log statements to help identify the calls to the getter and setter methods
# src/main/java/myorg/entityex/annotated/Cat2.java
private static final Log logger = LogFactory.getLog(Cat2.class);
...
public BigDecimal getWeight() {
logger.debug("annotated.getWeight()");
return new BigDecimal(weight);
}
public void setWeight(BigDecimal weight) {
logger.debug("annotated.setWeight()");
this.weight = weight==null ? 0 : weight.doubleValue();
}
Add the following test method to your AnimalTest. By persisting the entity, we will force the provider to get properties from the entity. By clearing the persistence unit of the entity prior to executing the find, we will force the provider to instantiate a new entity instance and set the properties within the entity.
# src/test/java/myorg/entityex/AnimalTest.java
@Test
public void testCreateCatAnnotated() {
logger.info("testCreateCatAnnotated");
myorg.entityex.annotated.Cat2 cat = new myorg.entityex.annotated.Cat2("fluffy", null, 99.9);
em.persist(cat); //get provider to call getters
em.flush(); em.detach(cat);
cat = em.find(myorg.entityex.annotated.Cat2.class, cat.getId()); //get provider to call setters
}
Run your new test method and observe the calls to getWeight and setWeight printed.
$ mvn clean test -Dtest=myorg.entityex.AnimalTest#testCreateCatAnnotated
...
-testCreateCatAnnotated
-annotated.getWeight() //<----------------
Hibernate:
insert
into
ENTITYEX_CAT
(CAT_ID, dob, name, weight)
values
(null, ?, ?, ?)
-annotated.getWeight() //<----------------
-annotated.getWeight() //<----------------
Hibernate:
select
cat2x0_.CAT_ID as CAT1_2_0_,
cat2x0_.dob as dob2_0_,
cat2x0_.name as name2_0_,
cat2x0_.weight as weight2_0_
from
ENTITYEX_CAT cat2x0_
where
cat2x0_.CAT_ID=?
-annotated.setWeight() //<----------------
Make the same code changes for weight and debugging in the mapped entity class. These changes expose weight to the provider as a different type than it is stored locally. The debug will help use track the calls to the getter and setter.
# src/main/java/myorg/entityex/mapped/Cat.java
public class Cat {
private static final Log logger = LogFactory.getLog(Cat.class);
...
private double weight;
...
public Cat(String name, Date dob, double weight) {
...
public BigDecimal getWeight() {
logger.debug("mapped.getWeight()");
return new BigDecimal(weight);
}
public void setWeight(BigDecimal weight) {
logger.debug("mapped.setWeight()");
this.weight = weight==null ? 0 : weight.doubleValue();
}
Add the following test method to your JUnit class.
@Test
public void testCreateCatMapped() {
logger.info("testCreateCatMapped");
myorg.entityex.mapped.Cat cat = new myorg.entityex.mapped.Cat("fluffy", null, 99.9);
em.persist(cat); //get provider to call getters
em.flush(); em.detach(cat);
cat = em.find(myorg.entityex.mapped.Cat.class, cat.getId()); //get provider to call setters
}
Add the access="PROPERTY" to the weight definition within the orm.xml
<basic name="weight" access="PROPERTY">
<column precision="3" scale="1"/>
</basic>
I had to remove the access="FIELD" attribute from the entity element for the provider to honor the per-property specification at the weight level.
Build and run the test for the mapped version of the class. Look for the debug statements coming from the getter and setter
-testCreateCatMapped -mapped.getWeight() //<---------------- Hibernate: insert into ENTITYEX_CAT (CAT_ID, dob, name, weight) values (null, ?, ?, ?) -mapped.getWeight() //<---------------- -mapped.getWeight() //<---------------- Hibernate: select cat0_.CAT_ID as CAT1_2_0_, cat0_.dob as dob2_0_, cat0_.name as name2_0_, cat0_.weight as weight2_0_ from ENTITYEX_CAT cat0_ where cat0_.CAT_ID=? -mapped.setWeight() //<----------------
In this chapter you performed some core class/database table mappings that allowed you to
Define the name of the table used to store instances of the entity class
Define the column name used to store properties within the entity class
Define property constraints to require a property to exist or continue to be optional
Define a generated value for your primary key
Define maximum column lengths for string properties witin your entity class
Define precision and scale for BigDecimal property mappings
You also had a chance to change the provider access from FIELD to PROPERTY either for the entire entity or on a per-property basis.
This chapter will take you through the steps to map an enumerated class to the database. Enums are a specialized cross between a typed collection and an inheritance hiearchy. They are very convenient for expressing well known, type-safe values for a property. Enumerated classes can be mapped to the database by
(Java)name value
ordinal value
With what we know about PROPERTY mapping, we can also leverage some object-based tricks to map more customized value types.
For simplicity, the exercise will only deal with annotated classes from this point forward. As you should have realized -- anything you can do with an annotation has a parallel construct within the ORM descriptor. It should be a trivial exercise to locate the orm.xml equivalent for the annotation should you need to use the concepts discussed from here on in a mapped entity.
We will first look at mapping an ordinal enum value using a Dog entity. The ordinal is stored in a efficient integer column type but can be more cryptic to read in a raw query result.
Put the following entity class in place in your src/main tree.
package myorg.entityex.annotated;
import javax.persistence.Access;
import javax.persistence.AccessType;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
import javax.persistence.Table;
@Entity
@Table(name="ENTITYEX_DOG")
@Access(AccessType.FIELD)
public class Dog {
public enum Sex {
MALE, FEMALE
}
@Id @GeneratedValue
private int id;
private Sex gender;
public int getId() { return id; }
public void setId(int id) {
this.id = id;
}
public Sex getGender() { return gender; }
public Dog setGender(Sex gender) {
this.gender = gender;
return this;
}
}
Add the new entity to your persistence unit in src/test
# src/test/resources/META-INF/persistence.xml
<class>myorg.entityex.annotated.Dog</class>
Build the module with the new entity and observe the generated schema for the Dog. The property, by default is being mapped by its ordinal property. That means the value stored will be a 0, 1, 2, ... value that represents the values position within the defined enum.
create table ENTITYEX_DOG ( id integer generated by default as identity, gender integer, primary key (id) );
Optionally add an explicit specification to use ordinal mapping.
@Enumerated(EnumType.ORDINAL)
private Sex gender;
Add the following test method to your AnimalTest.java JUnit test case. It will persist an instance of a Dog, poke into the raw database to verify what is being stored, and obtain a new Dog instance to verify the enum was materialized properly.
@Test
public void testEnums() {
logger.info("testEnums");
Dog dog = new Dog()
.setGender(Dog.Sex.FEMALE);
em.persist(dog);
em.flush();
//check the raw value stored in the database
Object[] o = (Object[])em.createNativeQuery("select GENDER from ENTITYEX_DOG where id=?")
.setParameter(1, dog.getId())
.getSingleResult();
logger.debug("col=" + o);
assertEquals("unexpected gender", Dog.Sex.FEMALE.ordinal(), ((Number)o).intValue());
//get a new instance
em.detach(dog);
Dog dog2 = em.find(Dog.class, dog.getId());
assertEquals("unexpected dog gender", dog.getGender(), dog2.getGender());
}
Build module and run the new test method
$ mvn clean test -Dtest=myorg.entityex.AnimalTest#testEnums ... -col=1 ... [INFO] BUILD SUCCESS
Okay -- the first goal is now complete. You have mapped an enum as an ordinal value to the database. It may also be useful to take a look at the row(s) in the table using the DB browser UI.
For this and the next section, the runtime process will throw an IllegalArgumentException if the database contains a property that is not correctly represented in our enum set of values.
We will next look at mapping a name value for the enum. Names are convenient since they are more expressive in a raw database query and are not sensitive to their order within the Enum. However, the string value does take more space to represent and would be less efficient to implement certain comparisons.
Add the following Color Enum to the Dog class. This will define the Enum Color, declare an instance of Color and define its mapping as the STRING name.
public enum Color {
WHITE, BLACK, BROWN, MIX
}
...
@Enumerated(EnumType.STRING)
private Color color;
...
public Color getColor() { return color; }
public Dog setColor(Color color) {
this.color = color;
return this;
}
Rebuild the module and take a look at the generated schema. Color has been modeled as a varchar type with a default maximum size.
create table ENTITYEX_DOG ( id integer generated by default as identity, color varchar(255), gender integer, primary key (id) );
You can optionally supply a length specification to the enum string to create a tuned column size for the Color type. This length value will get relfected in the new DDL output.
@Column(length=16)
private Color color;
Update the test method with the following updates for the color property.
@Test
public void testEnums() {
logger.info("testEnums");
Dog dog = new Dog()
.setGender(Dog.Sex.FEMALE)
.setColor(Dog.Color.MIX);
em.persist(dog);
em.flush();
//check the raw value stored in the database
Object[] o = (Object[])em.createNativeQuery("select GENDER, COLOR from ENTITYEX_DOG where id=?")
.setParameter(1, dog.getId())
.getSingleResult();
logger.debug("cols=" + Arrays.toString(o));
assertEquals("unexpected gender", Dog.Sex.FEMALE.ordinal(), ((Number)o[0]).intValue());
assertEquals("unexpected color", Dog.Color.MIX.name(), ((String)o[1]));
//get a new instance
em.detach(dog);
Dog dog2 = em.find(Dog.class, dog.getId());
assertEquals("unexpected dog gender", dog.getGender(), dog2.getGender());
assertEquals("unexpected dog color", dog.getColor(), dog2.getColor());
}
Rebuild the module and observe that our new property storage has passed the provided asserts.
mvn clean test -Dtest=myorg.entityex.AnimalTest#testEnums ... -cols=[1, MIX] ... [INFO] BUILD SUCCESS
If you look at the row values in the DB UI, you should be able to easily read the color string value.
In the previous sections we mapped enums based on built-in capabilities; ordinal and name. There are times when mapping a column is not as clean as above. Maybe there is a non-contiguous error code or maybe there is a pretty string with spaces in the name. In this section, we will look to make a pretty name to an internally stored enum. Officially it will be mapped as a string as far as JPA is concerned, but we will leverage getter/setter methods to covert to what we want to work with internally.
Add the following Breed enum to the Dog entity class. Follow it up with a property of that type. Leave off any JPA mappings in this area. We will use PROPERTY mapping.
public enum Breed {
LABRADOR("Lab"),
SAINT_BERNARD("Saint Bernard");
public final String prettyName;
private Breed(String prettyName) { this.prettyName = prettyName; }
public static Breed getBreed(String prettyName) {
for (Breed breed : values()) {
if (breed.prettyName.equals(prettyName)) {
return breed;
}
}
return null;
}
}
private Breed breed;
Define the getter/setter to be used by JPA to access the breed property. Note that we have defined these methods separate from the getBreed/setBreed methods because they accept and return the Breed enum. These methods form a contract with the provider to return a string. We also need to tell the provider the database column name to use because the default column naming rules will not work in in this case.
@Access(AccessType.PROPERTY)
@Column(name="BREED", length=32)
protected String getDBBreed() {
return breed==null ? null : breed.prettyName;
}
protected void setDBBreed(String dbValue) {
breed=Breed.getBreed(dbValue);
}
Notice how the methods declared above are non-public. They where defined that way since the public clients of the Dog class want to work with the Breed enum and not strings. To have them both public could be confusing and -- if they want strings -- they can always get that from the Breed enum itself.
Define the existing property as transient since the persistence of the property is being handled through dBBreed and leaving it non-Transient would cause an extra database column to be created for the enum.
@Transient
private Breed breed;
Rebuild the module and observe the generated schema for the updated entity. Note the varchar BREED column that has also been restricted to 32 characters according to our @Column specification. Notice that there is only a single column for breed and it comes from the dBBreed property definition.
create table ENTITYEX_DOG ( id integer generated by default as identity, BREED varchar(32), color varchar(16), gender integer, primary key (id) );
Update the test method with the following to assign and test the persistence of breed using an alternate mapping.
@Test
public void testEnums() {
logger.info("testEnums");
Dog dog = new Dog()
.setGender(Dog.Sex.FEMALE)
.setColor(Dog.Color.MIX)
.setBreed(Dog.Breed.SAINT_BERNARD);
em.persist(dog);
em.flush();
//check the raw value stored in the database
Object[] o = (Object[])em.createNativeQuery("select GENDER, COLOR, BREED from ENTITYEX_DOG where id=?")
.setParameter(1, dog.getId())
.getSingleResult();
logger.debug("cols=" + Arrays.toString(o));
assertEquals("unexpected gender", Dog.Sex.FEMALE.ordinal(), ((Number)o[0]).intValue());
assertEquals("unexpected color", Dog.Color.MIX.name(), ((String)o[1]));
assertEquals("unexpected breed", Dog.Breed.SAINT_BERNARD.prettyName, ((String)o[2]));
//get a new instance
em.detach(dog);
Dog dog2 = em.find(Dog.class, dog.getId());
assertEquals("unexpected dog gender", dog.getGender(), dog2.getGender());
assertEquals("unexpected dog color", dog.getColor(), dog2.getColor());
assertEquals("unexpected dog breed", dog.getBreed(), dog2.getBreed());
}
Re-run the enum test and note how text "Saint Bernard" with camel case and spaces was stored in the database and not the ordinal or SAINT_BERNARD name.
mvn clean test -Dtest=myorg.entityex.AnimalTest#testEnums ... -cols=[1, MIX, Saint Bernard] ... [INFO] BUILD SUCCESS
You may want to verify the contents of the database using the DB UI
SELECT * FROM ENTITYEX_DOG; ID BREED COLOR GENDER 1 Saint Bernard MIX 1
In this chapter we took a detailed look at mapping a special property type -- the enum. You mapped it using two built-in mapping techniques (ordinal and string/name) and a custom way using alternate getters/setters and PROPERTY access. Since the additional getter/setter pair was strictly for the OR mapping, you defined them as non-public accessors to keep the interface from being polluted with OR mapping details. You also declared the enum breed attribute as @Transient so the provider knew to ignore that particular FIELD.
This chapter will take you through mapping temporal (i.e. Dates) to columns in the dataabase.
Put the following class with three temporal types in place. Note that although the names of the properties indicate an intent to store DATE, TIME, and TIMESTAMP information, their Java datatype does not offer enough clues to the provider to make the distinction.
package myorg.entityex.annotated;
import java.util.Calendar;
import java.util.Date;
import javax.persistence.*;
@Entity
@Table(name="ENTITYEX_SHARK")
public class Shark {
@Id @GeneratedValue
private int id;
private Calendar aDate;
private Date aTime;
private Date aTimestamp;
public int getId() { return id; }
public Shark setId(int id) {
this.id = id; return this;
}
public Calendar getDate() { return aDate; }
public Shark setDate(Calendar date) {
this.aDate = date; return this;
}
public Date getTime() { return aTime; }
public Shark setTime(Date time) {
this.aTime = time; return this;
}
public Date getTimestamp() { return aTimestamp; }
public Shark setTimestamp(Date timestamp) {
this.aTimestamp = timestamp; return this;
}
@Override
public String toString() {
return new StringBuilder()
.append("aDate=").append(aDate.getTime())
.append(", aTime=").append(aTime)
.append(", aTimestamp=").append(aTimestamp)
.toString();
}
}
Add the new entity class to the persistence unit
<class>myorg.entityex.annotated.Shark</class>
Build the module and note the default schema created. The aDate, aTime, and aTimestamp where all created using a timestamp column type.
create table ENTITYEX_SHARK ( id integer generated by default as identity, aDate timestamp, aTime timestamp, aTimestamp timestamp, primary key (id) );
Add the following test method to your existing JUnit test case. This test method just prints the temporal fields of the original object and then of the object coming form the database.
@Test
public void testTemporal() {
log.info("testTemporal");
Shark shark = new Shark()
.setDate(new GregorianCalendar(1776, Calendar.JULY, 4))
.setTime(new Date())
.setTimestamp(new Date());
em.persist(shark);
log.info("initial object=" + shark);
//flush commands to DB and get new instance
em.flush(); em.detach(shark);
Shark shark2 = em.find(Shark.class, shark.getId());
log.info("object from DB=" + shark2);
}
Rebuild the module with the new test method. Observe the dates printed and how they are not quite what we are looking for. The Calendar date looks okay, but the time contains both time and date (because it is declared as a timestamp)
-testTemporal Hibernate: insert into ENTITYEX_SHARK (id, aDate, aTime, aTimestamp) values (null, ?, ?, ?) -initial object=aDate=Thu Jul 04 00:00:00 EST 1776, aTime=Sun Feb 24 02:21:44 EST 2013, aTimestamp=Sun Feb 24 02:21:44 EST 2013 Hibernate: select shark0_.id as id4_0_, shark0_.aDate as aDate4_0_, shark0_.aTime as aTime4_0_, shark0_.aTimestamp as aTimestamp4_0_ from ENTITYEX_SHARK shark0_ where shark0_.id=? -object from DB=aDate=Thu Jul 04 00:00:00 EST 1776, aTime=2013-02-24 02:21:44.861, aTimestamp=2013-02-24 02:21:44.861
Add Temporal specifications to the three properties.
@Temporal(TemporalType.DATE)
private Calendar aDate;
@Temporal(TemporalType.TIME)
private Date aTime;
@Temporal(TemporalType.TIMESTAMP)
private Date aTimestamp;
Rebuild the module and note the new database schema created. Our three fields now have more distinct values.
create table ENTITYEX_SHARK ( id integer generated by default as identity, aDate date, aTime time, aTimestamp timestamp, primary key (id) );
In looking at the output from the entity pulled from the database, each of the temporals has the desired granularity. Note that the first printed output is of the Date objects before they have been massaged by the database.
-testTemporal Hibernate: insert into ENTITYEX_SHARK (id, aDate, aTime, aTimestamp) values (null, ?, ?, ?) -initial object=aDate=Thu Jul 04 00:00:00 EST 1776, aTime=Sun Feb 24 02:19:47 EST 2013, aTimestamp=Sun Feb 24 02:19:47 EST 2013 Hibernate: select shark0_.id as id4_0_, shark0_.aDate as aDate4_0_, shark0_.aTime as aTime4_0_, shark0_.aTimestamp as aTimestamp4_0_ from ENTITYEX_SHARK shark0_ where shark0_.id=? -object from DB=aDate=Thu Jul 04 00:00:00 EST 1776, aTime=02:19:47, aTimestamp=2013-02-24 02:19:47.112 </listitem> <listitem><para></para> <programlisting language=""><![CDATA[
This chapter will take you through mapping large objects (Clobs and Blobs) to your database.
The normal database types like varchar can be indexed and searched for but usually have a maximum length somewhere between 2,000 and 5,000 characters. If you need so store larger documents or images databases provide two additional types:
Clob for storing large character/string data
Blob for storing large binary data
The actual database type(s) are not necessarily called Clob and Blob, so JPA provides a layer of abstraction between what the application needs and how it is declared in the database.
JPA has built-in rules to map string/character data to a varchar and we must define a @javax.persistence.Lob metadata property to have it mapped differently. Lets start this exercise by using the default mapping and then add overrides.
Create the following class in your src/main tree.
package myorg.entityex.annotated;
import javax.persistence.*;
@Entity
@Table(name="ENTITYEX_HORSE")
public class Horse {
@Id @GeneratedValue
private int id;
private String name;
private String description;
private char[] history;
private byte[] photo;
public int getId() { return id; }
public void setId(int id) {
this.id = id;
}
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
public String getDescription() { return description; }
public void setDescription(String description) {
this.description = description;
}
public char[] getHistory() { return history; }
public void setHistory(char[] history) {
this.history = history;
}
}
Add the new class to the persistence unit.
<class>myorg.entityex.annotated.Horse</class>
Build the module and observe the database schema that is created. Note that all of our string and char[] properties are being mapped to a varchar.
create table ENTITYEX_HORSE ( id integer generated by default as identity, description varchar(255), history varchar(255), name varchar(255), primary key (id) );
Add the following test method to the existing JUnit test case.
@Test
public void testLob() {
log.info("testLob");
//create our host object with Lob objects
Horse horse = new Horse();
horse.setName("Mr. Ed");
horse.setDescription("There once was a horse of course and his name was Mr. Ed...");
horse.setHistory("Mister Ed is a fictional talking horse residing in Mount Kisco, New York,...".toCharArray());
em.persist(horse);
//flush to DB and get a new instance
em.flush(); em.detach(horse);
Horse horse2 = em.find(Horse.class, horse.getId());
assertEquals("unexpected description", horse.getDescription(), horse2.getDescription());
assertTrue("unexpected history", Arrays.equals(horse.getHistory(), horse2.getHistory()));
}
Using the database server profile, run the tests and observe the data left in the database tables.
$ mvn clean test -Ph2srv -P\!h2db ... SELECT * FROM ENTITYEX_HORSE; ID DESCRIPTION HISTORY NAME 1 There once was a horse of course and his name was Mr. Ed... Mister Ed is a fictional talking horse residing in Mount Kisco, New York,... Mr. Ed
Update the hosting class and supply @Lob for description and history properties
@Lob
private String description;
@Lob
private char[] history;
Rebuild the module and observe the change in database types. The mapping has been changed to a Clob type and the database column for the H2 database looks like it is also "clob".
create table ENTITYEX_HORSE ( id integer generated by default as identity, description clob, history clob, name varchar(255), primary key (id) );
At this point in time you now have string fields that can be used to store large amounts of data. As a side-exercise, try storing 100K character strings stored as Clobs and then switch them back to varchar to see the difference in size constraints. Try setting the @Column.length field to a high value to accomodate the string mapped as a varchar. Where is the maximum? That maximum value is not the same on all databases.
The above section worked with character/string data. We also may need tp store binary information. JPA will
Add the following byte[] property to the host class. Annotate it as a @Lob type to assure we bet the right storage type.
@Lob
private byte[] photo;
public byte[] getPhoto() { return photo; }
public void setPhoto(byte[] photo) {
this.photo = photo;
}
Rebuild the module and observe the database schema created. We now have a "blob" type for photo.
create table ENTITYEX_HORSE ( id integer generated by default as identity, description clob, history clob, name varchar(255), photo blob, primary key (id) );
Add a Java Serializable type to the Horse class. JPA can store this type of object in a blob as well.
public class Horse {
public static class Jockey implements Serializable {
private static final long serialVersionUID = 1L;
private String name;
public String getName() { return name; }
public void setName(String name) { this.name = name; }
}
@Lob
private Jockey jockey;
public Jockey getJockey() { return jockey; }
public void setJockey(Jockey jockey) {
this.jockey = jockey;
}
Update the test method with the following to exercise the Blob fields.
...
byte[] picture = new byte[10*1000];
new Random().nextBytes(picture);
horse.setPhoto(picture);
Horse.Jockey jockey = new Horse.Jockey();
jockey.setName("Wilbur Post");
horse.setJockey(jockey);
em.persist(horse);
...
Horse horse2 = em.find(Horse.class, horse.getId());
...
assertTrue("unexpected photo", Arrays.equals(horse.getPhoto(), horse2.getPhoto()));
assertEquals("unexpected jockey", horse.getJockey().getName(), horse2.getJockey().getName());
Rebuild/test the module to generate schema and verify functionality. Notice we now have photo (the byte[] type) and jockey (the Serializable type) mapped to a database blob.
create table ENTITYEX_HORSE ( id integer generated by default as identity, description clob, history clob, jockey blob, name varchar(255), photo blob, primary key (id) );
If you took a look at the results in the H2 DB browser UI using a default query, you should notice that binary information is now stored in the two additional fields. Note that the jockey name is not stored as a simple String. It is stored as a serialized string within a serialized Jockey class within a Blob field of the database. We can only get and set the Jockey and not search for their name using this mapping mechanism.
In this chapter we mapped large string/character types to Clobs and binary content to Blobs. One thing we still need to point out is that Clobs and Blobs can come at a performance cost. You likely will want to model Clob and Blob data in lazily loaded child tables (using JPA relationships) to allow quick and efficient access to the traditional column data and then optionally provide the large payloads on demand.
This chapter will take you through the steps to setting up entity classes for three different primary key generation strategies:
IDENTITY
SEQUENCE
TABLE
There is a fourth strategy called AUTO and is the default. Since the job of most applications is to map to a known database schema, I would not consider this to be usable outside of quick prototypes like we went through in the previous sections. When using AUTO (the default), you are saying "I don't care what you do -- just get me a primary key value" when you actually do care for production code. Fore this reason I would suggest you always supply a strategy and not depend on the default always doing what you need.
Primary key generation has three fundamental requirements
The primary key must be a simple key (ie., not compound)
The data type of the primary key must be a numeric
The primary key value of the entity must be "unassigned" when passed to persist. Any "assigned" value will cause the provider to through an exception since the entity is presumed to already have a primary key and is not in the proper state. Unassigned states are "0" for built-in numeric types (e.g., int, long) and null for Object numeric types (e.g., Integer, Long)
Lets start with the easiest case. For IDENITY, the primary key is generated by the database on a per-table basis. Thus it is possible to have two separate entity instances with the same primary key value when mapped to two different tables.
Put the following class in place in your src/main tree. The class has two properties; an ID and Name. The ID is being automatically generated using the IDENTITY strategy.
package myorg.entityex.annotated;
import javax.persistence.*;
@Entity
@Table(name="ENTITYEX_BUNNY")
@Access(AccessType.FIELD)
public class Bunny {
@Id @GeneratedValue(strategy=GenerationType.IDENTITY)
private int id;
private String name;
public int getId() { return id; }
public void setId(int id) {
this.id = id;
}
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Add the new entity to the orm.xml file (and not the persistence.xml file). We are going to use primary key generation to demonstrate some points about metadata overrides in a moment.
# src/main/resources/orm/Animal-orm.xml
<entity class="myorg.entityex.annotated.Bunny"/>
Build the module and notice the database schema generated for the entity using IDENTITY.
create table ENTITYEX_BUNNY ( id integer generated by default as identity, name varchar(255), primary key (id) );
Add the following test method to the existing AnimalTest. Notice that it creates and persists several entities, checks they are received a unique primary key value, and prints all the assigned values at the end.
@Test
public void testPKGen() {
logger.info("testPKGen");
Bunny bunny = new Bunny();
bunny.setName("fuzzy");
assertTrue("primary key unexpectedly assigned", bunny.getId()==0);
em.persist(bunny);
em.flush();
logger.info("bunny.getId()=" + bunny.getId());
assertFalse("primary key not assigned", bunny.getId()==0);
Set<Integer> ids = new HashSet<Integer>();
ids.add(bunny.getId());
for (String name: new String[]{"peter", "march hare", "pat"}) {
Bunny b = new Bunny();
b.setName(name);
em.persist(b);
em.flush();
assertTrue("id not unique:" + b.getId(), ids.add(b.getId()));
}
logger.debug("ids=" + ids);
}
The em.flush() exists after the persist because we want to make sure the creation has been pushed to the database prior to the transaction committing because we want to check in while the transaction happens to be in progress.
Note too that the entity client code does not really know the strategy used. We will take advatange of that in the next section.
Build and test the primary key generation. Observe the following output and not assert errors.
$ mvn test -Dtest=myorg.entityex.AnimalTest#testPKGen -testPKGen Hibernate: insert into ENTITYEX_BUNNY (id, name) values (null, ?) -bunny.getId()=1 ... -ids=[1, 2, 3, 4]
You have now successfully mapped an entity to the database using the IDENTITY strategy and it was pretty simple. The only down side to using this technique is that it is not portable to all database providers. Most notably, Oracle does not support the IDENTITY strategy.
In this section, we are going to change the primary key generation strategy through the use of the XML descriptor rather than changing the Java annotation. This is an example of how your application could be deployed with one version of the orm.xml for development and another for production when your development and production databases don't support the same features.
SEQUENCES are specialized constratucts databases have implemented to efficiently generate primary key values across tables. Using a common sequence means that two entities mapped to two separate database tables will not have the same primary key value (as long as the tables use the same sequence). We can define multiple sequences.
Update the Animal-orm.xml definition for the entity to include a generated-value strategy of SEQUENCE.
<entity class="myorg.entityex.annotated.Bunny">
<attributes>
<id name="id">
<generated-value strategy="SEQUENCE"/>
</id>
</attributes>
</entity>
Rebuild the module and observe the new schema generated for the entity no longer has the identity as part of the table definition and has added a sequence with the default name "hibernate-sequence"
create table ENTITYEX_BUNNY ( id integer not null, name varchar(255), primary key (id) ); create sequence hibernate_sequence;
Optionally remove the IDENTITY strategy from the entity class that was overridden by the XML file so that there is less confusion which technique is being used. This change should not impact any behavior since the definition in the XML file already has priority over the Java annotations.
@GeneratedValue//(strategy=GenerationType.IDENTITY)
Rebuild the module and observe the output from the test method already in place. Remember that -- since the client does not care which strategy is used -- we are able to change the implementation and re-run with a different strategy and not change the client.
$ mvn test -Dtest=myorg.entityex.AnimalTest#testPKGen ... -testPKGen Hibernate: call next value for hibernate_sequence Hibernate: insert into ENTITYEX_BUNNY (name, id) values (?, ?) -bunny.getId()=10 ... -ids=[10, 11, 12, 13]
Notice that hibernate went to the database and obtained the next value to start with (which was 10 in this case) and was free to generate one-up numbers from that point within a window of values. Since hibernate has been told it is using a database specific dialect and can communicate with the database at runtime, you hope that the default values for the sequence are well understood. In the next step we will make this more explicit.
Update the Animal-orm.xml entity definition to include a seguence-generator and generator reference from the strategy. The sequence-generator will have an internal JPA name, a name used within the schema and a database name (be sure to use under_score and not dash-es).
<entity class="myorg.entityex.annotated.Bunny">
<sequence-generator name="animal-sequence-gen"
sequence-name="animal_sequence">
</sequence-generator>
<attributes>
<id name="id">
<generated-value strategy="SEQUENCE"
generator="animal-sequence-gen"/>
</id>
</attributes>
</entity>
create table ENTITYEX_BUNNY ( id integer not null, name varchar(255), primary key (id) ); create sequence animal_sequence;
Re-build the module and re-run the PKGen test method and you should see the new sequence being used.
-testPKGen Hibernate: call next value for animal_sequence Hibernate: insert into ENTITYEX_BUNNY (name, id) values (?, ?) -bunny.getId()=10 ... -ids=[10, 11, 12, 13]
We have successfully mapped the primary key generation to a database sequence and did so through a specification in the deployment descriptor to highlight the fact that the orm.xml file overrides and augments the class annotations. We also defined a custom generator which was used to generate the primary key values. If you were to do that with annotations -- the sequence-generator annotation would have gone over any one of the JPA entity classes.
In this section we will be mapping the primary key generation using a database table strategy. In this strategy a table and key name is identified for a sequence that will be used to bootstrap unique value generators. This technique always seems the most portable (and possibly the most expensive) since it involves no unique database behavior.
Change the generation strategy from SEQUENCE to TABLE in the orm.xml descriptor. You can leave the sequence generator definition in place.
<entity class="myorg.entityex.annotated.Bunny">
...
<attributes>
<id name="id">
<generated-value strategy="TABLE"/>
</id>
</attributes>
</entity>
Rebuild the module and note the database schema generated. The table name was defaulted to hibernate_sequences and was given some default column names as well.
create table ENTITYEX_BUNNY ( id integer not null, name varchar(255), primary key (id) ); create table hibernate_sequences ( sequence_name varchar(255), sequence_next_hi_value integer ) ;
In watching the output of the unit test, you should observe the following initialization of the sequence row prior to getting started with the actual work of persisting our entities.
-testPKGen Hibernate: select sequence_next_hi_value from hibernate_sequences where sequence_name = 'ENTITYEX_BUNNY' for update Hibernate: insert into hibernate_sequences (sequence_name, sequence_next_hi_value) values ('ENTITYEX_BUNNY', ?) Hibernate: update hibernate_sequences set sequence_next_hi_value = ? where sequence_next_hi_value = ? and sequence_name = 'ENTITYEX_BUNNY' Hibernate: insert into ENTITYEX_BUNNY (name, id) values (?, ?) -bunny.getId()=1 ... -ids=[1, 2, 3, 4]
Add a table generator definition as shown below to the orm.xml file. Attempt to override many of the defaults.
<entity class="myorg.entityex.annotated.Bunny">
<table-generator name="animal-table-gen"
table="animal_sequences"
initial-value="3"
allocation-size="10"
pk-column-name="key"
pk-column-value="animals"
value-column-name="seq"/>
<attributes>
<id name="id">
<generated-value strategy="TABLE"
generator="animal-table-gen"/>
</id>
</attributes>
Re-build the module and observe how the generated schema matches what was specified in the table generator specification.
create table ENTITYEX_BUNNY ( id integer not null, name varchar(255), primary key (id) ); create table animal_sequences ( key varchar(255), seq integer ) ;
Run the unit test and observe pretty much the same behavior as above except this time using our customized table definitions. Note the initial values and allocation size is not yet noticeable.
-testPKGen Hibernate: select seq from animal_sequences where key = 'animals' for update Hibernate: insert into animal_sequences (key, seq) values ('animals', ?) Hibernate: update animal_sequences set seq = ? where seq = ? and key = 'animals' Hibernate: insert into ENTITYEX_BUNNY (name, id) values (?, ?) -bunny.getId()=1 ... -ids=[1, 2, 3, 4]
You have now successfully mapped your entity class using a TABLE primary key mechanism. This is likely the most portable between databases but like also the most expensive because of all the knowledge and management coming from the client side of the connection.
In this chapter you defined your primary key as being automatically generated by the database and then specified one of three types of implementations (IDENTITY, SEQUENCE, and TABLE) to implement that value generation. The orm.xml file was also used to show how the actual primary key mechanism can be changed without changing the entity or client code. This can be helpful when switching between development and production databases that do not support the same features.
This chapter will take you through mapping a set of two or more natural fields as a compound primary key. Nothing is being generated with natural/compound keys. We are combining multiple fields to represent the object's identity.
A compound primary key is required to...
Be Serializable
Have a default constructor (either built-in or declared)
Supply a hash() and equals() method
Provide public access to
Embedded compound primary keys are primary key classes that are integrated into the using class in its aggregate form. The using class makes only a reference to the primary key class and not to any of the primary key class properties.
Create an instance of the following JPA primary key class in your src/main tree.
package myorg.entityex.annotated;
import java.io.Serializable;
import javax.persistence.*;
@Embeddable
public class CowPK implements Serializable { //required to be Serializable
private static final long serialVersionUID = 1L;
private String herd;
private String name;
public CowPK(){} //required default ctor
public CowPK(String herd, String name) {
this.herd = herd;
this.name = name;
};
public String getHerd() { return herd; }
public String getName() { return name; }
@Override
public int hashCode() { //required hashCode method
return herd.hashCode() + name.hashCode();
}
@Override
public boolean equals(Object obj) { //required equals method
try {
return herd.equals(((CowPK)obj).herd) &&
name.equals(((CowPK)obj).name);
} catch (Exception ex) {
return false;
}
}
}
Note the class is...
Annotated as @Embeddable
Implement Serializable
Provide a default constructor
Provides overrides for hashCode and equals
Provides attributes that will be used for the primary key fields
Note the class also...
Optionally implemented a convenience constructor
Optionally removed the setter methods so the primary key fields could not be accidentally changed
Add the following entity class to the src/main tree. This class will declare an @EmbdeddedId and no @Id property. It will also use FIELD level access to properties.
package myorg.entityex.annotated;
import javax.persistence.*;
@Entity
@Table(name="ENITYEX_COW")
public class Cow {
@EmbeddedId
private CowPK pk;
private int weight;
public Cow() {}
public Cow(CowPK cowPK) {
this.pk = cowPK;
}
public CowPK getPk() { return pk; }
public void setPk(CowPK pk) {
this.pk = pk;
}
public int getWeight() { return weight; }
public void setWeight(int weight) {
this.weight = weight;
}
}
Note the entity class makes wholesale use of the embedded primary key class and is not required to have any interaction with the properties of the embedded class.
Add the new entity class to your persistence unit.
<class>myorg.entityex.annotated.Cow</class>
Build the module with the new entity and primary key class. Note the schema that is produced. The herd and name from the primary key class have been integrated into the using class' table.
create table Cow ( herd varchar(255) not null, name varchar(255) not null, weight integer not null, primary key (herd, name) );
Notice in the above database schema that the ID columns are set to their default values. We can set the properties from either the primary key class or within the using entity class. Add the following to your two classes to help define the database columns used by the primary key class.
@Column(name="HERD", length=16)
private String herd;
@EmbeddedId
@AttributeOverrides({
@AttributeOverride(name="name", column=@Column(name="NAME", length=16))
})
private CowPK pk;
Note the technique used for the "herd" is to annotate the property directly within the primary key class. This is useful when the primary key class is dedicated for use by a single entity class or the annotations are common across entity classes. The technique used for "name" is to annotate the primary key property of the entity class using an @AttributeOverrride. Note there can only be a single @AnnotationOverride per property -- so we show the use of the @AttrinbuteOverrides({}) annotation to hold one or more @AttributeOverride elements.
Rebuild the module and note the generated database schema produced. The herd and name columns have been constrained to 16 characters. The names of the columns are also showing up as CAPITALIZED since that is how we typed them in the @Column.name mapping.
create table Cow ( HERD varchar(16) not null, NAME varchar(16) not null, weight integer not null, primary key (HERD, NAME) );
Add the following test method to your JUnit test case
@Test
public void testEmbeddedId() {
log.info("testEmbedded");
Cow cow = new Cow(new CowPK("Ponderosa", "Bessie"));
cow.setWeight(900);
em.persist(cow);
//flush to DB and get a new instance
em.flush(); em.detach(cow);
Cow cow2 = em.find(Cow.class, new CowPK("Ponderosa", "Bessie"));
assertNotNull("cow not found", cow2);
assertEquals("unexpected herd", cow.getPk().getHerd(), cow2.getPk().getHerd());
assertEquals("unexpected name", cow.getPk().getName(), cow2.getPk().getName());
assertEquals("unexpected weight", cow.getWeight(), cow2.getWeight());
}
Note how an instance of the compound primary key class is passed to the find() method to locate the entity class by primary key.
Rebuild and test your module with the new test method in place. You should notice the following information within the database when complete.
$ mvn clean test -Ph2srv -P\!h2db ... -testEmbedded Hibernate: insert into ENITYEX_COW (weight, HERD, NAME) values (?, ?, ?) Hibernate: select cow0_.HERD as HERD5_0_, cow0_.NAME as NAME5_0_, cow0_.weight as weight5_0_ from ENITYEX_COW cow0_ where cow0_.HERD=? and cow0_.NAME=? ... [INFO] BUILD SUCCESS SELECT * FROM ENITYEX_COW; HERD NAME WEIGHT Ponderosa Bessie 900
In this section you have modeled an embedded class as a compound primary key class that has encapsulated all properties within the primary key class.
In this section we will address the case were the entity class wishes to module one or more of the primary key properties as direct properties of the entity class.
Put the following entity class in place in your src/main tree.
package myorg.entityex.annotated;
import javax.persistence.*;
@Entity
@Table(name="ENITYEX_COW2")
@IdClass(CowPK.class)
@AttributeOverrides({
@AttributeOverride(name="name", column=@Column(name="NAME", length=16))
})
public class Cow2 {
@Id
private String herd;
@Id
private String name;
private int weight;
public Cow2() {}
public Cow2(String herd, String name) {
this.herd = herd;
this.name = name;
}
public String getHerd() { return herd; }
public String getName() { return name; }
public int getWeight() { return weight; }
public void setWeight(int weight) {
this.weight = weight;
}
}
Note this class...
Declares the primary key class as an @IdClass
Identifies two properties as @Id properties and these must match the properties of the primary key class.
This class also happens to...
Provide a custom override of the "name" primary key property using @AttributeOverride at the class level.
Provides read-only access to the primary key values by only supplying getters
Add the new entity class to the persistence unit.
<class>myorg.entityex.annotated.Cow2</class>
Build the module and not the generated database schema is identical to the embedded primary key mapping we created in the previous section).
create table ENITYEX_COW2 ( HERD varchar(16) not null, NAME varchar(16) not null, weight integer not null, primary key (HERD, NAME) );
Add the following test method for the new entity class to your existing Junit test case.
@Test public void testIdClass() { log.info("testIdClass"); Cow2 cow = new Cow2("Ponderosa", "Bessie"); cow.setWeight(900); em.persist(cow); //flush to DB and get a new instance em.flush(); em.detach(cow); Cow2 cow2 = em.find(Cow2.class, new CowPK("Ponderosa", "Bessie")); assertNotNull("cow not found", cow2); assertEquals("unexpected herd", cow.getHerd(), cow2.getHerd()); assertEquals("unexpected name", cow.getName(), cow2.getName()); assertEquals("unexpected weight", cow.getWeight(), cow2.getWeight()); }
Note how the use of the primary key is the same when interfacing with the entitymanager. The only real difference is that the entity deals directly with the properties of the primary key class and not the primary key class itself.
Rebuild the module with the new test method. You should get the following in the database when complete.
$ mvn clean test -Ph2srv -P\!h2db ... -testIdClass Hibernate: insert into ENITYEX_COW2 (weight, HERD, NAME) values (?, ?, ?) Hibernate: select cow2x0_.HERD as HERD6_0_, cow2x0_.NAME as NAME6_0_, cow2x0_.weight as weight6_0_ from ENITYEX_COW2 cow2x0_ where cow2x0_.HERD=? and cow2x0_.NAME=? ... [INFO] BUILD SUCCESS SELECT * FROM ENITYEX_COW2; HERD NAME WEIGHT Ponderosa Bessie 900
In this section you successfully mapped a compound primary key class as an IdClass for an entity where multiple @Id properties were directly exposed by the entity. The IdClass declared properties that were identical to the number and name of @Id properties within the entity class.
In this chapter we worked with primary keys made up of multiple values. There were two styles of usage for the primary key class; embedded and idclass. For the embedded case, the entity worked with the primary key class directly and never the primary key class properties. In the idclass case, the entity class declared properties that matched the idclass (or vice-versa) but never interacted with the primary key class. We showed how both mappings to the database were identical and how to define custom mappings for the targeted database schema.
This chapter will take you through mapping nested classes to the database. In this case we have classes within classes (within classes, etc.) that get mapped to a flat database table. Note there are cases when the embedded class contains the meat of what we want mapped to our database and the wrapping entity may only be created to provide the necessary primary key property.
In this section we will map a simple embedded object within an entity class.
Add the following classes to your src/main tree. In this case the entity class contains an instance of a Name class which contains two properties mapped to the database. The Name class is annotated as @Embeddable and the name property in the entity is annotated as @Embedded.
package myorg.entityex.annotated;
import javax.persistence.Column;
import javax.persistence.Embeddable;
@Embeddable
public class Name {
private String firstName;
private String lastName;
public String getFirstName() { return firstName; }
public Name setFirstName(String firstName) { this.firstName = firstName; return this; }
public String getLastName() { return lastName; }
public Name setLastName(String lastName) { this.lastName = lastName; return this; }
}
package myorg.entityex.annotated;
import javax.persistence.*;
@Entity
@Table(name="ENTITYEX_BEAR")
public class Bear {
@Id
@GeneratedValue(strategy=GenerationType.IDENTITY)
private int id;
@Embedded
private Name name;
public int getId() { return id; }
public void setId(int id) {
this.id = id;
}
public Name getName() { return name; }
public void setName(Name name) {
this.name = name;
}
}
Add the new entity class to the persistence unit.
<class>myorg.entityex.annotated.Bear</class>
Build the module and note the database schema created for the entity class. Note the embedded properties are now mapped at the same level as the columns for the entity properties.
create table ENTITYEX_BEAR ( id integer generated by default as identity, firstName varchar(255), lastName varchar(255), primary key (id) );
Add custom table mappings for the firstName and lastName properties. Define the mapping for firstName from within the embedded class. Define the mapping for lastName from within the entity class.
public class Name {
@Column(name="FIRST_NAME", length=16)
private String firstName;
public class Bear {
...
@AttributeOverrides({
@AttributeOverride(name="lastName", column=@Column(name="LAST_NAME", length=16))
})
@Embedded
private Name name;
Rebuild the module and note the change in definition for the firstName and lastName columns. We were able to control the mapping from either within the embedded or entity class.
create table ENTITYEX_BEAR ( id integer generated by default as identity, FIRST_NAME varchar(16), LAST_NAME varchar(16), primary key (id) );
At this point we have shown how to map a single nested object. Note how similar this was to the @EmbeddedId case we went thru during the compound primary key chapter.
The above is an example of a single-level embedded object that has been supported since JPA 1.0. In the next step, add a nested embedded object. Support for multiple levels of nesting was added in JPA 2.0.
Add the following classes.
package myorg.entityex.annotated;
import javax.persistence.Embeddable;
@Embeddable
public class Street {
private int number;
private String name;
public int getNumber() { return number; }
public Street setNumber(int number) { this.number = number; return this; }
public String getName() { return name; }
public Street setName(String name) { this.name = name; return this; }
}
package myorg.entityex.annotated;
import javax.persistence.AttributeOverride;
import javax.persistence.AttributeOverrides;
import javax.persistence.Column;
import javax.persistence.Embeddable;
@Embeddable
public class Address {
private Street street; //a second level of embedded
//@Column(name="CITY", length=16)
private String city;
//@Column(name="STATE", length=16)
private String state;
public Street getStreet() { return street; }
public Address setStreet(Street street) { this.street = street; return this; }
public String getCity() { return city; }
public Address setCity(String city) { this.city = city; return this; }
public String getState() { return state; }
public Address setState(String state) { this.state = state; return this; }
}
public class Bear { ... @Embedded private Address address; public Address getAddress() { return address; } public void setAddress(Address address) { this.address = address; }
Rebuild the module with the new, multi-level embedded class and note the database schema created. Both levels of the Address were flattened into the entity table.
create table ENTITYEX_BEAR ( id integer generated by default as identity, city varchar(255), state varchar(255), name varchar(255), number integer not null, FIRST_NAME varchar(16), LAST_NAME varchar(16), primary key (id) );
Define custom table mappings for the address.
Leave Street un-customized
public static class Street { private int number; private String name;
Map Street.number to the STREET_NUMBER column from the Address class.
@Embeddable public static class Address { @AttributeOverrides({ @AttributeOverride(name="number", column=@Column(name="STREET_NUMBER")), }) private Street street; //a second level of embedded
Map Street.name to a 16 character STREET_NAME column from the entity class. Note the multiple level syntax here.
@AttributeOverrides({ @AttributeOverride(name="street.name", column=@Column(name="STREET_NAME", length=16)), }) @Embedded private Address address;
Rebuild the module and note the generated database schema. Our custom database mappings are in place.
create table ENTITYEX_BEAR ( id integer generated by default as identity, CITY varchar(16), STATE varchar(16), STREET_NAME varchar(16), STREET_NUMBER integer, FIRST_NAME varchar(16), LAST_NAME varchar(16), primary key (id) );
Put the following test method within the existing JUnit test case.
@Test
public void testEmbeddedObject() {
log.info("testEmbeddedObject");
Bear bear = new Bear();
bear.setName(new Name().setFirstName("Yogi").setLastName("Bear"));
bear.setAddress(new Address()
.setCity("Jellystone Park")
.setState("???")
.setStreet(new Street().setNumber(1).setName("Picnic")));
em.persist(bear);
//flush to DB and get a new instance
em.flush(); em.detach(bear);
Bear bear2 = em.find(Bear.class, bear.getId());
assertEquals("unexpected firstName", bear.getName().getFirstName(), bear2.getName().getFirstName());
assertEquals("unexpected lastName", bear.getName().getLastName(), bear2.getName().getLastName());
assertEquals("unexpected street number",
bear.getAddress().getStreet().getNumber(), bear2.getAddress().getStreet().getNumber());
assertEquals("unexpected street name",
bear.getAddress().getStreet().getName(), bear2.getAddress().getStreet().getName());
assertEquals("unexpected city",
bear.getAddress().getCity(), bear2.getAddress().getCity());
assertEquals("unexpected state",
bear.getAddress().getState(), bear2.getAddress().getState());
}
Rebuild the module with the new test method in place.
-testEmbeddedObject Hibernate: insert into ENTITYEX_BEAR (id, CITY, STATE, STREET_NAME, STREET_NUMBER, FIRST_NAME, LAST_NAME) values (null, ?, ?, ?, ?, ?, ?) Hibernate: select bear0_.id as id7_0_, bear0_.CITY as CITY7_0_, bear0_.STATE as STATE7_0_, bear0_.STREET_NAME as STREET4_7_0_, bear0_.STREET_NUMBER as STREET5_7_0_, bear0_.FIRST_NAME as FIRST6_7_0_, bear0_.LAST_NAME as LAST7_7_0_ from ENTITYEX_BEAR bear0_ where bear0_.id=?
The above capability of mapping multi-nested classes was added in JPA 2.0 and allows us to map more complicated structures to the database.
In this chapter we mapped a nested object within the table used to host and enclosing entity class. Note there are times when the entity is created purely for persistence purposes and the embedded class is the real meat we are after. In that case, the entity class is imply providing the primary key property and embedding the rest.
This chapter will take you through mapping a single class to multiple tables. We will reuse the example from the embedded object mapping case because conceptually they are trying to do the same thing except
In the embedded object case we had a single table and multiple objects
In this case we have multiple tables and a single object/class
In the embedded object case, our child objects did not have a primary key
In this case our child objects are entities with a primary key that are joined with the primary object's primary key.
Add the following Java class to your src/main tree. We are not done mapping just yet but lets see what this maps to before making changes.
package myorg.entityex.annotated;
import javax.persistence.*;
@Entity
@Table(name="ENTITYEX_BEAR2")
public class Bear2 {
@Id @GeneratedValue(strategy=GenerationType.IDENTITY)
private int id;
private String firstName;
private String lastName;
private int streetNumber;
private String streetName;
private String city;
private String state;
public int getId() { return id; }
public Bear2 setId(int id) { this.id = id; return this; }
public String getFirstName() { return firstName; }
public Bear2 setFirstName(String firstName) {
this.firstName = firstName; return this;
}
public String getLastName() { return lastName; }
public Bear2 setLastName(String lastName) {
this.lastName = lastName; return this;
}
public int getStreetNumber() { return streetNumber; }
public Bear2 setStreetNumber(int streetNumber) {
this.streetNumber = streetNumber; return this;
}
public String getStreetName() { return streetName; }
public Bear2 setStreetName(String streetName) {
this.streetName = streetName; return this;
}
public String getCity() { return city; }
public Bear2 setCity(String city) {
this.city = city; return this;
}
public String getState() { return state; }
public Bear2 setState(String state) {
this.state = state; return this;
}
}
Add the new entity class to your persistence unit
<class>myorg.entityex.annotated.Bear2</class>
Build the module with the new entity class and observe how the class is mapped to the database. Of no surprise, it is a simple, flat mapping to a single table by default.
create table ENTITYEX_BEAR2 ( id integer generated by default as identity, city varchar(255), firstName varchar(255), lastName varchar(255), state varchar(255), streetName varchar(255), streetNumber integer not null, primary key (id) );
Define a secondary table to host the name properties
@Entity
@Table(name="ENTITYEX_BEAR2")
@SecondaryTables({
@SecondaryTable(name="ENTITYEX_BEAR2_NAME")
})
public class Bear2 {
Assign the firstName and lastName properties to the secondary table.
@Column(table="ENTITYEX_BEAR2_NAME", name="FIRST_NAME", length=16)
private String firstName;
@Column(table="ENTITYEX_BEAR2_NAME", name="LAST_NAME", length=16)
private String lastName;
Rebuild the module and notice the database schema generated. The firstName and lastName are mapped to the secondary table, the two tables are joined by primary key values, and the primary key generation and propagation is taken care of by the provider.
create table ENTITYEX_BEAR2 ( id integer generated by default as identity, city varchar(255), state varchar(255), streetName varchar(255), streetNumber integer not null, primary key (id) ); create table ENTITYEX_BEAR2_NAME ( FIRST_NAME varchar(16), LAST_NAME varchar(16), id integer not null, primary key (id) ); ... alter table ENTITYEX_BEAR2_NAME add constraint FKED0C2F35D7F6CC81 foreign key (id) references ENTITYEX_BEAR2;
Add a second secondary table for the address properties
@Entity
@Table(name="ENTITYEX_BEAR2")
@SecondaryTables({
@SecondaryTable(name="ENTITYEX_BEAR2_NAME"),
@SecondaryTable(name="ENTITYEX_BEAR2_ADDRESS")
})
public class Bear2 {
Assign the address properties to the new table
@Column(table="ENTITYEX_BEAR2_ADDRESS", name="STREET_NUMBER", length=16)
private int streetNumber;
@Column(table="ENTITYEX_BEAR2_ADDRESS", name="STREET_NAME", length=16)
private String streetName;
@Column(table="ENTITYEX_BEAR2_ADDRESS", name="CITY", length=16)
private String city;
@Column(table="ENTITYEX_BEAR2_ADDRESS", name="STATE", length=16)
private String state;
Rebuild the module and note the database schema generated. We now have a second table with a primary key join to the primary table.
create table ENTITYEX_BEAR2 ( id integer generated by default as identity, primary key (id) ); create table ENTITYEX_BEAR2_ADDRESS ( CITY varchar(16), STATE varchar(16), STREET_NAME varchar(16), STREET_NUMBER integer, id integer not null, primary key (id) ); create table ENTITYEX_BEAR2_NAME ( FIRST_NAME varchar(16), LAST_NAME varchar(16), id integer not null, primary key (id) ); ... alter table ENTITYEX_BEAR2_ADDRESS add constraint FKD1DF32EAD7F6CC81 foreign key (id) references ENTITYEX_BEAR2; alter table ENTITYEX_BEAR2_NAME add constraint FKED0C2F35D7F6CC81 foreign key (id) references ENTITYEX_BEAR2;
Add the following test method to the existing JUnit test case. Note this test method is similar to the embedded object test method except that all properties are directly accessible from the parent entity class.
@Test
public void testMultiTableMapping() {
log.info("testMultiTableMapping");
Bear2 bear = new Bear2()
.setFirstName("Yogi")
.setLastName("Bear")
.setStreetNumber(1)
.setStreetName("Picnic")
.setCity("Jellystone Park")
.setState("???");
em.persist(bear);
//flush to DB and get a new instance
em.flush(); em.detach(bear);
Bear2 bear2 = em.find(Bear2.class, bear.getId());
assertEquals("unexpected firstName", bear.getFirstName(), bear2.getFirstName());
assertEquals("unexpected lastName", bear.getLastName(), bear2.getLastName());
assertEquals("unexpected street number",
bear.getStreetNumber(), bear2.getStreetNumber());
assertEquals("unexpected street name",
bear.getStreetName(), bear2.getStreetName());
assertEquals("unexpected city",
bear.getCity(), bear2.getCity());
assertEquals("unexpected state",
bear.getState(), bear2.getState());
}
Rebuild the module and observe the pass/fail results of the new test as well as the database interaction.
-testMultiTableMapping Hibernate: insert into ENTITYEX_BEAR2 (id) values (null) Hibernate: insert into ENTITYEX_BEAR2_ADDRESS (CITY, STATE, STREET_NAME, STREET_NUMBER, id) values (?, ?, ?, ?, ?) Hibernate: insert into ENTITYEX_BEAR2_NAME (FIRST_NAME, LAST_NAME, id) values (?, ?, ?) Hibernate: select bear2x0_.id as id8_0_, bear2x0_1_.CITY as CITY10_0_, bear2x0_1_.STATE as STATE10_0_, bear2x0_1_.STREET_NAME as STREET3_10_0_, bear2x0_1_.STREET_NUMBER as STREET4_10_0_, bear2x0_2_.FIRST_NAME as FIRST1_9_0_, bear2x0_2_.LAST_NAME as LAST2_9_0_ from ENTITYEX_BEAR2 bear2x0_ left outer join ENTITYEX_BEAR2_ADDRESS bear2x0_1_ on bear2x0_.id=bear2x0_1_.id left outer join ENTITYEX_BEAR2_NAME bear2x0_2_ on bear2x0_.id=bear2x0_2_.id where bear2x0_.id=?
In this chapter we mapped multiple tables mapped thru a one-to-one primary key join into a single class. You will find this very similar to when we map a Java class inheritance hierachy to multiple tables using the JOIN strategy. However, in that case, each of the tables is mapped to a specific class within the hierarchy rather than a single class as we did here.
Copyright © 2019 jim stafford (jim.stafford@jhu.edu)
Built on: 2019-08-22 07:10 EST
Abstract
This document contains instructions for exercises to map Java class relations to the database using JPA. It covers the core and corner mapping concepts as well as demonstrates issues that can occur with JPA relationship mappings.
Table of Contents
To provide hands on experience
Mapping relationships between JPA entity classes to a relational database
Propagating dependent primary key values derived from a parent relationship
Relating entity classes which use built-in and composite primary key types
Witnessing the functional and potential performance differences between different mapping styles and properties
At the completion of this exercise, the student will be able to
Map OneToOne, OneToMany, ManyToOne, and ManyToMany relationships to the database
Map uni-directional and bi-directional relationships to the database
Derive and propagate a primary key for a dependent entity from a relation to a parent entity
Realize relationships using foreign key joins, primary key joins, and link table joins
Realize relationships to a parent entity using a composite primary/foreign key
Enforce mandatory and allow optional relationships
Reduce interaction with the EntityManager using relationship Cascades
Impact performance through informed decisions on selecting certain relationship styles and properties
This exercise makes use of the AUTO GeneratedValue strategy and was
originally authored when that strategy was IDENTITY. As of 2018 using H2,
the strategy resolves SEQUENCE. That means that there will be an additional
set of calls to "next value" and database rows may be delayed in their
insertion. Simply add an em.flush()
to force the insert of any
new rows where there is a confusing mis-match between the instructions and
your observed output. This should be another reminder to not leverage
the AUTO strategy -- since it can change over time.
The setup for this exercise will use a maven archetype process to build you the shell of a JPA module to perform these exercises. The module can be created anywhere, but if you create it within a directory that already has a pom.xml, the archetype will attempt to integrate the new module as a sub-module of the existing parent.
Add the following to your .m2/settings.xml file. This will allow you to resolve the exercise archetype and set a default database for the exercise.
<profiles> <profile> <id>webdev-repositories</id> <repositories> <repository> <id>webdev</id> <name>ejava webdev repository</name> <url>http://webdev.jhuep.com/~jcs/maven2</url> <releases> <enabled>true</enabled> <updatePolicy>never</updatePolicy> </releases> <snapshots> <enabled>false</enabled> </snapshots> </repository> <repository> <id>webdev-snapshot</id> <name>ejava webdev snapshot repository</name> <url>http://webdev.jhuep.com/~jcs/maven2-snapshot</url> <releases> <enabled>false</enabled> </releases> <snapshots> <enabled>true</enabled> <updatePolicy>daily</updatePolicy> </snapshots> </repository> </repositories> </profile> </profiles> <activeProfiles> <activeProfile>h2db</activeProfile> <!-- <activeProfile>h2srv</activeProfile> --> </activeProfiles>
Use the ejava.jpa:jpa-archetype to setup a new Maven project for this exercise. Activate the webdev-repositories profile (-Pwebdev-repositories) so that you can resolve the archetype off the Internet. The following should be run outside of the class example tree.
$ mvn archetype:generate -B -DarchetypeGroupId=info.ejava.examples.jpa -DarchetypeArtifactId=jpa-archetype -DarchetypeVersion=5.0.0-SNAPSHOT -DgroupId=myorg.relex -DartifactId=relationEx -Pwebdev-repositories [INFO] Scanning for projects... ... [INFO] ---------------------------------------------------------------------------- [INFO] Using following parameters for creating project from Archetype: jpa-archetype:5.0.0-SNAPSHOT [INFO] ---------------------------------------------------------------------------- [INFO] Parameter: groupId, Value: myorg.relex [INFO] Parameter: artifactId, Value: relationEx [INFO] Parameter: version, Value: 1.0-SNAPSHOT [INFO] Parameter: package, Value: myorg.relex [INFO] Parameter: packageInPathFormat, Value: myorg/relex [INFO] Parameter: version, Value: 1.0-SNAPSHOT [INFO] Parameter: package, Value: myorg.relex [INFO] Parameter: groupId, Value: myorg.relex [INFO] Parameter: artifactId, Value: relationEx [INFO] Project created from Archetype in dir: /Users/jim/proj/784/relationEx [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2.287 s [INFO] Finished at: 2018-08-18T12:19:05-04:00 [INFO] Final Memory: 18M/314M [INFO] ------------------------------------------------------------------------
You should now have an instantiated template for a JPA project
relationEx/ ├── pom.xml └── src ├── main │ └── java │ └── myorg │ └── relex │ └── Auto.java └── test ├── java │ └── myorg │ └── relex │ └── AutoTest.java └── resources ├── hibernate.properties ├── log4j.xml └── META-INF └── persistence.xml
Verify the instantiated template builds in your environment
Activate the h2db profile (and deactivate the h2srv profile) to use an embedded file as your database. This option does not require a server but is harder to inspect database state in between tests.
relationEx]$ mvn clean test -Ph2db -P\!h2srv ... -HHH000401: using driver [org.h2.Driver] at URL [jdbc:h2:/Users/jim/proj/784/relationEx/target/h2db/ejava] ... [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------
Start your database server
$ java -jar M2_REPO/com/h2database/h2/1.4.197/h2-1.4.197.jar
Activate the h2srv profile (and deactivate the h2db profile) to use a running H2 database server. This option provides more interaction with your database but does require the server to be running.
relationEx]$ mvn clean test -P\!h2db -Ph2srv ... -HHH000401: using driver [org.h2.Driver] at URL [jdbc:h2:tcp://127.0.0.1:9092/./h2db/ejava] ... [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------
You may now import the instantiated template into Eclipse as an "Existing Maven Project"
Put the following Junit test case in your src/test tree
package myorg.relex;
import javax.persistence.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.junit.*;
public class JPATestBase {
private static Logger log = LoggerFactory.getLogger(JPATestBase.class);
private static final String PERSISTENCE_UNIT = "relationEx-test";
private static EntityManagerFactory emf;
protected EntityManager em;
@BeforeClass
public static void setUpClass() {
log.debug("creating entity manager factory");
emf = Persistence.createEntityManagerFactory(PERSISTENCE_UNIT);
}
@Before
public void setUp() throws Exception {
log.debug("creating entity manager");
em = emf.createEntityManager();
cleanup();
em.getTransaction().begin();
}
@After
public void tearDown() throws Exception {
logger.debug("tearDown() started, em={}", em);
if (em!=null) {
EntityTransaction tx = em.getTransaction();
if (tx.isActive()) {
if (tx.getRollbackOnly() == true) { tx.rollback(); }
else { tx.commit(); }
} else {
tx.begin();
tx.commit();
}
em.close();
em=null;
}
logger.debug("tearDown() complete, em={}", em);
}
@AfterClass
public static void tearDownClass() {
log.debug("closing entity manager factory");
if (emf!=null) { emf.close(); }
}
public void cleanup() {
em.getTransaction().begin();
em.getTransaction().commit();
}
protected EntityManager createEm() {
return emf.createEntityManager();
}
}
In this chapter we will work thru several ways to relate two entities in a one-to-one relationship. As the name implies each side of the relationship has no more than one instance of the other. That sounds easy -- and it is if we keep in mind that this is a unique relationship (i.e., no other instance has it) from both sides.
Create a JUnit test class to host tests for the one-to-one mappings.
Put the following Junit test case base class in your src/test tree. You can delete the sample test method once we add our first real test. JUnit will fail a test case if it cannot locate a @Test to run.
package myorg.relex;
import static org.junit.Assert.*;
import javax.persistence.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.junit.*;
public class One2OneTest extends JPATestBase {
private static Logger log = LoggerFactory.getLogger(One2OneTest.class);
@Test
public void testSample() {
log.info("testSample");
}
}
Verify the new JUnit test class builds and executes to completion
relationEx]$ mvn clean test -P\!h2db -Ph2srv ... -HHH000401: using driver [org.h2.Driver] at URL [jdbc:h2:tcp://localhost:9092/./h2db/ejava] ... [INFO] BUILD SUCCESS
The notion of a uni-directional relationship is solely a characterization of what the Java class at either end of the relationship knows about the other. For uni-directional relationships only one class references the other while the other passively participates in the relationship.
In this first case we are going to model the relationship from the owning side of the relationship as a foreign key in the owning entity's table.
Create the following entity class in your src/main tree to represent the passive side of the relationship. I am calling this "passive" (or "ignorant") because it will know nothing of the relationships we will form within this section. This is different than the "inverse" side we will address in the bi-directional case.
package myorg.relex.one2one;
import javax.persistence.*;
/**
* Target of uni-directional relationship
*/
@Entity
@Table(name="RELATIONEX_PERSON")
public class Person {
@Id @GeneratedValue
private int id;
private String name;
public int getId() { return id; }
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Notice there is no reference to the owning Player class within this entity. This fact alone makes it uni-directional
Create the following entity class in your src/main tree to represent the owning side of the relationship. It is currently incomplete.
package myorg.relex.one2one;
import javax.persistence.*;
/**
* Provides example of one-to-one unidirectional relationship
* using foreign key.
*/
@Entity
@Table(name="RELATIONEX_PLAYER")
public class Player {
public enum Position { DEFENSE, OFFENSE, SPECIAL_TEAMS};
@Id @GeneratedValue
private int id;
@Enumerated(EnumType.STRING)
@Column(length=16)
private Position position;
//@OneToOne
private Person person;
public int getId() { return id; }
public Person getPerson() { return person; }
public void setPerson(Person person) {
this.person = person;
}
public Position getPosition() { return position; }
public void setPosition(Position position) {
this.position = position;
}
}
Add the two entity classes to the persistence unit housed in src/test tree
<persistence-unit name="relationEx-test">
<provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>
...
<class>myorg.relex.one2one.Person</class>
<class>myorg.relex.one2one.Player</class>
...
</persistence-unit>
Attempt to build the module and note the error that results. The error is stating the provider does not know how to map the non-serializable Person class to a column within the Player table.
org.hibernate.MappingException: Could not determine type for: myorg.relex.one2one.Person, at table: RELATIONEX_PLAYER, for columns: [org.hibernate.mapping.Column(person).
If you look back at the Class mapping topic, we were able to map a serialized relationship to a BLOB column. That is what we are accidentally doing here if we leave off the @XxxToXxx relationship specification.
Add a JPA @OneToOne relationship mapping from the Player to Person. Also include a definitions to...
Make the Person required for the Player
Specify the Person must be also fetched when obtaining the Player
Specify a foreign key column in the Player table that references the Person table
@OneToOne(optional=false,fetch=FetchType.EAGER)
@JoinColumn(name="PERSON_ID")
private Person person;
Build the module and observe the database schema generated.
create table RELATIONEX_PERSON ( id integer generated by default as identity, name varchar(255), primary key (id) ); create table RELATIONEX_PLAYER ( id integer generated by default as identity, position varchar(16), PERSON_ID integer not null, primary key (id), unique (PERSON_ID) ); alter table RELATIONEX_PLAYER add constraint FK58E275714BE1E366 foreign key (PERSON_ID) references RELATIONEX_PERSON;
The Player table contains a foreign key referencing the Person table. Note the foreign key *value* (PERSON_ID) is not modeled within the Player entity class. Only the *relationship* to the Person has been depicted within the Player. If we want the person ID value, we can ask the person object related to the player.
The foreign key column is required to be supplied ("not null"). This means that all Players must have a Person
The foreign key column is required to be unique. This means that only one Player may reference one Person using the PERSON_ID.
Add the following test method to your existing JUnit test case. It is currently incomplete.
@Test
public void testOne2OneUniFK() {
log.info("*** testOne2OneUniFK ***");
Person person = new Person();
person.setName("Johnny Unitas");
Player player = new Player();
player.setPerson(person);
player.setPosition(Player.Position.OFFENSE);
//em.persist(person);
em.persist(player); //provider will propagate person.id to player.FK
//clear the persistence context and get new instances
em.flush(); em.clear();
Player player2 = em.find(Player.class, player.getId());
assertEquals("unexpected position", player.getPosition(), player2.getPosition());
assertEquals("unexpected name", player.getPerson().getName(), player2.getPerson().getName());
}
Attempt to re-build the module and note the error.
./target/surefire-reports/myorg.relex.One2OneTest.txt :::::::::::::: ------------------------------------------------------------------------------- Test set: myorg.relex.One2OneTest ------------------------------------------------------------------------------- Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 2.874 sec <<< FAILURE! testOne2OneUniFK(myorg.relex.One2OneTest) Time elapsed: 0.171 sec <<< ERROR! java.lang.IllegalStateException: org.hibernate.TransientObjectException: object references an unsaved transient instance - save the transient instance before flushing: myorg.relex.one2one.Player.person -> myorg.relex.one2one.Person at org.hibernate.ejb.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1358) at org.hibernate.ejb.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1289) at org.hibernate.ejb.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1295) at org.hibernate.ejb.AbstractEntityManagerImpl.flush(AbstractEntityManagerImpl.java:976) at myorg.relex.One2OneTest.testOne2OneUniFK(One2OneTest.java:29)
The provider is stating that our test case is trying to persist the Player when the reference to the Person references an unmanaged Person object. We need add a persist of the Person prior to hitting the call to flush.
Update the test method to persist both the Person and Player prior to the flush.
em.persist(person);
em.persist(player);
We will look at cascades a bit later which may or may not be appropriate to solve this dependent/parent table persistence.
Rebuild and observe the results of the test method. Note the Person and Player being persisted and the PERSON_ID of the Player being set to the generated primary key value of the Person. During the find(), the Person and Player are both obtained through a database join. Since the Person is required for the Player and we requested an EAGER fetch type, a database inner join is performed between the Player and Person tables.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniFK ... -*** testOne2OneUniFK *** Hibernate: insert into RELATIONEX_PERSON (id, name) values (null, ?) Hibernate: insert into RELATIONEX_PLAYER (id, PERSON_ID, position) values (null, ?, ?) Hibernate: select player0_.id as id2_1_, player0_.PERSON_ID as PERSON3_2_1_, player0_.position as position2_1_, person1_.id as id1_0_, person1_.name as name1_0_ from RELATIONEX_PLAYER player0_ inner join RELATIONEX_PERSON person1_ on player0_.PERSON_ID=person1_.id where player0_.id=?
If we made the Person optional the database query is converted from an inner join to an outer join -- allowing Players without a Person to be returned.
@OneToOne(optional=true,fetch=FetchType.EAGER)
@JoinColumn(name="PERSON_ID")
private Person person;
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniFK ... Hibernate: select player0_.id as id2_1_, player0_.PERSON_ID as PERSON3_2_1_, player0_.position as position2_1_, person1_.id as id1_0_, person1_.name as name1_0_ from RELATIONEX_PLAYER player0_ left outer join RELATIONEX_PERSON person1_ on player0_.PERSON_ID=person1_.id where player0_.id=?
Also note if we modified the fetch specification to LAZY, the join is removed entirely and replaced with a single select of the Player table during the find() and then a follow-up select of the Person table once we got to the player.getPerson().getName() calls.
@OneToOne(optional=false,fetch=FetchType.LAZY)
@JoinColumn(name="PERSON_ID")
private Person person;
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniFK ... Hibernate: select player0_.id as id2_0_, player0_.PERSON_ID as PERSON3_2_0_, player0_.position as position2_0_ from RELATIONEX_PLAYER player0_ where player0_.id=? Hibernate: <<<=== caused by player.getPerson().getName() select person0_.id as id1_0_, person0_.name as name1_0_ from RELATIONEX_PERSON person0_ where person0_.id=?
If we comment out the calls to getPerson.getName(), only a single select on the Player is performed and the Person is never retrieved. That is the performance power of LAZY load.
//assertEquals("unexpected name", player.getPerson().getName(), player2.getPerson().getName());
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniFK ... Hibernate: select player0_.id as id2_0_, player0_.PERSON_ID as PERSON3_2_0_, player0_.position as position2_0_ from RELATIONEX_PLAYER player0_ where player0_.id=?
Add the following code to the test method to perform a query of the two tables using SQL in order to verify the expected mappings and values
Object[] cols = (Object[]) em.createNativeQuery(
"select person.id person_id, person.name, " +
"player.id player_id, player.person_id player_person_id " +
"from RELATIONEX_PLAYER player " +
"join RELATIONEX_PERSON person on person.id = player.person_id " +
"where player.id = ?1")
.setParameter(1, player.getId())
.getSingleResult();
log.info("row=" + Arrays.toString(cols));
assertEquals("unexpected person_id", person.getId(), ((Number)cols[0]).intValue());
assertEquals("unexpected person_name", person.getName(), (String)cols[1]);
assertEquals("unexpected player_id", player.getId(), ((Number)cols[2]).intValue());
assertEquals("unexpected player_person_id", person.getId(), ((Number)cols[3]).intValue());
Rebuild the module to verify the SQL mappings is what we expected.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniFK ... Hibernate: select person.id person_id, person.name, player.id player_id, player.person_id player_person_id from RELATIONEX_PLAYER player join RELATIONEX_PERSON person on person.id = player.person_id where player.id = ? -row=[1, Johnny Unitas, 1, 1]
Add the following delete logic to the test method to remove the Person object. It is currently incomplete.
//em.remove(player2);
em.remove(player2.getPerson());
em.flush();
assertNull("person not deleted", em.find(Person.class, person.getId()));
assertNull("player not deleted", em.find(Player.class, player.getId()));
Attempt to re-build the module and note the error that occurs. The problem is we have attempted to delete the Person row from the database while a foreign key from the Player was still referencing it.
.Hibernate: delete from RELATIONEX_PERSON where id=? /target/surefire-reports/myorg.relex.One2OneTest.txt :::::::::::::: ------------------------------------------------------------------------------- Test set: myorg.relex.One2OneTest ------------------------------------------------------------------------------- Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 3.551 sec <<< FAILURE! testOne2OneUniFK(myorg.relex.One2OneTest) Time elapsed: 1.103 sec <<< ERROR! javax.persistence.PersistenceException: org.hibernate.exception.ConstraintViolationException: Referential integrity constraint violation: "FK58E275714BE1E366: PUBLIC.RELATIONEX_PLAYER FOREIGN KEY(PERSON_ID) REFERENCES PUBLIC.RELATIONEX_PERSON(ID) (1)"; SQL statement: delete from RELATIONEX_PERSON where id=? [23503-168] at org.hibernate.ejb.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1361) at org.hibernate.ejb.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1289) at org.hibernate.ejb.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1295) at org.hibernate.ejb.AbstractEntityManagerImpl.flush(AbstractEntityManagerImpl.java:976) at myorg.relex.One2OneTest.testOne2OneUniFK(One2OneTest.java:37)
Fix the problem by deleting the Player prior to the Person.
em.remove(player2); em.remove(player2.getPerson());
Rebuild the module and note the success of the test method and the sensible delete order within the database.
Hibernate: delete from RELATIONEX_PLAYER where id=? Hibernate: delete from RELATIONEX_PERSON where id=?
We have finished a pass at the first way to hook up a one-to-one, uni-directional relationship by using a foreign key. With that, we also showed the database impact of making the relationship optional and modifying the fetch type. We also purposely created errors common to persisting and deleting obejcts with foreign key references.
Next we are going to realize the one-to-one uni-directional relationship from the dependent to parent entity using a join table. The implementation of the dependent entity is identical to what we did in the FK-join except for changing the @JoinColumn to a @JoinTable
Add the following entity class to your src/main tree. The comments make it incomplete and use a default mapping for the @OneToOne relationship.
package myorg.relex.one2one;
import javax.persistence.*;
/**
* Provides example of one-to-one unidirectional relationship
* using join table.
*/
@Entity
@Table(name="RELATIONEX_MEMBER")
public class Member {
public enum Role { PRIMARY, SECONDARY};
@Id @GeneratedValue
private int id;
@OneToOne(optional=false,fetch=FetchType.EAGER)
/*@JoinTable(name="RELATIONEX_MEMBER_PERSON",
joinColumns={
@JoinColumn(name="MEMBER_ID", referencedColumnName="ID"),
}, inverseJoinColumns={
@JoinColumn(name="PERSON_ID", referencedColumnName="ID"),
}
)*/
private Person person;
@Enumerated(EnumType.STRING)
@Column(length=16)
private Role role;
protected Member() {}
public Member(Person person) {
this.person = person;
}
public int getId() { return id; }
public Person getPerson() { return person; }
public Role getRole() { return role; }
public void setRole(Role role) {
this.role = role;
}
}
Add the entity to the persistence unit
<class>myorg.relex.one2one.Member</class>
Build the module and observe the generated database schema. Notice the default mapping for the relationship is a foreign key join.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_MEMBER ( id integer generated by default as identity, role varchar(16), person_id integer not null, primary key (id), unique (person_id) ); ... alter table RELATIONEX_MEMBER add constraint FK5366652A4BE1E366 foreign key (person_id) references RELATIONEX_PERSON;
Update the mapping to use a a join table using the @JoinTable annotation. The name of the join table is required in this case, but leave the rest of the mapping defaulted at this point.
@OneToOne(optional=false,fetch=FetchType.EAGER)
@JoinTable(name="RELATIONEX_MEMBER_PERSON")/*,
joinColumns={
@JoinColumn(name="MEMBER_ID", referencedColumnName="ID"),
}, inverseJoinColumns={
@JoinColumn(name="PERSON_ID", referencedColumnName="ID"),
}
)*/
private Person person;
Re-build the module and observe the generated database schema for our new @JoinTable relationship.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_PERSON ( id integer generated by default as identity, name varchar(255), primary key (id) ); ... create table RELATIONEX_MEMBER ( id integer generated by default as identity, role varchar(16), primary key (id) ); create table RELATIONEX_MEMBER_PERSON ( person_id integer not null, id integer not null, primary key (id), unique (person_id) ); ... alter table RELATIONEX_MEMBER_PERSON add constraint FK3D65E40A13E64581 foreign key (id) references RELATIONEX_MEMBER; alter table RELATIONEX_MEMBER_PERSON add constraint FK3D65E40A4BE1E366 foreign key (person_id) references RELATIONEX_PERSON;
Note...
The provider derived names for the Person.id and Member.id foreign keys in the join table
The "id" column of the join table is the primary key and has a primary key join relationship with the dependent's table.
The "person_id" of the join table is also constrained to be unique since this is a one-to-one relationship. We can only have a single entry in this table referencing the parent entity.
Finish the @JoinTable mapping by making the join table column mapping explicit.
@JoinTable(name="RELATIONEX_MEMBER_PERSON",
joinColumns={
@JoinColumn(name="MEMBER_ID", referencedColumnName="ID"),
}, inverseJoinColumns={
@JoinColumn(name="PERSON_ID", referencedColumnName="ID"),
}
)
private Person person;
The JoinTable.name property was used to name the table
The JoinTable.joinColumns property was used define column(s) pointing to this dependent entity
The JoinTable.inverseJoinColumns property was used to define column(s) pointing to the parent entity
Multiple @JoinColumns would have been necessary only when using composite keys
Re-build the module and note the generated database schema for the join table. The columns now have the custom names we assigned.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_MEMBER_PERSON ( PERSON_ID integer not null, MEMBER_ID integer not null, primary key (MEMBER_ID), unique (PERSON_ID) );
Add the following test method to you existing one-to-one test case.
@Test
public void testOne2OneUniJoinTable() {
log.info("*** testOne2OneUniJoinTable ***");
Person person = new Person();
person.setName("Joe Smith");
Member member = new Member(person);
member.setRole(Member.Role.SECONDARY);
em.persist(person);
em.persist(member); //provider will propagate person.id to player.FK
//clear the persistence context and get new instances
em.flush(); em.clear();
Member member2 = em.find(Member.class, member.getId());
assertEquals("unexpected role", member.getRole(), member2.getRole());
assertEquals("unexpected name", member.getPerson().getName(), member2.getPerson().getName());
}
Build the module, run the new test method, and observe the database output. Notice the extra insert for the join table
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniJoinTable ... -*** testOne2OneUniJoinTable *** Hibernate: insert into RELATIONEX_PERSON (id, name) values (null, ?) Hibernate: insert into RELATIONEX_MEMBER (id, role) values (null, ?) Hibernate: insert into RELATIONEX_MEMBER_PERSON (PERSON_ID, MEMBER_ID) values (?, ?) Hibernate: select member0_.id as id3_1_, member0_.role as role3_1_, member0_1_.PERSON_ID as PERSON1_4_1_, person1_.id as id1_0_, person1_.name as name1_0_ from RELATIONEX_MEMBER member0_ left outer join RELATIONEX_MEMBER_PERSON member0_1_ on member0_.id=member0_1_.MEMBER_ID inner join RELATIONEX_PERSON person1_ on member0_1_.PERSON_ID=person1_.id where member0_.id=?
If you make the relationship optional then the inner join to the Person changes to a left outer join -- allowing us to locate Members that have no Person related.
@OneToOne(optional=true,fetch=FetchType.EAGER)
@JoinTable(name="RELATIONEX_MEMBER_PERSON",
Hibernate: select member0_.id as id3_1_, member0_.role as role3_1_, member0_1_.PERSON_ID as PERSON1_4_1_, person1_.id as id1_0_, person1_.name as name1_0_ from RELATIONEX_MEMBER member0_ left outer join RELATIONEX_MEMBER_PERSON member0_1_ on member0_.id=member0_1_.MEMBER_ID left outer join RELATIONEX_PERSON person1_ on member0_1_.PERSON_ID=person1_.id where member0_.id=?
If you change from EAGER to LAZY fetch type, the provider then has the option of skipping the two extra tables until the Person is actually needed. Note, however, in the provided output that the provider joined with at least the join table so that it could build a lightweight reference to the Person.
@OneToOne(optional=false,fetch=FetchType.LAZY)
@JoinTable(name="RELATIONEX_MEMBER_PERSON",
Hibernate: select member0_.id as id3_0_, member0_.role as role3_0_, member0_1_.PERSON_ID as PERSON1_4_0_ from RELATIONEX_MEMBER member0_ left outer join RELATIONEX_MEMBER_PERSON member0_1_ on member0_.id=member0_1_.MEMBER_ID where member0_.id=?
Using LAZY fetch mode, the provider is able to postpone getting the parent object until it is actually requested.
Hibernate: select person0_.id as id1_0_, person0_.name as name1_0_ from RELATIONEX_PERSON person0_ where person0_.id=?
Add the following test of the SQL structure to the test method. Here we can assert what we believe the mapping and values should be in the database when forming the one-to-one relationship using the join table.
Object[] cols = (Object[]) em.createNativeQuery(
"select person.id person_id, person.name, " +
"member.id member_id, member.role member_role, " +
"link.member_id link_member, link.person_id link_person " +
"from RELATIONEX_MEMBER member " +
"join RELATIONEX_MEMBER_PERSON link on link.member_id = member.id " +
"join RELATIONEX_PERSON person on link.person_id = person.id " +
"where member.id = ?1")
.setParameter(1, member.getId())
.getSingleResult();
log.info("row=" + Arrays.toString(cols));
assertEquals("unexpected person_id", person.getId(), ((Number)cols[0]).intValue());
assertEquals("unexpected person_name", person.getName(), (String)cols[1]);
assertEquals("unexpected member_id", member.getId(), ((Number)cols[2]).intValue());
assertEquals("unexpected member_role", member.getRole().name(), (String)cols[3]);
assertEquals("unexpected link_member_id", member.getId(), ((Number)cols[4]).intValue());
assertEquals("unexpected link_person_id", person.getId(), ((Number)cols[5]).intValue());
Re-build the module run the test method of interest, and note the success of our assertions on the schema and the produced values.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniJoinTable ... Hibernate: select person.id person_id, person.name, member.id member_id, member.role member_role, link.member_id link_member, link.person_id link_person from RELATIONEX_MEMBER member join RELATIONEX_MEMBER_PERSON link on link.member_id = member.id join RELATIONEX_PERSON person on link.person_id = person.id where member.id = ? -row=[1, Joe Smith, 1, SECONDARY, 1, 1]
Add the following cleanup to the test method.
em.remove(member2); em.remove(member2.getPerson()); em.flush(); assertNull("person not deleted", em.find(Person.class, person.getId())); assertNull("member not deleted", em.find(Member.class, member.getId()));
Re-build, not the successful results of our assertions, and the database output. A row is deleted from the Member and join table when the Member is deleted. Person row is deleted when we finally delete the person.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniJoinTable ... Hibernate: delete from RELATIONEX_MEMBER_PERSON where MEMBER_ID=? Hibernate: delete from RELATIONEX_MEMBER where id=? Hibernate: delete from RELATIONEX_PERSON where id=? ...
We have completed our one-to-one, uni-directional relationship implemented through a join table. It required an extra table, and some more verbose mappings -- but not any structural change to the dependent entity class.
Next we will attempt to remove the separate foreign key column from the dependent table or the separate join table mapping the dependent and parent tables. We will instead map the dependent to the parent using a join of their primary key values. This means that the primary keys of both entities/tables must be the same value. The parent's primary key can be automatically generated -- but the dependent's primary key value must be based on the parent's value. As you will see, that will cause a slight complication in ordering the persists of the two entities.
Add the following entity class to your src/main tree to implement a one-to-one, uni-directional, primary key join. In this entity class, we have replaced the @JoinColumn with a @PrimaryKeyJoinColumn specification. This tells the provider not to create a separate foreign key column in the database and to reuse the primary key column to form the relation to the Person.
package myorg.relex.one2one;
import java.util.Date;
import javax.persistence.*;
/**
* Provides example of one-to-one unidirectional relationship
* using a primary key join.
*/
@Entity
@Table(name="RELATIONEX_EMPLOYEE")
public class Employee {
@Id //pk value must be assigned, not generated
private int id;
@OneToOne(optional=false,fetch=FetchType.EAGER)
@PrimaryKeyJoinColumn //informs provider the FK derived from PK
private Person person;
@Temporal(TemporalType.DATE)
private Date hireDate;
protected Employee() {}
public Employee(Person person) {
this.person = person;
if (person != null) { id = person.getId(); }
}
public int getId() { return person.getId(); }
public Person getPerson() { return person; }
public Date getHireDate() { return hireDate; }
public void setHireDate(Date hireDate) {
this.hireDate = hireDate;
}
}
Note...
The dependent entity has an @Id property compatible with the type in the parent entity @Id
The dependent entity @Id is not generated -- it must be assigned
The relationship to the parent entity is defined as being realized through the value in the primary key
The dependent entity class requires the parent be provided in the constructor and provides no setters for the relation. JPA has no requirement for this but is an appropriate class design since the person is a required relation, the source of the primary key, and it is illegal to change the value of a primary key in the database.
Add the new entity class to the persistence unit.
<class>myorg.relex.one2one.Employee</class>
Build the module and observe the database schema generated. Notice the Employee table does not have a separate foreign key column and its primary key is assigned the duties of the foreign key.
create table RELATIONEX_EMPLOYEE ( id integer not null, hireDate date, primary key (id) ); create table RELATIONEX_PERSON ( id integer generated by default as identity, name varchar(255), primary key (id) ); alter table RELATIONEX_EMPLOYEE add constraint FK813A593E1907563C foreign key (id) references RELATIONEX_PERSON;
Add the following test method to your existing one-to-one test case. It is incomplete at this point and will cause an error.
@Test
public void testOne2OneUniPKJ() {
log.info("*** testOne2OneUniPKJ ***");
Person person = new Person();
person.setName("Ozzie Newsome");
//em.persist(person);
//em.flush(); //generate the PK for the person
Employee employee = new Employee(person);//set PK/FK -- provider will not auto propagate
employee.setHireDate(new GregorianCalendar(1996, Calendar.JANUARY, 1).getTime());
em.persist(person);
em.persist(employee);
//clear the persistence context and get new instances
em.flush(); em.clear();
Employee employee2 = em.find(Employee.class, employee.getPerson().getId());
log.info("calling person...");
assertEquals("unexpected name", employee.getPerson().getName(), employee2.getPerson().getName());
}
Attempt to build and execute the new test method and observe the results. The problem is the the primary key is not being set and the required foreign key is being realized by the unset primary key.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniPKJ ... -*** testOne2OneUniPKJ *** Hibernate: insert into RELATIONEX_PERSON (id, name) values (null, ?) Hibernate: insert into RELATIONEX_EMPLOYEE (hireDate, id) values (?, ?) -SQL Error: 23506, SQLState: 23506 -Referential integrity constraint violation: "FK813A593E1907563C: PUBLIC.RELATIONEX_EMPLOYEE FOREIGN KEY(ID) REFERENCES PUBLIC.RELATIONEX_PERSON(ID) (0)"; SQL statement: insert into RELATIONEX_EMPLOYEE (hireDate, id) values (?, ?) [23506-168]
Move the persistence of the parent entity so that it is in place prior to being assigned to the dependent entity. That way the dependent entity will be receiving the primary key value in time for it to be persisted.
em.persist(person);
em.flush(); //generate the PK for the person
Employee employee = new Employee(person);//set PK/FK -- provider will not auto propagate
employee.setHireDate(new GregorianCalendar(1996, Calendar.JANUARY, 1).getTime());
//em.persist(person);
em.persist(employee);
Re-build the module and re-run the test method. It should now be able to persist both entities and successfully pull them back.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniPKJ ... -*** testOne2OneUniPKJ *** Hibernate: insert into RELATIONEX_PERSON (id, name) values (null, ?) Hibernate: insert into RELATIONEX_EMPLOYEE (hireDate, id) values (?, ?) ...
Notice that -- in the primary key join case -- the query to the database uses two separate selects rather than a single select with a join as done with the FK-join case. We can tell the fetch mode is EAGER by the fact that the select for the parent table occurs prior to making a call to the parent.
Hibernate: select employee0_.id as id6_0_, employee0_.hireDate as hireDate6_0_ from RELATIONEX_EMPLOYEE employee0_ where employee0_.id=? Hibernate: select person0_.id as id1_0_, person0_.name as name1_0_ from RELATIONEX_PERSON person0_ where person0_.id=? -calling person... ...
If you change the relationship to optional/EAGER, the select changes to a single outer join.
@OneToOne(optional=true,fetch=FetchType.EAGER)
@PrimaryKeyJoinColumn //informs provider the FK derived from PK
private Person person;
Hibernate: select employee0_.id as id6_1_, employee0_.hireDate as hireDate6_1_, person1_.id as id1_0_, person1_.name as name1_0_ from RELATIONEX_EMPLOYEE employee0_ left outer join RELATIONEX_PERSON person1_ on employee0_.id=person1_.id where employee0_.id=? -calling person...
If you change the relationship to required/LAZY you will notice by the location of "calling person..." -- the second select occurs at the point where the parent is being dereferenced and called.
@OneToOne(optional=false,fetch=FetchType.LAZY)
@PrimaryKeyJoinColumn //informs provider the FK derived from PK
private Person person;
Hibernate: select employee0_.id as id6_0_, employee0_.hireDate as hireDate6_0_ from RELATIONEX_EMPLOYEE employee0_ where employee0_.id=? -calling person... Hibernate: select person0_.id as id1_0_, person0_.name as name1_0_ from RELATIONEX_PERSON person0_ where person0_.id=?
One odd thing of note -- if we change the relationship to optional/LAZY, the provider performs the same type of query as when it was required/EAGER.
@OneToOne(optional=true,fetch=FetchType.LAZY)
@PrimaryKeyJoinColumn //informs provider the FK derived from PK
private Person person;
Hibernate: select employee0_.id as id6_0_, employee0_.hireDate as hireDate6_0_ from RELATIONEX_EMPLOYEE employee0_ where employee0_.id=? Hibernate: select person0_.id as id1_0_, person0_.name as name1_0_ from RELATIONEX_PERSON person0_ where person0_.id=? -calling person...
Add the following to your test method to verify the tables, columns, and values we expect at the raw SQL level.
Object[] cols = (Object[]) em.createNativeQuery(
"select person.id person_id, person.name, " +
"employee.id employee_id " +
"from RELATIONEX_EMPLOYEE employee " +
"join RELATIONEX_PERSON person on person.id = employee.id " +
"where employee.id = ?1")
.setParameter(1, employee.getId())
.getSingleResult();
log.info("row=" + Arrays.toString(cols));
assertEquals("unexpected person_id", person.getId(), ((Number)cols[0]).intValue());
assertEquals("unexpected person_name", person.getName(), (String)cols[1]);
assertEquals("unexpected employee_id", employee.getId(), ((Number)cols[2]).intValue());
Rebuild the module and execute the test method to verify the assertions about the raw SQL structure and values.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniPKJ ... Hibernate: select person.id person_id, person.name, employee.id employee_id from RELATIONEX_EMPLOYEE employee join RELATIONEX_PERSON person on person.id = employee.id where employee.id = ? -row=[1, Ozzie Newsome, 1]
Add the following cleanup logic and to test the ability to delete the entities and their relationships.
em.remove(employee2);
em.remove(employee2.getPerson());
em.flush();
assertNull("person not deleted", em.find(Person.class, person.getId()));
assertNull("employee not deleted", em.find(Employee.class, employee.getId()));
Re-build the module and verify the ability to delete the dependent and parent entities.
Hibernate: delete from RELATIONEX_EMPLOYEE where id=? Hibernate: delete from RELATIONEX_PERSON where id=?
You have finished modeling a one-to-one, uni-directional relationship using a primary key join. Using this technique saved the dependent of using a separate foreign key column but created the requirement that the parent entity be persisted first. We also saw how changing the required and fetch mode could impact the underlying quieries to the database. In the next section we will show how a new feature in JPA 2.0 can ease the propagation of the parent primary key to the dependent entity.
JPA 2.0 added a new annotation called @MapsId that can ease the propagation of the parent primary key to the dependent entity. There are several uses of @MapsId. We will first look at its capability to identify the foreign key of a dependent entity as being the source of the primary key value. We saw in the FK-join case where the provider automatically propagates FK values to dependent entities but not PK-joins. Rather than saying the PK realizes the FK. @MapsId seems to state the FK realizes the PK. Lets take a concrete look...
Add the following entity class to your src/main tree. It is incomplete at this point in time.
package myorg.relex.one2one;
import javax.persistence.*;
/**
* This class demonstrates a one-to-one, uni-directional relationship
* where the foreign key is used to define the primary key with the
* use of @MapsId
*/
@Entity
@Table(name="RELATIONEX_COACH")
public class Coach {
public enum Type {HEAD, ASSISTANT };
@Id //provider sets to FK value with help from @MapsId
private int id;
@OneToOne(optional=false, fetch=FetchType.EAGER)
// @MapsId //informs provider the PK is derived from FK
private Person person;
@Enumerated(EnumType.STRING) @Column(length=16)
private Type type;
public Coach() {}
public Coach(Person person) {
this.person = person;
}
public int getId() { return person==null ? 0 : person.getId(); }
public Person getPerson() { return person; }
public Type getType() { return type; }
public void setType(Type type) {
this.type = type;
}
}
Add the entity class to the persistence unit.
<class>myorg.relex.one2one.Coach</class>
Rebuild the module and take a look at the generated database schema for what was initially defined above. Notice how the dependent table has been define to have both a primary key and a separate foreign key. Lets fix that so there is only a single column to represent the two purposes like what we did for the PK-join case.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_COACH ( id integer not null, type varchar(16), person_id integer not null, primary key (id), unique (person_id) ); ... alter table RELATIONEX_COACH add constraint FK75C513EA4BE1E366 foreign key (person_id) references RELATIONEX_PERSON;
Update the dependent entity class to inform the provider to derive the primary key value from the assigned foreign key relationship using the @MapsId annotation.
@Id //provider sets to FK value with help from @MapsId
private int id;
@OneToOne(optional=false, fetch=FetchType.EAGER)
@MapsId //informs provider the PK is derived from FK
private Person person;
If you look back over the entire class design you should notice that the class provides no way to ever assign the @Id except through @MapsId.
Rebuild the module and review the generated database schema. Notice how the provider is now using the column named after the foreign key as the primary key and has eliminated the separate primary key.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_COACH ( type varchar(16), person_id integer not null, primary key (person_id), unique (person_id) ); ... alter table RELATIONEX_COACH add constraint FK75C513EA4BE1E366 foreign key (person_id) references RELATIONEX_PERSON;
Add the following test method to your existing one-to-one test case. Notice the design of the test method persists the parent and dependent class together -- without having to worry about deriving the parent primary key first. That is very convenient.
@Test
public void testOne2OneUniMapsId() {
log.info("*** testOne2OneUniMapsId ***");
Person person = new Person();
person.setName("John Harbaugh");
Coach coach = new Coach(person);
coach.setType(Coach.Type.HEAD);
em.persist(person);
em.persist(coach); //provider auto propagates person.id to coach.FK mapped to coach.PK
//flush commands to database, clear cache, and pull back new instance
em.flush(); em.clear();
Coach coach2 = em.find(Coach.class, coach.getId());
log.info("calling person...");
assertEquals("unexpected name", coach.getPerson().getName(), coach2.getPerson().getName());
}
Re-build the module and run the the new test method. Notice the provider issues two separate selects; one select each for the dependent and parent entity.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniMapsId ... -*** testOne2OneUniMapsId *** Hibernate: insert into RELATIONEX_PERSON (id, name) values (null, ?) Hibernate: insert into RELATIONEX_COACH (type, person_id) values (?, ?) Hibernate: select coach0_.person_id as person2_5_0_, coach0_.type as type5_0_ from RELATIONEX_COACH coach0_ where coach0_.person_id=? Hibernate: select person0_.id as id1_0_, person0_.name as name1_0_ from RELATIONEX_PERSON person0_ where person0_.id=? -calling person...
Add the following assertions about the SQL structure and values.
Object[] cols = (Object[]) em.createNativeQuery(
"select person.id person_id, person.name, " +
"coach.person_id coach_id " +
"from RELATIONEX_COACH coach " +
"join RELATIONEX_PERSON person on person.id = coach.person_id " +
"where coach.person_id = ?1")
.setParameter(1, coach.getId())
.getSingleResult();
log.info("row=" + Arrays.toString(cols));
assertEquals("unexpected person_id", person.getId(), ((Number)cols[0]).intValue());
assertEquals("unexpected person_name", person.getName(), (String)cols[1]);
assertEquals("unexpected coach_id", coach.getId(), ((Number)cols[2]).intValue());
Rebuild the module, re-run the test method, and observe the results of the new assertions.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniMapsId ... Hibernate: select person.id person_id, person.name, coach.person_id coach_id from RELATIONEX_COACH coach join RELATIONEX_PERSON person on person.id = coach.person_id where coach.person_id = ? -row=[1, John Harbaugh, 1]
Add cleanup logic and assertions of the removal of the two entity rows.
em.remove(coach2);
em.remove(coach2.getPerson());
em.flush();
assertNull("person not deleted", em.find(Person.class, person.getId()));
assertNull("coach not deleted", em.find(Coach.class, coach.getId()));
Re-build the module, re-run the test method, and note the successful deletion of the two entity rows.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniMapsId ... Hibernate: delete from RELATIONEX_COACH where person_id=? Hibernate: delete from RELATIONEX_PERSON where id=?
You have completed implementing a one-to-one, uni-directional relationship using a @MapsId to derive the primary key of the dependent entity from the foreign key to the parent entity. This allowed the persist() of the two entities to occur without worrying about a sequencing them in separate actions to the database.
This section will cover cases where one wants to map a one-to-one primary key join to a parent entity that uses a composite primary key. The dependent entity may use either an @IdClass/@PrimaryKeyJoin or an @EmbeddedId/@MapsId to realize this relationship and identity.
To get started, put the following parent class in place in your src/main tree.
package myorg.relex.one2one;
import java.util.Date;
import javax.persistence.*;
/**
* This class represents the passive side of a one-to-one
* uni-directional relationship where the parent uses
* a composite primary key that must be represented in
* the dependent entity's relationship.
*/
@Entity
@Table(name="RELATIONEX_SHOWEVENT")
@IdClass(ShowEventPK.class)
public class ShowEvent {
@Id
@Temporal(TemporalType.DATE)
private Date date;
@Id
@Temporal(TemporalType.TIME)
private Date time;
@Column(length=20)
private String name;
public ShowEvent() {}
public ShowEvent(Date date, Date time) {
this.date = date;
this.time = time;
}
public Date getDate() { return date; }
public Date getTime() { return time; }
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
The above entity class uses two properties to form its primary key -- thus it requires a composite primary key to represent the PK within JPA.
Put the following composite primary key in place. It is defined as @Embeddable so that it can be used both as an @IdClass and an @EmbeddableId.
package myorg.relex.one2one;
import java.io.Serializable;
import java.util.Date;
import javax.persistence.Embeddable;
/**
* This class will be used as an IdClass for the ShowEvent
* entity.
*/
@Embeddable
public class ShowEventPK implements Serializable {
private static final long serialVersionUID = 1L;
private Date date;
private Date time;
protected ShowEventPK(){}
public ShowEventPK(Date date, Date time) {
this.date = date;
this.time = time;
}
public Date getDate() { return date; }
public Date getTime() { return time; }
@Override
public int hashCode() { return date.hashCode() + time.hashCode(); }
@Override
public boolean equals(Object obj) {
try {
return date.equals(((ShowEventPK)obj).date) &&
time.equals(((ShowEventPK)obj).time);
} catch (Exception ex) { return false; }
}
}
Add the above parent entity to the persistence unit.
<class>myorg.relex.one2one.ShowEvent</class>
Continue on with mapping the dependent entity using an @IdClass and @EmbeddedId. You will find the @IdClass technique acts much like the @PrimaryKeyJoin we performed earlier. The @EmbeddedId technique acts much like the @MapsId case as well.
This sub-section will map the dependent class to the parent using an @IdClass.
Put the following dependent entity class in you src/main tree. It is incomplete at this point and will generate the default mapping for the relationship to the class using the composite PK. Since we eventually want to derive the primary key(s) for this dependent entity from the parent entity -- we also model the same properties as @Id and use and @IdClass to represent the PK within JPA. At this point -- the composite identity of the dependent entity is independent of the relationship.
package myorg.relex.one2one;
import java.util.Date;
import javax.persistence.*;
/**
* This class provides an example of a the owning entity of a
* one-to-one, uni-directional relationship where the dependent's
* primary key is derived from the parent and the parent uses
* a composite primary key.
*/
@Entity
@Table(name="RELATIONEX_SHOWTICKETS")
@IdClass(ShowEventPK.class)
public class ShowTickets {
@Id
@Temporal(TemporalType.DATE)
@Column(name="TICKET_DATE")
private Date date;
@Id
@Temporal(TemporalType.TIME)
@Column(name="TICKET_TIME")
private Date time;
@OneToOne(optional=false, fetch=FetchType.EAGER)
/*
@PrimaryKeyJoinColumns({
@PrimaryKeyJoinColumn(name="TICKET_DATE", referencedColumnName="date"),
@PrimaryKeyJoinColumn(name="TICKET_TIME", referencedColumnName="time"),
})
*/
private ShowEvent show;
@Column(name="TICKETS")
private int ticketsLeft;
public ShowTickets() {}
public ShowTickets(ShowEvent show) {
this.date = show.getDate();
this.time = show.getTime();
this.show = show;
}
public Date getDate() { return show==null ? null : show.getDate(); }
public Date getTime() { return show==null ? null : show.getTime(); }
public ShowEvent getShow() { return show; }
public int getTicketsLeft() { return ticketsLeft; }
public void setTicketsLeft(int ticketsLeft) {
this.ticketsLeft = ticketsLeft;
}
}
Add the dependent entity class to the persistence unit.
<class>myorg.relex.one2one.ShowTickets</class>
Build the module and observe the database schema generated for the entity classes involved. Notice how the dependent table has seemingly duplicate columns. There is a TICKET_DATE/TIME set of columns that represent the dependent entity's composite primary key. There is also a show_date/time set of columns to reference the parent entity -- which also uses a composite primary key. If the referenced entity of a foreign relationship uses a composite primary key -- then the value of the foreign key also expresses a composite set of properties.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_SHOWEVENT ( date date not null, time time not null, name varchar(20), primary key (date, time) ); create table RELATIONEX_SHOWTICKETS ( TICKET_DATE date not null, TICKET_TIME time not null, TICKETS integer, show_date date not null, show_time time not null, primary key (TICKET_DATE, TICKET_TIME), unique (show_date, show_time) ); ... alter table RELATIONEX_SHOWTICKETS add constraint FK93AB7C9AE3196D0 foreign key (show_date, show_time) references RELATIONEX_SHOWEVENT;
Update the relationship with a default mapping for a @PrimaryKeyJoin.
@OneToOne(optional=false, fetch=FetchType.EAGER)
@PrimaryKeyJoinColumn /*s({
@PrimaryKeyJoinColumn(name="TICKET_DATE", referencedColumnName="date"),
@PrimaryKeyJoinColumn(name="TICKET_TIME", referencedColumnName="time"),
})*/
private ShowEvent show;
Re-build the module and observe how the default mapping of the @PrimaryKeyJoin was realized when using the composite primary key.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_SHOWTICKETS ( TICKET_DATE date not null, TICKET_TIME time not null, TICKETS integer, primary key (TICKET_DATE, TICKET_TIME) ); ... alter table RELATIONEX_SHOWTICKETS add constraint FK93AB7C9A1C31D972 foreign key (TICKET_DATE, TICKET_TIME) references RELATIONEX_SHOWEVENT;
In this case, the provider was able to generate default mappings that are exactly what we would have created manually. You could have enabled the following custom mappings to explicitly map primary key column values from the dependent table columns to the parent table columns.
@OneToOne(optional=false, fetch=FetchType.EAGER)
@PrimaryKeyJoinColumns({
@PrimaryKeyJoinColumn(name="TICKET_DATE", referencedColumnName="date"),
@PrimaryKeyJoinColumn(name="TICKET_TIME", referencedColumnName="time"),
})
private ShowEvent show;
Note there can only be a single @PrimaryKeyJoin annotated against a method. Multiple @PrimaryKeyJoin columns must be wrapped within a @PrimaryKeyJoinColumns annotation to work.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... TICKET_DATE date not null, TICKET_TIME time not null, TICKETS integer, primary key (TICKET_DATE, TICKET_TIME) ); ... alter table RELATIONEX_SHOWTICKETS add constraint FK93AB7C9A1C31D972 foreign key (TICKET_DATE, TICKET_TIME) references RELATIONEX_SHOWEVENT;
Add the following test method to your one-to-one test case. Although the @IdClass/@PrimaryKeyJoin is very similar to the @Id/PrimaryKeyJoin covered earlier, this approach is being simplified by the fact the primary key of the parent is not dynamically generated. The relationship assembly can occur as soon as the we derive the natural key values for the parent entity.
@Test
public void testOne2OneUniIdClass() {
log.info("*** testOneToOneUniIdClass ***");
Date showDate = new GregorianCalendar(1975+new Random().nextInt(100),
Calendar.JANUARY, 1).getTime();
Date showTime = new GregorianCalendar(0, 0, 0, 0, 0, 0).getTime();
ShowEvent show = new ShowEvent(showDate, showTime);
show.setName("Rocky Horror");
ShowTickets tickets = new ShowTickets(show); //parent already has natural PK by this point
tickets.setTicketsLeft(300);
em.persist(show);
em.persist(tickets);
//flush commands to database, clear cache, and pull back new instance
em.flush(); em.clear();
ShowTickets tickets2 = em.find(ShowTickets.class, new ShowEventPK(tickets.getDate(), tickets.getTime()));
log.info("calling parent...");
assertEquals("unexpected name", tickets.getShow().getName(), tickets2.getShow().getName());
}
Re-build the module and note the creation of the parent and dependent entities.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniIdClass ... -*** testOne2OneUniIdClass *** Hibernate: insert into RELATIONEX_SHOWEVENT (name, date, time) values (?, ?, ?) Hibernate: insert into RELATIONEX_SHOWTICKETS (TICKETS, TICKET_DATE, TICKET_TIME) values (?, ?, ?)
The provider uses a set of selects to fully assemble our object tree for use.
Hibernate: select showticket0_.TICKET_DATE as TICKET1_8_0_, showticket0_.TICKET_TIME as TICKET2_8_0_, showticket0_.TICKETS as TICKETS8_0_ from RELATIONEX_SHOWTICKETS showticket0_ where showticket0_.TICKET_DATE=? and showticket0_.TICKET_TIME=? Hibernate: select showevent0_.date as date7_0_, showevent0_.time as time7_0_, showevent0_.name as name7_0_ from RELATIONEX_SHOWEVENT showevent0_ where showevent0_.date=? and showevent0_.time=? -calling parent...
Add the following to the test method to verify our assertions about the structure of the database tables and their values related to this example.
Object[] cols = (Object[]) em.createNativeQuery(
"select show.date show_date, show.time show_time, " +
"tickets.ticket_date ticket_date, tickets.ticket_time ticket_time, tickets.tickets " +
"from RELATIONEX_SHOWEVENT show " +
"join RELATIONEX_SHOWTICKETS tickets on show.date = tickets.ticket_date and show.time = tickets.ticket_time " +
"where tickets.ticket_date = ?1 and tickets.ticket_time = ?2")
.setParameter(1, tickets.getShow().getDate(), TemporalType.DATE)
.setParameter(2, tickets.getShow().getTime(), TemporalType.TIME)
.getSingleResult();
log.info("row=" + Arrays.toString(cols));
assertEquals("unexpected show_date", tickets2.getShow().getDate(), (Date)cols[0]);
assertEquals("unexpected show_time", tickets2.getShow().getTime(), (Date)cols[1]);
assertEquals("unexpected ticket_date", tickets2.getDate(), (Date)cols[2]);
assertEquals("unexpected ticket_time", tickets2.getTime(), (Date)cols[3]);
assertEquals("unexpected ticketsLeft", tickets2.getTicketsLeft(), ((Number)cols[4]).intValue());
Re-build the module and observe the success of the SQL portion of the test method.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniIdClass ... Hibernate: select show.date show_date, show.time show_time, tickets.ticket_date ticket_date, tickets.ticket_time ticket_time, tickets.tickets from RELATIONEX_SHOWEVENT show join RELATIONEX_SHOWTICKETS tickets on show.date = tickets.ticket_date and show.time = tickets.ticket_time where tickets.ticket_date = ? and tickets.ticket_time = ? -row=[2033-01-01, 00:00:00, 2033-01-01, 00:00:00, 300]
Add the following cleanup logic and assertions to verify the rows have been deleted for the dependent and parent entities.
em.remove(tickets2);
em.remove(tickets2.getShow());
em.flush();
assertNull("tickets not deleted", em.find(ShowEvent.class,
new ShowEventPK(show.getDate(), show.getTime())));
assertNull("show not deleted", em.find(ShowTickets.class,
new ShowEventPK(tickets.getDate(), tickets.getTime())));
Re-build the module and observe the successful results of the completed test method.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniIdClass ... Hibernate: delete from RELATIONEX_SHOWTICKETS where TICKET_DATE=? and TICKET_TIME=? Hibernate: delete from RELATIONEX_SHOWEVENT where date=? and time=?
You have completed mapping a one-to-one uni-directional relationship that is based on a composite primary in the parent and the composite key mapped in the dependent table as an @IdClass.
In this second example of @MapsId, we will be informing the provider that the primary key for the dependent table is realized by the foreign key and, in this case, is a composite primary key. We must use an @EmbeddedId in order for this to work correctly.
Add the following entity class to your src/main tree. It is not complete at this point and schema generation will show there bring a separate primary and foreign key.
package myorg.relex.one2one;
import java.util.Date;
import javax.persistence.*;
/**
* This class provides an example of a the owning entity of a
* one-to-one, uni-directional relationship where the dependent's
* primary key is derived from the parent, the parent uses
* a composite primary key, and the dependent used an @EmeddedId
* and @MapsId.
*/
@Entity
@Table(name="RELATIONEX_BOXOFFICE")
public class BoxOffice {
@EmbeddedId
private ShowEventPK pk; //will be set by provider with help of @MapsId
@OneToOne(optional=false, fetch=FetchType.EAGER)
// @MapsId //provider maps this composite FK to @EmbeddedId PK value
private ShowEvent show;
@Column(name="TICKETS")
private int ticketsLeft;
protected BoxOffice() {}
public BoxOffice(ShowEvent show) {
this.show = show;
}
public Date getDate() { return show==null ? null : show.getDate(); }
public Date getTime() { return show==null ? null : show.getTime(); }
public ShowEvent getShow() { return show; }
public int getTicketsLeft() { return ticketsLeft; }
public void setTicketsLeft(int ticketsLeft) {
this.ticketsLeft = ticketsLeft;
}
}
Add the dependent entity class to the persistence unit.
<class>myorg.relex.one2one.BoxOffice</class>
Build the module and observe the generated schema. Notice the separate use of date/time for the primary key and show_date/time for the foreign key.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_BOXOFFICE ( date timestamp not null, time timestamp not null, TICKETS integer, show_date date not null, show_time time not null, primary key (date, time), unique (show_date, show_time) ); alter table RELATIONEX_BOXOFFICE add constraint FK64CED797E3196D0 foreign key (show_date, show_time) references RELATIONEX_SHOWEVENT;
Update the dependent table mapping so that the foreign key is used to realize the primary key for the entity. Notice also the class provides no way to set the @EmbeddedId exception thru the @MapsId on the foreign key.
@OneToOne(optional=false, fetch=FetchType.EAGER)
@MapsId //provider maps this composite FK to @EmbeddedId PK value
private ShowEvent show;
Re-build the module and observe the generated database schema. Note the primary key has now been mapped to the show_date/time foreign key columns.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_BOXOFFICE ( TICKETS integer, show_date date, show_time time not null, primary key (show_date, show_time), unique (show_date, show_time) ); alter table RELATIONEX_BOXOFFICE add constraint FK64CED797E3196D0 foreign key (show_date, show_time) references RELATIONEX_SHOWEVENT;
Add the following test method to your one-to-one test case.
@Test
public void testOne2OneUniEmbeddedId() {
log.info("*** testOne2OneUniEmbedded ***");
Date showDate = new GregorianCalendar(1975+new Random().nextInt(100),
Calendar.JANUARY, 1).getTime();
Date showTime = new GregorianCalendar(0, 0, 0, 0, 0, 0).getTime();
ShowEvent show = new ShowEvent(showDate, showTime);
show.setName("Rocky Horror");
BoxOffice boxOffice = new BoxOffice(show);
boxOffice.setTicketsLeft(500);
em.persist(show);
em.persist(boxOffice); //provider auto propagates parent.cid to dependent.FK mapped to dependent.cid
//flush commands to database, clear cache, and pull back new instance
em.flush(); em.clear();
BoxOffice boxOffice2 = em.find(BoxOffice.class, new ShowEventPK(boxOffice.getDate(), boxOffice.getTime()));
log.info("calling parent...");
assertEquals("unexpected name", boxOffice.getShow().getName(), boxOffice2.getShow().getName());
}
Re-build the module and run the test method above.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneUniEmbeddedId
...
-*** testOne2OneUniEmbedded ***
Hibernate:
insert
into
RELATIONEX_SHOWEVENT
(name, date, time)
values
(?, ?, ?)
Hibernate:
insert
into
RELATIONEX_BOXOFFICE
(TICKETS, show_date, show_time)
values
(?, ?, ?)
Hibernate:
select
boxoffice0_.show_date as show2_9_0_,
boxoffice0_.show_time as show3_9_0_,
boxoffice0_.TICKETS as TICKETS9_0_
from
RELATIONEX_BOXOFFICE boxoffice0_
where
boxoffice0_.show_date=?
and boxoffice0_.show_time=?
Hibernate:
select
showevent0_.date as date7_0_,
showevent0_.time as time7_0_,
showevent0_.name as name7_0_
from
RELATIONEX_SHOWEVENT showevent0_
where
showevent0_.date=?
and showevent0_.time=?
-calling parent...
Add the following to verify our assertions about the SQL structure and values underlying the JPA abstraction.
Object[] cols = (Object[]) em.createNativeQuery(
"select show.date show_date, show.time show_time, " +
"tickets.show_date ticket_date, tickets.show_time ticket_time, tickets.tickets " +
"from RELATIONEX_SHOWEVENT show " +
"join RELATIONEX_BOXOFFICE tickets on show.date = tickets.show_date and show.time = tickets.show_time " +
"where tickets.show_date = ?1 and tickets.show_time = ?2")
.setParameter(1, boxOffice.getShow().getDate(), TemporalType.DATE)
.setParameter(2, boxOffice.getShow().getTime(), TemporalType.TIME)
.getSingleResult();
log.info("row=" + Arrays.toString(cols));
assertEquals("unexpected show_date", boxOffice2.getShow().getDate(), (Date)cols[0]);
assertEquals("unexpected show_time", boxOffice2.getShow().getTime(), (Date)cols[1]);
assertEquals("unexpected ticket_date", boxOffice2.getDate(), (Date)cols[2]);
assertEquals("unexpected ticket_time", boxOffice2.getTime(), (Date)cols[3]);
assertEquals("unexpected ticketsLeft", boxOffice2.getTicketsLeft(), ((Number)cols[4]).intValue());
Re-build the module and re-run the test method to verify the underlying SQL structure is how assume it to be.
Hibernate: select show.date show_date, show.time show_time, tickets.show_date ticket_date, tickets.show_time ticket_time, tickets.tickets from RELATIONEX_SHOWEVENT show join RELATIONEX_BOXOFFICE tickets on show.date = tickets.show_date and show.time = tickets.show_time where tickets.show_date = ? and tickets.show_time = ? -row=[1994-01-01, 00:00:00, 1994-01-01, 00:00:00, 500]
Add the following removal logic to test that we can remove the two entities and their associated rows.
em.remove(boxOffice2);
em.remove(boxOffice2.getShow());
em.flush();
assertNull("tickets not deleted", em.find(ShowEvent.class,
new ShowEventPK(show.getDate(), show.getTime())));
assertNull("show not deleted", em.find(BoxOffice.class,
new ShowEventPK(boxOffice.getDate(), boxOffice.getTime())));
Observe the output of the deletes. It is consistent with before.
Hibernate: delete from RELATIONEX_BOXOFFICE where show_date=? and show_time=? Hibernate: delete from RELATIONEX_SHOWEVENT where date=? and time=?
You have not completed the mapping of a one-to-one, uni-directional relationship using a composite key and realized through the use of an @EmbeddedId and @MapsId.
In this chapter, we have so far only addressed uni-directional relationships -- where only one side of the relationship was aware of the other at the Java class level. We can also make our relationships bi-directional for easy navigation to/from either side. This requires no change the database and is only a change at the Java and mapping levels.
In bi-directional relationships, it is important to understand there are two sides/types to the relationship; owning side and inverse side.
The owning side of the relation defines the mapping to the database. This is what we did for the uni-directional sections above
The inverse side of the relation must refer to its owning side mapping through the "mappedBy" property of the @XxxToXxx annotation
For OneToOne relationships, the owning side contains the foreign key or defines the join table
The provider will initialize the state of the inverse side during calls like find() and refresh(), but will not update its value during application changes to the owning side. This is the application programmer's job to make the two references consistent.
The provider will only trigger persistence changes thru changes to the owning side
Lets start the discussion of one-to-one bi-directional using a set of entities that are pretty much joined at the hip. Their properties have been mapped to separate database tables and Java entity classes, but they will never reference a different instance. For this reason we will assign them a common primary key, join them by that common primary key value, and propagate the primary key to the dependent class using @MapsId.
Add the following inverse side entity class to your src/main tree. It is currently incomplete and will soon cause an error.
package myorg.relex.one2one;
import javax.persistence.*;
/**
* This class provides an example of the inverse side of a
* one-to-one bi-directional relationship.
*/
@Entity
@Table(name="RELATIONEX_APPLICANT")
public class Applicant {
@Id @GeneratedValue
private int id;
@Column(length=32)
private String name;
// @OneToOne(mappedBy="applicant", //identifies property on owning side
// fetch=FetchType.LAZY)
// @Transient
private Application application;
public Applicant(){}
public Applicant(int id) {
this.id = id;
}
public int getId() { return id; }
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
public Application getApplication() { return application; }
public void setApplication(Application application) {
this.application = application;
}
}
Add the following owning side entity class to your src/main tree. It is currently incomplete and will not yet generate the desired primary key mapping we desire in this case.
package myorg.relex.one2one;
import java.util.Date;
import javax.persistence.*;
/**
* This class provides an example of the owning side
* of a one-to-one, bi-directional relationship.
*/
@Entity
@Table(name="RELATIONEX_APPLICATION")
public class Application {
@Id
private int id;
// @MapsId //foreign key realizes primary key
@OneToOne(//lack of mappedBy identifies this as owning side
optional=false, fetch=FetchType.EAGER)
private Applicant applicant;
@Temporal(TemporalType.DATE)
private Date desiredStartDate;
protected Application() {}
public Application(Applicant applicant) {
this.applicant = applicant;
if (applicant != null) {
applicant.setApplication(this); //must maintain inverse side
}
}
public int getId() { return id; }
public Applicant getApplicant() { return applicant; }
public Date getDesiredStartDate() { return desiredStartDate; }
public void setDesiredStartDate(Date desiredStartDate) {
this.desiredStartDate = desiredStartDate;
}
}
It is important to note that -- in the case of a bi-directional relationship -- the application developer is responsible for setting both sides of the relationship even though JPA is only concerned with the inverse side when making changes to the database. We can either make the assignment here ...
applicant.setApplication(this); //must maintain inverse side
... or from the code that called the ctor(application) in the first place.
Application application = new Application(applicant);
applicant.setApplication(application); //must maintain inverse side
Add the two entity classes to your persistence unit.
<class>myorg.relex.one2one.Applicant</class>
<class>myorg.relex.one2one.Application</class>
Attempt to build the module and note the error from the provider attempting to map the Application entity properties.
$ mvn clean process-test-classes ... org.hibernate.MappingException: Could not determine type for: myorg.relex.one2one.Application, at table: RELATIONEX_APPLICANT, for columns: [org.hibernate.mapping.Column(application)]
The error occurs because...
The entity uses FIELD access and an un-annotated field property was found that has no default mapping
The referenced Application entity does not implement Serializable and cannot be stuffed into a BLOB column within Applicant (and nor do we want it to)
Lets initially get beyond the error by marking the property as @Transient. This will allow the Java attribute to exist in memory but will not have any mapping to the database. That may be what we ultimately want for some cases, but not here. We are only using @Transient to get back to a stable state while we work through any other mapping issues in front of us.
@Transient
private Application application;
Re-build the module and observe the generated database schema so far. Notice the expected uni-direction behavior has been recreated with our current mapping.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_APPLICANT ( id integer generated by default as identity, name varchar(32), primary key (id) ); create table RELATIONEX_APPLICATION ( id integer not null, desiredStartDate date, applicant_id integer not null, primary key (id), unique (applicant_id) ); ... alter table RELATIONEX_APPLICATION add constraint FK8B404CA01EF7E92E foreign key (applicant_id) references RELATIONEX_APPLICANT;
Currently we are seeing...
The inverse/parent Applicant entity table has no reference to the owning/dependent Application since it is @Transient
The owning/dependent Application entity table has a foreign key reference to the inverse Applicant entity table.
The owning/dependent Application entity is realizing the relation through a foreign key join rather than a primary key join.
Fix the mapping from the owning/dependent Application to the inverse/parent Applicant entity by adding @MapsId to the owning side definition.
@MapsId //foreign key realizes primary key
@OneToOne(//lack of mappedBy identifies this as owning side
optional=false, fetch=FetchType.EAGER)
private Applicant applicant;
Re-build the module and observe the generated database schema so far. Notice the former ID primary key column for the owning/dependent Application entity table was removed and its role taken by the APPLICANT_ID foreign key column because of the @MapsId annotation.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_APPLICANT ( id integer generated by default as identity, name varchar(32), primary key (id) ); create table RELATIONEX_APPLICATION ( desiredStartDate date, applicant_id integer not null, primary key (applicant_id), unique (applicant_id) ); ... alter table RELATIONEX_APPLICATION add constraint FK8B404CA01EF7E92E foreign key (applicant_id) references RELATIONEX_APPLICANT;
Since primary keys cannot be optional, only mandatory relationships can be created through primary keys joins. The parent can not be deleted without also removing the dependent entity (first). The inverse side must be in place with the primary key to be shared. The owning side of primary key join cannot be the entity generating the primary key.
Attempt to fix the parent entity by replacing the @Transient specification with a @OneToOne relationship mapping. However, in doing it exactly this way we are causing an error with the database mapping we will soon see...
@OneToOne(
// mappedBy="applicant", //identifies property on owning side
fetch=FetchType.LAZY)
// @Transient
private Application application;
Re-build the module and observe the generated database schema. Notice that our "inverse" Applicant entity table has inherited an unwanted database column ("application_applicant_id") and foreign key to the "owning" Application entity table. That circular reference is not a bi-directional relationship -- it is two, independent uni-directional relationships.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_APPLICANT ( id integer generated by default as identity, name varchar(32), application_applicant_id integer, primary key (id) ); create table RELATIONEX_APPLICATION ( desiredStartDate date, applicant_id integer not null, primary key (applicant_id), unique (applicant_id) ); ... alter table RELATIONEX_APPLICANT add constraint FK8C43FE52AB28790B foreign key (application_applicant_id) references RELATIONEX_APPLICATION; alter table RELATIONEX_APPLICATION add constraint FK8B404CA01EF7E92E foreign key (applicant_id) references RELATIONEX_APPLICANT;
Fix the mistaken mapping by making the parent entity the inverse side of the relationship using the property "mappedBy".
@OneToOne(
mappedBy="applicant", //identifies property on owning side
fetch=FetchType.LAZY)
private Application application;
Re-build the module and observe the generated database schema. We now have the database schema we need to implement a one-to-one, bi-directional relationship realized through a common, generated primary key value.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_APPLICANT ( id integer generated by default as identity, name varchar(32), primary key (id) ); create table RELATIONEX_APPLICATION ( desiredStartDate date, applicant_id integer not null, primary key (applicant_id), unique (applicant_id) ); ... alter table RELATIONEX_APPLICATION add constraint FK8B404CA01EF7E92E foreign key (applicant_id) references RELATIONEX_APPLICANT;
We now have...
The parent Applicant entity table has a generated PK
The parent/inverse Applicant entity table has no foreign key reference to the dependent/owning Application entity table.
The dependent/owning Application entity table has a foreign key reference to the parent/inverse Applicant entity table. This is used to form the relationship.
The parent/inverse Applicant with the generated primary key can be inserted at any time.
The dependent/owning Application with the non-null foreign key can only be inserted after the parent/inverse Application entity
Add the following test method to your existing one-to-one test case.
@Test
public void testOne2OneBiPKJ() {
log.info("*** testOne2OneBiPKJ() ***");
Applicant applicant = new Applicant();
applicant.setName("Jason Garret");
Application application = new Application(applicant);
application.setDesiredStartDate(new GregorianCalendar(2008, Calendar.JANUARY, 1).getTime());
em.persist(applicant); //provider will generate a PK
em.persist(application); //provider will propogate parent.PK to dependent.FK/PK
}
Build the module, run the new test method, and notice the database output shows a good bit of what we expect from our uni-directional experience.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiPKJ ... -*** testOne2OneBiPKJ() ***
Hibernate: insert into RELATIONEX_APPLICANT (id, name) values (null, ?) Hibernate: insert into RELATIONEX_APPLICATION (desiredStartDate, applicant_id) values (?, ?)
Add the following to the test method to verify the actions to the database when the entities are being found from the owning/dependent side of the relationship. This should be similar to our uni-directional case since we are using the entity class with the foreign key in the find(). However, our mapping seems to cause some additional database interaction.
em.flush(); em.clear();
log.info("finding dependent...");
Application application2 = em.find(Application.class, application.getId());
log.info("found dependent...");
assertTrue("unexpected startDate",
application.getDesiredStartDate().equals(
application2.getDesiredStartDate()));
log.info("calling parent...");
assertEquals("unexpected name", application.getApplicant().getName(), application2.getApplicant().getName());
Re-build the module, run the new test method, and notice the database output contains three select statements, including an extra select for the owning side after both the owning and inverse sides have been retrieved during the EAGER fetch.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiPKJ ... -finding dependent... Hibernate: select applicatio0_.applicant_id as applicant2_11_0_, applicatio0_.desiredStartDate as desiredS1_11_0_ from RELATIONEX_APPLICATION applicatio0_ where applicatio0_.applicant_id=? Hibernate: select applicant0_.id as id10_0_, applicant0_.name as name10_0_ from RELATIONEX_APPLICANT applicant0_ where applicant0_.id=? Hibernate: select applicatio0_.applicant_id as applicant2_11_0_, applicatio0_.desiredStartDate as desiredS1_11_0_ from RELATIONEX_APPLICATION applicatio0_ where applicatio0_.applicant_id=? -found dependent... -calling parent...
Add the following to the test method to verify the actions to the database when the entities are being found from the inverse/parent side of the relationship. This is something we could not do in the uni-directional case since the only one side of the relationship knew about the other.
em.flush(); em.clear();
log.info("finding parent...");
Applicant applicant2 = em.find(Applicant.class, applicant.getId());
log.info("found parent...");
assertEquals("unexpected name", applicant.getName(), applicant2.getName());
log.info("calling dependent...");
assertTrue("unexpected startDate",
applicant.getApplication().getDesiredStartDate().equals(
applicant2.getApplication().getDesiredStartDate()));
Re-build the module, run the test method, and notice the database output shows the inverse/parent being obtained first by primary key and then the owning/dependent entity being obtained through its foreign key/primary key.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiPKJ ... -finding parent... Hibernate: select applicant0_.id as id10_0_, applicant0_.name as name10_0_ from RELATIONEX_APPLICANT applicant0_ where applicant0_.id=? Hibernate: select applicatio0_.applicant_id as applicant2_11_0_, applicatio0_.desiredStartDate as desiredS1_11_0_ from RELATIONEX_APPLICATION applicatio0_ where applicatio0_.applicant_id=? -found parent... -calling dependent...
Hardly scientific, but know that in this mapping case and provider software version, we end up saving one query to the database when accessing our primary key joined entities form the inverse/parent side of the bi-directional relationship.
Add the following to the test method to verify delete actions.
em.remove(applicant2.getApplication());
em.remove(applicant2);
em.flush();
assertNull("applicant not deleted", em.find(Applicant.class, applicant2.getId()));
assertNull("application not deleted", em.find(Application.class, applicant2.getApplication().getId()));
Re-build the module and notice the successful deletion.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiPKJ ... Hibernate: delete from RELATIONEX_APPLICATION where applicant_id=? Hibernate: delete from RELATIONEX_APPLICANT where id=?
The previous case dealt with a 1:1 relationship where the entities were tightly coupled with one another -- sharing the same primary key. In this case we will look at 0..1 relationship that must provide the flexibility to be optional as well as re-assigned.
Add the following inverse/parent entity class to your src/main tree. It is currently incomplete and has a common error that will cause mapping issues in a short while.
package myorg.relex.one2one;
import javax.persistence.*;
/**
* This class is an example of the inverse/parent side of a one-to-one,
* bi-directional relationship that allows 0..1 and changing related entities.
*/
@Entity(name="RelationAuto")
@Table(name="RELATIONEX_AUTO")
public class Auto {
public enum Type { CAR, TRUCK };
@Id @GeneratedValue
private int id;
@Enumerated(EnumType.STRING)
@Column(length=10)
private Type type;
@OneToOne(
// mappedBy="auto",
optional=true, fetch=FetchType.LAZY)
private Driver driver;
public Auto() {}
public int getId() { return id;}
public Type getType() { return type; }
public void setType(Type type) {
this.type = type;
}
public Driver getDriver() { return driver; }
public void setDriver(Driver driver) {
this.driver = driver;
}
}
The above entity class was assigned an override for the entity name ('@Entity(name="RelationAuto")') so that it would not conflict with the Auto entity provided in the initial template.
Add the following owning/dependent entity class to the src/main tree.
package myorg.relex.one2one; import javax.persistence.*; /** * This class provides an example of the owning/dependent side of a one-to-one * relationship where the inverse/parent represents a 0..1 or changing relation. */ @Entity @Table(name="RELATIONEX_DRIVER") public class Driver { @Id @GeneratedValue private int id; @Column(length=20) private String name; @OneToOne( optional=false, //we must have the auto for this driver fetch=FetchType.EAGER) private Auto auto; protected Driver() {} public Driver(Auto auto) { this.auto = auto; } public int getId() { return id; } public Auto getAuto() { return auto; } public void setAuto(Auto auto) { //drivers can switch Autos this.auto = auto; } public String getName() { return name; } public void setName(String name) { this.name = name; } }
Add the two entity classes to the persistence unit
<class>myorg.relex.one2one.Auto</class>
<class>myorg.relex.one2one.Driver</class>
Build the module and observe the generated database schema. We have repeated the error from the previous section where two uni-directional relationships where defined instead of a single bi-directional relationship. We should have no foreign key relationship from the inverse/parent entity table (AUTO) to the owning/dependent entity table (DRIVER).
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_AUTO ( id integer generated by default as identity, type varchar(10), driver_id integer, <!!!!==== this should not bet here primary key (id) ); ... create table RELATIONEX_DRIVER ( id integer generated by default as identity, name varchar(20), auto_id integer not null, primary key (id), unique (auto_id) ); ... alter table RELATIONEX_AUTO <!!!!==== this should not bet here add constraint FK3558203FB3D04E86 foreign key (driver_id) references RELATIONEX_DRIVER; ... alter table RELATIONEX_DRIVER add constraint FK44C072B81E349026 foreign key (auto_id) references RELATIONEX_AUTO;
Correct the relationship specification in the inverse/parent entity class by adding a "mappedBy" property that references the incoming property from the owning/dependent entity.
@OneToOne(
mappedBy="auto",
optional=true, fetch=FetchType.LAZY)
private Driver driver;
Re-build the module and notice the correct schema produced.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_AUTO ( id integer generated by default as identity, type varchar(10), primary key (id) ); ... create table RELATIONEX_DRIVER ( id integer generated by default as identity, name varchar(20), auto_id integer not null, primary key (id), unique (auto_id) ); ... alter table RELATIONEX_DRIVER add constraint FK44C072B81E349026 foreign key (auto_id) references RELATIONEX_AUTO;
We now have...
An inverse/parent entity table (AUTO) with no reference to the owning/dependent entity table (DRIVER)
The owning/dependent entity table (DRIVER) has a foreign key separate from its primary key to reference the inverse/parent entity class (AUTO)
The foreign key is constrained to be non-null since the Auto entity was defined to be required by the Driver entity relationship.
If we had switched owning/inverse roles between the two entity classes, then a foreign key to the DRIVER in the AUTO would have been nullable.
Add the following as a new test method in your existing one-to-one test case. It currently has a persistence ordering problem that will cause an error in a following step.
@Test
public void testOne2OneBiOwningOptional() {
log.info("*** testOne2OneBiOwningOptional() ***");
Auto auto = new Auto(); //auto is inverse/parent side
auto.setType(Auto.Type.CAR);
Driver driver = new Driver(auto); //driver is owning/dependent side
driver.setName("Danica Patrick");
auto.setDriver(driver); //application must maintain inverse side
em.persist(driver);
em.persist(auto);
}
The relationship owner/dependent entity is being persisted before the inverse/parent entity. This means the inverse/parent will be transient and not have a primary key value when the owning/dependent entity is persisted -- thus cause an issue persisting the relationship.
Attempt to build the module, execute the new test method, and observe the error.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiOwningOptional ... -*** testOne2OneBiOwningOptional() *** Hibernate: insert into RELATIONEX_DRIVER (id, auto_id, name) values (null, ?, ?) -SQL Error: 23502, SQLState: 23502 -NULL not allowed for column "AUTO_ID"; SQL statement: insert into RELATIONEX_DRIVER (id, auto_id, name) values (null, ?, ?) [23502-168] -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@25824994 -tearDown() complete, em=org.hibernate.ejb.EntityManagerImpl@25824994 -closing entity manager factory -HHH000030: Cleaning up connection pool [jdbc:h2:tcp://localhost:9092/./h2db/ejava] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 3.23 sec <<< FAILURE! Results : Tests in error: testOne2OneBiOwningOptional(myorg.relex.One2OneTest): org.hibernate.exception.ConstraintViolationException: NULL not allowed for column "AUTO_ID"; SQL statement:(..)
Correct the ordering of the persist() requests so the inverse/parent entity is persisted prior to the owning/dependent entity.
em.persist(auto);
em.persist(driver);
Re-build the module, execute the new test method, and observe the successful persist() of both entities.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiOwningOptional ... -*** testOne2OneBiOwningOptional() *** Hibernate: insert into RELATIONEX_AUTO (id, type) values (null, ?) Hibernate: insert into RELATIONEX_DRIVER (id, auto_id, name) values (null, ?, ?)
Add the following to access the pair through the owning/dependent side of the relation.
em.flush(); em.clear();
log.info("finding dependent...");
Driver driver2 = em.find(Driver.class, driver.getId());
log.info("found dependent...");
assertEquals("unexpected name", driver.getName(), driver2.getName());
log.info("calling parent...");
assertEquals("unexpected name", driver.getAuto().getType(), driver2.getAuto().getType());
Re-build the module, re-run the test method, and observe the database activity to obtain the entities from the owning/dependent-side. For some reason, the provider makes two selects to fully resolve both the owning and inverse sides of the relationship using EAGER fetch mode from the inverse side. If you look closely at the where clauses...
the first appears to be attempting to locate the specific Driver we were looking for in the find()
the second appears to be populating the required Auto property of the Driver -- not realizing the first query already resolved the entity and, in this case, the Driver instance we just came from has to be the one for this Auto.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiOwningOptional ... -finding dependent... Hibernate: select driver0_.id as id13_1_, driver0_.auto_id as auto3_13_1_, driver0_.name as name13_1_, auto1_.id as id12_0_, auto1_.type as type12_0_ from RELATIONEX_DRIVER driver0_ inner join <!!!!==== Auto is a required relation for Driver RELATIONEX_AUTO auto1_ on driver0_.auto_id=auto1_.id where driver0_.id=? <!!!!==== looking for Driver we asked for in find() Hibernate: select driver0_.id as id13_1_, driver0_.auto_id as auto3_13_1_, driver0_.name as name13_1_, auto1_.id as id12_0_, auto1_.type as type12_0_ from RELATIONEX_DRIVER driver0_ inner join RELATIONEX_AUTO auto1_ on driver0_.auto_id=auto1_.id where driver0_.auto_id=? <!==== looking for Auto associated with Driver -found dependent... -calling parent...
The above is not an indication of an error. It is an indication that there is always room to analyze the automatically generated queries and create manual optimizations where necessary and appropriate.
Add the additional tests to verify access to the entity pair from the inverse/parent side of the relationship.
em.flush(); em.clear(); log.info("finding parent..."); Auto auto2 = em.find(Auto.class, auto.getId()); log.info("found parent..."); assertEquals("unexpected type", auto.getType(), auto.getType()); log.info("calling dependent..."); assertEquals("unexpected name", auto.getDriver().getName(), auto2.getDriver().getName());
Re-build the module, re-run the test method, and observe the database output from the additional steps.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiOwningOptional ... -finding parent... Hibernate: select auto0_.id as id12_0_, auto0_.type as type12_0_ from RELATIONEX_AUTO auto0_ where auto0_.id=? Hibernate: select driver0_.id as id13_1_, driver0_.auto_id as auto3_13_1_, driver0_.name as name13_1_, auto1_.id as id12_0_, auto1_.type as type12_0_ from RELATIONEX_DRIVER driver0_ inner join RELATIONEX_AUTO auto1_ on driver0_.auto_id=auto1_.id where driver0_.auto_id=? -found parent... -calling dependent...
Add the following to test the 0..1, optional aspects of the Driver relative to the Auto. In this case we are deleting the Driver and should not get one back during the next pull from the database.
Auto truck = new Auto(); truck.setType(Auto.Type.TRUCK); em.persist(truck); driver = em.find(Driver.class, driver.getId()); //get the managed instance driver.setAuto(truck); truck.setDriver(driver); em.flush(); em.clear(); Auto auto3 = em.find(Auto.class, auto.getId()); Driver driver3 = em.find(Driver.class, driver.getId()); Auto truck3 = em.find(Auto.class, truck.getId()); assertNull("driver not removed from auto", auto3.getDriver()); assertEquals("driver not assigned to truck", truck.getId(), driver3.getAuto().getId()); assertEquals("truck not assigned to driver", driver.getId(), truck3.getDriver().getId());
Re-build the module, re-run the test method, and observe the database output from the additional steps. The Driver has been updated to reference a different Auto object. That means the existing Auto object no longer has a Driver in the database.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiOwningOptional ... Hibernate: insert into RELATIONEX_AUTO (id, type) values (null, ?) Hibernate: update RELATIONEX_DRIVER set auto_id=?, name=? where id=? ...
Add the final cleanup and cleanup sanity checks for this example.
em.remove(truck3.getDriver()); em.remove(truck3); em.remove(auto3); em.flush(); assertNull("driver not deleted", em.find(Driver.class, truck3.getDriver().getId())); assertNull("auto not deleted", em.find(Auto.class, auto.getId())); assertNull("truck not deleted", em.find(Auto.class, truck.getId()));
Re-build the module, re-run the test method, and notice from the successful delete of the objects using the supplied ordering.
You have finished implementing a one-to-one, bi-directional relationship with 0..1 semantics from the inverse to owning side (i.e., owning side optional and/or changeable). In the next section we will quickly run through an example where we make the owning side mandatory and the inverse side optional.
The previous case the owning/dependent side of the relationship was optional and the inverse/parent side was mandatory. In this quick example we want to switch the database mapping so the owning/dependent side is optional. We will use a copy of the entity classes we used last time except switch owning/inverse sides.
Copy the Auto entity class to Auto2 and change the relationship ownership from inverse to owning. Also assign the entity to a new database table (AUTO2). and be sure to update all references to the Driver class to Driver2.
@Entity(name="RelationAuto2")
@Table(name="RELATIONEX_AUTO2")
public class Auto2 {
...
@OneToOne(
optional=true, fetch=FetchType.LAZY)
private Driver2 driver;
...
public Driver2 getDriver() { return driver; }
public void setDriver(Driver2 driver) {
this.driver = driver;
}
Copy the Driver entity class to Driver2 and change the relationship ownership from owning to inverse. Also assigned the entity to a new database table (DRIVER2) and be sure to update all references of the Auto class to Auto2.
@Entity
@Table(name="RELATIONEX_DRIVER2")
public class Driver2 {
...
@OneToOne(mappedBy="driver",//driver is now the inverse side
optional=false, //we must have the auto for this driver
fetch=FetchType.EAGER)
private Auto2 auto;
protected Driver2() {}
public Driver2(Auto2 auto) {
this.auto = auto;
}
...
public Auto2 getAuto() { return auto; }
public void setAuto(Auto2 auto) { //drivers can switch Autos
this.auto = auto;
}
...
Add the two new entities to the persistence unit.
<class>myorg.relex.one2one.Auto2</class>
<class>myorg.relex.one2one.Driver2</class>
Build the module and observe the generated database schema. I have included both versions below. Notice...
The foreign key has switched from the Driver entity table to the Auto entity table.
The foreign key is nullable when defined in the Auto entity table since the Driver is optional in the relationship.
The foreign key to the Auto is not constrained to be unique. That surprises me since this is a one-to-one relationship and not two Autos should be referencing the same Driver (If so, we would have a Many-To-One). However, later we will see the the provider enforcing the uniqueness in code rather than the database.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_AUTO ( id integer generated by default as identity, type varchar(10), primary key (id) ); create table RELATIONEX_AUTO2 ( id integer generated by default as identity, type varchar(10), driver_id integer, primary key (id) ); ... create table RELATIONEX_DRIVER ( id integer generated by default as identity, name varchar(20), auto_id integer not null, primary key (id), unique (auto_id) ); create table RELATIONEX_DRIVER2 ( id integer generated by default as identity, name varchar(20), primary key (id) ); ... alter table RELATIONEX_AUTO2 add constraint FK75ABE7D3B3D04E86 foreign key (driver_id) references RELATIONEX_DRIVER; ... alter table RELATIONEX_DRIVER add constraint FK44C072B81E349026 foreign key (auto_id) references RELATIONEX_AUTO;
Copy the previous test method and change all Auto class references to Auto2 and all Driver references to Driver2. You might find it easier to copy the test method in blocks starting with the creates.
@Test
public void testOne2OneBiInverseOptional() {
log.info("*** testOne2OneBiInverseOptional() ***");
Auto2 auto = new Auto2(); //auto is owning/dependent side
auto.setType(Auto2.Type.CAR);
Driver2 driver = new Driver2(auto); //driver is inverse/parent side
driver.setName("Danica Patrick");
auto.setDriver(driver); //owning side must be set
em.persist(auto);
em.persist(driver);
em.flush();
}
If you build and run the test method for just the persist() portion you should notice the following results.
Auto is inserted without a reference to the Driver
Driver is inserted with no knowledge of Auto
Auto is updated with foreign key to Driver
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiInverseOptional ... -*** testOne2OneBiInverseOptional() *** Hibernate: insert into RELATIONEX_AUTO2 (id, driver_id, type) values (null, ?, ?) Hibernate: insert into RELATIONEX_DRIVER2 (id, name) values (null, ?) Hibernate: update RELATIONEX_AUTO2 set driver_id=?, type=? where id=?
If we reversed to persist of driver and auto...
em.persist(driver);
em.persist(auto);
em.flush();
...we could avoid the extra update call since the foreign key value value for Driver would be known at the time the Auto was persisted in this second case.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiInverseOptional ... -*** testOne2OneBiInverseOptional() *** Hibernate: insert into RELATIONEX_DRIVER2 (id, name) values (null, ?) Hibernate: insert into RELATIONEX_AUTO2 (id, driver_id, type) values (null, ?, ?)
Add the following to your test method to obtain the entity pair from the inverse/parent side. Note to change the types from Driver to Driver2 as well as the logging and any comments dealing with inverse, parent, and dependent.
em.flush(); em.clear();
log.info("finding parent...");
Driver2 driver2 = em.find(Driver2.class, driver.getId());
log.info("found parent...");
assertEquals("unexpected name", driver.getName(), driver2.getName());
log.info("calling dependent...");
assertEquals("unexpected name", driver.getAuto().getType(), driver2.getAuto().getType());
Re-build the module, re-run the test method, and observe the actions taken with the database when accessing the relationship from the inverse/parent side of the relationship. Notice how the joins have been replaced by multiple selects and we have eliminated the redundancy of database calls using this combination.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiInverseOptional ... -finding parent... Hibernate: select driver2x0_.id as id15_0_, driver2x0_.name as name15_0_ from RELATIONEX_DRIVER2 driver2x0_ where driver2x0_.id=? Hibernate: select auto2x0_.id as id14_0_, auto2x0_.driver_id as driver3_14_0_, auto2x0_.type as type14_0_ from RELATIONEX_AUTO2 auto2x0_ where auto2x0_.driver_id=? -found parent... -calling dependent...
Add the following to obtain the pair of entities from the owning side. Note to change all Auto references to Auto2 and update comments and log statements referencing parent, inverse, and dependent.
em.flush(); em.clear();
log.info("finding dependent...");
Auto2 auto2 = em.find(Auto2.class, auto.getId());
log.info("found dependent...");
assertEquals("unexpected type", auto.getType(), auto.getType());
log.info("calling parent...");
assertEquals("unexpected name", auto.getDriver().getName(), auto2.getDriver().getName());
Re-build the module, re-run the test method, and observe the actions taken with the database when accessing the entities from the owning side of the relation. The owning/dependent entity is first located by itself due to the LAZY and optional specification of the Auto. The inverse/parent entity is located and then its reference back to the Auto is independently populated with an extra call.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiInverseOptional ... -finding dependent... Hibernate: select auto2x0_.id as id14_0_, auto2x0_.driver_id as driver3_14_0_, auto2x0_.type as type14_0_ from RELATIONEX_AUTO2 auto2x0_ where auto2x0_.id=? -found dependent... -calling parent... Hibernate: select driver2x0_.id as id15_0_, driver2x0_.name as name15_0_ from RELATIONEX_DRIVER2 driver2x0_ where driver2x0_.id=? Hibernate: select auto2x0_.id as id14_0_, auto2x0_.driver_id as driver3_14_0_, auto2x0_.type as type14_0_ from RELATIONEX_AUTO2 auto2x0_ where auto2x0_.driver_id=?
Add the following to the test method to test changing the Driver from one Auto to another and demonstrating the database interactions that occur now that the relationship is on the Auto side and not the Driver side. Note that with the change in database mapping we must manually clear the relationship from the previous Auto before assigning the Driver to a new Auto. That has not been done below and will cause an error.
Auto2 truck = new Auto2();
truck.setType(Auto2.Type.TRUCK);
em.persist(truck);
driver = em.find(Driver2.class, driver.getId()); //get the managed instance
driver.setAuto(truck);
// auto2.setDriver(null); //must remove reference to former driver
truck.setDriver(driver);//prior to assigning to new driver for 1:1
em.flush(); em.clear();
Auto2 auto3 = em.find(Auto2.class, auto.getId());
Driver2 driver3 = em.find(Driver2.class, driver.getId());
Auto2 truck3 = em.find(Auto2.class, truck.getId());
assertNull("driver not removed from auto", auto3.getDriver());
assertEquals("driver not assigned to truck", truck.getId(), driver3.getAuto().getId());
assertEquals("truck not assigned to driver", driver.getId(), truck3.getDriver().getId());
Remember what happened in the previous case. We created a new Auto instance and then updated the Driver.FK to reference that instance. Since the first Auto was no longer referenced, it had no Driver.
Re-build the module, re-run the test method, and observe the difference in database interactions. We again create a new Auto instance and assign it to the Driver. However, with the new database mapping the assignment is within the new Auto. The first Auto instance is still holding holding onto its reference at this point and causing the provider to fail. This is the uniqueness constraint I was talking about earlier when we reviewed the database schema.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiInverseOptional ... Hibernate: insert <!!!!==== create new Auto instance into RELATIONEX_AUTO2 (id, driver_id, type) values (null, ?, ?) Hibernate: <!!!!==== relate new Auto to existing Driver update RELATIONEX_AUTO2 set driver_id=?, type=? where id=? ... Tests in error: testOne2OneBiInverseOptional(myorg.relex.One2OneTest): org.hibernate.HibernateException: More than one row with the given identifier was found: 1, for class: myorg.relex.one2one.Auto2
Update the test method to clear the Driver reference from the first Auto prior to assigning the Driver to the new Auto. This is required because we have a 1:1 relationship and only a single Driver can be referenced by a single Auto. Before we did not do this because the truck.setDriver() was updating the owning side. Now it is updating the inverse side -- which is not of interest to the JPA provider.
auto2.setDriver(null); //must remove reference to former driver
truck.setDriver(driver);//prior to assigning to new driver for 1:1
Re-build the module, re-run the updated test method, and observe the new interactions with the database that allow the modification to complete. The new owning/dependent entity instance (truck) is created, the first owning/dependent entity instance (auto) is cleared of its driver, and then the new owning/dependent entity instance (truck) us updated to reference the driver.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiInverseOptional ... Hibernate: insert <!!!!== new Auto (truck) is created into RELATIONEX_AUTO2 (id, driver_id, type) values (null, ?, ?) Hibernate: <!!!!== existing Auto(auto) is cleared of reference to Driver update RELATIONEX_AUTO2 set driver_id=?, type=? where id=? Hibernate: update <!!!!== new Auto (truck) updated to reference Driver RELATIONEX_AUTO2 set driver_id=?, type=? where id=?
We could eliminate one of the database updates by moving the persist() of the truck to after the driver was set.
Auto2 truck = new Auto2();
truck.setType(Auto2.Type.TRUCK);
driver = em.find(Driver2.class, driver.getId()); //get the managed instance
driver.setAuto(truck);
auto2.setDriver(null); //must remove reference to former driver
truck.setDriver(driver);//prior to assigning to new driver for 1:1
em.persist(truck);
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiInverseOptional ... Hibernate: insert into RELATIONEX_AUTO2 (id, driver_id, type) values (null, ?, ?) Hibernate: update RELATIONEX_AUTO2 set driver_id=?, type=? where id=?
Add the following cleanup calls and verification tests to the test method.
em.remove(truck3.getDriver());
em.remove(truck3);
em.remove(auto3);
em.flush();
assertNull("driver not deleted", em.find(Driver.class, truck3.getDriver().getId()));
assertNull("auto not deleted", em.find(Auto.class, auto.getId()));
assertNull("truck not deleted", em.find(Auto.class, truck.getId()));
Re-build the module, re-run the test method, and notice the interaction that occurs with the database. The provider allows the Driver to be deleted first -- but first clears the Auto of the foreign key reference and then moves on to the rest of the deletes.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiInverseOptional ... Hibernate: update RELATIONEX_AUTO2 set driver_id=?, type=? where id=? Hibernate: delete from RELATIONEX_DRIVER2 where id=? Hibernate: delete from RELATIONEX_AUTO2 where id=? Hibernate: delete from RELATIONEX_AUTO2 where id=?
We can get rid of the extra database update if we rearrange the deletes to remove the owning/dependent entities first.
em.remove(truck3);
em.remove(auto3);
em.remove(truck3.getDriver());
em.flush();
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneBiInverseOptional ... Hibernate: delete from RELATIONEX_AUTO2 where id=? Hibernate: delete from RELATIONEX_AUTO2 where id=? Hibernate: delete from RELATIONEX_DRIVER2 where id=?
You have finished mapping a one-to-one, bi-directional relationship that uses a 0..1 relationship for the inverse/parent side. This caused the foreign key to be moved to the optional (Auto) side of the relationship where the it was allowed to be nullable and had to be kept unique.
In this section will will go through some automated actions your EntityManager provider can do for your application behind the scenes. These actions automate what you would otherwise need to do in extra EntityManager calls or additional tracking of entity use. The first case is broad in scope and applies to cascading actions that occur during the other CRUD actions. The second case is confined to the removal of orphaned parent entities.
In this example we will demonstrate how cascades can be setup and automated from the owning side of a relationship.
Put the following entity class in place in your src/main tree. This class will be the passive/ignorant side of a one-to-one, uni-directional relationship.
package myorg.relex.one2one;
import java.util.Date;
import javax.persistence.*;
/**
* This class provides an example of a recipient of cascade actions.
*/
@Entity
@Table(name="RELATIONEX_LICENSE")
public class License {
@Id @GeneratedValue
private int id;
@Temporal(TemporalType.DATE)
private Date renewal;
public int getId() { return id; }
public Date getRenewal() { return renewal; }
public void setRenewal(Date renewal) {
this.renewal = renewal;
}
}
Put the following entity clas in place in your src/main tree. This class will be the owning side of a one-to-one, uni-directional relationship. It is currently incomplete and will need to be updated later.
package myorg.relex.one2one;
import java.util.Date;
import javax.persistence.*;
/**
* This class provides an example initiation of cascade actions to a
* related entity.
*/
@Entity
@Table(name="RELATIONEX_LICAPP")
public class LicenseApplication {
@Id @GeneratedValue
private int id;
@Temporal(TemporalType.TIMESTAMP)
private Date updated;
@OneToOne(optional=false, fetch=FetchType.EAGER,
cascade={
// CascadeType.PERSIST,
// CascadeType.DETACH,
// CascadeType.REMOVE,
// CascadeType.REFRESH,
// CascadeType.MERGE
})
private License license;
public LicenseApplication() {}
public LicenseApplication(License license) {
this.license = license;
}
public int getId() { return id; }
public License getLicense() { return license; }
public Date getUpdated() { return updated; }
public void setUpdated(Date updated) {
this.updated = updated;
}
}
Add the two new entities to the persistence unit.
<class>myorg.relex.one2one.License</class>
<class>myorg.relex.one2one.LicenseApplication</class>
Build the module and verify the database schema created for this exercise is as follows. The specific schema has very little to do with implementing the JPA cascades, but it is of general interest to any JPA mapping exercise.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_LICAPP ( id integer generated by default as identity, updated timestamp, license_id integer not null, primary key (id), unique (license_id) ); create table RELATIONEX_LICENSE ( id integer generated by default as identity, renewal date, primary key (id) ); ... alter table RELATIONEX_LICAPP add constraint FK51E55C6BE67289CE foreign key (license_id) references RELATIONEX_LICENSE;
Put the following test method in place in your existing one-to-one test case.
@Test
public void testOne2OneCascadeFromOwner() {
log.info("*** testOne2OneCascadeFromOwner ***");
License license = new License();
license.setRenewal(new GregorianCalendar(2012,1,1).getTime());
LicenseApplication licapp = new LicenseApplication(license);
licapp.setUpdated(new Date());
em.persist(licapp);
em.flush();
}
Re-build the module, attempt to run the new unit test, and observe the reported error.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromOwner ... -*** testOne2OneCascadeFromOwner *** Hibernate: insert into RELATIONEX_LICAPP (id, license_id, updated) values (null, ?, ?) -SQL Error: 23502, SQLState: 23502 -NULL not allowed for column "LICENSE_ID"; SQL statement: insert into RELATIONEX_LICAPP (id, license_id, updated) values (null, ?, ?) [23502-168]
The problem is the test method only persisted the licapp and not the license that it references. We could fix this by adding a call to em.persist(license) prior to calling em.persist(licapp), but lets solve this with cascades instead.
It is not always appropriate for the dependent entity to create the missing parent, but in this case we are going to rationalize this it is appropriate -- especially when the instance is there and all we want to to automate the persistence of the overall (small) object tree. Update the relationship in the licapp to cascade persist calls to the related License entity.
@Entity
@Table(name="RELATIONEX_LICAPP")
public class LicenseApplication {
...
@OneToOne(optional=false, fetch=FetchType.EAGER,
cascade={
CascadeType.PERSIST
})
private License license;
Re-build the module, re-run the test method and observe that the error has gone away and the license is now being automatically persisted with the licapp.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromOwner ... -*** testOne2OneCascadeFromOwner *** Hibernate: insert into RELATIONEX_LICENSE (id, renewal) values (null, ?) Hibernate: insert into RELATIONEX_LICAPP (id, license_id, updated) values (null, ?, ?)
Put the following test code in place to demonstrate behavior of cascading a call to detach(). This section of the test looks to detach the existing entities and then instantiate new instances within the local cache.
assertTrue("licapp was not managed???", em.contains(licapp));
assertTrue("license was not managed???", em.contains(license));
em.detach(licapp);
assertFalse("licapp still managed", em.contains(licapp));
assertFalse("license still managed", em.contains(license));
licapp = em.find(LicenseApplication.class, licapp.getId());
license = licapp.getLicense();
Re-build the module, attempt to re-run the unit test, and observe the following error when asserting the detached state.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromOwner ... $ more `find . -name *.txt | grep reports` ------------------------------------------------------------------------------- Test set: myorg.relex.One2OneTest ------------------------------------------------------------------------------- Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 7.046 sec <<< FAILURE! testOne2OneCascadeFromOwner(myorg.relex.One2OneTest) Time elapsed: 0.724 sec <<< FAILURE! java.lang.AssertionError: license still managed at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.assertTrue(Assert.java:43) at org.junit.Assert.assertFalse(Assert.java:68) at myorg.relex.One2OneTest.testOne2OneCascadeFromOwner(One2OneTest.java:529)
Fix the immediate error by enabling cascade=DETACH from the licapp to the license entity.
@Entity
@Table(name="RELATIONEX_LICAPP")
public class LicenseApplication {
...
@OneToOne(optional=false, fetch=FetchType.EAGER,
cascade={
CascadeType.PERSIST,
CascadeType.DETACH
})
private License license;
Re-build the module, re-run the test method, and observe how the change allowed the detach() to propagate down to the license entity.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromOwner ... [INFO] BUILD SUCCESS
Add the following to your test method in order to demonstrate how to cascade refresh() to relationships. The strategy derives a new renewal and modified date/timestamp for the license and licapp entities, updates the database directly through a bulk query, and then synchronizes the entity state with the database using refresh().
Bulk query changes bypass the entity cache and render any cached instances for updated database rows out of sync.
Date newDate = new GregorianCalendar(2014, 1, 1).getTime();
Date newUpdate = new Date(licapp.getUpdated().getTime()+1);
assertEquals("unexpected update count", 1,
em.createQuery("update License lic set lic.renewal=:renewal where lic.id=:id")
.setParameter("renewal", newDate, TemporalType.DATE)
.setParameter("id", license.getId())
.executeUpdate());
assertEquals("unexpected update count", 1,
em.createQuery("update LicenseApplication licapp set licapp.updated=:updated where licapp.id=:id")
.setParameter("updated", newUpdate, TemporalType.TIMESTAMP)
.setParameter("id", licapp.getId())
.executeUpdate());
assertFalse("unexpected updated value prior to refresh",
licapp.getUpdated().getTime() == newUpdate.getTime());
assertFalse("unexpected renewal value prior to refresh",
license.getRenewal().getTime() == newDate.getTime());
log.info("database updated");
em.refresh(licapp);
log.info("entities refreshed");
DateFormat df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSSZ");
assertTrue(String.format("licapp not refreshed, exp=%s, act=%s", df.format(newUpdate), df.format(licapp.getUpdated())),
licapp.getUpdated().getTime() == newUpdate.getTime());
assertTrue(String.format("license not refreshed, exp=%s, act=%s", df.format(newDate), df.format(license.getRenewal())),
license.getRenewal().getTime() == newDate.getTime());
Re-build the module and attempt to run the additional tests. This will fail since we only synchonized the licapp with the database state.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromOwner ... -database updated Hibernate: select licenseapp0_.id as id17_0_, licenseapp0_.license_id as license3_17_0_, licenseapp0_.updated as updated17_0_ from RELATIONEX_LICAPP licenseapp0_ where licenseapp0_.id=? -entities refreshed ... Failed tests: testOne2OneCascadeFromOwner(myorg.relex.One2OneTest): license not refreshed, exp=2014-02-01 00:00:00.000-0500, act=2012-02-01 00:00:00.000-0500
Correct the issue by adding cascade=REFRESH to the relationship in the licapp entity.
@Entity
@Table(name="RELATIONEX_LICAPP")
public class LicenseApplication {
...
@OneToOne(optional=false, fetch=FetchType.EAGER,
cascade={
CascadeType.PERSIST,
CascadeType.DETACH,
CascadeType.REFRESH
})
private License license;
Re-build the module, re-run the test method, and observe the successful results for the refresh() being cascaded to the license. Notice how both entities are re-fetched from the database.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromOwner ... -database updated Hibernate: select license0_.id as id16_0_, license0_.renewal as renewal16_0_ from RELATIONEX_LICENSE license0_ where license0_.id=? Hibernate: select licenseapp0_.id as id17_1_, licenseapp0_.license_id as license3_17_1_, licenseapp0_.updated as updated17_1_, license1_.id as id16_0_, license1_.renewal as renewal16_0_ from RELATIONEX_LICAPP licenseapp0_ inner join RELATIONEX_LICENSE license1_ on licenseapp0_.license_id=license1_.id where licenseapp0_.id=? -entities refreshed ... [INFO] BUILD SUCCESS
Add the following to your test method to demonstrate the ability to cascade merge() calls thru relationships. The code updates two detached entities, merges them using the entity manager, and then checks the state of the resultant managed entities.
em.detach(licapp);
newDate = new GregorianCalendar(2016, 1, 1).getTime();
newUpdate = new Date(licapp.getUpdated().getTime()+1);
assertFalse("licapp still managed", em.contains(licapp));
assertFalse("license still managed", em.contains(licapp.getLicense()));
licapp.setUpdated(newUpdate);
licapp.getLicense().setRenewal(newDate);
log.info("merging changes to detached entities");
licapp=em.merge(licapp);
em.flush();
log.info("merging complete");
assertTrue("merged licapp not managed", em.contains(licapp));
assertTrue("merged licapp.license not managed", em.contains(licapp.getLicense()));
assertTrue(String.format("licapp not merged, exp=%s, act=%s", df.format(newUpdate), df.format(licapp.getUpdated())),
licapp.getUpdated().getTime() == newUpdate.getTime());
assertTrue(String.format("license not merged, exp=%s, act=%s", df.format(newDate), df.format(license.getRenewal())),
licapp.getLicense().getRenewal().getTime() == newDate.getTime());
Re-build the module, re-run the test method, and observe the error that occurs. The problem is that only changes to the licapp are considered during the merge. Any changes to license is ignored.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromOwner ... -merging changes to detached entities Hibernate: select licenseapp0_.id as id17_0_, licenseapp0_.license_id as license3_17_0_, licenseapp0_.updated as updated17_0_ from RELATIONEX_LICAPP licenseapp0_ where licenseapp0_.id=? Hibernate: select license0_.id as id16_0_, license0_.renewal as renewal16_0_ from RELATIONEX_LICENSE license0_ where license0_.id=? Hibernate: update RELATIONEX_LICAPP set license_id=?, updated=? where id=? -merging complete ... Failed tests: testOne2OneCascadeFromOwner(myorg.relex.One2OneTest): license not merged, exp=2016-02-01 00:00:00.000-0500, act=2016-02-01 00:00:00.000-0500
Attempt to fix the issue by adding cascade=MERGE to the relationship defined in licapp.
@Entity
@Table(name="RELATIONEX_LICAPP")
public class LicenseApplication {
...
@OneToOne(optional=false, fetch=FetchType.EAGER,
cascade={
CascadeType.PERSIST,
CascadeType.DETACH,
CascadeType.REFRESH,
CascadeType.MERGE
})
Re-build the module, re-run the test method, and observe how the test now passes. Updates from licapp and license are issued to the database.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromOwner ... -merging changes to detached entities Hibernate: select licenseapp0_.id as id17_1_, licenseapp0_.license_id as license3_17_1_, licenseapp0_.updated as updated17_1_, license1_.id as id16_0_, license1_.renewal as renewal16_0_ from RELATIONEX_LICAPP licenseapp0_ inner join RELATIONEX_LICENSE license1_ on licenseapp0_.license_id=license1_.id where licenseapp0_.id=? Hibernate: update RELATIONEX_LICENSE set renewal=? where id=? Hibernate: update RELATIONEX_LICAPP set license_id=?, updated=? where id=? -merging complete
Add the following to your test method to demonstrate cascades for the remove() method. In this code we attempt to remove just the licapp but expect both the licapp and related license to be deleted from the database when complete.
em.remove(licapp); em.flush(); assertNull("licapp not deleted", em.find(LicenseApplication.class, licapp.getId())); assertNull("licapp.license not deleted", em.find(License.class, licapp.getLicense().getId()));
Re-build the module, attempt to run the updated test method, and observe the following error. The trouble is only the licapp is deleted from the database and the license is being ignored.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromOwner ... Hibernate: delete from RELATIONEX_LICAPP where id=? ... Failed tests: testOne2OneCascadeFromOwner(myorg.relex.One2OneTest): licapp.license not deleted
Attempt to fix the problem by adding cascade=DELETE to the relationship defined within licapp.
@Entity
@Table(name="RELATIONEX_LICAPP")
public class LicenseApplication {
...
@OneToOne(optional=false, fetch=FetchType.EAGER,
cascade={
CascadeType.PERSIST,
CascadeType.DETACH,
CascadeType.REMOVE,
CascadeType.REFRESH,
CascadeType.MERGE
})
private License license;
Re-build the module, re-run the test method, and observe how the tests now pass. Both the licapp and license are being deleted during the call to remove().
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromOwner ... Hibernate: delete from RELATIONEX_LICAPP where id=? Hibernate: delete from RELATIONEX_LICENSE where id=? ... [INFO] BUILD SUCCESS
You have finished implementing cascade functionality from the owning side of a one-to-one relationship. As you can see, the cascade capability can save extract calls to the EntityManager when there is a large object graph to be persisted or acted on in other ways. However, as you can also guess -- not every call should be cascaded at all times. Use this capability wisely as you can encounter many situations where the cascade could lead to severe issues or at least poorer performance than desired.
In this example we will demonstrate how cascades can be setup and automated from the inverse side of a relationship.
Add the following inverse/parent entity to your src/main source tree. It is currently incomplete and will cause errors later when we attempt to cascade entity manager commands from the inverse to owning side.
package myorg.relex.one2one;
import java.util.Date;
import javax.persistence.*;
/**
* This entity class provides an example of cascades being originated from
* the inverse side of a bi-directional relationship.
*/
@Entity
@Table(name="RELATIONEX_TICKET")
public class Ticket {
@Id @GeneratedValue
private int id;
@OneToOne(mappedBy="ticket", fetch=FetchType.EAGER,
cascade={
// CascadeType.PERSIST,
// CascadeType.DETACH,
// CascadeType.REFRESH,
// CascadeType.MERGE,
// CascadeType.REMOVE
})
private Passenger passenger;
@Temporal(TemporalType.DATE)
Date date;
public Ticket(){}
public Ticket(int id) { this.id = id; }
public int getId() { return id; }
public Passenger getPassenger() { return passenger; }
public void setPassenger(Passenger passenger) {
this.passenger = passenger;
}
public Date getDate() { return date; }
public void setDate(Date date) {
this.date = date;
}
}
Add the following owning/dependent entity class to your src/main tree. Although this class will own the foreign key and JPA relation, it is designed to not initiate any cascade calls from the owning side to the inverse side.
package myorg.relex.one2one;
import javax.persistence.*;
/**
* This entity class provides an example of the owning side of a
* bi-directional relation where all cascades are being initiated
* from the inverse side (i.e., not from here).
*/
@Entity
@Table(name="RELATIONEX_PASSENGER")
public class Passenger {
@Id @GeneratedValue
private int id;
@OneToOne(optional=false)
private Ticket ticket;
@Column(length=32, nullable=false)
private String name;
protected Passenger() {}
public Passenger(int id) { this.id = id; }
public Passenger(Ticket ticket, String name) {
this.ticket = ticket;
this.name = name;
}
public int getId() { return id; }
public Ticket getTicket() { return ticket; }
public void setTicket(Ticket ticket) {
this.ticket = ticket;
}
public String getName() { return name;}
public void setName(String name) {
this.name = name;
}
}
Add the two entity classes to the persistence unit.
<class>myorg.relex.one2one.Ticket</class>
<class>myorg.relex.one2one.Passenger</class>
Add the following test method to your existing one-to-one JUnit test case. At this point in time we are testing the ability to cascade the PERSIST call from the inverse/parent side of the relation to the owning/dependent side.
@Test
public void testOne2OneCascadeFromInverse() {
log.info("*** testOne2OneCascadeFromInverse ***");
Ticket ticket = new Ticket();
ticket.setDate(new GregorianCalendar(2013, Calendar.MARCH, 16).getTime());
Passenger passenger = new Passenger(ticket, "Fred"); //set inverse side
ticket.setPassenger(passenger); //set the owning side
em.persist(ticket); //persist from inverse side
em.flush();
assertTrue("ticket not managed", em.contains(ticket));
assertTrue("passenger not managed", em.contains(passenger));
}
Build the module and attempt to run the new unit test. The test will fail at this point because the inverse side has not been configured to cascade the PERSIST call to the owning side.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromInverse ... -*** testOne2OneCascadeFromInverse *** Hibernate: insert into RELATIONEX_TICKET (id, date) values (null, ?) ... testOne2OneCascadeFromInverse(myorg.relex.One2OneTest): org.hibernate.TransientObjectException: object references an unsaved transient instance - save the transient instance before flushing: myorg.relex.one2one.Ticket.passenger -> myorg.relex.one2one.Passenger
Fix the inverse/parent side of the relationship by adding a cascade of PERSIST.
@OneToOne(mappedBy="ticket", fetch=FetchType.EAGER,
cascade={
CascadeType.PERSIST,
// CascadeType.DETACH,
// CascadeType.REFRESH,
// CascadeType.MERGE,
// CascadeType.REMOVE
})
private Passenger passenger;
Rebuild the module and re-run the unit test to verify the persist is now being cascaded from the inverse to owning side of the relationship.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromInverse ... -*** testOne2OneCascadeFromInverse *** Hibernate: insert into RELATIONEX_TICKET (id, date) values (null, ?) Hibernate: insert into RELATIONEX_PASSENGER (id, name, ticket_id) values (null, ?, ?) ... [INFO] BUILD SUCCESS
Add the following code snippet to the test method to verify the capability to cascade a DETACH from the inverse to owning side of a relationship.
log.debug("detach both instances from the persistence context");
em.detach(ticket);
assertFalse("ticket managed", em.contains(ticket));
assertFalse("passenger managed", em.contains(passenger));
Rebuild and attempt to re-run the test method. The test method will fail because the inverse side is not configured to cascade the DETACH.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromInverse ... -detach both instances from the persistence context ... Failed tests: testOne2OneCascadeFromInverse(myorg.relex.One2OneTest): passenger managed
Fix the inverse side by adding a cascade of DETACH to the relationship.
@OneToOne(mappedBy="ticket", fetch=FetchType.EAGER,
cascade={
CascadeType.PERSIST,
CascadeType.DETACH,
// CascadeType.REFRESH,
// CascadeType.MERGE,
// CascadeType.REMOVE
})
private Passenger passenger;
Rebuild and re-run the test method with the correction in place. This should now pass.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromInverse ... -detach both instances from the persistence context ... [INFO] BUILD SUCCESS
Add the following code snippet to your test method to demonstrate the capability to cascade a REFRESH.
log.debug("perform a bulk update to both objects");
ticket = em.find(Ticket.class, ticket.getId());
Date newDate=new GregorianCalendar(2013, Calendar.APRIL, 1).getTime();
String newName = "Frederick";
em.createQuery("update Ticket t set t.date=:date")
.setParameter("date", newDate,TemporalType.DATE)
.executeUpdate();
em.createQuery("update Passenger p set p.name=:name where p.name='Fred'")
.setParameter("name", newName)
.executeUpdate();
assertFalse("unexpected date", newDate.equals(ticket.getDate()));
assertFalse("unexpected name", newName.equals(ticket.getPassenger().getName()));
em.refresh(ticket);
assertTrue("date not refreshed", newDate.equals(ticket.getDate()));
assertTrue("name not refreshed", newName.equals(ticket.getPassenger().getName()));
Rebuild the module and attempt to re-run the test method. It should fail at this point because your inverse side is not configured to cascade the REFRESH. Notice how only the state for the ticket was retrieved from the database during the refresh() call.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromInverse ... -perform a bulk update to both objects Hibernate: select ticket0_.id as id18_1_, ticket0_.date as date18_1_, passenger1_.id as id19_0_, passenger1_.name as name19_0_, passenger1_.ticket_id as ticket3_19_0_ from RELATIONEX_TICKET ticket0_ left outer join RELATIONEX_PASSENGER passenger1_ on ticket0_.id=passenger1_.ticket_id where ticket0_.id=? Hibernate: update RELATIONEX_TICKET set date=? Hibernate: update RELATIONEX_PASSENGER set name=? where name='Fred' Hibernate: select ticket0_.id as id18_0_, ticket0_.date as date18_0_ from RELATIONEX_TICKET ticket0_ where ticket0_.id=? ... Failed tests: testOne2OneCascadeFromInverse(myorg.relex.One2OneTest): name not refreshed
Fix the issue by configuring the inverse side to cascade the REFRESH so that both entities will be updated with the state of the database when the inverse side is refreshed.
@OneToOne(mappedBy="ticket", fetch=FetchType.EAGER,
cascade={
CascadeType.PERSIST,
CascadeType.DETACH,
CascadeType.REFRESH,
// CascadeType.MERGE,
// CascadeType.REMOVE
})
private Passenger passenger;
Rebuild the module and re-run the test method. It should pass at this point. Notice how both the state for the inverse and owning side is retrieved from the database during the refresh of the inverse side.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromInverse ... -perform a bulk update to both objects Hibernate: select ticket0_.id as id18_1_, ticket0_.date as date18_1_, passenger1_.id as id19_0_, passenger1_.name as name19_0_, passenger1_.ticket_id as ticket3_19_0_ from RELATIONEX_TICKET ticket0_ left outer join RELATIONEX_PASSENGER passenger1_ on ticket0_.id=passenger1_.ticket_id where ticket0_.id=? Hibernate: update RELATIONEX_TICKET set date=? Hibernate: update RELATIONEX_PASSENGER set name=? where name='Fred' Hibernate: select passenger0_.id as id19_0_, passenger0_.name as name19_0_, passenger0_.ticket_id as ticket3_19_0_ from RELATIONEX_PASSENGER passenger0_ where passenger0_.id=? Hibernate: select ticket0_.id as id18_1_, ticket0_.date as date18_1_, passenger1_.id as id19_0_, passenger1_.name as name19_0_, passenger1_.ticket_id as ticket3_19_0_ from RELATIONEX_TICKET ticket0_ left outer join RELATIONEX_PASSENGER passenger1_ on ticket0_.id=passenger1_.ticket_id where ticket0_.id=? Hibernate: select passenger0_.id as id19_1_, passenger0_.name as name19_1_, passenger0_.ticket_id as ticket3_19_1_, ticket1_.id as id18_0_, ticket1_.date as date18_0_ from RELATIONEX_PASSENGER passenger0_ inner join RELATIONEX_TICKET ticket1_ on passenger0_.ticket_id=ticket1_.id where passenger0_.ticket_id=? ... [INFO] BUILD SUCCESS
Add the following code snippet to your test method to demonstrate the ability to cascade a MERGE from the inverse side.
log.debug("merging changes from inverse side");
Ticket ticket2 = new Ticket(ticket.getId());
ticket2.setDate(new GregorianCalendar(2014, Calendar.APRIL, 1).getTime());
Passenger passenger2 = new Passenger(passenger.getId());
passenger2.setName("Rick");
ticket2.setPassenger(passenger2);
passenger2.setTicket(ticket2);
assertFalse("unexpected date", ticket2.getDate().equals(ticket.getDate()));
assertFalse("unexpected name", ticket2.getPassenger().getName().equals(ticket.getPassenger().getName()));
ticket=em.merge(ticket2);
em.flush();
assertTrue("date not merged", ticket2.getDate().equals(ticket.getDate()));
assertTrue("name not merged", ticket2.getPassenger().getName().equals(ticket.getPassenger().getName()));
Rebuild the module and attempt to run the updated test method. This will fail because the inverse side is not yet configured to cascade the MERGE. Notice how only the inverse side is being updated and not the owning entity.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromInverse ... -merging changes from inverse side Hibernate: select passenger0_.id as id19_1_, passenger0_.name as name19_1_, passenger0_.ticket_id as ticket3_19_1_, ticket1_.id as id18_0_, ticket1_.date as date18_0_ from RELATIONEX_PASSENGER passenger0_ inner join RELATIONEX_TICKET ticket1_ on passenger0_.ticket_id=ticket1_.id where passenger0_.ticket_id=? -tearDown() started, em=org.hibernate.ejb.EntityManagerImpl@5dfadfd6 Hibernate: update RELATIONEX_TICKET set date=? where id=? ... Failed tests: testOne2OneCascadeFromInverse(myorg.relex.One2OneTest): name not merged
Fix the issue by configuring the inverse side to cascade the MERGE.
@OneToOne(mappedBy="ticket", fetch=FetchType.EAGER,
cascade={
CascadeType.PERSIST,
CascadeType.DETACH,
CascadeType.REFRESH,
CascadeType.MERGE,
// CascadeType.REMOVE
})
private Passenger passenger;
Rebuild the module and re-run the test method. The test method should now pass because to MERGE is bing cascaded from the inverse to owning side of the relation. Notice how both sides are being updated.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromInverse ... Hibernate: update RELATIONEX_PASSENGER set name=?, ticket_id=? where id=? Hibernate: update RELATIONEX_TICKET set date=? where id=? ... [INFO] BUILD SUCCESS
Add the following code snippet to your test method to demonstrate the cascade from the inverse side.
log.debug("delete the entities from the inverse side");
assertNotNull("ticket not found", em.find(Ticket.class, ticket.getId()));
assertNotNull("passenger not found", em.find(Passenger.class, ticket.getPassenger().getId()));
em.remove(ticket);
em.flush();
assertNull("ticket not removed", em.find(Ticket.class, ticket.getId()));
assertNull("passenger not removed", em.find(Passenger.class, ticket.getPassenger().getId()));
Rebuild the module and attempt to run the updated test method. This will fail because the inverse side is not configured to cascade the DELETE. Notice how only the inverse side is being deleted from the database during the call to remove.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromInverse ... -delete the entities from the inverse side Hibernate: delete from RELATIONEX_TICKET where id=? -SQL Error: 23503, SQLState: 23503 -Referential integrity constraint violation: "FKD08633EA1BCEB406: PUBLIC.RELATIONEX_PASSENGER FOREIGN KEY(TICKET_ID) REFERENCES PUBLIC.RELATIONEX_TICKET(ID) (1)"; SQL statement: delete from RELATIONEX_TICKET where id=? [23503-168]
Fix the issue by configuring the inverse side to cascade the DELETE.
@OneToOne(mappedBy="ticket", fetch=FetchType.EAGER,
cascade={
CascadeType.PERSIST,
CascadeType.DETACH,
CascadeType.REFRESH,
CascadeType.MERGE,
CascadeType.REMOVE
})
private Passenger passenger;
Rebuild the module and re-run the test method. The test method should pass this time because the DELETE is now configured to be cascaded from the inverse to owning side. Notice how both sides of the relationship are being deleted from the database, starting with the owning/dependent side to prevent a foreign key constraint violation.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOne2OneCascadeFromInverse ... -delete the entities from the inverse side Hibernate: delete from RELATIONEX_PASSENGER where id=? Hibernate: delete from RELATIONEX_TICKET where id=? ... [INFO] BUILD SUCCESS
You have now finished demonstrating how entity manager actions can be automatically cascaded from the inverse/parent side of a relationship to the owning/dependent side of a bi-directional relationship. This is not always an appropriate or desired behavior but it is notable that cascade direction and relationship ownership are independent of one another. You can configure cascades to be originated from the owning or inverse side of a relationship.
Orphan removal can be handy when the target of a OneToXxx relationship becomes unreferenced by its referencing entity and the target entity only exists to support that referencing entity. This is different from cascade=DELETE. In this case the referencing entity is not being deleted. It is de-referencing the parent. The basic rules are as follows.
Referencing entity removes (nulls in the case of OneToOne) its relationship to the target entity
em.remove() on the orphaned target entity is applied
Orphaned target entity must not be reassigned to new child. It exists for the sole use of the referencing entity.
Not necessary to declare cascade=DELETE for this relationship
Put the following entity class in your src/main tree. This entity class represents a parent entity that exists solely for the use of a dependent entity that will be related to it in a follow-on step.
package myorg.relex.one2one;
import javax.persistence.*;
/**
* This entity class provides an example of an entity that
* will get deleted when no longer referenced by its dependent
* entity in a one-to-one relation. This is called orphan removal.
*/
@Entity
@Table(name="RELATIONEX_RESIDENCE")
public class Residence {
@Id @GeneratedValue
private int id;
@Column(length=16, nullable=false)
private String city;
@Column(length=2, nullable=false)
private String state;
protected Residence() {}
public Residence(int id) { this.id = id; }
public Residence(String city, String state) {
this.city = city;
this.state = state;
}
public int getId() { return id; }
public String getCity() { return city; }
public void setCity(String city) {
this.city = city;
}
public String getState() { return state;}
public void setState(String state) {
this.state = state;
}
}
Put the following entity class in your src/main tree. This entity provides an example of a dependent entity with a parent that only exists to support this instance. If this instance ceases to reference the parent -- the parent will be removed by the provider.
package myorg.relex.one2one;
import javax.persistence.*;
/**
* This entity class provides an example of a dependent with a relationship to a parent entity that
* should only exist to support this entity. When this entity ceases to reference the parent, it
* will become "orphaned" and subject to orphanRemoval by the provider.
*/
@Entity
@Table(name="RELATIONEX_ATTENDEE")
public class Attendee {
@Id @GeneratedValue
private int id;
//orphanRemoval will take care of dereference and DELETE from dependent Attendee
@OneToOne(cascade=CascadeType.PERSIST, orphanRemoval=true)
private Residence residence;
private String name;
public int getId() { return id; }
public Residence getResidence() { return residence; }
public void setResidence(Residence residence) {
this.residence = residence;
}
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Add the two entity classes to the persistence unit using the persistence.xml.
<class>myorg.relex.one2one.Residence</class>
<class>myorg.relex.one2one.Attendee</class>
Add the following test method to your existing one-to-one JUnit test case. At this point in time, the test just creates an instance of the two related entities. Consistent with the concept that the parent is only there to support the parent -- we have maintained no reference to the parent except through the dependent entity.
@Test
public void testOrphanRemoval() {
log.info("*** testOrphanRemoval ***");
log.debug("start by verifying the state of the database");
int startCount = em.createQuery("select count(r) from Residence r", Number.class)
.getSingleResult().intValue();
log.debug("create a new attendee and residence");
Attendee attendee = new Attendee();
attendee.setName("jones");
attendee.setResidence(new Residence("Columbia", "MD"));
em.persist(attendee);
em.flush();
log.debug("verify we have a new residence in the database");
assertEquals("unexpected number of residences", startCount+1,
em.createQuery("select count(r) from Residence r", Number.class)
.getSingleResult().intValue());
log.debug("verify we can find our new instance");
int originalId=attendee.getResidence().getId();
assertNotNull("could not find residence", em.find(Residence.class, originalId));
}
Build the module and run the new test method. Notice how the two entities are inserted into the database and the foreign key column for the dependent entity table is set to reference the parent.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOrphanRemoval ... -start by verifying the state of the database Hibernate: select count(residence0_.id) as col_0_0_ from RELATIONEX_RESIDENCE residence0_ limit ? -create a new attendee and residence Hibernate: insert into RELATIONEX_RESIDENCE (id, city, state) values (null, ?, ?) Hibernate: insert into RELATIONEX_ATTENDEE (id, name, residence_id) values (null, ?, ?) -verify we have a new residence in the database Hibernate: select count(residence0_.id) as col_0_0_ from RELATIONEX_RESIDENCE residence0_ limit ? -verify we can find our new instance
Add the following to your test method to orphan your original parent.
log.debug("have attendee change residence");
//ISSUE: https://hibernate.atlassian.net/browse/HHH-6484
//MORE: https://hibernate.atlassian.net/browse/HHH-5559
// attendee.setResidence(null);
// em.flush();
attendee.setResidence(new Residence("Baltimore", "MD"));
em.flush();
log.debug("verify we have the same number of residences");
assertEquals("unexpected number of residences", startCount+1,
em.createQuery("select count(r) from Residence r", Number.class)
.getSingleResult().intValue());
log.debug("verify the new instance replaced the original instance");
assertNull("found original residence", em.find(Residence.class, originalId));
assertNotNull("could not find new residence", em.find(Residence.class, attendee.getResidence().getId()));
Rebuild the module and re-run the test method to exercise the additional test snippet. There is a small debate about what should happen. Ideally the older, orphaned parent should get deleted when the new parent takes its place as documented in the following trouble tickets( HHH-6484 and HHH-5559). However, that does not end up being the case
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOrphanRemoval ... -have attendee change residence Hibernate: insert into RELATIONEX_RESIDENCE (id, city, state) values (null, ?, ?) Hibernate: update RELATIONEX_ATTENDEE set name=?, residence_id=? where id=? -verify we have the same number of residences Hibernate: select count(residence0_.id) as col_0_0_ from RELATIONEX_RESIDENCE residence0_ limit ? ... Failed tests: testOrphanRemoval(myorg.relex.One2OneTest): unexpected number of residences expected:<1> but was:<2>
Uncomment setting the parent reference to null and calling em.flush() on the session. This will cause the desired behavior with the extra steps/calls.
attendee.setResidence(null);
em.flush();
attendee.setResidence(new Residence("Baltimore", "MD"));
Rebuild the module and re-run the test method with the above lines uncommented. This should cause the older, orphaned parent to be deleted. The assignment to the new parent is an independent step.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOrphanRemoval ... -have attendee change residence Hibernate: update RELATIONEX_ATTENDEE set name=?, residence_id=? where id=? Hibernate: delete from RELATIONEX_RESIDENCE where id=? Hibernate: insert into RELATIONEX_RESIDENCE (id, city, state) values (null, ?, ?) Hibernate: update RELATIONEX_ATTENDEE set name=?, residence_id=? where id=? -verify we have the same number of residences Hibernate: select count(residence0_.id) as col_0_0_ from RELATIONEX_RESIDENCE residence0_ limit ? -verify the new instance replaced the original instance Hibernate: select residence0_.id as id18_0_, residence0_.city as city18_0_, residence0_.state as state18_0_ from RELATIONEX_RESIDENCE residence0_ where residence0_.id=?
Add the following statements to verify a simple nulling of the parent. With the work-around above, there should be no surprise here that this works. However, the flush() is unnecessary because it appears to be happening automatically because of the follow-on query.
log.debug("remove reference to the current residence");
attendee.setResidence(null);
//em.flush(); -- note flush is done during follow-on query
log.debug("verify all residences created during this test have been deleted");
assertEquals("unexpected number of residences", startCount,
em.createQuery("select count(r) from Residence r", Number.class)
.getSingleResult().intValue());
Rebuild the module and re-run the test method with the final exercise of the orphan management capability. Notice how the query made the manual call to em.flush() unnecessary.
$ mvn clean test -Dtest=myorg.relex.One2OneTest#testOrphanRemoval ... -remove reference to the current residence -verify all residences created during this test have been deleted Hibernate: update RELATIONEX_ATTENDEE set name=?, residence_id=? where id=? Hibernate: delete from RELATIONEX_RESIDENCE where id=? Hibernate: select count(residence0_.id) as col_0_0_ from RELATIONEX_RESIDENCE residence0_ limit ? ... [INFO] BUILD SUCCESS
You have finished working with orphanRemoval and found that the provider will automatically delete the parent entity when the reference from the dependent is nulled out and the session is flushed to the database. You saw how a simple replace did not cause the behavior and noticed there were a few trouble tickets written against that lack of behavior that causes your business logic to take extra steps to have the orphan removed.
In this chapter we mapped two entity classes through a one-to-one relationship using multiple techniques and under different situations.
Uni-directional and Bi-directional Relationships
Simple and Composite Primary/Foreign Key
Primary Key, Foreign Key and Link Table Joins
Cascading actions across relationships
Cascade and Orphan Removal
Create a JUnit test class to host tests for the one-to-many mappings.
Put the following JUnit test case base class in your src/test tree. You can delete the sample test method once we add our first real test. JUnit will fail a test case if it cannot locate a @Test to run.
package myorg.relex;
import static org.junit.Assert.*;
import javax.persistence.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.junit.*;
public class One2ManyTest extends JPATestBase {
private static Logger log = LoggerFactory.getLogger(One2ManyTest.class);
@Test
public void testSample() {
log.info("testSample");
}
}
Verify the new JUnit test class builds and executes to completion
relationEx]$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest ... -HHH000401: using driver [org.h2.Driver] at URL [jdbc:h2:tcp://localhost:9092/./h2db/ejava] ... [INFO] BUILD SUCCESS
Mapping one-to-many, uni-directional relationships require some help since the owning side is a single item and we must model relationships to many entities. To do this we can either use a join table (which forms a one-to-one, uni-directional relationship with the many side) or insert a foreign key into the table representing the many side. We will take a look at the join table approach first.
In this section we will demonstrate how to form a one-to-many uni-directional relationship using a join table. The join table is necessary since the many side is unaware of the relationship and "many" cannot be represented by a single row in the owning table.
Place the following entity class in your src/main tree. This class provides an example of the many side of a one-to-many, uni-directional relation. Because of this, this entity class has no reference to the owning/parent entity class.
package myorg.relex.one2many;
import javax.persistence.*;
/**
* This class provides an example of an entity class on the many side of a one-to-many,
* uni-directional relationship that will be referenced through a JoinTable.
*/
@Entity
@Table(name="RELATIONEX_RIDER")
public class Rider {
@Id @GeneratedValue
private int id;
@Column(length=32)
private String name;
public Rider() {}
public Rider(int id) {
this.id = id;
}
public int getId() { return id; }
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Place the following entity class in your src/main tree. This class provides an example of the one side of a one-to-many, uni-directional relation. The implementation is incomplete at the moment and relies on defaults to complete the relation.
package myorg.relex.one2many;
import java.util.List;
import javax.persistence.*;
/**
* This entity class provides an example of the one side of a one-to-many, uni-directional relation
* that is realized through a JoinTable.
*/
@Entity
@Table(name="RELATIONEX_BUS")
public class Bus {
@Id
private int number;
@OneToMany
// @JoinTable(
// name="RELATIONEX_BUS_RIDER",
// joinColumns={@JoinColumn(name="BUS_NO")},
// inverseJoinColumns={@JoinColumn(name="RIDER_ID")}
// )
private List<Rider> passengers;
protected Bus() {}
public Bus(int number) {
this.number = number;
}
public int getNumber() { return number; }
public List<Rider> getPassengers() {
if (passengers==null) { passengers = new ArrayList<Rider>(); }
return passengers;
}
public void setPassengers(List<Rider> passengers) {
this.passengers = passengers;
}
}
Add the two entity classes to the persistence unit.
<class>myorg.relex.one2many.Rider</class>
<class>myorg.relex.one2many.Bus</class>
Build the module and look at the default database schema generated for the two entities and relationship
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_BUS ( number integer not null, primary key (number) ); create table RELATIONEX_BUS_RELATIONEX_RIDER ( RELATIONEX_BUS_number integer not null, passengers_id integer not null, unique (passengers_id) ); ... create table RELATIONEX_RIDER ( id integer generated by default as identity, name varchar(32), primary key (id) ); ... alter table RELATIONEX_BUS_RELATIONEX_RIDER add constraint FK3F295C59773EE4ED foreign key (RELATIONEX_BUS_number) references RELATIONEX_BUS; alter table RELATIONEX_BUS_RELATIONEX_RIDER add constraint FK3F295C5994D90F30 foreign key (passengers_id) references RELATIONEX_RIDER;
Note that...
The default mapping for a @OneToMany is a @JoinTable
A default join table name is created using a combination of the one and many table names
A foreign key to the one table is based on the table name and primary key column name
A foreign key to the many table is based on the property name and primary key column name
The foreign key to the many side is unique since the child can only be related to one parent. There can only be a single row in this table linking the child to a parent.
Fill in the relationship mapping with non-default values.
Explicitly use a @JoinTable mapping
Name the join name RELATIONEX_BUS_RIDER
Name the foreign key to the Bus "BUS_NO"
Name the foreign key to the Rider "RIDER_ID
@OneToMany
@JoinTable(
name="RELATIONEX_BUS_RIDER",
joinColumns={@JoinColumn(name="BUS_NO")},
inverseJoinColumns={@JoinColumn(name="RIDER_ID")}
)
private List<Rider> passengers;
Rebuild the module and re-generate the database schema for the entities and relationship. Nothing will functionally change with the relationship except that we will be more in control of the database table and column names chosen in case we have to map to a legacy database.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_BUS ( number integer not null, primary key (number) ); create table RELATIONEX_BUS_RIDER ( BUS_NO integer not null, RIDER_ID integer not null, unique (RIDER_ID) ); ... create table RELATIONEX_RIDER ( id integer generated by default as identity, name varchar(32), primary key (id) ); ... alter table RELATIONEX_BUS_RIDER add constraint FKE78EAB2B869BB455 foreign key (BUS_NO) references RELATIONEX_BUS; alter table RELATIONEX_BUS_RIDER add constraint FKE78EAB2B3961502F foreign key (RIDER_ID) references RELATIONEX_RIDER;
Add the following test method to your existing one-to-many JUnit test case to demonstrate the capabilities of a one-to-many, uni-directional relationship mapped using a join table.
@Test
public void testOneToManyUniJoinTable() {
log.info("*** testOneToManyUniJoinTable ***");
Bus bus = new Bus(302);
em.persist(bus);
List<Rider> riders = new ArrayList<Rider>();
for (int i=0; i<2; i++) {
Rider rider = new Rider();
rider.setName("rider" + i);
em.persist(rider);
riders.add(rider);
}
log.debug("relating entities");
bus.getPassengers().addAll(riders);
em.flush(); em.clear();
log.debug("verify we have expected objects");
Bus bus2 = em.find(Bus.class, bus.getNumber());
assertNotNull("bus not found", bus2);
for (Rider r: bus.getPassengers()) {
assertNotNull("rider not found", em.find(Rider.class, r.getId()));
}
log.debug("verify they are related");
assertEquals("unexpected number of riders", bus.getPassengers().size(), bus2.getPassengers().size());
}
Rebuild the module and run the new test method. All entity instances will be created and then the relationships added.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniJoinTable ... -relating entities Hibernate: insert into RELATIONEX_BUS_RIDER (BUS_NO, RIDER_ID) values (?, ?) Hibernate: insert into RELATIONEX_BUS_RIDER (BUS_NO, RIDER_ID) values (?, ?)
Notice that since we are not requiring an EAGER fetch, the parent object can be retrieved without a join of the JoinTable and child/many table.
-verify we have expected objects Hibernate: select bus0_.number as number23_0_ from RELATIONEX_BUS bus0_ where bus0_.number=?
Notice that once we needed the collection a join of the JoinTable and child table was performed to provide the result.
log.debug("verify they are related");
assertEquals("unexpected number of riders", bus.getPassengers().size(), bus2.getPassengers().size());
-verify they are related Hibernate: select passengers0_.BUS_NO as BUS1_23_1_, passengers0_.RIDER_ID as RIDER2_1_, rider1_.id as id22_0_, rider1_.name as name22_0_ from RELATIONEX_BUS_RIDER passengers0_ inner join RELATIONEX_RIDER rider1_ on passengers0_.RIDER_ID=rider1_.id where passengers0_.BUS_NO=?
Add the following code snippet to the test method to attempt to remove one of the child objects. There is an error with code because it attempts to remove the child object without removing the relationship to the parent.
log.debug("remove one of the child objects");
em.remove(bus2.getPassengers().get(0)); //ouch!!! this will violate a FK-constraint
em.flush();
Rebuild the module and attempt to re-run the test method. Not the foreign key constraint violatation with the current implementation of the test method.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniJoinTable ... -remove one of the child objects Hibernate: delete from RELATIONEX_RIDER where id=? -SQL Error: 23503, SQLState: 23503 -Referential integrity constraint violation: "FKE78EAB2B3961502F: PUBLIC.RELATIONEX_BUS_RIDER FOREIGN KEY(RIDER_ID) REFERENCES PUBLIC.RELATIONEX_RIDER(ID) (1)"; SQL statement: delete from RELATIONEX_RIDER where id=? [23503-168]
Update the test method to remove the relationship prior to removing the child object. This should satisfy all constraints.
log.debug("remove one of the child objects");
Rider rider = bus2.getPassengers().get(0);
log.debug("removing the relationship");
assertTrue("ride not found in relation", bus2.getPassengers().remove(rider));
em.flush();
log.debug("removing the object");
em.remove(rider);
em.flush();
Rebuild the module and re-run the test method. The provider now knows to delete the relationship prior to deleting the child object. However, it is of some interest that the provider chose to delete all relationships and re-add only teh remaining ones rather than remove a specific one.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniJoinTable ... -remove one of the child objects -removing the relationship Hibernate: delete from RELATIONEX_BUS_RIDER where BUS_NO=? Hibernate: insert into RELATIONEX_BUS_RIDER (BUS_NO, RIDER_ID) values (?, ?) -removing the object Hibernate: delete from RELATIONEX_RIDER where id=? ... [INFO] BUILD SUCCESS
You have finished a brief section on implementing a one-to-many, uni-directional relationship using a join table. You should have noticed that the join table is the default mapping for such a construct and in the following section we will eliminate the join table and insert a foreign key into the child table.
In this section we will demonstrate how to form a one-to-many uni-directional relationship without a join table. In this case the @JoinColumn in the owning/one side actually maps the relationship to a foreign key in the dependent entity table.
Place the following entity class in your src/main tree. It provides an example of the many side of a one-to-many, uni-directional relationship that we will map using a foreign key in the table for this entity class.
package myorg.relex.one2many;
import javax.persistence.*;
/**
* This class provides an example of the many side of a one-to-many, uni-directional relationship
* mapped using a foreign key in the child entity table. Note that all mapping will be from the one/owning
* side and no reference to the foreign key exists within this class.
*/
@Entity
@Table(name="RELATIONEX_STOP")
public class Stop {
@Id @GeneratedValue
private int id;
@Column(length=16)
private String name;
public int getId() { return id; }
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Place the following entity class in your src/main tree. It provides an example of the one side of a one-to-many, uni-directional relationship that will define a foreign key mapping in the remote child table. The @JoinColumn referenced in the @OneToMany relationship is referring to a column in the child's table and not the parent's table.
package myorg.relex.one2many;
import java.util.ArrayList;
import java.util.List;
import javax.persistence.*;
/**
* This class provides an example of the one side of a one-to-many, uni-directional relationship
* mapped using a foreign key inserted into the child/many table. The @JoinColumn is referencing
* the child table and not this entity's table.
*/
@Entity
@Table(name="RELATIONEX_ROUTE")
public class Route {
@Id
private int number;
@OneToMany
@JoinColumn
private List<Stop> stops;
protected Route() {}
public Route(int number) {
this.number = number;
}
public int getNumber() { return number; }
public List<Stop> getStops() {
if (stops == null) { stops = new ArrayList<Stop>(); }
return stops;
}
public void setStops(List<Stop> stops) {
this.stops = stops;
}
}
Add the two new entity classes to the persistence unit.
<class>myorg.relex.one2many.Stop</class>
<class>myorg.relex.one2many.Route</class>
Build the module and inspect the default database schema generated for the two entities and their one-to-many, uni-directional relationship.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_ROUTE ( number integer not null, primary key (number) ); ... create table RELATIONEX_STOP ( id integer generated by default as identity, name varchar(16), stops_number integer, primary key (id) ); ... alter table RELATIONEX_STOP add constraint FK35604A92586DC195 foreign key (stops_number) references RELATIONEX_ROUTE;
Notice that...
A foreign key was added to the child/many table to reference the parent/one table
There is no property within the child/many entity to module this foreign key
Add the following test method to your existig one-to-many JUnit test case.-
@Test
public void testOneToManyUniFK() {
log.info("*** testOneToManyUniFK ***");
Route route = new Route(302);
em.persist(route);
List<Stop> stops = new ArrayList<Stop>();
for (int i=0; i<2; i++) {
Stop stop = new Stop();
stop.setName("stop" + i);
em.persist(stop);
stops.add(stop);
}
log.debug("relating entities");
route.getStops().addAll(stops);
em.flush(); em.clear();
log.debug("verify we have expected objects");
Route route2 = em.find(Route.class, route.getNumber());
assertNotNull("route not found", route2);
for (Stop s: route.getStops()) {
assertNotNull("stop not found", em.find(Stop.class, s.getId()));
}
log.debug("verify they are related");
assertEquals("unexpected number of stops", route.getStops().size(), route2.getStops().size());
}
Rebuild and module and run the new test method to demonstrate creating the entities and relating them through the one-to-many, uni-directional foreign key from the child table. In this case the provider simply updates a foreign key column in the child table rather than inserting a row in a separate join table.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniFK ... -relating entities Hibernate: update RELATIONEX_STOP set stops_number=? where id=? Hibernate: update RELATIONEX_STOP set stops_number=? where id=?
Notice how there is no longer a join required when obtaining a list of all children
-verify we have expected objects ... -verify they are related Hibernate: select stops0_.stops_number as stops3_25_1_, stops0_.id as id1_, stops0_.id as id24_0_, stops0_.name as name24_0_ from RELATIONEX_STOP stops0_ where stops0_.stops_number=?
Add the following snippet to verify we can delete a child entity. Note that we have more flexibility in this @JoinColumn case than we did with the previous @JoinTable case. Since there is no separate foreign key referencing the child row and the foreign key is part of the child row -- a delete of the child will automatically remove the relationship.
log.debug("remove one of the child objects");
log.debug("removing the object and relationship");
em.remove(route2.getStops().get(0));
em.flush();
Rebuild the module and re-run the unit test to verify we can delete one of the child objects. Note that we only delete a single row from the child table. In the previous case the provider deleted all instances referencing the parent from the join table and re-added them.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniFK ... -remove one of the child objects -removing the object and relationship Hibernate: delete from RELATIONEX_STOP where id=? ... [INFO] BUILD SUCCESS
You have finished implementing a one-to-many, uni-directional relationship using a @JoinColumn from the child table. This case was designed to solve the common situation in Java and databases where the database is mapped using a simple foreign key in the child table and Java is mapped using only a collection in the parent/one entity. The child/member Java entity has no direct knowledge of which parent entities it will be associated with.
In this section we will demonstrate how to map a collection of a simple data type to a database using a parent/dependent set of tables and a foreign key. This case is similar to the one-to-many, uni-directional relationship with a @JoinColumn where all mapping is being done from the owning/one side. However, in this case, the child/many side is not an entity and does not have a primary key.
Place the following entity class in your src/main tree. The class models a collection of alias strings we will want mapped to a separate dependent/child table. However, the class is currently incomplete and will initially define a mapping we do not want.
package myorg.relex.one2many;
import java.util.HashSet;
import java.util.Set;
import javax.persistence.*;
/**
* This class provides an example of the owning side of a collection of base data types.
* In this case we want a unique set of strings (aliases) mapped to this entity using
* a separate dependent table and a foreign key relationship.
*/
@Entity
@Table(name="RELATIONEX_SUSPECT")
public class Suspect {
@Id @GeneratedValue
private int id;
@Column(length=32)
private String name;
// @ElementCollection
// @CollectionTable(
// name="RELATIONEX_SUSPECT_ALIASES",
// joinColumns=@JoinColumn(name="SUSPECT_ID"),
// uniqueConstraints=@UniqueConstraint(columnNames={"SUSPECT_ID", "ALIAS"}))
// @Column(name="ALIAS", length=32)
@Lob
private Set<String> aliases;
public int getId() { return id; }
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
public Set<String> getAliases() {
if (aliases==null) { aliases = new HashSet<String>(); }
return aliases;
}
public void setAliases(Set<String> aliases) {
this.aliases = aliases;
}
}
Add the new entity class to the persistence unit
<class>myorg.relex.one2many.Suspect</class>
Build the module and look at the database schema generated. Notice how the set of strings was mapped to a single column within the parent table of type blob where the set will be serialized into that column when saved.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_SUSPECT ( id integer generated by default as identity, aliases blob, name varchar(32), primary key (id) );
Fix the mapping such that the list of aliases are kept in a separate dependent/child table with a value column containing the property value and a foreign key back to the parent entity.
@ElementCollection // @CollectionTable( // name="RELATIONEX_SUSPECT_ALIASES", // joinColumns=@JoinColumn(name="SUSPECT_ID"), // uniqueConstraints=@UniqueConstraint(columnNames={"SUSPECT_ID", "ALIAS"})) // @Column(name="ALIAS", length=32) private Set<String> aliases;
Rebuild the module and observe the new database schema generated.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_SUSPECT ( id integer generated by default as identity, name varchar(32), primary key (id) ); ... create table Suspect_aliases ( Suspect_id integer not null, aliases varchar(255) ); ... alter table Suspect_aliases add constraint FK9AC56596DE29C9CF foreign key (Suspect_id) references RELATIONEX_SUSPECT;
Notice that...
A new table was created to house the collection values
The new table has no primary key
The new table has a foreign key that references the parent table
Update the table mapping to control the table and column properties of the child table.
@ElementCollection
@CollectionTable(
name="RELATIONEX_SUSPECT_ALIASES",
joinColumns=@JoinColumn(name="SUSPECT_ID"),
uniqueConstraints=@UniqueConstraint(columnNames={"SUSPECT_ID", "ALIAS"}))
@Column(name="ALIAS", length=32)
private Set<String> aliases;
Rebuild the module and notice the change in database schema generated.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_SUSPECT ( id integer generated by default as identity, name varchar(32), primary key (id) ); create table RELATIONEX_SUSPECT_ALIASES ( SUSPECT_ID integer not null, ALIAS varchar(32), unique (SUSPECT_ID, ALIAS) ); ... alter table RELATIONEX_SUSPECT_ALIASES add constraint FK3FD160E6DE29C9CF foreign key (SUSPECT_ID) references RELATIONEX_SUSPECT;
Notice that...
@ColumnTable.name was used to name the dependent/child table
@ColumnTable.joinColumns was used to name the foreign key to the parent table
@ColumnTable.uniqueConstraints was used to restrict columns in table to unique values
@Column.name was used to name the value column used to hold the member of collection
@Column.length was used to control the size of the value column
Add a new test method to your existing one-to-many JUnit test case to demonstrate mapping of simple value collections to parent/dependent tables using a foreign key relationship.
@Test
public void testOneToManyUniElementCollection() {
log.info("*** testOneToManyUniElementCollection ***");
Suspect suspect = new Suspect();
suspect.setName("william");
em.persist(suspect);
suspect.getAliases().add("bill");
suspect.getAliases().add("billy");
em.flush(); em.clear();
log.debug("verify we have expected objects");
Suspect suspect2 = em.find(Suspect.class, suspect.getId());
assertNotNull("suspect not found", suspect2);
for (String a: suspect.getAliases()) {
assertNotNull("alias not found", suspect2.getAliases().contains(a));
}
}
Rebuild the module and run the new test method to demonstrate the construction of the object graph and the mapping to parent/dependent tables.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniElementCollection ... -*** testOneToManyUniElementCollection *** Hibernate: insert into RELATIONEX_SUSPECT (id, name) values (null, ?) Hibernate: insert into RELATIONEX_SUSPECT_ALIASES (SUSPECT_ID, ALIAS) values (?, ?) Hibernate: insert into RELATIONEX_SUSPECT_ALIASES (SUSPECT_ID, ALIAS) values (?, ?)
-verify we have expected objects Hibernate: select suspect0_.id as id26_0_, suspect0_.name as name26_0_ from RELATIONEX_SUSPECT suspect0_ where suspect0_.id=? Hibernate: select aliases0_.SUSPECT_ID as SUSPECT1_26_0_, aliases0_.ALIAS as ALIAS0_ from RELATIONEX_SUSPECT_ALIASES aliases0_ where aliases0_.SUSPECT_ID=?
Add the following code snippet to demonstrate we can remove a row from the dependent/child table by removing the value from the collecion.
log.debug("remove one of the child objects");
String alias = suspect2.getAliases().iterator().next();
assertTrue("alias not found", suspect2.getAliases().remove(alias));
em.flush();
Rebuild the module and re-run the test method to demonstrate the removal of an element from the collection. Note how the provider is removing a row from the database table.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniElementCollection ... -remove one of the child objects Hibernate: delete from RELATIONEX_SUSPECT_ALIASES where SUSPECT_ID=? and ALIAS=? ... [INFO] BUILD SUCCESS
Our example was able to take advantage of the fact that an alias should be unique for a suspect. If that was not the case, we would have had to remove the uniqueness constraint for SUSPECT_ID+ALIAS and risked performing a full table scan if we didn't either a) manually define a value index on ALIAS outside of schema generation or b) use a vendor-specific annotation to define the necessary non-unique index. Since database schema generally goes through manual manipulation prior to operational deployment and the generation here is just an aide -- this should not be much of an issue.
You have finished mapping a collection of simple types to a set of parent and dependent tables linked through a foreign key. The value within the collection was mapped directly to a column within the dependent table. By mapping the collection to a separate table instead of storing a serialized block into a single column -- we gain the ability to use values in the collection within future database queries.
In the previous section we demonstrated how we could map a collection of simple data values to a set of parent/dependent tables and link them through a foreign key. In this section we are going to add a small complexity where the simple value is now an embeddable type and will map to multiple columns in the dependent entity table.
Put the following entity class into your src/main tree. This class provides and example of an embeddable non-entity class that will be mapped using an @ElementCollection mapping.
package myorg.relex.one2many;
import java.util.Date;
import javax.persistence.*;
/**
* This class is an example of a non-entity class that will be mapped to a dependent table
* and form the many side of an @ElementCollection.
*/
@Embeddable
public class Produce {
public static enum Color { RED, GREEN, YELLOW }
@Column(length=16)
private String name;
@Enumerated(EnumType.STRING)
@Column(length=10)
private Color color;
@Temporal(TemporalType.DATE)
private Date expires;
public Produce() {}
public Produce(String name, Color color, Date expires) {
this.name = name;
this.color = color;
this.expires = expires;
}
public String getName() { return name; }
public Color getColor() { return color; }
public Date getExpires() { return expires; }
}
Put the following entity class in place in your src/main tree. This class provides and example of using an @ElementCollection to persist a list of embeddable, non-entity classes within a dependent/child table linked to this entity table using a foreign key. The class defines some of the CollectionTable mappings but leaves some of the default values to what is defined within the Embeddable class.
package myorg.relex.one2many;
import java.util.ArrayList;
import java.util.List;
import javax.persistence.*;
/**
* This entity class provides an example of mapping a collection of non-entity/embeddable class instances
* to a dependent/child table and relating the child table to this entity table using a foreign key.
*/
@Entity
@Table(name="RELATIONEX_BASKET")
public class Basket {
@Id @GeneratedValue
private int id;
@ElementCollection
@CollectionTable(
name="RELATIONEX_BASKET_PRODUCE",
joinColumns=@JoinColumn(name="BASKET_ID"))
// @AttributeOverrides({
// @AttributeOverride(name="name", column=@Column(name="ITEM_NAME")),
// @AttributeOverride(name="color", column=@Column(name="ITEM_COLOR"))
// })
private List<Produce> contents;
@Column(length=16)
private String name;
public int getId() { return id; }
public List<Produce> getContents() {
if (contents == null) { contents = new ArrayList<Produce>(); }
return contents;
}
public void setContents(List<Produce> contents) {
this.contents = contents;
}
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Add the two entity classes to the persistence unit.
<class>myorg.relex.one2many.Basket</class>
<class>myorg.relex.one2many.Produce</class>
Build the module and observe the generated schema produced. Notice how the @Column mappings from the @Embeddable class show up in the resultant schema.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_BASKET ( id integer generated by default as identity, name varchar(16), primary key (id) ); create table RELATIONEX_BASKET_PRODUCE ( BASKET_ID integer not null, color varchar(10), expires date, name varchar(16) ); ... alter table RELATIONEX_BASKET_PRODUCE add constraint FKA97DDDD776CDC1E5 foreign key (BASKET_ID) references RELATIONEX_BASKET;
Assume we wish to control the schema from the parent class. Update the parent class to control the column names for the product name and color but not the expiration date. You can do this using an @AttributeOverride for each property you wish to change. However multiple @AttributeOverride instances must be wrapped within an array and defined within an @AttributeOverrides instance.
@ElementCollection
@CollectionTable(
name="RELATIONEX_BASKET_PRODUCE",
joinColumns=@JoinColumn(name="BASKET_ID"))
@AttributeOverrides({
@AttributeOverride(name="name", column=@Column(name="ITEM_NAME")),
@AttributeOverride(name="color", column=@Column(name="ITEM_COLOR"))
})
private List<Produce> contents;
Rebuild the module and observe the updated database schema. Notice how the name of the columns specified in the parent definition of the mapping was used over the default name. It also eliminated the @Column length. We could fix that in the @Column definition of the @AttributeOverride as well.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_BASKET ( id integer generated by default as identity, name varchar(16), primary key (id) ); create table RELATIONEX_BASKET_PRODUCE ( BASKET_ID integer not null, ITEM_COLOR varchar(255), expires date, ITEM_NAME varchar(255) ); ... alter table RELATIONEX_BASKET_PRODUCE add constraint FKA97DDDD776CDC1E5 foreign key (BASKET_ID) references RELATIONEX_BASKET;
Add the following test method to your existing one-to-many JUnit test case.
@Test
public void testOneToManyUniEmbeddableElementCollection() {
log.info("*** testOneToManyUniEmbeddableElementCollection ***");
Basket basket = new Basket();
basket.setName("assorted fruit");
em.persist(basket);
basket.getContents().add(new Produce("apple", Produce.Color.RED, new Date(System.currentTimeMillis()+(3600000*24*30))));
basket.getContents().add(new Produce("banana", Produce.Color.YELLOW, new Date(System.currentTimeMillis()+(3600000*24*14))));
em.flush(); em.clear();
log.debug("verify we have expected objects");
Basket basket2 = em.find(Basket.class, basket.getId());
assertNotNull("basket not found", basket2);
assertEquals("unexpected contents", basket.getContents().size(), basket2.getContents().size());
Build the module and run the new test method to demonstrate the building of the object graph and the mapping to the database.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniEmbeddableElementCollection ... -*** testOneToManyUniEmbeddableElementCollection *** Hibernate: insert into RELATIONEX_BASKET (id, name) values (null, ?) Hibernate: insert into RELATIONEX_BASKET_PRODUCE (BASKET_ID, ITEM_COLOR, expires, ITEM_NAME) values (?, ?, ?, ?) Hibernate: insert into RELATIONEX_BASKET_PRODUCE (BASKET_ID, ITEM_COLOR, expires, ITEM_NAME) values (?, ?, ?, ?)
-verify we have expected objects Hibernate: select basket0_.id as id27_0_, basket0_.name as name27_0_ from RELATIONEX_BASKET basket0_ where basket0_.id=? Hibernate: select contents0_.BASKET_ID as BASKET1_27_0_, contents0_.ITEM_COLOR as ITEM2_0_, contents0_.expires as expires0_, contents0_.ITEM_NAME as ITEM4_0_ from RELATIONEX_BASKET_PRODUCE contents0_ where contents0_.BASKET_ID=?
Add the following code snippet to verify we can delete one of the embeddable instances from the collection and the database.
log.debug("remove one of the child objects");
Produce produce = basket2.getContents().get(0);
assertTrue("produce not found", basket2.getContents().remove(produce));
em.flush();
Rebuild the module and re-run the test method to observe how the provider will be deleting our embeddable instance. Notice, in this case, the provider is deleting all related instances and re-adding the remaining instance(s) that are still associated to the parent. That is likely due to the fact that the instance has no primary key and no field combinations were labelled as unique. The provider has no way to tell one instance from another -- so it must delete and re-add.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniEmbeddableElementCollection ... -remove one of the child objects Hibernate: delete from RELATIONEX_BASKET_PRODUCE where BASKET_ID=? Hibernate: insert into RELATIONEX_BASKET_PRODUCE (BASKET_ID, ITEM_COLOR, expires, ITEM_NAME) values (?, ?, ?, ?) ... [INFO] BUILD SUCCESS
You have finished mapping a collection of embeddable, non-entity class instances to a dependent table related to the parent table using a foreign key. You were also able to control the table and column properties through @Column definitions directly applied to the embeddable class and override them in the parent class use @AttributeOverride(s).
In this chapter we will assume what was covered in the one-to-one CASCADE portion is suitable for now and focus this section on orphanRemoval.
Orphan removal is specific to one-to-one and one-to-many relationships. For a one-to-many relationship, this capability allows members of a collection to be removed once they stop being members of that collection. Thus the child member is only there to support the parent entity.
Put the following child entity class in your src/main tree. This class provides an example many side of a one-to-many relation. Thus it has no reference to the one side.
package myorg.relex.one2many;
import javax.persistence.*;
/**
* This class is an example of the many side of a one-to-many, uni-directional relationship
* which uses orphanRemoval of target entities on the many side. This entity exists for the
* sole use of the one side of the relation.
*/
@Entity
@Table(name="RELATIONEX_TODO")
public class Todo {
@Id @GeneratedValue
private int id;
@Column(length=32)
private String title;
public Todo() {}
public Todo(String title) {
this.title = title;
}
public int getId() { return id; }
public String getTitle() { return title; }
public void setTitle(String title) {
this.title = title;
}
}
Place the following parent entity in your src/main tree. This class provides an example of the one/owning side of a one-to-many, uni-directional relationship. The relationship is realized through a foreign key from the child entity table to the parent. The parent has enabled cascade.PERSIST to allow for easy storage of the child entities but its currently incomplete when it comes to orphan removal. We will fix in a moment.
package myorg.relex.one2many; import java.util.ArrayList; import java.util.List; import javax.persistence.*; /** * This class provides an example owning entity in a one-to-many, uni-directional relationship * where the members of the collection are subject to orphanRemoval when they are removed from the * collection. */ @Entity @Table(name="RELATIONEX_TODOLIST") public class TodoList { @Id @GeneratedValue private int id; @OneToMany(cascade={CascadeType.PERSIST} // ,orphanRemoval=true ) @JoinColumn private List<Todo> todos; public int getId() { return id; } public List<Todo> getTodos() { if (todos==null) { todos = new ArrayList<Todo>(); } return todos; } public void setTodos(List<Todo> todos) { this.todos = todos; } }
Add the new entity classes to your persistence unit.
<class>myorg.relex.one2many.Todo</class>
<class>myorg.relex.one2many.TodoList</class>
Add the following test method to your JUnit test case. This portion of the test will simply create and parent and first child entity.
@Test
public void testOneToManyUniOrphanRemoval() {
log.info("*** testOneToManyUniEmbeddableElementCollection ***");
//check how many child entities exist to start with
int startCount = em.createQuery("select count(t) from Todo t", Number.class).getSingleResult().intValue();
log.debug("create new TODO list with first entry");
TodoList list = new TodoList();
list.getTodos().add(new Todo("get up"));
em.persist(list);
em.flush();
}
Build the module and run the new test method. Note the creation of the parent, child, and the relating of the child to the parent.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniOrphanRemoval ... -*** testOneToManyUniEmbeddableElementCollection *** Hibernate: select count(todo0_.id) as col_0_0_ from RELATIONEX_TODO todo0_ limit ? -create new TODO list with first entry Hibernate: insert into RELATIONEX_TODOLIST (id) values (null) Hibernate: insert into RELATIONEX_TODO (id, title) values (null, ?) Hibernate: update RELATIONEX_TODO set todos_id=? where id=? ... [INFO] BUILD SUCCESS
Add the following lines to the unit test to verify a child instance was created.
log.debug("verifying we have new child entity");
assertEquals("new child not found", startCount +1,
em.createQuery("select count(t) from Todo t", Number.class).getSingleResult().intValue());
Rebuild the module and re-run the test method. Notice we get the expected additional child entity in the table.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniOrphanRemoval ... -verifying we have new child entity Hibernate: select count(todo0_.id) as col_0_0_ from RELATIONEX_TODO todo0_ limit ? ... [INFO] BUILD SUCCESS
Add the following lines to your test method. This will remove the child entity from the parent collection and test whether it was subjected to an orphan removal.
log.debug("removing child from list");
list.getTodos().clear();
em.flush();
assertEquals("orphaned child not deleted", startCount,
em.createQuery("select count(t) from Todo t", Number.class).getSingleResult().intValue());
Rebuild the module and re-run the test method. Notice how the test currently fails. This is because we never enabled orphanRemoval on the relationship.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniOrphanRemoval ... -removing child from list Hibernate: update RELATIONEX_TODO set todos_id=null where todos_id=? Hibernate: select count(todo0_.id) as col_0_0_ from RELATIONEX_TODO todo0_ limit ? ... Failed tests: testOneToManyUniOrphanRemoval(myorg.relex.One2ManyTest): orphaned child not deleted expected:<0> but was:<1> ... [INFO] BUILD FAILURE
Enable orphanRemoval on the relationship on the parent side.
@OneToMany(cascade={CascadeType.PERSIST}
,orphanRemoval=true
)
@JoinColumn
private List<Todo> todos;
Rebuild the module and re-run the test method. Notice how the child entity is removed shortly after it was removed from the parent collection.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyTest#testOneToManyUniOrphanRemoval ... -removing child from list Hibernate: update RELATIONEX_TODO set todos_id=null where todos_id=? Hibernate: delete from RELATIONEX_TODO where id=? Hibernate: select count(todo0_.id) as col_0_0_ from RELATIONEX_TODO todo0_ limit ? ... [INFO] BUILD SUCCESS
You have finished taking a look at orphanRemoval within the context of a one-to-many, uni-directional relationship. With this capability, entities removed from a parent collection are automatically deleted from the database.
In this chapter we you worked with several types of one-to-many, uni-directional relationships. In this type of relationship -- the many/dependent/child entity knows nothing of the relationship and it must be defined from the one/parent side. You formed that mapping using a join table as well as inserting a foreign key into the child table (without the child entity knowing about it). You also created mappings to many/child/dependent non-entity classes like Strings and JPA @Embeddable classes. This allows you to create entity classes with collections of non-entity POJOs and have them mapped to database tables that can be used for searches.
We will pause our look at relationship types take a detour into collection types in the next chapter. We want to make sure we understand many of the options available and requirements associated with those options before we get too far into more relationships involving a "many" side.
In this chapter we will take a closer look at the collections used within a relationship and how we can better map them to the business need. We will primarily look at collection ordering and access.
Create a JUnit test class to host tests for the collection mappings.
Put the following Junit test case base class in your src/test tree. You can delete the sample test method once we add our first real test. JUnit will fail a test case if it cannot locate a @Test to run.
package myorg.relex;
import static org.junit.Assert.*;
import javax.persistence.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.junit.*;
public class CollectionTest extends JPATestBase {
private static Logger log = LoggerFactory.getLogger(CollectionTest.class);
@Test
public void testSample() {
log.info("testSample");
}
}
Verify the new JUnit test class builds and executes to completion
relationEx]$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest ... -HHH000401: using driver [org.h2.Driver] at URL [jdbc:h2:tcp://localhost:9092/./h2db/ejava] ... [INFO] BUILD SUCCESS
This section will focus on how Java and JPA determine the identity of an entity and when one instance equals another. To demonstrate the concepts, please the following artifacts in place.
Place the following mapped superclass in place in your src/main tree. Mapped superclasses are POJO base classes for entities that are not themselves entities. The reason we did not make this class an entity is because it is abstract and will never exist within our example tree without a subclass representing the entity. Each instance of the mapped superclass will be assigned an instanceId, a database primary key (when persisted), and a business Id (name).
package myorg.relex.collection;
import java.util.Date;
import java.util.concurrent.atomic.AtomicInteger;
import javax.persistence.Column;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
import javax.persistence.MappedSuperclass;
import javax.persistence.Temporal;
import javax.persistence.TemporalType;
import javax.persistence.Transient;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* This class is used as a common base implementation by several implementations
* of hashCode/equals.
*/
@MappedSuperclass
public abstract class Ship {
@Transient
protected final Logger log = LoggerFactory.getLogger(getClass());
private static AtomicInteger instanceId = new AtomicInteger();
@Transient
private int oid = instanceId.getAndAdd(1);
@Id
@GeneratedValue
protected int id;
@Column(length = 16)
protected String name; //businessId
@Temporal(TemporalType.TIMESTAMP)
protected Date created;
public int getId() { return id; }
public Ship setId(int id) {
this.id = id;
return this;
}
public String getName() { return name; }
public Ship setName(String name) {
this.name = name;
return this;
}
public Date getCreated() { return created; }
public Ship setCreated(Date created) {
this.created = created;
return this;
}
public abstract int peekHashCode();
protected int objectHashCode() {
return super.hashCode();
}
@Override
public int hashCode() {
return logHashCode(peekHashCode());
}
public int logHashCode(int hashCode) {
log.info(toString() +
".hashCode=" + hashCode);
return hashCode;
}
public boolean logEquals(Object obj, boolean equals) {
log.info(new StringBuilder()
.append(toString())
.append(".equals(id=")
.append(obj==null?null : ((Ship)obj).id + ",oid=" + ((Ship)obj).oid)
.append(")=")
.append(equals));
return equals;
}
public String toString() {
return getClass().getSimpleName() + "(id=" + id + ",oid=" + oid + ")";
}
}
Place the following entity class in you src/main tree. This class will be used to represent a parent/one end of a one-to-many relationship. It is currently incomplete. We will add more to it later.
package myorg.relex.collection;
import java.util.ArrayList;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import javax.persistence.*;
/**
* This class provides an example one/parent entity with a relationship to many child/dependent
* objects -- with the members in each collection based on a different hashCode/equals method.
*/
@Entity
@Table(name="RELATIONEX_FLEET")
public class Fleet {
@Id @GeneratedValue
private int id;
@Column(length=16)
private String name;
public int getId() { return id; }
public void setId(int id) {
this.id = id;
}
public String getName() { return name;}
public void setName(String name) {
this.name = name;
}
}
Add only the entity class to the persistence unit. Do not add the mapped superclass.
<class>myorg.relex.collection.Fleet</class>
In this section we will demonstrate how using the default java.lang.Object hashCode and equals methods is used within Java collections and impacts JPA code. This technique works when working with a single instance that represents a real object. If two objects are of the same class but different instances -- then they will have a different hashCode identity and equals will be returned as false even if every Java attribute they host has an equivalent value.
Add the following entity class to your src/main tree. This class represents an entity that implements hashCode and equals using the default java.lang.Object hashCode/equals implementation except it will print some debug when these methods are called. Notice that it extends the managed superclass you added earlier.
package myorg.relex.collection;
import javax.persistence.*;
/**
* This class is provides an example of an entity that implements hashCode/equals
* using the default java.lang.Object implementation. Note this implementation is instance-specific.
* No other instance will report the same value even if they represent the same row in the DB.
*/
@Entity
@Table(name="RELATIONEX_SHIP")
public class ShipByDefault extends Ship {
@Override
public int peekHashCode() {
return super.objectHashCode();
}
@Override
public boolean equals(Object obj) {
try {
if (this == obj) { return logEquals(obj, true); }
boolean equals = super.equals(obj);
return logEquals(obj, equals);
} catch (Exception ex) {
return logEquals(obj, false);
}
}
}
Add the entity class to your persistence unit.
<class>myorg.relex.collection.ShipByDefault</class>
Add the following test method and initial code to your collections JUnit test case. This test provides a simple demonstration how two instances with the same values will report they are different when using the default java.lang.Object implementations of hashCode and equals.
@Test
public void testByDefault() {
log.info("*** testByDefault ***");
Ship ship1 = new ShipByDefault();
Ship ship2 = new ShipByDefault();
assertFalse("unexpected hashCode", ship1.hashCode() == ship2.hashCode());
assertFalse("unexpected equality", ship1.equals(ship2));
}
Build the module, run the new JUnit test case, and observe the results. Notice how the two instances have the same databaseId (unassigned at this point) but a different instanceId, significantly different hashCodes and an equals that does not match.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testByDefault ... -*** testByDefault *** -ShipByDefault(id=0,oid=4).hashCode=1215713589 -ShipByDefault(id=0,oid=5).hashCode=1100908089 -ShipByDefault(id=0,oid=4).equals(id=0,oid=5)=false ... [INFO] BUILD SUCCESS
Add the following lines of code to the existing test method to persist the entity and attempt to retrieve it while still in the cache.
log.debug("persisting entity");
em.persist(ship1);
em.flush();
Ship ship3 = em.find(ShipByDefault.class, ship1.getId());
assertTrue("unexpected hashCode", ship1.hashCode() == ship3.hashCode());
assertTrue("unexpected inequality", ship1.equals(ship3));
Rebuild the module, re-run the test method and observe the equality that occurs. The two variable instances have the same hashCode and are equal because they reference the same entity instance.
-persisting entity Hibernate: insert into RELATIONEX_SHIP (id, created, name) values (null, ?, ?) -ShipByDefault(id=1,oid=4).hashCode=1341189399 -ShipByDefault(id=1,oid=4).hashCode=1341189399 -ShipByDefault(id=1,oid=4).equals(id=1,oid=4)=true ... [INFO] BUILD SUCCESS
Add the following lines of code to your existing test method to show how the equality of the instances depends on whether the cache is still in place.
log.debug("getting new instance of entity");
em.clear();
Ship ship4 = em.find(ShipByDefault.class, ship1.getId());
assertFalse("unexpected hashCode", ship1.hashCode() == ship4.hashCode());
assertFalse("unexpected equality", ship1.equals(ship4));
Rebuild the module, re-run the test method, and observe the fact we now have inequality now that we have different instances. We can be sure they are different instances -- even though they both represent the same database Id -- by the value printed for the oid.
-getting new instance of entity Hibernate: select shipbydefa0_.id as id29_0_, shipbydefa0_.created as created29_0_, shipbydefa0_.name as name29_0_ from RELATIONEX_SHIP shipbydefa0_ where shipbydefa0_.id=? -ShipByDefault(id=1,oid=4).hashCode=368668382 -ShipByDefault(id=1,oid=6).hashCode=346534810 -ShipByDefault(id=1,oid=4).equals(id=1,oid=6)=false ... [INFO] BUILD SUCCESS
You have finished demonstrating how entities using the default java.lang.Object implementation of hashCode and equals identify themselves as equal only if they are referencing the same instance. This works as long as the instance is available to be referenced but would not work in cases where we want the identity to span instances that might share the same properties. In the next section we will look at factoring in database Id into the hashCode and equality implementations.
In this section we will will demonstrate an attempt at modeling the hashCode and equals property through the database-assigned primary key. After all -- this value is meant to be our Id for the entity.
Add the following entity class to your src/main tree. This class will base its hashCode and equals solely on the assigned (or unassigned) primary key.
package myorg.relex.collection;
import javax.persistence.*;
/**
* This class is provides an example of an entity that implements hashCode/equals
* using its database assigned primary key. Note the PK is not assigned until the
* entity is inserted into the database -- so there will be a period of time prior
* to persist() when all instances of this class report the same hashCode/equals.
*/
@Entity
@Table(name="RELATIONEX_SHIP")
public class ShipByPK extends Ship {
@Override
public int peekHashCode() {
return id;
}
@Override
public boolean equals(Object obj) {
try {
if (this == obj) { return logEquals(obj, true); }
boolean equals = id==((ShipByPK)obj).id;
return logEquals(obj, equals);
} catch (Exception ex) {
return logEquals(obj, false);
}
}
}
Add the entity class to your persistence unit.
<class>myorg.relex.collection.ShipByPK</class>
Add the following test method to your existing unit test. This test will demonstrate how we can get two instances to logically represent the same thing.
@Test
public void testByPK() {
log.info("*** testByPK ***");
Ship ship1 = new ShipByPK();
Ship ship2 = new ShipByPK();
assertTrue("unexpected hashCode", ship1.hashCode() == ship2.hashCode());
assertTrue("unexpected equality", ship1.equals(ship2));
}
Rebuild the module and run the new test method. Notice how two object instances with the same database primary key value can easily report the same hashCode and report they are equal.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testByPK ... -*** testByPK *** -ShipByPK(id=0,oid=4).hashCode=0 -ShipByPK(id=0,oid=5).hashCode=0 -ShipByPK(id=0,oid=4).equals(id=0,oid=5)=true ... [INFO] BUILD SUCCESS
Add the following lines of code to your existing test method. This code will demonstrate how an earlier unmanaged instance and a newly found managed instance will report they are the same.
log.debug("persisting entity");
em.persist(ship1);
em.flush();
em.clear();
log.debug("getting new instance of entity");
Ship ship4 = em.find(ShipByPK.class, ship1.getId());
assertTrue("unexpected hashCode", ship1.hashCode() == ship4.hashCode());
assertTrue("unexpected equality", ship1.equals(ship4));
Rebuild the module and re-run the test method. Notice how the common primary key value causes the two instances to be equal.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testByPK
...
-getting new instance of entity
Hibernate:
select
shipbypk0_.id as id29_0_,
shipbypk0_.created as created29_0_,
shipbypk0_.name as name29_0_
from
RELATIONEX_SHIP shipbypk0_
where
shipbypk0_.id=?
-ShipByPK(id=1,oid=4).hashCode=1
-ShipByPK(id=1,oid=6).hashCode=1
-ShipByPK(id=1,oid=4).equals(id=1,oid=6)=true
...
[INFO] BUILD SUCCESS
Add the following lines of code to your existing test method. This will demonstrate that even though the two instances report they are equal, the provider still treats them as being distinct and not interchangeable.
log.debug("check if entity manager considers them the same");
assertFalse("em contained first entity", em.contains(ship1));
assertTrue("em did not contained second entity", em.contains(ship4));
Rebuild the module and re-run the test method. Note how the entity manager is able to tell the two instances apart and is not making any calls to hashCode or equals to determine if they are contained in the persistence context. This is helpful because we don't get confused by which instance is actually currently managed.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testByPK ... -check if entity manager considers them the same ... [INFO] BUILD SUCCESS
Up to now, we have been showing all good things about the databaseId approach. Add the following code to your existing test method to demonstrate an issue with the technique. In the following code we attempt to create two separate, logical instances and add them to a Set. Since elements of sets are unique and the implementation of the class is based off of a currently uninitialized primary key -- only the second entry is added to the set.
Set<Ship> ships = new HashSet<Ship>();
Ship ship5 = new ShipByPK().setName("one");
Ship ship6 = new ShipByPK().setName("two");
log.debug("add first ship to the set");
assertTrue("first entity not accepted into set", ships.add(ship5));
log.debug("add second ship to the set");
assertFalse("second entity accepted into set", ships.add(ship6));
assertEquals("unexpected set.size", 1, ships.size());
log.debug("ships=" + ships);
Rebuild the module, re-run the test method, and note the final contents of the Set only contains the first entity. Since the first entity reported it equaled the second entity -- the second entity was not added to the set. This can be an issue if we want to model a relationship as a unique Set prior to the entities being persisted to the database.
-add first ship to the set -ShipByPK(id=0,oid=7).hashCode=0 -add second ship to the set -ShipByPK(id=0,oid=8).hashCode=0 -ShipByPK(id=0,oid=8).equals(id=0,oid=7)=true -ships=[ShipByPK(id=0,oid=7)] ... [INFO] BUILD SUCCESS
You have finished demonstrating a potential option for deriving hashCode and equals that would make two separate instances logically presenting the same thing to be reported as equal. However, this solution -- as demonstrated -- has issues. It only works for persisted entities that already have their database identity assigned. This can be a serious issue for entity classes with a @GeneratedValue for a primary key and parents that house those entities within Sets. In the next section we will look at a potential hybrid solution.
In this section we will demonstrate an option for deriving hashCode and equals that will report two instances for the same logical object and attempt to compensate for addition to a set prior to being assigned a primary key.
Add the following class to your src/main tree. This class will default to the java.lang.Object approach prior to being given a primary key -- and then switch to the primary key from that point forward. It sounds good -- but will also have some issues we will demonstrate.
The Internet is not short on discussion of hashCode/equals and whether its value and result can be changed during the lifetime of an object. The java.lang.Object.hashCode javadoc states that "...the hashCode method must consistently return the same integer, provided no information used in equals comparisons on the object is modified". The first part of that phrase makes the following solution suspect. The last part of the phrase at least makes it a legal option.
package myorg.relex.collection;
import javax.persistence.*;
/**
* This class is provides an example of an entity that implements hashCode/equals
* using its database assigned primary key if it exists and defaults to the
* java.lang.Object definition if not yet assigned. Note that this technique causes
* a change in hashCode/equals after the persist() takes place -- invalidating anything
* previously cached for the identity.
*/
@Entity
@Table(name="RELATIONEX_SHIP")
public class ShipBySwitch extends Ship {
@Override
public int peekHashCode() {
return id==0 ? super.objectHashCode() : id;
}
@Override
public boolean equals(Object obj) {
try {
if (this == obj) { return logEquals(obj, true); }
boolean equals = (id==0) ? super.equals(obj) :
id==((ShipBySwitch)obj).id;
return logEquals(obj, equals);
} catch (Exception ex) {
return logEquals(obj, false);
}
}
}
Add the entity class to your persistence unit.
<class>myorg.relex.collection.ShipBySwitch</class>
Add the following test method to your existing JUnit test case. It will be used demonstrate the benefits and issues with having an object switching hashCode values and equals results.
@Test
public void testBySwitch() {
log.info("*** testBySwitch ***");
Ship ship1 = new ShipBySwitch().setName("one");
Ship ship2 = new ShipBySwitch().setName("two");
assertFalse("unexpected hashCode", ship1.hashCode() == ship2.hashCode());
assertFalse("unexpected equality", ship1.equals(ship2));
}
Rebuild the module and run the new test method. Notice how the two instances are immediately determined to be different during the pre-persist state by the fact they are two different instances. If we wanted them to be the same -- we could have switched to comparing object properties.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testBySwitch ... -*** testBySwitch *** -ShipBySwitch(id=0,oid=4).hashCode=734740843 -ShipBySwitch(id=0,oid=5).hashCode=744458212 -ShipBySwitch(id=0,oid=4).equals(id=0,oid=5)=false ... [INFO] BUILD SUCCESS
Add the following lines of code to your existing method. This will demonstrate how to the two instances can be added to a set -- unlike before.
Set<Ship> ships = new HashSet<Ship>();
log.debug("add first ship to the set");
assertTrue("first entity not accepted into set", ships.add(ship1));
log.debug("add second ship to the set");
assertTrue("second entity not accepted into set", ships.add(ship2));
assertEquals("unexpected set.size", 2, ships.size());
log.debug("ships=" + ships);
Rebuild the module and re-run the test method. Note this time around we end up with two instances in the set.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testBySwitch ... -add first ship to the set -ShipBySwitch(id=0,oid=4).hashCode=434276434 -add second ship to the set -ShipBySwitch(id=0,oid=5).hashCode=1226345699 -ships=[ShipBySwitch(id=0,oid=4), ShipBySwitch(id=0,oid=5)] ... [INFO] BUILD SUCCESS
Add the following lines of code to your existing test method. This should demonstrate how the object shifts from using the instanceId to the databaseId once it has been assigned.
em.persist(ship1);
em.flush();
em.clear();
log.debug("getting new instance of entity");
Ship ship4 = em.find(ShipBySwitch.class, ship1.getId());
assertTrue("unexpected hashCode", ship1.hashCode() == ship4.hashCode());
assertTrue("unexpected equality", ship1.equals(ship4));
Rebuild the module and re-run the updated test method. Notice how the previously managed from the persist() and the newly managed instance from the find() report they represent the same object.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testBySwitch ... -getting new instance of entity Hibernate: select shipbyswit0_.id as id29_0_, shipbyswit0_.created as created29_0_, shipbyswit0_.name as name29_0_ from RELATIONEX_SHIP shipbyswit0_ where shipbyswit0_.id=? -ShipBySwitch(id=1,oid=4).hashCode=1 -ShipBySwitch(id=1,oid=6).hashCode=1 -ShipBySwitch(id=1,oid=4).equals(id=1,oid=6)=true ... [INFO] BUILD SUCCESS
Add the following of code to the existing unit test. This will attempt to find the entity that we know exists in the set.
log.debug("set=" + ships);
log.debug("checking set for entity");
assertFalse("set found changed entity after persist", ships.contains(ship1));
Rebuild the module and re-run the test method. Notice the entity can no longer be found in the set. This is because the hashCode has changed from when it was originally inserted into the set. This can't be good. Lets stop here with this solution.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testBySwitch
...
-set=[ShipBySwitch(id=1,oid=4), ShipBySwitch(id=0,oid=5)]
-checking set for entity
-ShipBySwitch(id=1,oid=4).hashCode=1
...
[INFO] BUILD SUCCESS
You have finished trying out a technique where we shift the hashCode/equals implementation based on a change in state. The javadoc for java.lang.Object states this is legal for the instance to do this, but this technique obviously does not work when the hashCode is stored separate from the entity -- like in a HashedSet. This technique represents the last automatic calculation of hashCode/equals we will try -- and they all had some type of deficiency. We will next look at using business identity within the entity properties to derive the hashCode and equals.
In this section we will look at one last identity mechanism. It is based on a "business identity". In this approach we model our entity with enough properties such that they may be able to uniquely identity the entity without a database Id. It is possible the designer could use these field(s) as the primary key or just make them the implementation basis for hashCode and equals. In the following example we will use both a database-assigned primary key and separate identifying business properties.
Add the following entity class to your src/main tree. This entity class derives its hashCode and equals using a name and createTime property. It may be possible that two entities have the same name -- but no two entities should be created with the same name in the same millisecond.
package myorg.relex.collection;
import javax.persistence.*;
/**
* This class is provides an example of an entity that implements hashCode/equals
* using its business identity. Note that it is not always easy to derive a business Id
* for an entity class.
*/
@Entity
@Table(name="RELATIONEX_SHIP")
public class ShipByBusinessId extends Ship {
@Override
public int peekHashCode() {
return (name==null ? 0 : name.hashCode()) +
(created==null ? 0 : (int)created.getTime());
}
@Override
public boolean equals(Object obj) {
try {
if (this == obj) { return logEquals(obj, true); }
boolean equals = name.equals(((ShipByBusinessId)obj).name) &&
created.getTime() == (((ShipByBusinessId)obj).created.getTime());
return logEquals(obj, equals);
} catch (Exception ex) {
return logEquals(obj, false);
}
}
@Override
public String toString() {
return super.toString() +
", name=" + name +
", created=" + (created==null ? 0 : created.getTime());
}
}
Add the new entity class to your persistence unit.
<class>myorg.relex.collection.ShipByBusinessId</class>
Add the following test method to your existing JUnit test case. The first part of the test simply verifies we can determine the two instances are different. Note that since we are factoring in the createTime into the businessId -- a delay is inserted to make sure we get at least one millisecond different in createTime. Depending on how our entities are created -- this may not be necessary.
@Test
public void testByBusinessId() {
log.info("*** testByBusinessId ***");
Ship ship1 = new ShipByBusinessId().setName("one").setCreated(new Date());
try { Thread.sleep(1);} catch (InterruptedException e) {}
Ship ship2 = new ShipByBusinessId().setName("two").setCreated(new Date());
assertFalse("unexpected hashCode", ship1.hashCode() == ship2.hashCode());
assertFalse("unexpected equality", ship1.equals(ship2));
}
Build the model and run the new test method. Note that we are factoring in both name and createTime into the hashCode/equals and ignoring the databaseId and instanceId.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testByBusinessId ... -*** testByBusinessId *** -ShipByBusinessId(id=0,oid=4), name=one, created=1364958763502.hashCode=-840726444 -ShipByBusinessId(id=0,oid=5), name=two, created=1364958763503.hashCode=-840721349 -ShipByBusinessId(id=0,oid=4), name=one, created=1364958763502.equals(id=0,oid=5)=false ... [INFO] BUILD SUCCESS
Add the following to your test method. Since we have a unique identities at this point, both instances will be placed into the set. We will also notice later that since the hashCode/equals does not change -- the set will be usable after the calls to persist() complete.
Set<Ship> ships = new HashSet<Ship>();
log.debug("add first ship to the set");
assertTrue("first entity not accepted into set", ships.add(ship1));
log.debug("add second ship to the set");
assertTrue("second entity not accepted into set", ships.add(ship2));
assertEquals("unexpected set.size", 2, ships.size());
log.debug("ships=" + ships);
Rebuild the module and re-run the test method. Of no surprise -- we get both instances inserted into the set.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testByBusinessId
...
-add first ship to the set
-ShipByBusinessId(id=0,oid=4), name=one, created=1364959025047.hashCode=-840464899
-add second ship to the set
-ShipByBusinessId(id=0,oid=5), name=two, created=1364959025048.hashCode=-840459804
-ships=[ShipByBusinessId(id=0,oid=4), name=one, created=1364959025047, ShipByBusinessId(id=0,oid=5), name=two, created=1364959025048]
...
[INFO] BUILD SUCCESS
Add the following lines to your test method. In this section we will be determining whether a managed and unmanaged instance will have the same Id. Note the extra effort to zero out the databaseId for the unmanaged instance.
em.persist(ship1);
em.flush();
em.clear();
log.debug("getting new instance of entity");
Ship ship4 = em.find(ShipByBusinessId.class, ship1.getId());
ship1.setId(0); //making sure that databaseId not used in hashCode/equals
assertTrue("unexpected hashCode", ship1.hashCode() == ship4.hashCode());
assertTrue("unexpected equality", ship1.equals(ship4));
Rebuild the module and re-run the test method. Note how the instances match even when one does not have the databaseId to work with.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testByBusinessId ... -getting new instance of entity Hibernate: select shipbybusi0_.id as id29_0_, shipbybusi0_.created as created29_0_, shipbybusi0_.name as name29_0_ from RELATIONEX_SHIP shipbybusi0_ where shipbybusi0_.id=? -ShipByBusinessId(id=0,oid=4), name=one, created=1364959720863.hashCode=-839769083 -ShipByBusinessId(id=1,oid=6), name=one, created=1364959720863.hashCode=-839769083 -ShipByBusinessId(id=0,oid=4), name=one, created=1364959720863.equals(id=1,oid=6)=true ... [INFO] BUILD SUCCESS
Add the following lines to your test method. This is where the hashCode/equals basis switch failed us last time.
log.debug("set=" + ships);
log.debug("checking set for entity");
assertTrue("entity not found after persist", ships.contains(ship1));
Rebuild your module and re-run the test method. This time around you should notice the entity is able to be found within the set.
-set=[ShipByBusinessId(id=0,oid=5), name=two, created=1364959903295, ShipByBusinessId(id=0,oid=4), name=one, created=1364959903294]
-checking set for entity
-ShipByBusinessId(id=0,oid=4), name=one, created=1364959903294.hashCode=-839586652
...
[INFO] BUILD SUCCESS
You have finished working through the business identity solution for calculating hashCode and equals. This technique has the benefit of being stable from the time the instance was created in memory and through persistence into the database. It was a bit harder to calculate because we needed to find enough stable entity properties persisted to the database so we could derive the values. If these values are actually unique -- we could have considered using them for the primary key but that would have added database complexity/expense.
In this section we will demonstrate the use of ordering a collection mapped with JPA. The previous sections mapped the many aspects of the collection but did not represent any specific ordering within the collection.
Put the following class in your src/main tree. We will use this entity as something we wish to sort within our application. It is currently incomplete and does not sort without some external help.
package myorg.relex.collection;
import java.util.Comparator;
import javax.persistence.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* This class represents an example entity that has an order in its parent's list.
*/
@Entity
@Table(name="RELATIONEX_SEGMENT")
public class Segment {// implements Comparable<Segment>{
private static final Logger log = LoggerFactory.getLogger(Segment.class);
@Id @GeneratedValue
private int id;
private int number; //a one-up sequence used to order a route
@Column(name="TO", length=16)
private String to;
@Column(name="FM", length=16)
private String from;
public int getId() { return id; }
public int getNumber() { return number; }
public Segment setNumber(int number) {
this.number = number;
return this;
}
public String getTo() { return to; }
public Segment setTo(String to) {
this.to = to;
return this;
}
public String getFrom() { return from; }
public Segment setFrom(String from) {
this.from = from;
return this;
}
/*
@Override
public int compareTo(Segment rhs) {
if (this == rhs) { return 0; }
int result = number - rhs.number;
log.debug(getClass().getSimpleName() + toString() +
".compareTo" + rhs.toString() +
"=" + result
);
return result;
}
*/
@Override
public String toString() {
return "(id=" + id + ",number=" + number + ")";
}
}
Put the following entity class in your src/main tree. This class will be the owning side of a one-to-many relationship of what should be ordered children entities. It is currently incomplete and we will modify in the following steps.
package myorg.relex.collection;
import java.util.Collection;
import java.util.Collections;
import java.util.Comparator;
import java.util.LinkedList;
import java.util.List;
import javax.persistence.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* This entity class provides an example of an ordered list of child entities ordered by a business property
* in the child entity.
*/
@Entity
@Table(name="RELATIONEX_PATH")
public class Path {
private static final Logger log = LoggerFactory.getLogger(Path.class);
@Id @GeneratedValue
private int id;
@OneToMany(cascade=CascadeType.ALL, fetch=FetchType.EAGER)
@JoinColumn
// @OrderBy("number ASC")
private List<Segment> segments;
@Column(length=16)
private String name;
public int getId() { return id; }
public List<Segment> getSegments() {
if (segments==null) { segments = new LinkedList<Segment>(); }
return segments;
}
//private class SegmentComparator implements Comparator<Segment>
public Path addSegment(Segment segment) {
getSegments().add(segment);
/*
Collections.sort(segments, new Comparator<Segment>() {
@Override
public int compare(Segment lhs, Segment rhs) {
if (lhs == rhs || lhs==null && rhs == null) { return 0; }
if (lhs != null && rhs == null) { return 1; }
if (lhs == null && rhs != null) { return -1; }
int result = lhs.getNumber() - rhs.getNumber();
log.debug(lhs.getClass().getSimpleName() + lhs.toString() +
".compareTo" + rhs.toString() +
"=" + result
);
return result;
}});
*/
// Collections.sort(segments);
return this;
}
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Add the two entity classes to the persistence unit.
<class>myorg.relex.collection.Path</class>
<class>myorg.relex.collection.Segment</class>
Add the following test method to your existing JUnit test case. This test will verify we can add several elements to a list within the parent entity and have the list maintain the entry order.
@Test
public void testOrderBy() {
log.info("*** testOrderBy ***");
Segment s1 = new Segment().setNumber(1).setFrom("A").setTo("B");
Segment s2 = new Segment().setNumber(2).setFrom("B").setTo("C");
Segment s3 = new Segment().setNumber(3).setFrom("C").setTo("D");
Path path = new Path();
path.addSegment(s2).addSegment(s3).addSegment(s1);
log.debug("path.segments=" + path.getSegments());
Iterator<Segment> itr = path.getSegments().iterator();
assertEquals(2, itr.next().getNumber());
assertEquals(3, itr.next().getNumber());
assertEquals(1, itr.next().getNumber());
}
Build the module and run the new unit test. Notice the list maintained the order we entered the elements even though it may not be the way we wish to have them ordered later.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testOrderBy ... -*** testOrderBy *** -path.segments=[(id=0,number=2), (id=0,number=3), (id=0,number=1)] ... [INFO] BUILD SUCCESS
Add the following to the test method. This will store the entities and get a fresh instance from the database.
log.debug("getting new path instance from database");
em.persist(path);
em.flush(); em.clear();
Path path2 = em.find(Path.class, path.getId());
itr = path2.getSegments().iterator();
log.debug("path2.segments=" + path2.getSegments());
Re-build the module and re-run the test method. Notice how the provider queried for the child entities without any regard for order. They come out in a random order (even though it appears somewhat predictable here).
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testOrderBy ... -path.segments=[(id=0,number=2), (id=0,number=3), (id=0,number=1)] -getting new path instance from database ... Hibernate: select path0_.id as id30_1_, path0_.name as name30_1_, segments1_.segments_id as segments5_30_3_, segments1_.id as id3_, segments1_.id as id31_0_, segments1_.FM as FM31_0_, segments1_.number as number31_0_, segments1_.TO as TO31_0_ from RELATIONEX_PATH path0_ left outer join RELATIONEX_SEGMENT segments1_ on path0_.id=segments1_.segments_id where path0_.id=? -path2.segments=[(id=1,number=2), (id=2,number=3), (id=3,number=1)] ... [INFO] BUILD SUCCESS
Lets add a requirement the Segments be ordered by a business property; number. We can make this happen by adding the following metadata to the list within the parent entity. Add an @OrderBy("number ASC") to the collection. Note that we can order in an ASCending or DESCending order. The default is ASC.
@OneToMany(cascade=CascadeType.ALL, fetch=FetchType.EAGER)
@JoinColumn
@OrderBy("number ASC")
private List<Segment> segments;
Add the following assertions that verify the returned collection has a list of elements ordered by the specified business property.
log.debug("path2.segments=" + path2.getSegments()); ... assertEquals(1, itr.next().getNumber()); assertEquals(2, itr.next().getNumber()); assertEquals(3, itr.next().getNumber());
Rebuild the module and and re-run the unit test. Notice the provider has added an "order by" to the SQL query and our list comes back in a stable, predictable order by number.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testOrderBy ... Hibernate: select path0_.id as id30_1_, path0_.name as name30_1_, segments1_.segments_id as segments5_30_3_, segments1_.id as id3_, segments1_.id as id31_0_, segments1_.FM as FM31_0_, segments1_.number as number31_0_, segments1_.TO as TO31_0_ from RELATIONEX_PATH path0_ left outer join RELATIONEX_SEGMENT segments1_ on path0_.id=segments1_.segments_id where path0_.id=? order by segments1_.number asc -path2.segments=[(id=3,number=1), (id=1,number=2), (id=2,number=3)] ... [INFO] BUILD SUCCESS
Thats great that we can get the list ordered in a way we want when pulled from the database. However -- what if we wanted things ordered without a round-trip to the database. Java Lists are meant to be sorted so lets add a few extra steps to this section.
Change the behavior of adding an element to the collection. Add a sort and a Comparator so adding a new entry causes the list to be re-sorted according to the business property.
public Path addSegment(Segment segment) { getSegments().add(segment); Collections.sort(segments, new Comparator<Segment>() { @Override public int compare(Segment lhs, Segment rhs) { if (lhs == rhs || lhs==null && rhs == null) { return 0; } if (lhs != null && rhs == null) { return 1; } if (lhs == null && rhs != null) { return -1; } int result = lhs.getNumber() - rhs.getNumber(); log.debug(lhs.getClass().getSimpleName() + lhs.toString() + ".compareTo" + rhs.toString() + "=" + result ); return result; }}); return this; }
Change the order of the assert statements in the first section of the unit test. We should now have a list that is sorted by business property before being stored in the database.
log.debug("path.segments=" + path.getSegments());
Iterator<Segment> itr = path.getSegments().iterator();
assertEquals(1, itr.next().getNumber());
assertEquals(2, itr.next().getNumber());
assertEquals(3, itr.next().getNumber());
Rebuild the module, re-run the unit test, and notice how the elements of the list are ordered prior to being pulled back. We didn't save the provider or database any work -- but we did make our abstraction of the list more consistent.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testOrderBy ... -*** testOrderBy *** -Segment(id=0,number=3).compareTo(id=0,number=2)=1 -Segment(id=0,number=3).compareTo(id=0,number=2)=1 -Segment(id=0,number=1).compareTo(id=0,number=3)=-2 -Segment(id=0,number=1).compareTo(id=0,number=3)=-2 -Segment(id=0,number=1).compareTo(id=0,number=2)=-1 -path.segments=[(id=0,number=1), (id=0,number=2), (id=0,number=3)] ... [INFO] BUILD SUCCESS
Add the following to your child entity class. We are going to make the entities be able to order themselves. This works find if there is a standard, comparable property (or set of properties) and can clean up the parent classes from having to define sort information.
public class Segment implements Comparable<Segment>{
...
...
@Override
public int compareTo(Segment rhs) {
if (this == rhs) { return 0; }
int result = number - rhs.number;
log.debug(getClass().getSimpleName() + toString() +
".compareTo" + rhs.toString() +
"=" + result
);
return result;
}
Change the implementation of the parent to the following. All we did was move the burden of the compare from the container to the elements within the container.
public Path addSegment(Segment segment) {
getSegments().add(segment);
Collections.sort(segments);
return this;
}
Rebuild the module and re-run the test method. Observe the ordering is still preserved with the new approach.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testOrderBy ... -*** testOrderBy *** -Segment(id=0,number=3).compareTo(id=0,number=2)=1 -Segment(id=0,number=3).compareTo(id=0,number=2)=1 -Segment(id=0,number=1).compareTo(id=0,number=3)=-2 -Segment(id=0,number=1).compareTo(id=0,number=3)=-2 -Segment(id=0,number=1).compareTo(id=0,number=2)=-1 -path.segments=[(id=0,number=1), (id=0,number=2), (id=0,number=3)]
You have finished looking at ordered collections. You saw how a list could be ordered by the provider when queried by the database. You also saw how implementing the Java Comparator or Comparable interface could provide sorting outside of the scope of the database query.
In this section we will demonstrate the use of different collection interfaces that can be used with JPA.
In this section we will demonstrate the mapping of a collection using a java.util.Map interface. This technique is useful for un-ordered collections with elements that are normally accessed by an entity property from the parent/one side of the relationship. Note that access to any entity by any property is always available through the EntityManager and JPAQL.
Add the following class to your src/main tree. This entity class will be referenced via a Map from its parent in a one-to-many, uni-directional relationship.
package myorg.relex.collection;
import javax.persistence.*;
/**
* This class is an example of an entity that will be referenced from the parent in its relationship
* through a Map which uses a value unique to that parent.
*/
@Entity
@Table(name="RELATIONEX_POSITION")
public class Position {
@Id @GeneratedValue
private int id;
@Column(length=12, nullable=false)
private String position; //this is not unique within this table
@Column(length=32, nullable=false, unique=true)
private String player; //this is unique within the table
protected Position() {}
public Position(String position, String player) {
this.position = position;
this.player = player;
}
public int getId() { return id; }
public String getPosition() { return position; }
public void setPosition(String position) { this.position = position; }
public String getPlayer() { return player; }
public void setPlayer(String player) {
this.player = player;
}
@Override
public int hashCode() {
return position==null?0:position.hashCode() + player==null?0:player.hashCode();
}
@Override
public boolean equals(Object obj) {
try {
if (this == obj) { return true; }
Position rhs = (Position) obj;
if (position==null || player==null) { return false; }
return position.equals(rhs.position) && player.equals(rhs.player);
} catch (Exception ex) { return false; }
}
}
Put the following class in your src/main tree. This entity class implements the one-to-many, uni-directional relationship as a Map. Since the parent uses a Map keyed by an entity property, there can only be a relationship to children from a common parent where the children have different property values. Since the property being used is not unique within the child table -- then not all children will be allowed to be associated with this parent entity at the same time.
package myorg.relex.collection;
import java.util.HashMap;
import java.util.Map;
import javax.persistence.*;
/**
* This class provides an example of a parent that uses a Map to reference child members.
*/
@Entity
@Table(name="RELATIONEX_LINEUP")
public class Lineup {
@Id @GeneratedValue
private int id;
@OneToMany
@MapKey(name="position")
@JoinColumn(name="LINEUP_ID")
private Map<String, Position> positions;
@Column(length=10)
private String team;
public int getId() { return id; }
public Map<String, Position> getPositions() {
if (positions==null) { positions = new HashMap<String, Position>(); }
return positions;
}
public Lineup addPosition(Position position) {
if (position==null) { return this; }
getPositions().put(position.getPosition(), position);
return this;
}
public String getTeam() { return team; }
public void setTeam(String team) {
this.team = team;
}
}
Add the entity classes to the persistence unit.
<class>myorg.relex.collection.Position</class>
<class>myorg.relex.collection.Lineup</class>
Add the following test method to your existing JUnit test method.
@Test
public void testMap() {
log.info("*** testMap ***");
Position players[] = new Position[] {
new Position("1st", "who"),
new Position("2nd", "what"),
new Position("3rd", "idontknow"),
new Position("1st", "whom"),
new Position("1st", "whoever")
};
log.debug("persisting players");
for (Position p: players) {
em.persist(p);
}
Lineup lineup = new Lineup();
lineup.setTeam("today");
lineup.addPosition(players[0]);
lineup.addPosition(players[1]);
lineup.addPosition(players[2]);
log.debug("persisting lineup");
em.persist(lineup);
}
Build the module and run the new unit test method. Nothing significant at this point to observe other than to possibly notice the foreign key assignments with some of the child table rows.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testMap ... -*** testMap *** -persisting players ... [INFO] BUILD SUCCESS
Add the following lines to the test method. This will verify the expected entities from the child table are associated with the parent and accessible through the map using a property of the child entity.
log.debug("getting new lineup instance");
em.flush(); em.clear();
Lineup lineup2 = em.find(Lineup.class, lineup.getId());
assertEquals("unexpected size", lineup.getPositions().size(), lineup2.getPositions().size());
for (int i=0; i<lineup.getPositions().size(); i++) {
assertNotNull(players[i].getPlayer() + " not found", lineup2.getPositions().get(players[i].getPosition()));
}
Rebuild the module and re-run the test method. Nothing unique occurs with the database, but observe the asserts within the test case should be passing.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testMap ... -getting new lineup instance ... [INFO] BUILD SUCCESS
Add the following lines to your unit test. This will attempt to replace an entry in the Map with a new entry. This should remove the association with the former entity and form an relationship with the new entity since they share the same entity property.
log.debug("adding new player for position");
lineup2.addPosition(players[3]);
assertEquals("number of positions changed", lineup.getPositions().size(), lineup2.getPositions().size());
em.flush();
Rebuild the module and re-run the test method. Notice the foreign key being set to null for the former entity and the foreign key being set for the new entity being added to the Map.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testMap ... -adding new player for position Hibernate: update RELATIONEX_POSITION set LINEUP_ID=null where LINEUP_ID=? and id=? Hibernate: update RELATIONEX_POSITION set LINEUP_ID=? where id=? ... [INFO] BUILD SUCCESS
Add the following lines to your test method. They will verify the foreign key state of each row in the child table using the known state of the unit test.
log.debug("checking positions"); @SuppressWarnings("unchecked") List<Object[]> rows = em.createNativeQuery("select ID, LINEUP_ID from RELATIONEX_POSITION").getResultList(); for (Object[] val : rows) { int id = (Integer)val[0]; Integer lineupId = (Integer)val[1]; if (id==players[1].getId() || id==players[2].getId() || id==players[3].getId()) { assertNotNull("unexpected lineupId", lineupId); } else { assertNull("lineupId was assigned for " + id, lineupId); } }
Rebuild the module and re-run the unit test. The final asserts should pass.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.CollectionTest#testMap ... -checking positions Hibernate: select ID, LINEUP_ID from RELATIONEX_POSITION ... [INFO] BUILD SUCCESS
The following lists the final state of the child table at the completion of the unit test.
SELECT * FROM RELATIONEX_POSITION; ID PLAYER POSITION LINEUP_ID 1 who 1st null 2 what 2nd 1 3 idontknow 3rd 1 4 whom 1st 1 5 whoever 1st null
You have completed a brief look at collection types used by JPA. In this exercise you used a Map which permitted a relationship with child entities that had unique values for the @MapKey. Foreign keys are created when we add a child entity to the Map and removed when we remove or overwrite entries in the Map. Other types of collections supported by JPA include Set, List, and Collection.
In this chapter we took a detour from relationships and took a deeper look at topics specifically related to identity, collection membership, collection ordering, and collection types. During this chapter we found the significance of hashCode/equals and when it would be important and a non-issue to override. We showed how to make out collections ordered at all times when desired. We saw how we could access child objects through a Map interface. In the following chapters we will get back into the grind of going through the other relationship types.
Now that we are done the tour into one-to-many relations and collections themselves, we can turn our attention on the many side and aspects associated with implementing a many-to-one relationship. Our first stab at this will stick to the uni-directional case. Many to one, uni-directional relationships are especially appropriate when the many side is huge and should best be obtained through a query rather than a through a collection in the parent entity. However, many things we learn here will apply to the one-to-many/many-to-one bi-directional case.
Create a JUnit test class to host tests for the many-to-one mappings.
Put the following JUnit test case base class in your src/test tree. You can delete the sample test method once we add our first real test. JUnit will fail a test case if it cannot locate a @Test to run.
package myorg.relex;
import static org.junit.Assert.*;
import javax.persistence.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.junit.*;
public class Many2OneTest extends JPATestBase {
private static Logger log = LoggerFactory.getLogger(Many2OneTest.class);
@Test
public void testSample() {
log.info("testSample");
}
}
Verify the new JUnit test class builds and executes to completion
relationEx]$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest ... -HHH000401: using driver [org.h2.Driver] at URL [jdbc:h2:tcp://localhost:9092/./h2db/ejava] ... [INFO] BUILD SUCCESS
Many-to-one relationships map easily to the database because that is exactly how relationships are commonly formed in the database -- a foreign key from the many/child table to the one/parent table. We will start simple and then get into some interesting cases.
In this section you will create a many-to-one, uni-directional relationship where the child entity references the parent entity using a separate foreign key.
Put the following class in your src/main tree. It provides an example of the inverse side of a many-to-one, uni-directional relationship -- which means it will have no reference to the relationship.
package myorg.relex.many2one;
import javax.persistence.*;
/**
* This class provides an example one/parent entity in a many-to-one, uni-directional relationship.
* For that reason -- this class will not have any reference to the many entities that may possibly
* reference it. These many/child objects must be obtained through the entity manager using a find or query.
*/
@Entity
@Table(name="RELATIONEX_STATE")
public class State {
@Id @Column(length=2)
private String id;
@Column(length=20, nullable=false)
private String name;
protected State() {}
public State(String id, String name) {
this.id = id;
this.name = name;
}
public String getId() { return id; }
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Put the following class in your src/main tree. This entity class provides an example of the owning side of a many-to-one, uni-directional relationship materialized through a foreign key column in the entity class table. It currently relies on defaults that we will experiment with and change.
package myorg.relex.many2one;
import javax.persistence.*;
/**
* This class provides an example of the owning side of a many-to-one, uni-directional relationship
* that is realized through a foreign key from the child to the parent entity.
*/
@Entity
@Table(name="RELATIONEX_STATERES")
public class StateResident {
@Id @GeneratedValue
private int id;
@ManyToOne(
// optional=false,
// fetch=FetchType.EAGER
)
// @JoinColumn(
// name="STATE_ID",
// nullable=false
// )
private State state;
@Column(length=32)
private String name;
protected StateResident() {}
public StateResident(State state) {
this.state = state;
}
public int getId() { return id; }
public State getState() { return state; }
public void setState(State state) {
this.state = state;
}
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Add the two entity classes to your persistence unit.
<class>myorg.relex.many2one.State</class>
<class>myorg.relex.many2one.StateResident</class>
Generate the database schema for the two entity classes and the relationship. Notice the provider automatically generated a foreign key column and named it after the entity table the column references.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_STATE ( id varchar(12) not null, name varchar(20) not null, primary key (id) ); create table RELATIONEX_STATERES ( id integer generated by default as identity, name varchar(32), state_id varchar(12), <!------- GENERATED FK Column primary key (id) ); ... alter table RELATIONEX_STATERES add constraint FK88A9D0FF4006DFB7 foreign key (state_id) references RELATIONEX_STATE;
Change the child entity to make use of a foreign key column explicit -- with a specific name and nullable constraint.
@ManyToOne
@JoinColumn(
name="STATE_ID",
nullable=false
)
private State state;
Regenerate the database schema and note the changes made to the schema using the more explicit declaration of the foreign key column. It now
uses a specified column name
is defined to be non-nullable
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_STATERES ( id integer generated by default as identity, name varchar(32), STATE_ID varchar(12) not null, primary key (id) ); ... alter table RELATIONEX_STATERES add constraint FK88A9D0FF4006DFB7 foreign key (STATE_ID) references RELATIONEX_STATE; ...
Note there are two ways to get the provider to recognize whether the foreign key column is required. We demonstrated the table/column-centric approach above. The following uses a relationship-centric approach. You can optionally change your relationship definition to the following to show that the foreign key column is defined to be non-null in both cases.
@ManyToOne(
optional=false)
@JoinColumn(
name="STATE_ID"//,
// nullable=false
)
private State state;
create table RELATIONEX_STATERES ( id integer generated by default as identity, name varchar(32), STATE_ID varchar(12) not null, primary key (id) );
Create the following test method in your JUnit test case. This test method will create an instance of both the one and many side, relate them, and persist them. The flush() is only there so we can control/see the specific database commands and is not a required call within the body of the transaction. Note we arranged the order of the persists and left off any cascade definitions to show a point here that will initially cause an error.
@Test
public void testManyToOneUniFK() {
log.info("*** testManyToOneUniFK ***");
State state = new State("MD", "Maryland");
StateResident res = new StateResident(state);
res.setName("joe");
log.debug("persisting child");
em.persist(res);
log.debug("persisting parent");
em.persist(state);
em.flush();
}
Build the module and run the new test method. Notice this produces a foreign key constraint error because the child is being persisted to the database prior to the parent being managed.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest#testManyToOneUniFK ... -*** testManyToOneUniFK *** -persisting child Hibernate: select state_.id, state_.name as name28_ from RELATIONEX_STATE state_ where state_.id=? Hibernate: insert into RELATIONEX_STATERES (id, name, STATE_ID) values (null, ?, ?) ... Tests in error: testManyToOneUniFK(myorg.relex.Many2OneTest): org.hibernate.exception.ConstraintViolationException: NULL not allowed for column "STATE_ID"; SQL statement:(..) ... [INFO] BUILD FAILURE
We could attempt to fix this with a cascade.PERSIST from the child to the parent, but that just seems wrong. The parent should exist prior to assigning children. There may be additional business rules that must go on to create the parent that is outside the scope of creating the child and child/parent relationship.
Re-order the creates to better simulate the parent being created prior to adding the child entities.
log.debug("persisting parent");
em.persist(state);
log.debug("persisting child");
em.persist(res);
em.flush();
Rebuild the module and re-run the test method. With the re-ordered persist() calls the provider is able to successfully store our parent, child, and relationship.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest#testManyToOneUniFK ... -persisting parent -persisting child Hibernate: insert into RELATIONEX_STATE (name, id) values (?, ?) Hibernate: insert into RELATIONEX_STATERES (id, name, STATE_ID) values (null, ?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to your test method. This will verify we can obtain the child and its associated parent.
log.debug("getting new instances");
em.clear();
StateResident res2 = em.find(StateResident.class, res.getId());
log.debug("checking child");
assertEquals("unexpected child data", res.getName(), res2.getName());
log.debug("checking parent");
assertEquals("unexpected parent data", state.getName());
Rebuild the module and re-run the test method will the extra calls to retrieve a fresh instance. Notice, under these conditions, the provider chose to use an EAGER fetch during the find() and there was no LAZY load during the calls to the child and parent.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest#testManyToOneUniFK ... -getting new instances Hibernate: select stateresid0_.id as id29_1_, stateresid0_.name as name29_1_, stateresid0_.STATE_ID as STATE3_29_1_, state1_.id as id28_0_, state1_.name as name28_0_ from RELATIONEX_STATERES stateresid0_ inner join <!=== EAGER fetch of parent RELATIONEX_STATE state1_ on stateresid0_.STATE_ID=state1_.id where stateresid0_.id=? -checking child -checking parent ... [INFO] BUILD SUCCESS
Optionally change the fetch type for the parent to a LAZY fetch. Although the provider is permitted to ignore requests for LAZY fetch (but must honor requests for EAGER), we do get a shallower query during the find() for the child and then a second query for the parent when it is accessed.
@ManyToOne( optional=false, fetch=FetchType.LAZY ) @JoinColumn(name="STATE_ID") private State state;
-getting new instances Hibernate: select stateresid0_.id as id29_0_, stateresid0_.name as name29_0_, stateresid0_.STATE_ID as STATE3_29_0_ from RELATIONEX_STATERES stateresid0_ where stateresid0_.id=? -checking child -checking parent Hibernate: select state0_.id as id28_0_, state0_.name as name28_0_ from RELATIONEX_STATE state0_ where state0_.id=? ... [INFO] BUILD SUCCESS
Add another child object for the same parent.
log.debug("add more residents"); StateResident resB = new StateResident(res2.getState()); em.persist(resB); em.flush();
Rebuild and re-run the test method. Observe the provider issues a database insert for only the new child entity with its foreign key already set to the existing parent.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest#testManyToOneUniFK ... -add more residents Hibernate: insert into RELATIONEX_STATERES (id, name, STATE_ID) values (null, ?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to your unit test. This will clear the cache of all instances and query for new instances. Note that we need to make use of the query since this is a uni-directional relationship and the parent has no knowledge of the relationship.
log.debug("getting new instances of residences"); em.clear(); List<StateResident> residents = em.createQuery( "select r from StateResident r " + "where r.state.id=:stateId", StateResident.class) .setParameter("stateId", res.getState().getId()) .getResultList(); assertEquals("unexpected number of residents", 2, residents.size());
Rebuild the module and re-run the test method. Notice since we are using a LAZY fetch, only the child entities are returned from the database.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest#testManyToOneUniFK ... -getting new instances of residences Hibernate: select stateresid0_.id as id29_, stateresid0_.name as name29_, stateresid0_.STATE_ID as STATE3_29_ from RELATIONEX_STATERES stateresid0_ where stateresid0_.STATE_ID=? ... [INFO] BUILD SUCCESS
If you change the relationship back to an EAGER fetch, you will notice the provider still initially queries for the child entities and then issues a second query for parents matching specific values.
@ManyToOne( optional=false, fetch=FetchType.EAGER ) @JoinColumn(name="STATE_ID") private State state;
-getting new instances of residences Hibernate: select stateresid0_.id as id29_, stateresid0_.name as name29_, stateresid0_.STATE_ID as STATE3_29_ from RELATIONEX_STATERES stateresid0_ where stateresid0_.STATE_ID=? Hibernate: select state0_.id as id28_0_, state0_.name as name28_0_ from RELATIONEX_STATE state0_ where state0_.id=? ... [INFO] BUILD SUCCESS
Add the following lines to your test method. This will verify that all managed children (the many) reference a common parent (the one) instance. We are implementing the check in terms of a change to one parent and attempting to observe that change in another child.
log.debug("changing state/data of common parent");
residents.get(0).getState().setName("Home State");
assertEquals("unexpected difference in parent data",
residents.get(0).getState().getName(),
residents.get(1).getState().getName());
Rebuild the module and re-run the test method. Observe our assertion passes and we have a single update to the database.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest#testManyToOneUniFK ... -changing state/data of common parent Hibernate: update RELATIONEX_STATE set name=? where id=? ... [INFO] BUILD SUCCESS
You have finished going through a many-to-one, uni-directional relationship mapped using a single foreign key column in the child entity table. The parent maintains no collection or reference to the children. All references are maintained by the entity and obtained through a find() or query() of the children. This is quite appropriate when the size of the many can be quite large and is best to be queried for where paging can be leveraged or other types of queries to reduce the total count returned.
In the next section we will complicate the situation slightly by using a parent entity with a compound primary key -- which ripples over to the foreign key used by the child.
It this section we will create a new complication to the foreign key join by making the primary key of the parent entity a compound primary key (i.e, multiple columns). This will cause the related child entity to map more than one column as its foreign key to the parent.
In this example we will use an embeddable primary key class as an @EmbeddedId in the parent entity.
Place the following embeddable primary key class in your src/main tree. Notice that it models two primary key values; number and street. It currently defines no specific mappings for the two properties, but we will experiment with changes later.
package myorg.relex.many2one;
import java.io.Serializable;
import javax.persistence.*;
/**
* This class provides an example compound primary key value that will be used in a many-to-one,
* uni-directional relationship.
*/
@Embeddable
public class HousePK implements Serializable {
private static final long serialVersionUID = 5213787609029123676L;
// @Column(name="NO")
private int number;
// @Column(name="STR", length=50)
private String street;
public HousePK() {}
public HousePK(int number, String street) {
this.number = number;
this.street = street;
}
public int getNumber() { return number; }
public void setNumber(int number) {
this.number = number;
}
public String getStreet() { return street; }
public void setStreet(String street) {
this.street = street;
}
@Override
public int hashCode() {
return number + street==null?0:street.hashCode();
}
@Override
public boolean equals(Object obj) {
try {
if (this==obj) { return true; }
HousePK rhs = (HousePK)obj;
if (street==null && rhs.street != null) { return false; }
return number==rhs.number && street.equals(rhs.street);
} catch (Exception ex) { return false; }
}
}
Note, as a primary key class, it ...
implements Serializable
implements a default constructor
overrides the default hashCode() based on the properties
overrides the default equals() based on the properties
Put the following parent entity class in your src/main tree. This class provides an example of the parent/one/inverse side of a many-to-one, uni-directional relationship. In that role, it has no reference to the child entity. However, since this is a compound primary/foreign key example, it defines its primary key in terms of the primary key class you added above. There were two choices in implementing the compound primary key; @IdClass or @EmbeddedId. With the @IdClass approach -- the entity would have modeled the number and street properties as properties of this class. With the @EmbeddedId approach, the entity models the properties as an opaque set encapsulated by the primary key class.
package myorg.relex.many2one;
import javax.persistence.*;
/**
* This class provides an example of a parent/inverse side of a many-to-one, uni-directional relationship where
* the parent and foreign key must use a compound value.
*/
@Entity
@Table(name="RELATIONEX_HOUSE")
public class House {
@EmbeddedId
// @AttributeOverrides({
// @AttributeOverride(name="street", column=@Column(name="STREET_NAME", length=20)),
// })
private HousePK id;
@Column(length=16, nullable=false)
private String name;
protected House() {}
public House(HousePK id, String name) {
this.id = id;
this.name = name;
}
public HousePK getId() { return id; }
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Add the following child entity class to your src/main tree. This class is the owning side of the many-to-one relation and therefore has a reference to the one/parent side and a mapping of the relationship to the database. We currently have the class implemented to accepts all defaults and will be soon making changes after we see what default schema is applied.
package myorg.relex.many2one;
import javax.persistence.*;
/**
* This class provides an example of the owning/child side of a many-to-one, uni-directional relationship
* where the parent uses a (embedded) compound primary key.
*/
@Entity
@Table(name="RELATIONEX_OCCUPANT")
public class Occupant {
@Id @GeneratedValue
private int id;
@ManyToOne(optional=false)
// @JoinColumns({
// @JoinColumn(name="RES_NUM", referencedColumnName="NUMBER"),
// @JoinColumn(name="RES_STR", referencedColumnName="STREET_NAME")
// })
private House residence;
@Column(length=16, nullable=false)
private String name;
protected Occupant(){}
public Occupant(String name, House residence) {
this.name = name;
this.residence = residence;
}
public int getId() { return id; }
public House getResidence() { return residence; }
public void setResidence(House residence) {
this.residence = residence;
}
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Add the new entity classes to the persistence unit. Do *not* list the primary key class.
<class>myorg.relex.many2one.House</class>
<class>myorg.relex.many2one.Occupant</class>
Generate schema for the model and observe the schema generated for the new entities. Notice how the default mapping properties of the primary key class was used within the parent entity table and the child table created values derived from the parent column name and parent reference variable name (i.e., (parentVarName)_(PARENT_COLUMNNAME)).
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_HOUSE ( number integer not null, street varchar(255) not null, name varchar(16) not null, primary key (number, street) ); ... create table RELATIONEX_OCCUPANT ( id integer generated by default as identity, name varchar(16) not null, residence_number integer not null, residence_street varchar(255) not null, primary key (id) ); ... alter table RELATIONEX_OCCUPANT add constraint FK6957B84D35A694BB foreign key (residence_number, residence_street) references RELATIONEX_HOUSE;
Make the following annotation change to the primary key class and observe how that impacts the schema generated.
@Column(name="NO")
private int number;
@Column(name="STR", length=50)
private String street;
The parent entity table inherited the column definitions from the primary key class. Since the parent column names changed, the default child entity table foreign key column names changed to match.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_HOUSE ( NO integer not null, STR varchar(50) not null, name varchar(16) not null, primary key (NO, STR) ); ... create table RELATIONEX_OCCUPANT ( id integer generated by default as identity, name varchar(16) not null, residence_NO integer not null, residence_STR varchar(50) not null, primary key (id) ); ... alter table RELATIONEX_OCCUPANT add constraint FK6957B84D81D611CF foreign key (residence_NO, residence_STR) references RELATIONEX_HOUSE;
Create a mapping override in the parent entity so one of the columns is defined by an entity override and the remaining property is still defined by the primary key class. You do not need to make any changes to the primary key class here.
public class House {
@EmbeddedId
@AttributeOverrides({
@AttributeOverride(name="street", column=@Column(name="STREET_NAME", length=20)),
})
private HousePK id;
Notice how the parent entity table primary key column definition is a blend of specifications from the primary key class and overrides from the parent entity class.
create table RELATIONEX_HOUSE ( NO integer not null, STREET_NAME varchar(20) not null, name varchar(16) not null, primary key (NO, STREET_NAME) ); ... create table RELATIONEX_OCCUPANT ( id integer generated by default as identity, name varchar(16) not null, residence_NO integer not null, residence_STREET_NAME varchar(20) not null, primary key (id) ); ... alter table RELATIONEX_OCCUPANT add constraint FK6957B84D8AD189E5 foreign key (residence_NO, residence_STREET_NAME)
Notice how the child entity table foreign key columns, by default, change to match the parent primary key column names.
Add column specifications for the child entity table. You will notice that we are defining a @JoinColumn for each foreign key mapping we wish to override and we must wrap multiple @JoinColumn annotations within a @JoinColumns array
Note too the referencedColumnName must match what the parent entity table is currently using. That means if you are changing values or overrides with the parent primary key column *AND* you are attempting to override the default foreign key column properties -- these must match.
public class Occupant {
...
@ManyToOne(optional=false)
@JoinColumns({
@JoinColumn(name="RES_NUM", referencedColumnName="NO"),
@JoinColumn(name="RES_STR", referencedColumnName="STREET_NAME")
})
private House residence;
create table RELATIONEX_HOUSE ( NO integer not null, STREET_NAME varchar(20) not null, name varchar(16) not null, primary key (NO, STREET_NAME) ); ... create table RELATIONEX_OCCUPANT ( id integer generated by default as identity, name varchar(16) not null, RES_NUM integer not null, RES_STR varchar(20) not null, primary key (id) ); ... alter table RELATIONEX_OCCUPANT add constraint FK6957B84D739A6436 foreign key (RES_NUM, RES_STR) references RELATIONEX_HOUSE;
Add the following test method to your JUnit test case. This test will create and instance of the parent and child, relate them, and persist them. Learning from the lessons of the previous section, we made sure the parent existed prior to persisting the child.
@Test
public void testManyToOneUniCompoundFK() {
log.info("*** testManyToOneUniCompoundFK ***");
House house = new House(new HousePK(1600,"PA Ave"),"White House");
Occupant occupant = new Occupant("bo", house);
log.debug("persisting parent");
em.persist(house);
log.debug("persisting child");
em.persist(occupant);
em.flush();
Build the module and run the new test method. Notice the extra columns used to persist the primary key of the parent and the foreign key of the child.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest#testManyToOneUniCompoundFK ... -*** testManyToOneUniCompoundFK *** -persisting parent -persisting child Hibernate: insert into RELATIONEX_HOUSE (name, NO, STREET_NAME) values (?, ?, ?) Hibernate: insert into RELATIONEX_OCCUPANT (id, name, RES_NUM, RES_STR) values (null, ?, ?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to your test method to verify we can obtain a reference to the child and parent through a find().
log.debug("getting new instances");
em.clear();
Occupant occupant2 = em.find(Occupant.class, occupant.getId());
log.debug("checking child");
assertEquals("unexpected child data", occupant.getName(), occupant2.getName());
log.debug("checking parent");
assertEquals("unexpected parent data", house.getName(), occupant2.getResidence().getName());
Rebuild the module and re-run the test method. As in the previous section, the find, by default, performs an EAGER fetch on the parent. This time, however, the join uses both primary/foreign columns.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest#testManyToOneUniCompoundFK ... -getting new instances Hibernate: select occupant0_.id as id31_1_, occupant0_.name as name31_1_, occupant0_.RES_NUM as RES3_31_1_, occupant0_.RES_STR as RES4_31_1_, house1_.NO as NO30_0_, house1_.STREET_NAME as STREET2_30_0_, house1_.name as name30_0_ from RELATIONEX_OCCUPANT occupant0_ inner join RELATIONEX_HOUSE house1_ on occupant0_.RES_NUM=house1_.NO and occupant0_.RES_STR=house1_.STREET_NAME where occupant0_.id=? -checking child -checking parent ... [INFO] BUILD SUCCESS
Add the following lines to your test method. This will test adding a second child for a common parent.
log.debug("add more child entities");
Occupant occupantB = new Occupant("miss beazily", occupant2.getResidence());
em.persist(occupantB);
em.flush();
Rebuild the module and re-run the test method. Notice how only the new child needs to be persisted -- along with the foreign key columns to reference the parent.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest#testManyToOneUniCompoundFK ... -add more child entities Hibernate: insert into RELATIONEX_OCCUPANT (id, name, RES_NUM, RES_STR) values (null, ?, ?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to your test method. This will query for child entities related to our parent. Note this is the only way (other than find()) that we can locate the child entities since no reference to the child entities is maintained in a many-to-one, uni-directional relationship.
log.debug("getting new instances of children");
em.clear();
List<Occupant> occupants = em.createQuery(
"select o from Occupant o " +
"where o.residence.id=:houseId",
Occupant.class)
.setParameter("houseId", occupant.getResidence().getId())
.getResultList();
assertEquals("unexpected number of children", 2, occupants.size());
Take extra note of what we did during the query and query parameter above. We did *not* pass in the individual primary key properties and do a property-by-property match. That would have been tedious and error prone if the primary key was re-factored. What we did instead was to pass in the opaque primary key object and let the provider, with its mapping knowledge of the primary key class and the parent/child entity classes -- form the details of the where clause.
Rebuild the module and re-run the test method. Notice how the provider broke down the opaque primary key object we passed as a parameter and created the individual column tests in the where.
-getting new instances of children Hibernate: select occupant0_.id as id31_, occupant0_.name as name31_, occupant0_.RES_NUM as RES3_31_, occupant0_.RES_STR as RES4_31_ from RELATIONEX_OCCUPANT occupant0_ where occupant0_.RES_NUM=? <!== provider created breakdown from PK comparison and occupant0_.RES_STR=? to individual column comparisons Hibernate: select house0_.NO as NO30_0_, house0_.STREET_NAME as STREET2_30_0_, house0_.name as name30_0_ from RELATIONEX_HOUSE house0_ where house0_.NO=? and house0_.STREET_NAME=? ... [INFO] BUILD SUCCESS
You have finished taking a look at the impact of a parent entity using a compound primary key when there is a relationship from a child entity. The circumstances around this situation is not all that unlike the one-to-one case, but it is still worthy covering specifically here.
In the next section we will shift the emphasis of the compound primary key from the parent to the child where the child identity is derived from a property of the parent.
In this section we are going to revisit a topic that was addressed during the one-to-one relationship coverage. We are going to take a look at a child class that uses a compound primary key with one of the properties of the primary key derived from the foreign key to the parent entity. This means that one of the columns of the child table will be both a primary key value and a foreign key value at the same time.
Add the following parent entity class to your src/main tree. This parent entity class will have its primary key automatically assigned by the container and the foreign key of the children entities will have to reference that column.
package myorg.relex.many2one;
import javax.persistence.*;
/**
* This class is an example of a parent in a many-to-one, uni-directional relation where the
* primary key of the child is derived from the primary key of the parent.
*/
@Entity
@Table(name="RELATIONEX_ITEMTYPE")
public class ItemType {
@Id @GeneratedValue
private int id;
@Column(length=20, nullable=false)
private String name;
protected ItemType() {}
public ItemType(String name) {
this.name = name;
}
public int getId() { return id; }
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
@Override
public String toString() {
return id +":" + name;
}
}
Since the child entities will have a two properties -- one of them derived from the parent entity -- we need to model a compound primary key class for the child entity. Place the following primary key class in your src/main tree. It models two properties
typeId - represents the primary key value of the ItemType entity
number - represents a unique value per ItemType.id given to an Item
package myorg.relex.many2one;
import java.io.Serializable;
import javax.persistence.*;
/**
* This class provides an example primary key class for a child entity that
* derives one of its primary key values from its parent entity in a many-to-one
* relationship.
*/
@SuppressWarnings("serial")
@Embeddable
public class ItemPK implements Serializable {
public class ItemPK implements Serializable {
// @Column(name="TYPE_ID_PK")
private int typeId; //unique value from parent ItemType.id
// @Column(name="NUMBER_PK")
private int number; //unique value assigned to instances of Item
public int getTypeId() { return typeId; }
public ItemPK setTypeId(int typeId) {
this.typeId = typeId;
return this;
}
public int getNumber() { return number; }
public ItemPK setNumber(int number) {
this.number = number;
return this;
}
@Override
public int hashCode() {
return typeId + number;
}
@Override
public boolean equals(Object obj) {
try {
if (this == obj) { return true; }
ItemPK rhs = (ItemPK) obj;
return typeId==rhs.typeId && number==rhs.number;
} catch (Exception ex) { return false; }
}
@Override
public String toString() {
return "(typeId=" + typeId + ",number=" + number + ")";
}
}
Place the following child entity class in your src/main tree. This entity class uses an embedded primary key and a many-to-one relation to the parent entity. As stated, one of the properties (typeId) of the primary key will need to be the same value as the foreign key to the parent entity (itemType.id). The implementation below currently models that as two separate columns and we will need to work to correct.
package myorg.relex.many2one;
import java.util.Date;
import javax.persistence.*;
/**
* This class provides an example of a child entity that derives its primary key from
* the parent/one side of a many-to-one relation.
*/
@Entity
@Table(name="RELATIONEX_ITEM")
public class Item {
@EmbeddedId
private ItemPK id;
@ManyToOne(optional=false)
// @MapsId("typeId") //refers to the ItemPK.typeId property
// @JoinColumn(name="TYPE_ID")
private ItemType itemType;
@Temporal(TemporalType.TIMESTAMP)
private Date created;
protected Item() {}
public Item(ItemType itemType, int number) {
this.itemType = itemType;
//typeId in PK auto-mapped to itemType FK
this.id = new ItemPK().setNumber(number);
}
public ItemPK getId() { return id; }
public ItemType getItemType() { return itemType; }
public Date getCreated() { return created; }
public void setCreated(Date created) {
this.created = created;
}
@Override
public String toString() {
return (itemType==null?null:itemType) + "pk=" + id;
}
}
Pay special notice to the fact the child entity never sets the typeId property of the primary key. This property will be ignored by the provider and the property from the foreign key used instead once we get things mapped correctly.
Add the two entity classes to your peristence unit. Do *not* list the primary key class.
<class>myorg.relex.many2one.Item</class>
<class>myorg.relex.many2one.ItemType</class>
Generate database schema for the new entity classes and their relationship. Notice how there is a typeId from the compound primary key class property and a itemType_id from the foreign key. The provider is stating that our mapping treats the two uses of the same property as two separate columns in the same table. You can tell the typeId is part of the primary key from the "primary key (number, typeId)" declaration. You can tell itemType_id is a foreign key based on the "foreign key (itemType_id)" constraint declaration.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_ITEM ( number integer not null, typeId integer not null, created timestamp, itemType_id integer not null, primary key (number, typeId) ); create table RELATIONEX_ITEMTYPE ( id integer generated by default as identity, name varchar(20) not null, primary key (id) ); ... alter table RELATIONEX_ITEM add constraint FK355BBDA3C6C591FD foreign key (itemType_id) references RELATIONEX_ITEMTYPE;
Add the @MapsId("typeId") annotation to the @ManyToOne relation. This signals the provider to map the column for the compound primary key "typeId" column to the same column used to map the foreign key for the itemType.
@ManyToOne(optional=false)
@MapsId("typeId") //refers to the ItemPK.typeId property
private ItemType itemType;
Regenerate schema and notice the two columns have mapped to the foreign key column. itemType_id still exists and has a foreign key constraint. itemType_id is now also listed in the primary key declaration.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_ITEM ( number integer not null, created timestamp, itemType_id integer, primary key (number, itemType_id) ); create table RELATIONEX_ITEMTYPE ( id integer generated by default as identity, name varchar(20) not null, primary key (id) ); ... alter table RELATIONEX_ITEM add constraint FK355BBDA3C6C591FD foreign key (itemType_id) references RELATIONEX_ITEMTYPE;
Make a small cosmetic change by defining the column name for the foreign key column.
@ManyToOne(optional=false)
@MapsId("typeId") //refers to the ItemPK.typeId property
@JoinColumn(name="TYPE_ID")
private ItemType itemType;
Notice we are renaming the foreign key that the primary key is mapped to and not the primary key.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_ITEM ( number integer not null, TYPE_ID integer, created timestamp, primary key (number, TYPE_ID) ); create table RELATIONEX_ITEMTYPE ( id integer generated by default as identity, name varchar(20) not null, primary key (id) ); ... alter table RELATIONEX_ITEM add constraint FK355BBDA349D11870 foreign key (TYPE_ID) references RELATIONEX_ITEMTYPE;
Further test the assertion above about the primary key being mapped to the foreign key by trying to define the primary key columns for Item. Add the following for the ItemPK class.
public class ItemPK implements Serializable {
@Column(name="TYPE_ID_PK")
private int typeId; //unique value from parent ItemType.id
@Column(name="NUMBER_PK")
private int number; //unique value assigned to instances of Item
Notice only the non-FK annotation column definition was picked up from the primary key class. The other property is defined by the foreign key column definition in the entity class.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_ITEM ( NUMBER_PK integer not null, TYPE_ID integer, created timestamp, primary key (NUMBER_PK, TYPE_ID) ); create table RELATIONEX_ITEMTYPE ( id integer generated by default as identity, name varchar(20) not null, primary key (id) ); ... alter table RELATIONEX_ITEM add constraint FK355BBDA349D11870 foreign key (TYPE_ID) references RELATIONEX_ITEMTYPE;
Add the following test method to your unit test. As in the previous sections, this code will attempt to persist the parent, assign the managed parent to the child, and then persist the child. The flush()es were added simply to control when the database output was issued and printed and not a requirement of the scenario.
@Test
public void testManyToOneUniMapsIdEmbedded() {
log.info("*** testManyToOneUniMapsIdEmbedded ***");
ItemType type = new ItemType("snowblower");
log.debug("persisting parent:" + type);
em.persist(type);
em.flush();
log.debug("persisted parent:" + type);
Item item = new Item(type,1);
item.setCreated(new Date());
log.debug("persisting child:" + item);
em.persist(item);
em.flush();
log.debug("persisted child:" + item);
//check PK assigned
ItemPK pk = new ItemPK().setTypeId(type.getId()).setNumber(1);
assertTrue(String.format("expected PK %s not match actual %s", pk,
item.getId()), pk.equals(item.getId()));
}
Build the module and run the new unit test. Notice first how the parent primary key was unassigned until after the persist. After the persist, the output changed from 0 to a non-0 value. The same thing happened for the child entity. The property in the child's primary key that is mapped to the foreign key was left unassigned by the child entity class. The provider updated the primary key values during the persist and caused the primary key property to be equal to the foreign key value.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest#testManyToOneUniMapsIdEmbedded ... -persisting parent:0:snowblower Hibernate: insert into RELATIONEX_ITEMTYPE (id, name) values (null, ?) -persisted parent:1:snowblower -persisting child:1:snowblowerpk=(typeId=0,number=1) Hibernate: ^ insert +- unassigned into RELATIONEX_ITEM (created, NUMBER_PK, TYPE_ID) values +- assigned by provider (?, ?, ?) v -persisted child:1:snowblowerpk=(typeId=1,number=1) ... [INFO] BUILD SUCCESS
Place the following lines in your test method. This will cause new instances to be pulled from the database and checked against their expected values.
log.debug("getting new instances");
em.clear();
Item item2 = em.find(Item.class, pk);
log.debug("checking child");
assertNotNull("child not found by primary key:" + pk, item2);
assertTrue("unexpected child data", item.getCreated().equals(item2.getCreated()));
log.debug("checking parent");
assertEquals("unexpected parent data", type.getName(), item2.getItemType().getName());
Rebuild the module and re-run the test method. Notice the provider continues to perform an EAGER fetch of the parent when using find() to locate the child. The provider is able to locate the child entity using a primary key instance initialized with the parent's ID and a unique value assigned to the child number.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest#testManyToOneUniMapsIdEmbedded ... -getting new instances Hibernate: select item0_.NUMBER_PK as NUMBER1_32_1_, item0_.TYPE_ID as TYPE2_32_1_, item0_.created as created32_1_, itemtype1_.id as id33_0_, itemtype1_.name as name33_0_ from RELATIONEX_ITEM item0_ inner join RELATIONEX_ITEMTYPE itemtype1_ on item0_.TYPE_ID=itemtype1_.id where item0_.NUMBER_PK=? and item0_.TYPE_ID=? -checking child -checking parent ... [INFO] BUILD SUCCESS
Add the following lines to the test method. This will add additional child related to the same parent entity.
Item itemB = new Item(item2.getItemType(),2);
log.debug("add more child entities:" + itemB);
itemB.setCreated(new Date());
em.persist(itemB);
em.flush();
log.debug("new child entities added:" + itemB);
Rebuild the module and re-run the test method. Notice, as before, only the foreign key and unique value is assigned to the new child entity prior to calling persist(). The primary key property used to identify the parent entity is unassigned going into the persist and then updated by the provider once the persist completes.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2OneTest#testManyToOneUniMapsIdEmbedded ... -add more child entities:1:snowblowerpk=(typeId=0,number=2) Hibernate: ^ insert +- unassigned into RELATIONEX_ITEM (created, NUMBER_PK, TYPE_ID) values +- assigned by provider (?, ?, ?) v -new child entities added:1:snowblowerpk=(typeId=1,number=2) ... [INFO] BUILD SUCCESS
Add the following lines to the test method. These will get new instances of all the children associated with a common parent. Notice we are using the foreign key during the where clause and no the primary key value. It does not matter which value we use since they are both mapped to the same column.
log.debug("getting new instances of children");
em.clear();
List<Item> items = em.createQuery(
"select i from Item i " +
"where i.itemType.id=:typeId",
Item.class)
.setParameter("typeId", item.getItemType().getId())
.getResultList();
assertEquals("unexpected number of children", 2, items.size());
Rebuild the module and re-run the test method. Notice the column used during the where clause to get the child rows is the column we mapped for the foreign key.
-getting new instances of children Hibernate: select item0_.NUMBER_PK as NUMBER1_32_, item0_.TYPE_ID as TYPE2_32_, item0_.created as created32_ from RELATIONEX_ITEM item0_ where item0_.TYPE_ID=? Hibernate: select itemtype0_.id as id33_0_, itemtype0_.name as name33_0_ from RELATIONEX_ITEMTYPE itemtype0_ where itemtype0_.id=? ... [INFO] BUILD SUCCESS
If you changed the query to use the primary key instead of the foreign key we end up with the same query because (again) @MapsId caused these two properties to be mapped to the same column -- the foreign key column.
List<Item> items = em.createQuery(
"select i from Item i " +
//"where i.itemType.id=:typeId",
"where i.id.typeId=:typeId",
Item.class)
Hibernate: select item0_.NUMBER_PK as NUMBER1_32_, item0_.TYPE_ID as TYPE2_32_, item0_.created as created32_ from RELATIONEX_ITEM item0_ where item0_.TYPE_ID=?
You have completed going through an example many-to-one, uni-directional relationship where the child identity is partially determined from the identity of the parent. By default, the provider would have mapped these as two separate columns. By using @MapsId on the relationship -- the property within the primary key class identified by the @MapsId has its column mapped to the same column used to reference the parent entity.
This example used the @EmbeddedId. Another option would have been the use of an @IdClass.
In this chapter we took a look at variants of the many-to-one relationship using just the uni-directional case. You implemented the relationship using a simple foreign key from the child entity table as well as through a join-table. You formed relationships with simple types and embeddable types that were owned by the parent entity. You enacted orphanRemoval on first class child entities to have them deleted as well when removed from the collection.
In this chapter we are going to combine the aspects of the one-to-many annd many-to-one to form a bi-directional relationship. The "bi-directional" aspects are solely at the Java class level and do not change anything about the database. Foreign keys and join tables will look just as they did in the uni-directional case. Howevever, in this case, we will be able to easily navigate from parent to child and child to parent through the use of a variable reference from either direction.
As with the one-to-one, bi-directional relationships we looked at in an earlier chapter, bi-directional relationships have an owning side and and inverse side. The owning side provides the mapping information and is the side of the relationship that drives the provider actions. The inverse side simply references the owning side (via "mappedBy" attribute). The inverse side will get initialized by the provider when obtaining object trees from the database. However the provider will not update or pay attention to the current state of the inverse side when it comes to persisting the state of the relation.
JPA does have some rules we need to follow when converting from uni-directional to bi-directional relationships. JPA requires the many side of a one-to-many, bi-directional relationship to be the owning side of that relationship. There is no choice to be made along those lines. That means the one side will always be the one side.
Create a JUnit test class to host tests for the one-to-many mappings.
Put the following Junit test case base class in your src/test tree. You can delete the sample test method once we add our first real test. JUnit will fail a test case if it cannot locate a @Test to run.
package myorg.relex;
import static org.junit.Assert.*;
import javax.persistence.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.junit.*;
public class One2ManyBiTest extends JPATestBase {
private static Logger log = LoggerFactory.getLogger(One2ManyBiTest.class);
@Test
public void testSample() {
log.info("testSample");
}
}
Verify the new JUnit test class builds and executes to completion
relationEx]$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest ... -HHH000401: using driver [org.h2.Driver] at URL [jdbc:h2:tcp://localhost:9092/./h2db/ejava] ... [INFO] BUILD SUCCESS
In this section we will demonstrate the use of a simple foreign key mapping from the owning/dependent entity table to the inverse/parent entity table.
Put the following class in your src/main tree. This class provides an example of the one/parent side of a one-to-many, bi-directional relationship. It is currently incomplete and we will fix shortly. This biggest issue is the lack of a "mappedBy" attribute in the @OneToMany mapping. That attribute is required to form the bi-directional relationship.
package myorg.relex.one2manybi;
import java.util.ArrayList;
import java.util.List;
import javax.persistence.*;
/**
* This class provides an example of the one/parent side of a one-to-many, bi-directional relationship
* that will be realized through a foreign key from the many/child side of the relationship. Being the
* one side of the one-to-many relationship, this class must implement the inverse side.
*/
@Entity
@Table(name="RELATIONEX_BORROWER")
public class Borrower {
@Id @GeneratedValue
private int id;
@OneToMany(
// mappedBy="borrower"
// , cascade={CascadeType.PERSIST, CascadeType.DETACH, CascadeType.REMOVE}
// , orphanRemoval=true
// , fetch=FetchType.EAGER
)
private List<Loan> loans;
@Column(length=12)
private String name;
public int getId() { return id; }
public List<Loan> getLoans() {
if (loans == null) {
loans = new ArrayList<Loan>();
}
return loans;
}
public void setLoans(List<Loan> loans) {
this.loans = loans;
}
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Put the following class in your src/main tree. This class provides an example of the many/child side of a many-to-one, bi-directional relationship. Thus, this class will define the mapping to the database and does so using a simple foreign key.
package myorg.relex.one2manybi;
import java.util.Date;
import javax.persistence.*;
/**
* This class provides an example of the many/child side of a many-to-one, bi-directional relationship.
* Being the many side of the many-to-one relationship, this class must implementing the owning side.
*/
@Entity
@Table(name="RELATIONEX_LOAN")
public class Loan {
@Id @GeneratedValue
private int id;
@ManyToOne(fetch=FetchType.EAGER, optional=false)
// @JoinColumn(name="BORROWER_ID")
private Borrower borrower;
@Temporal(TemporalType.DATE)
@Column(nullable=false)
private Date checkout;
@Temporal(TemporalType.DATE)
private Date checkin;
public Loan() {}
public Loan(Borrower borrower) {
this.borrower=borrower;
this.checkout=new Date();
}
public int getId() { return id; }
public boolean isOut() { return checkin==null; }
public Borrower getBorrower() { return borrower; }
public void setBorrower(Borrower borrower) {
this.borrower = borrower;
}
public Date getCheckout() { return checkout; }
public void setCheckout(Date checkout) {
this.checkout = checkout;
}
public Date getCheckin() { return checkin; }
public void setCheckin(Date checkin) {
this.checkin = checkin;
}
}
Add the new entity classes to the persistence unit.
<class>myorg.relex.one2manybi.Borrower</class>
<class>myorg.relex.one2manybi.Loan</class>
Generate schema for the module. Note the dual one-way relationships defined rather than a single bi-directional one. The foreign key from the child entity table to the parent entity table is correct. However, the link table from the parent entity table is not correct. This was added because of the lack of the the "mappedBy" attribute earlier.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_BORROWER ( id integer generated by default as identity, name varchar(12), primary key (id) ); create table RELATIONEX_BORROWER_RELATIONEX_LOAN ( <!== WRONG!!!! RELATIONEX_BORROWER_id integer not null, loans_id integer not null, unique (loans_id) ); ... create table RELATIONEX_LOAN ( id integer generated by default as identity, checkin date, checkout date not null, borrower_id integer not null, <!== CORRECT primary key (id) ); ... alter table RELATIONEX_BORROWER_RELATIONEX_LOAN <!== WRONG!!!! add constraint FKC555B9339909D56E foreign key (RELATIONEX_BORROWER_id) references RELATIONEX_BORROWER; alter table RELATIONEX_BORROWER_RELATIONEX_LOAN <!== WRONG!!!! add constraint FKC555B933458DDBCB foreign key (loans_id) references RELATIONEX_LOAN; alter table RELATIONEX_LOAN <!== CORRECT add constraint FK355D0780BC290DFE foreign key (borrower_id) references RELATIONEX_BORROWER;
Correct the mapping by adding "mappedBy" to the one/parent side of the relation.
public class Borrower {
...
@OneToMany(
mappedBy="borrower"
)
private List<Loan> loans;
Also make the foreign key mapping from the many/child side to the one/parent side more obvious by adding a @JoinColumn declaration.
public class Loan {
...
@ManyToOne(fetch=FetchType.EAGER, optional=false)
@JoinColumn(name="BORROWER_ID")
private Borrower borrower;
Regenerate schema for the module. Notice how we now only have the single foreign key to the parent entity table in the child entity table.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_BORROWER ( id integer generated by default as identity, name varchar(12), primary key (id) ); ... create table RELATIONEX_LOAN ( id integer generated by default as identity, checkin date, checkout date not null, BORROWER_ID integer not null, primary key (id) ); ... alter table RELATIONEX_LOAN add constraint FK355D0780BC290DFE foreign key (BORROWER_ID) references RELATIONEX_BORROWER;
Add the following test method to your JUnit test case. The initial version simply persists the object tree with a parent and single child. Notice how the parent is set on the child (the owning side) and the child is set on the parent (the inverse side).
@Test
public void testOneToManyBiFK() {
log.info("*** testOneToManyBiFK ***");
log.debug("persisting borrower");
Borrower borrower = new Borrower();
borrower.setName("fred");
em.persist(borrower);
em.flush();
log.debug("persisting loan");
Loan loan = new Loan(borrower);
borrower.getLoans().add(loan);
em.persist(borrower); //cascade.PERSIST
em.flush();
Notice how we are attempting to persist the child -- by associating it with the parent and then calling em.persist() again on the parent. This is legal. Calling persist on an already managed entity causes nothing to happen to the already managed entity but it will execute all cascades.
If you build the module and run the test method you will notice a problem. The child is never saved to the database. We will fix shortly.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiFK ... -*** testOneToManyBiFK *** -persisting borrower Hibernate: insert into RELATIONEX_BORROWER (id, name) values (null, ?) -persisting loan ... [INFO] BUILD SUCCESS
Add the following lines to your test method to help detect the error with the persist above.
log.debug("getting new instances from parent side");
em.detach(borrower);
Borrower borrower2 = em.find(Borrower.class, borrower.getId());
log.debug("checking parent");
assertNotNull("borrower not found", borrower2);
log.debug("checking parent collection");
assertEquals("no loans found", 1, borrower2.getLoans().size());
log.debug("checking child");
assertEquals("unexpected child id", loan.getId(), borrower2.getLoans().get(0).getId());
Rebuild the module and re-run the test method. Notice in this output the provider first retrieves the parent during the find and then LAZY loads the child. The test fails because no child was found.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiFK ... -getting new instances from parent side Hibernate: select borrower0_.id as id36_0_, borrower0_.name as name36_0_ from RELATIONEX_BORROWER borrower0_ where borrower0_.id=? -checking parent -checking parent collection Hibernate: select loans0_.BORROWER_ID as BORROWER4_36_1_, loans0_.id as id1_, loans0_.id as id37_0_, loans0_.BORROWER_ID as BORROWER4_37_0_, loans0_.checkin as checkin37_0_, loans0_.checkout as checkout37_0_ from RELATIONEX_LOAN loans0_ where loans0_.BORROWER_ID=? ... Failed tests: testOneToManyBiFK(myorg.relex.One2ManyBiTest): no loans found expected:<1> but was:<0> ... [INFO] BUILD FAILURE
Fix the persist issue above by adding cascade=PERSIST from the parent to the child. Add cascade.DETACH to cover the detach() call from the parent in the test method and cascade.DELETE in case we wish to delete the object tree from the parent.
public class Borrower {
@Id @GeneratedValue
private int id;
@OneToMany(
mappedBy="borrower"
, cascade={CascadeType.PERSIST, CascadeType.DETACH, CascadeType.REMOVE}
)
private List<Loan> loans;
Rebuild the module and re-run the test method. Notice how setting the cascade=PERSIST causes the second call of persist() on the parent entity to have the child persisted to the database.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiFK ... -persisting borrower Hibernate: insert into RELATIONEX_BORROWER (id, name) values (null, ?) -persisting loan Hibernate: insert into RELATIONEX_LOAN (id, BORROWER_ID, checkin, checkout) values (null, ?, ?, ?)
The parent is still LAZY loaded and attempts to load the child will not occur until the child collection is accessed. This, obviously, is efficient for when the children are not commonly accessed.
-getting new instances from parent side Hibernate: select borrower0_.id as id36_0_, borrower0_.name as name36_0_ from RELATIONEX_BORROWER borrower0_ where borrower0_.id=? -checking parent
Once the test method accesses the child collection, the provider must query the database to obtain the children in the collection.
-checking parent collection Hibernate: select loans0_.BORROWER_ID as BORROWER4_36_1_, loans0_.id as id1_, loans0_.id as id37_0_, loans0_.BORROWER_ID as BORROWER4_37_0_, loans0_.checkin as checkin37_0_, loans0_.checkout as checkout37_0_ from RELATIONEX_LOAN loans0_ where loans0_.BORROWER_ID=? -checking child ... [INFO] BUILD SUCCESS
Change the fetch mode of the parent to EAGER to see how this impacts our queries.
public class Borrower {
...
@OneToMany(
mappedBy="borrower"
, cascade={CascadeType.PERSIST, CascadeType.DETACH, CascadeType.REMOVE}
, fetch=FetchType.EAGER
)
private List<Loan> loans;
Rebuild the module and re-run the test method. Notice how the two queries have been replaced with a single query (with a join) for both the parent and child tables. This obviously is more efficient if *all* children are always accessed as a part of accessing the parent.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiFK ... -getting new instances from parent side Hibernate: select borrower0_.id as id36_1_, borrower0_.name as name36_1_, loans1_.BORROWER_ID as BORROWER4_36_3_, loans1_.id as id3_, loans1_.id as id37_0_, loans1_.BORROWER_ID as BORROWER4_37_0_, loans1_.checkin as checkin37_0_, loans1_.checkout as checkout37_0_ from RELATIONEX_BORROWER borrower0_ left outer join RELATIONEX_LOAN loans1_ on borrower0_.id=loans1_.BORROWER_ID where borrower0_.id=? -checking parent -checking parent collection -checking child ... [INFO] BUILD SUCCESS
Add the following lines to your test method to add an additional child to the collection. Notice how both sides of the relation are being set by the application. The provider only insists the owning/many side be set, but consistency within the application requires the inverse to be set as well. Both the inverse and owning side are initialized by the provider -- as demonstrated by the previous block of asserts.
log.debug("adding new child");
Loan loanB = new Loan(borrower2);
borrower2.getLoans().add(loanB);
em.persist(borrower2);
em.flush();
Rebuild the module and re-run the test method. Notice how a persist of the managed parent with one managed child and one un-managed child causes only the un-managed child to be persisted to the database (because we have cascade=PERSIST set on the parent)
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiFK ... -adding new child Hibernate: insert into RELATIONEX_LOAN (id, BORROWER_ID, checkin, checkout) values (null, ?, ?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to your test method. They demonstrate how, because of the bi-directional relationship, we can access the object graph from the child side as well as the parent.
log.debug("getting new instances from child side");
em.detach(borrower2);
Loan loan2 = em.find(Loan.class, loan.getId());
log.debug("checking child");
assertNotNull("child not found", loan2);
assertNotNull("parent not found", loan2.getBorrower());
log.debug("checking parent");
assertEquals("unexpected number of children", 2, loan2.getBorrower().getLoans().size());
Rebuild the module and re-run the test method. Notice how the first child, parent, and all its children were queried for during the first find() and prior to any accesses to the object tree. This is because of EAGER fetches defined on both sides.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiFK ... -getting new instances from child side Hibernate: select loan0_.id as id37_1_, loan0_.BORROWER_ID as BORROWER4_37_1_, loan0_.checkin as checkin37_1_, loan0_.checkout as checkout37_1_, borrower1_.id as id36_0_, borrower1_.name as name36_0_ from RELATIONEX_LOAN loan0_ inner join RELATIONEX_BORROWER borrower1_ on loan0_.BORROWER_ID=borrower1_.id where loan0_.id=? Hibernate: select loans0_.BORROWER_ID as BORROWER4_36_1_, loans0_.id as id1_, loans0_.id as id37_0_, loans0_.BORROWER_ID as BORROWER4_37_0_, loans0_.checkin as checkin37_0_, loans0_.checkout as checkout37_0_ from RELATIONEX_LOAN loans0_ where loans0_.BORROWER_ID=? -checking child -checking parent ... [INFO] BUILD SUCCESS
Change the fetch to LAZY on the child.
public class Loan {
...
@ManyToOne(fetch=FetchType.LAZY, optional=false)
@JoinColumn(name="BORROWER_ID")
private Borrower borrower;
Rebuild the module and re-run the test method. Notice how only the initial child is loaded for during the find() and then the parent is loaded (with children) once accessed.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiFK ... -getting new instances from child side Hibernate: select loan0_.id as id37_0_, loan0_.BORROWER_ID as BORROWER4_37_0_, loan0_.checkin as checkin37_0_, loan0_.checkout as checkout37_0_ from RELATIONEX_LOAN loan0_ where loan0_.id=? -checking child -checking parent Hibernate: select borrower0_.id as id36_1_, borrower0_.name as name36_1_, loans1_.BORROWER_ID as BORROWER4_36_3_, loans1_.id as id3_, loans1_.id as id37_0_, loans1_.BORROWER_ID as BORROWER4_37_0_, loans1_.checkin as checkin37_0_, loans1_.checkout as checkout37_0_ from RELATIONEX_BORROWER borrower0_ left outer join RELATIONEX_LOAN loans1_ on borrower0_.id=loans1_.BORROWER_ID where borrower0_.id=? ... [INFO] BUILD SUCCESS
Feel free to experiment with a few more combinations of LAZY and EAGER to make sure you understand the implications of choosing one over the other.
Add the following lines to your test method. This code orphans one of the children by removing it from the parent collection. We would like to see the orphaned child deleted by the provider, but we have to fix our mapping specification first.
log.debug("orphaning one of the children");
int startCount = em.createQuery("select count(l) from Loan l", Number.class).getSingleResult().intValue();
Borrower borrower3 = loan2.getBorrower();
borrower3.getLoans().remove(loan2);
em.flush();
assertEquals("orphaned child not deleted", startCount-1,
em.createQuery("select count(l) from Loan l", Number.class).getSingleResult().intValue());
Rebuild the module and re-run the test method. Notice how nothing changed in the database and our test failed. The fact the child was removed from the inverse side of the relation meant nothing the way our relationship is currently mapped.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiFK ... -orphaning one of the children Hibernate: select count(loan0_.id) as col_0_0_ from RELATIONEX_LOAN loan0_ limit ? Hibernate: select count(loan0_.id) as col_0_0_ from RELATIONEX_LOAN loan0_ limit ? ... [INFO] BUILD FAILURE
Enable orphanRemoval on the parent collection.
public class Borrower {
@Id @GeneratedValue
private int id;
@OneToMany(
mappedBy="borrower"
, cascade={CascadeType.PERSIST, CascadeType.DETACH, CascadeType.REMOVE}
, orphanRemoval=true
, fetch=FetchType.EAGER
)
private List<Loan> loans;
Rebuild the module and re-run the test method. Notice how the orphaned child is now deleted when removed form the collection.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiFK ... -orphaning one of the children Hibernate: select count(loan0_.id) as col_0_0_ from RELATIONEX_LOAN loan0_ limit ? Hibernate: delete from RELATIONEX_LOAN where id=? Hibernate: select count(loan0_.id) as col_0_0_ from RELATIONEX_LOAN loan0_ limit ? ... [INFO] BUILD SUCCESS
Add the final lines to the test method. This will attempt to delete the entire object graph by removing just the parent. This will work because we added cascade=DELETE earlier.
log.debug("deleting parent");
em.remove(borrower3);
em.flush();
assertEquals("orphaned child not deleted", startCount-2,
em.createQuery("select count(l) from Loan l", Number.class).getSingleResult().intValue());
Rebuild the module and re-run the test method. Notice how each child gets deleted from the database by ID and then the parent is removed.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiFK ... -deleting parent Hibernate: delete from RELATIONEX_LOAN where id=? Hibernate: delete from RELATIONEX_BORROWER where id=? Hibernate: select count(loan0_.id) as col_0_0_ from RELATIONEX_LOAN loan0_ limit ? ... [INFO] BUILD SUCCESS
You have finished going through a one-to-many/many-to-one, bi-directional relationship that is realized through a foreign key column in the child entity table. We also added fetch, cascade, and orphanRemoval features to show some build-in provider functionality that can save some code when working with large object graphs.
In this section we will demonstrate mapping a one-to-many relationship using a join table and a bi-directional relationship. From the database perspective, this will look identical to the one-to-many, uni-directional case. However, from the JPA-perspective the relationship is being owned (i.e, defined) by the child/many side. In the uni-directional case there was no property in the child/many entity class that represented the relationship. Now there is.
Put the following class in your src/main tree. This entity class provides an example of the one/inverse side of a one-to-many, bi-directional relationship mapped using a join-table. Or at least it will be. The current version below has a few errors we need to correct.
package myorg.relex.one2manybi;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import javax.persistence.*;
/**
* This class provides an example of the one/inverse side of a one-to-many, bi-directional
* relationship realized through a join-table mapped from the owning/many side.
*/
@Entity
@Table(name="RELATIONEX_PURCHASE")
public class Purchase {
@Id @GeneratedValue
private int id;
@OneToMany(
// mappedBy="purchase",
// cascade={CascadeType.PERSIST, CascadeType.DETACH},
// orphanRemoval=true
)
private List<SaleItem> items;
@Temporal(TemporalType.TIMESTAMP)
@Column(nullable=false, updatable=false)
private Date date;
protected Purchase() {}
public Purchase(Date date) {
this.date = date;
}
public int getId() { return id; }
public Date getDate() { return date; }
public List<SaleItem> getItems() {
if (items == null) {
items = new ArrayList<SaleItem>();
}
return items;
}
public Purchase addItem(SaleItem item) {
getItems().add(item);
return this;
}
}
Place the following class in your src/main tree. This class provides an example of the many/owning side of a many-to-one relationship mapped using a join table. It is currently incomplete and we will work to expose the issues and correct in the following steps.
package myorg.relex.one2manybi;
import java.math.BigDecimal;
import javax.persistence.*;
/**
* This class provides and example of the many/owning side of a many-to-one, bi-directional
* relationship that is realized using a join-table.
*/
@Entity
@Table(name="RELATIONEX_SALEITEM")
public class SaleItem {
@Id @GeneratedValue
private int id;
@ManyToOne//(optional=false, fetch=FetchType.EAGER)
// @JoinTable(
// name="RELATIONEX_SALEITEM_PURCHASE",
// joinColumns=@JoinColumn(name="SALEITEM_ID"),
// inverseJoinColumns=@JoinColumn(name="PURCHASE_ID")
// )
private Purchase purchase;
@Column(length=16)
private String name;
@Column(precision=5, scale=2)
private BigDecimal price;
protected SaleItem() {}
public SaleItem(Purchase purchase) {
this.purchase = purchase;
}
public int getId() { return id; }
public Purchase getPurchase() { return purchase; }
public void setPurchase(Purchase purchase) {
this.purchase = purchase;
}
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
public double getPrice() { return price==null? 0 : price.doubleValue(); }
public void setPrice(double price) {
this.price = new BigDecimal(price);
}
}
Add the new entity classes to the persistence unit.
<class>myorg.relex.one2manybi.Purchase</class>
<class>myorg.relex.one2manybi.SaleItem</class>
Generate schema for the new entity classes and their relationship. Notice how we don't have a bi-directional relationship. We have two uni-directional relationships. The owned relationship by the one side has formed a join-table and the owned relationship from the many side has formed a foreign key relationship.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_PURCHASE ( id integer generated by default as identity, date timestamp not null, primary key (id) ); create table RELATIONEX_PURCHASE_RELATIONEX_SALEITEM ( <!== WRONG, missing @OneToMany.mappedBy RELATIONEX_PURCHASE_id integer not null, items_id integer not null, unique (items_id) ); ... create table RELATIONEX_SALEITEM ( id integer generated by default as identity, name varchar(16), price decimal(5,2), purchase_id integer, <!== WRONG, missing @JoinTable primary key (id) ); ... alter table RELATIONEX_PURCHASE_RELATIONEX_SALEITEM add constraint FK8157C4BCB4DABD0E foreign key (RELATIONEX_PURCHASE_id) references RELATIONEX_PURCHASE; alter table RELATIONEX_PURCHASE_RELATIONEX_SALEITEM add constraint FK8157C4BC3F0D578 foreign key (items_id) references RELATIONEX_SALEITEM; alter table RELATIONEX_SALEITEM add constraint FKAD87326AD7F9F59E foreign key (purchase_id) references RELATIONEX_PURCHASE;
Correct the bi-directional relationship by adding mappedBy to the @OneToMany mapping in the parent.
public class Purchase {
@OneToMany(
mappedBy="purchase"
)
private List<SaleItem> items;
Generate schema for the new entity classes and their relationship. Notice how the join table implementing the owned relationship from the parent/one side has been removed. However, what remains is a foreign key join owned by the child/many side.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_PURCHASE ( id integer generated by default as identity, date timestamp not null, primary key (id) ); ... create table RELATIONEX_SALEITEM ( id integer generated by default as identity, name varchar(16), price decimal(5,2), purchase_id integer, <!=== WRONG, missing @JoinTable primary key (id) ); ... alter table RELATIONEX_SALEITEM add constraint FKAD87326AD7F9F59E foreign key (purchase_id) references RELATIONEX_PURCHASE;
Attempt to correct the mapping (remember -- we wanted this example to use a join table), by adding a @JoinTable mapping in the child/many side. We will start by allowing the provider to generate default table names.
public class SaleItem {
...
@ManyToOne
@JoinTable
private Purchase purchase;
Regenerate schema for the entity classes and their relationship. Notice by the error produced that the link table name must be provided when defined from the child/many side. There is no default for this case.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... Unable to configure EntityManagerFactory: JoinTable.name() on a @ToOne association has to be explicit: myorg.relex.one2manybi.SaleItem.purchase
Add a table name for the join table.
public class SaleItem {
...
@ManyToOne//(optional=false, fetch=FetchType.EAGER)
@JoinTable(
name="RELATIONEX_SALEITEM_PURCHASE"
)
private Purchase purchase;
Regenerate schema for the entity classes and their relationship. Notice how we now have regained our link table (from when it use to be generated from the parent side), specified a name for it, and have default names for foreign keys to the parent and child tables. Notice too that since this is a many-to-one relationship, the reference to the child is a primary key for the link table -- which means the child can only be mapped once by the joint table. The same was true when the child table contained a foreign key column.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_PURCHASE ( id integer generated by default as identity, date timestamp not null, primary key (id) ); ... create table RELATIONEX_SALEITEM ( id integer generated by default as identity, name varchar(16), price decimal(5,2), primary key (id) ); create table RELATIONEX_SALEITEM_PURCHASE ( purchase_id integer, <!=== many references to same parent legacy (many-to-one) id integer not null, primary key (id) <!=== reference to child is unique ); ... alter table RELATIONEX_SALEITEM_PURCHASE add constraint FKB4CE0B36BDB37099 foreign key (id) references RELATIONEX_SALEITEM; alter table RELATIONEX_SALEITEM_PURCHASE add constraint FKB4CE0B36D7F9F59E foreign key (purchase_id) references RELATIONEX_PURCHASE;
Make a few final tweaks to the database mapping. Lets provide explicit names for the foreign key columns within the join table
public class SaleItem {
@Id @GeneratedValue
private int id;
@ManyToOne(optional=false, fetch=FetchType.EAGER)
@JoinTable(
name="RELATIONEX_SALEITEM_PURCHASE",
joinColumns=@JoinColumn(name="SALEITEM_ID"),
inverseJoinColumns=@JoinColumn(name="PURCHASE_ID")
)
private Purchase purchase;
Regenerate schema for the entity classes and their relationship. Notice this time that the foreign key column names now have explicitly assigned names and with the @ManyToOne.optional=false the definition of the column back to the parent class became non-null.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_PURCHASE ( id integer generated by default as identity, date timestamp not null, primary key (id) ); ... create table RELATIONEX_SALEITEM ( id integer generated by default as identity, name varchar(16), price decimal(5,2), primary key (id) ); create table RELATIONEX_SALEITEM_PURCHASE ( PURCHASE_ID integer not null, SALEITEM_ID integer not null, primary key (SALEITEM_ID) ); ... alter table RELATIONEX_SALEITEM_PURCHASE add constraint FKB4CE0B36D7F9F59E foreign key (PURCHASE_ID) references RELATIONEX_PURCHASE; alter table RELATIONEX_SALEITEM_PURCHASE add constraint FKB4CE0B36371BCF1E foreign key (SALEITEM_ID) references RELATIONEX_SALEITEM;
Add the following test method to your existing JUnit test case. This method will create instances of the parent and child entities and relate them.
@Test
public void testOneToManyBiJoinTable() {
log.info("*** testOneToManyBiJoinTable ***");
log.debug("persisting parent");
Purchase purchase = new Purchase(new Date());
em.persist(purchase);
em.flush();
log.debug("persisting child");
SaleItem item = new SaleItem(purchase);
item.setPrice(10.02);
purchase.addItem(item);
em.persist(purchase); //cascade.PERSIST
em.flush();
}
Build the module and run the test method. Notice how only the parent class got persisted. This is because we did not enable any cascades from the parent to the child entity.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiJoinTable ... -*** testOneToManyBiJoinTable *** -persisting parent Hibernate: insert into RELATIONEX_PURCHASE (id, date) values (null, ?) -persisting child ... [INFO] BUILD SUCCESS
Make the error more obvious by adding the following lines to the test method. Among other things, this section of code will check to see if the child entity exists in the database.
log.debug("getting new instances");
em.detach(purchase);
Purchase purchase2 = em.find(Purchase.class, purchase.getId());
assertNotNull("parent not found", purchase2);
log.debug("checking parent");
assertTrue("unexpected date", purchase.getDate().equals(purchase2.getDate()));
log.debug("checking child");
assertEquals("unexpected number of children", 1, purchase2.getItems().size());
assertEquals("", item.getPrice(), purchase2.getItems().get(0).getPrice(),.01);
log.debug("verify got new instances");
assertFalse("same parent instance returned", purchase == purchase2);
assertFalse("same child instance returned", item == purchase2.getItems().get(0));
Rebuild the module and re-run the test method. Notice the initial find() simply does a LAZY load on the parent table. Once the test method accesses the child collection -- the related child entities are loaded along with the join-table and the parent table. The join table is queried to locate the parent table and the parent table is queried for because of the EAGER fetch specified in the child mapping.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiJoinTable ... -getting new instances Hibernate: select purchase0_.id as id38_0_, purchase0_.date as date38_0_ from RELATIONEX_PURCHASE purchase0_ where purchase0_.id=? -checking parent -checking child Hibernate: select items0_.PURCHASE_ID as PURCHASE1_38_2_, items0_.SALEITEM_ID as SALEITEM2_2_, saleitem1_.id as id39_0_, saleitem1_.name as name39_0_, saleitem1_.price as price39_0_, saleitem1_1_.PURCHASE_ID as PURCHASE1_40_0_, purchase2_.id as id38_1_, purchase2_.date as date38_1_ from RELATIONEX_SALEITEM_PURCHASE items0_ inner join RELATIONEX_SALEITEM saleitem1_ on items0_.SALEITEM_ID=saleitem1_.id left outer join RELATIONEX_SALEITEM_PURCHASE saleitem1_1_ on saleitem1_.id=saleitem1_1_.SALEITEM_ID inner join RELATIONEX_PURCHASE purchase2_ on saleitem1_1_.PURCHASE_ID=purchase2_.id where items0_.PURCHASE_ID=? ... [INFO] BUILD FAILURE <!== We expected this -- caused by no cascade=PERSIST
Correct the cascade specification by allowing entity manager persist() commands to cascade to related children.
public class Purchase {
@OneToMany(
mappedBy="purchase",
cascade={CascadeType.PERSIST}
)
private List<SaleItem> items;
Rebuild the module and re-run the test method. Notice how we now persist the child and a row in the join table to form the relationship back to the parent.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiJoinTable ... -*** testOneToManyBiJoinTable *** -persisting parent Hibernate: insert into RELATIONEX_PURCHASE (id, date) values (null, ?) -persisting child Hibernate: insert into RELATIONEX_SALEITEM (id, name, price) values (null, ?, ?) Hibernate: insert into RELATIONEX_SALEITEM_PURCHASE (PURCHASE_ID, SALEITEM_ID) values (?, ?)
The next block of code was able to locate the parent, relationship, and child entities. This is the same query as before except this one returned a child entity.
-getting new instances Hibernate: select purchase0_.id as id38_0_, purchase0_.date as date38_0_ from RELATIONEX_PURCHASE purchase0_ where purchase0_.id=? -checking parent -checking child Hibernate: select items0_.PURCHASE_ID as PURCHASE1_38_2_, items0_.SALEITEM_ID as SALEITEM2_2_, saleitem1_.id as id39_0_, saleitem1_.name as name39_0_, saleitem1_.price as price39_0_, saleitem1_1_.PURCHASE_ID as PURCHASE1_40_0_, purchase2_.id as id38_1_, purchase2_.date as date38_1_ from RELATIONEX_SALEITEM_PURCHASE items0_ inner join RELATIONEX_SALEITEM saleitem1_ on items0_.SALEITEM_ID=saleitem1_.id left outer join RELATIONEX_SALEITEM_PURCHASE saleitem1_1_ on saleitem1_.id=saleitem1_1_.SALEITEM_ID inner join RELATIONEX_PURCHASE purchase2_ on saleitem1_1_.PURCHASE_ID=purchase2_.id where items0_.PURCHASE_ID=?
The test fails, however, because we received the same instance of the child that was related to the original parent. This is because our detach() call was not cascaded to the child.
-verify got new instances ... Failed tests: testOneToManyBiJoinTable(myorg.relex.One2ManyBiTest): same child instance returned ... [INFO] BUILD FAILURE
Add cascade=DETACH to the parent side. This will cause any detach() call on the parent to also detach() the child entitities.
public class Purchase { @Id @GeneratedValue private int id; @OneToMany( mappedBy="purchase", cascade={CascadeType.PERSIST, CascadeType.DETACH} ) private List<SaleItem> items;
Rebuild the module and re-run the test method. Notice we now get a new instance for both the parent and child because of the call of detach on the parent and the cascade of the call to the child.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiJoinTable ... -verify got new instances ... [INFO] BUILD SUCCESS
Add the following lines to your test method. This will add a new child entity to the parent.
log.debug("adding new child");
SaleItem itemB = new SaleItem(purchase2);
purchase2.addItem(itemB);
em.persist(purchase2);
em.flush();
Rebuild the module and re-run the test method. Notice this looks much like the first child that was persisted. A row in the child table is added -- followed by a row in the join table.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiJoinTable ... -adding new child Hibernate: insert into RELATIONEX_SALEITEM (id, name, price) values (null, ?, ?) Hibernate: insert into RELATIONEX_SALEITEM_PURCHASE (PURCHASE_ID, SALEITEM_ID) values (?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to the test method. This will obtain a access to the object graph based on a reference from the child.
log.debug("getting new instances from child side");
em.detach(purchase2);
SaleItem item2 = em.find(SaleItem.class, item.getId());
log.debug("checking child");
assertNotNull("child not found", item2);
assertNotNull("parent not found", item2.getPurchase());
log.debug("checking parent");
assertEquals("unexpected number of children", 2, item2.getPurchase().getItems().size());
Rebuild the module and re-run the test method. Notice how the find() is implementing an EAGER fetch of the relation and parent in addition the the state of the child.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiJoinTable ... -getting new instances from child side Hibernate: select saleitem0_.id as id39_1_, saleitem0_.name as name39_1_, saleitem0_.price as price39_1_, saleitem0_1_.PURCHASE_ID as PURCHASE1_40_1_, purchase1_.id as id38_0_, purchase1_.date as date38_0_ from <!==== query for child RELATIONEX_SALEITEM saleitem0_ left outer join <!==== EAGER fetch of relation RELATIONEX_SALEITEM_PURCHASE saleitem0_1_ on saleitem0_.id=saleitem0_1_.SALEITEM_ID inner join <!==== EAGER fetch of parent RELATIONEX_PURCHASE purchase1_ on saleitem0_1_.PURCHASE_ID=purchase1_.id where saleitem0_.id=?
However -- even though the first child, relation, and parent of that child was eagerly fetched, the remaing children for the parent must be fetched once we inspect the state of the parent.
-checking child -checking parent Hibernate: select items0_.PURCHASE_ID as PURCHASE1_38_2_, items0_.SALEITEM_ID as SALEITEM2_2_, saleitem1_.id as id39_0_, saleitem1_.name as name39_0_, saleitem1_.price as price39_0_, saleitem1_1_.PURCHASE_ID as PURCHASE1_40_0_, purchase2_.id as id38_1_, purchase2_.date as date38_1_ from RELATIONEX_SALEITEM_PURCHASE items0_ inner join RELATIONEX_SALEITEM saleitem1_ on items0_.SALEITEM_ID=saleitem1_.id left outer join RELATIONEX_SALEITEM_PURCHASE saleitem1_1_ on saleitem1_.id=saleitem1_1_.SALEITEM_ID inner join RELATIONEX_PURCHASE purchase2_ on saleitem1_1_.PURCHASE_ID=purchase2_.id where items0_.PURCHASE_ID=? ... [INFO] BUILD SUCCESS
Change the mapping from EAGER to LAZY from the child.
public class SaleItem {
...
@ManyToOne(optional=false, fetch=FetchType.LAZY)
@JoinTable(
name="RELATIONEX_SALEITEM_PURCHASE",
joinColumns=@JoinColumn(name="SALEITEM_ID"),
inverseJoinColumns=@JoinColumn(name="PURCHASE_ID")
)
private Purchase purchase;
Rebuild the module and re-run the test method. Notice in this case the parent is not part of the initial query caused by the find().
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiJoinTable ... -getting new instances from child side Hibernate: select saleitem0_.id as id39_0_, saleitem0_.name as name39_0_, saleitem0_.price as price39_0_, saleitem0_1_.PURCHASE_ID as PURCHASE1_40_0_ from RELATIONEX_SALEITEM saleitem0_ left outer join RELATIONEX_SALEITEM_PURCHASE saleitem0_1_ on saleitem0_.id=saleitem0_1_.SALEITEM_ID where saleitem0_.id=?
But notice how the LAZY fatch from the child seemed to change the behavior of the parent. It did an initial LAZY fetch and then followed up with a query for state for the children.
-checking child -checking parent Hibernate: select purchase0_.id as id38_0_, purchase0_.date as date38_0_ from RELATIONEX_PURCHASE purchase0_ where purchase0_.id=? Hibernate: select items0_.PURCHASE_ID as PURCHASE1_38_1_, items0_.SALEITEM_ID as SALEITEM2_1_, saleitem1_.id as id39_0_, saleitem1_.name as name39_0_, saleitem1_.price as price39_0_, saleitem1_1_.PURCHASE_ID as PURCHASE1_40_0_ from RELATIONEX_SALEITEM_PURCHASE items0_ inner join RELATIONEX_SALEITEM saleitem1_ on items0_.SALEITEM_ID=saleitem1_.id left outer join RELATIONEX_SALEITEM_PURCHASE saleitem1_1_ on saleitem1_.id=saleitem1_1_.SALEITEM_ID where items0_.PURCHASE_ID=? ... [INFO] BUILD SUCCESS
Add the following lines to your test method. This will provide a test of orphan processing where we look the container to delete the child when the child is no longer referenced by the parent.
log.debug("orphaning one of the children");
int startCount = em.createQuery("select count(s) from SaleItem s", Number.class).getSingleResult().intValue();
Purchase purchase3 = item2.getPurchase();
purchase3.getItems().remove(item2);
em.flush();
assertEquals("orphaned child not deleted", startCount-1,
em.createQuery("select count(s) from SaleItem s", Number.class).getSingleResult().intValue());
Rebuild the module and re-run the test method. Notice that only the count(*) selects show up in the SQL when the commands execute and the test fails because the orphaned child is not removed. There is a reason for this -- and we will fix.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiJoinTable ... -orphaning one of the children Hibernate: select count(saleitem0_.id) as col_0_0_ from RELATIONEX_SALEITEM saleitem0_ left outer join RELATIONEX_SALEITEM_PURCHASE saleitem0_1_ on saleitem0_.id=saleitem0_1_.SALEITEM_ID limit ? Hibernate: select count(saleitem0_.id) as col_0_0_ from RELATIONEX_SALEITEM saleitem0_ left outer join RELATIONEX_SALEITEM_PURCHASE saleitem0_1_ on saleitem0_.id=saleitem0_1_.SALEITEM_ID limit ? ... Failed tests: testOneToManyBiJoinTable(myorg.relex.One2ManyBiTest): orphaned child not deleted expected:<1> but was:<2> ... [INFO] BUILD FAILURE
Hold on here!?!? We admit that we didn't tell the provider to remove the orphan child, but didn't the code remove the relationship? NO! it did not. The child was removed from the parent collection, but that is the inverse side. With the way we currently have it mapped the relationship can only be removed by actions on the child and the only way to do that with a required (optional=false) parent is to manually remove the child or set orphanRemoval as we will do next.
Fix the mapping by enabling orphanRemoval from the parent to the child.
public class Purchase {
...
@OneToMany(
mappedBy="purchase",
cascade={CascadeType.PERSIST, CascadeType.DETACH},
orphanRemoval=true)
private List<SaleItem> items;
Rebuild the module and re-run the test method. Notice how the child is now removed from the database when it is removed from the parent (and the transaction is commited/flushed). Notice also the row out of the relationship table is removed as well when the child is removed.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiJoinTable ... -orphaning one of the children Hibernate: select count(saleitem0_.id) as col_0_0_ from RELATIONEX_SALEITEM saleitem0_ left outer join RELATIONEX_SALEITEM_PURCHASE saleitem0_1_ on saleitem0_.id=saleitem0_1_.SALEITEM_ID limit ? Hibernate: delete from RELATIONEX_SALEITEM_PURCHASE where SALEITEM_ID=? Hibernate: delete from RELATIONEX_SALEITEM where id=? Hibernate: select count(saleitem0_.id) as col_0_0_ from RELATIONEX_SALEITEM saleitem0_ left outer join RELATIONEX_SALEITEM_PURCHASE saleitem0_1_ on saleitem0_.id=saleitem0_1_.SALEITEM_ID limit ? ... [INFO] BUILD SUCCESS
Add the following lines to the test method. These will remove the parent and test to see if removing the parent also removed the remaining child.
log.debug("deleting parent");
em.remove(purchase3);
em.flush();
assertEquals("orphaned child not deleted", startCount-2,
em.createQuery("select count(s) from SaleItem s", Number.class).getSingleResult().intValue());
Rebuild the module and re-run the test method. Notice how the child and the relation were deleted even though there was not a cascade=DELETE on the parent to child relationship. That is because cascade=DELETE is not necessary with orphanDelete. They serve the same purpose when the parent is being deleted.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiJoinTable ... -deleting parent Hibernate: delete from RELATIONEX_SALEITEM_PURCHASE where SALEITEM_ID=? Hibernate: delete from RELATIONEX_SALEITEM where id=? Hibernate: delete from RELATIONEX_PURCHASE where id=? Hibernate: select count(saleitem0_.id) as col_0_0_ from RELATIONEX_SALEITEM saleitem0_ left outer join RELATIONEX_SALEITEM_PURCHASE saleitem0_1_ on saleitem0_.id=saleitem0_1_.SALEITEM_ID limit ? ... [INFO] BUILD SUCCESS
You have finished looking at one-to-many/many-to-one, bi-directional relationships mapped using a join table. This was functionally no different at the Java class level than the foreign key case and very similar to the one-to-many, uni-directional join table case. However, this mapping leveraged a relationship from the child that formed the mapping to the database and could be used to easily access the parent.
In this section we will demonstrate a one-to-many, bi-directional relationship where the primary key of the owning/dependent entity is derived from the one side.
Place the following class in your src/main tree. It provides an example of the one/parent/inverse side of a one-to-many, bi-directional relationship. We are going to skip making any errors with the entity and move straight to a reasonable solution. The key aspects to remember about one-to-many, bi-directional relationships are
The many/child side is required to be the owning side and the one/parent side is the inverse
The one/parent side declares it is the inverse side by adding the @OneToMany.mappedBy attribute
Without the parent declaring the mappedBy attribute, you end up with a dual uni-directional relationship and chaos
package myorg.relex.one2manybi;
import java.util.Date;
import java.util.HashSet;
import java.util.Set;
import javax.persistence.*;
/**
* This class is an example of the one/inverse side of a one-to-many, bi-directional
* relationship mapped using a compound foreign key that is partially derived from the
* parent primary key.
*/
@Entity
@Table(name="RELATIONEX_CAR")
public class Car {
@Id @GeneratedValue
private int id;
@OneToMany(
mappedBy="car",
cascade={CascadeType.PERSIST, CascadeType.DETACH},
orphanRemoval=true,
fetch=FetchType.LAZY)
private Set<Tire> tires;
@Column(length=16)
private String model;
@Temporal(TemporalType.DATE)
private Date year;
public int getId() { return id; }
public Set<Tire> getTires() {
if (tires==null) {
tires=new HashSet<Tire>();
}
return tires;
}
public String getModel() { return model; }
public void setModel(String model) {
this.model = model;
}
public Date getYear() { return year; }
public void setYear(Date year) {
this.year = year;
}
@Override
public int hashCode() {
return (model==null?0:model.hashCode()) + (year==null?0:year.hashCode());
}
@Override
public boolean equals(Object obj) {
try {
if (this==obj) { return true; }
Car rhs = (Car)obj;
return id==0 ? super.equals(obj) : id==rhs.id;
} catch (Exception ex) { return true; }
}
}
Put the following Enum in place in your src/main tree. This will be used by the example to help define the primary key of the child entity.
package myorg.relex.one2manybi;
public enum TirePosition {
LEFT_FRONT,
RIGHT_FRONT,
LEFT_REAR,
RIGHT_REAR
}
Put the following class in your src/main tree. This provides an example of the many/child/owning side of a many-to-one, bi-directional relationship that is mapped using a foreign that is used to partially derive the child's compound primary key. The child, in this case, uses an @IdClass to model the compound primary key. That means the primary key values will be exposed in the entity class as regular @Id values. Note, however, the foreign key is mapped as a relationship and not an ID Java value. We model the relationship in the entity class. We will model the foreign key value in the @IdClass -- but the names must match.
package myorg.relex.one2manybi;
import javax.persistence.*;
/**
* This class provides an example of the many/owning side of a many-to-one, bi-directional
* relationship mapped using a foreign key and that foreign key is used to derive the
* primary key of this class.
*/
@Entity
@Table(name="RELATIONEX_TIRE")
@IdClass(TirePK.class)
public class Tire {
@Id
@ManyToOne
@JoinColumn(name="CAR_ID", nullable=false)
private Car car;
@Id @Enumerated(EnumType.STRING)
@Column(length=16)
private TirePosition position;
private int miles;
protected Tire() {}
public Tire(Car car, TirePosition position) {
this.car = car;
this.position = position;
}
public TirePosition getPosition() { return position; }
public Car getCar() { return car; }
public int getMiles() { return miles; }
public void setMiles(int miles) {
this.miles = miles;
}
@Override
public int hashCode() {
return position.hashCode();
}
@Override
public boolean equals(Object obj) {
try {
if (this==obj) { return true; }
Tire rhs = (Tire)obj;
return car.equals(rhs.car) && position==rhs.position;
} catch (Exception ex) { return false; }
}
}
Put the following class in place. This class represents an primary key class that will be used as an @IdClass. That means
The properties must be modeled with the same property names as the entity class
The properties must be modeled with the same property access (FIELD or PROPERTY) as the entity class
The class must implement Serializable
The class must provide an implementation for hashCode() and equals()
package myorg.relex.one2manybi;
import java.io.Serializable;
/**
* This class provides an example of an IdClass used by a child entity in a
* many-to-one, bi-directional relationship where half of its primary key is
* derived form the parentId;
*/
public class TirePK implements Serializable {
private static final long serialVersionUID = -6028270454708159105L;
private int car; //shared primary key value from parent and child, name matches child rel
private TirePosition position; //child primary key value unique within parent
protected TirePK() {}
public TirePK(int carId, TirePosition position) {
this.car=carId;
this.position=position;
}
public int getAutoId() { return car; }
public TirePosition getPosition() { return position; }
@Override
public int hashCode() {
return car + (position==null?0:position.hashCode());
}
@Override
public boolean equals(Object obj) {
try {
if (this==obj) { return true; }
TirePK rhs = (TirePK)obj;
return car==rhs.car && position==rhs.position;
} catch (Exception ex) { return false; }
}
}
Add the entity classes to the persistence unit. Do not list the enum or primary key class here.
<class>myorg.relex.one2manybi.Car</class>
<class>myorg.relex.one2manybi.Tire</class>
Generate database schema for the entity classes and their relationship. Notice the foreign key is in the child entity table and is also being used as the primary key for the child entity table.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_CAR ( id integer generated by default as identity, model varchar(16), year date, primary key (id) ); ... create table RELATIONEX_TIRE ( CAR_ID integer not null, position varchar(16) not null, miles integer not null, primary key (CAR_ID, position) <!== Foreign key is also part of primary key ); ... alter table RELATIONEX_TIRE add constraint FK356095F89CA49F36 foreign key (CAR_ID) references RELATIONEX_CAR;
Add the following test method to your JUnit test case. This test method is similar to the previous sections. It creates an instance of the parent and child and relates the two.
@Test
public void testOneToManyBiDerivedClass() {
log.info("*** testOneToManyBiDerivedClass ***");
log.debug("persisting parent");
Car car = new Car();
car.setModel("DeLorean");
car.setYear(new GregorianCalendar(1983, 0, 0).getTime());
em.persist(car);
em.flush();
log.debug("persisting child");
Tire tire = new Tire(car, TirePosition.RIGHT_FRONT);
tire.setMiles(2000);
car.getTires().add(tire);
em.persist(car); //cascade.PERSIST
em.flush();
}
Build the module and run the test method. Notice that when the child is created -- the values for the parentId (CAR_ID) and other primary key value (position) are stored with the child. The parentId (CAR_ID) is serving as the foreign key and part of the primary key.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiDerivedIdClass ... -creating entity manager -*** testOneToManyBiDerivedClass *** -persisting parent Hibernate: insert into RELATIONEX_CAR (id, model, year) values (null, ?, ?) -persisting child Hibernate: insert into RELATIONEX_TIRE (miles, CAR_ID, position) values (?, ?, ?) ... [INFO] BUILD SUCCESS
Both the parent and child were successfully inserted into the database during repeated calls to persist() and passing the parent because we enabled cascade=PERSIST in the parent relationship mapping.
Add the following lines to your test method. This section will verify the parent and child exist and can be used to demonstrate the impact of a LAZY or EAGER fetch.
log.debug("getting new instances");
em.detach(car);
Car car2 = em.find(Car.class, car.getId());
assertNotNull("parent not found", car2);
log.debug("checking parent");
assertTrue("unexpected date", car.getYear().equals(car2.getYear()));
log.debug("checking child");
assertEquals("unexpected number of children", 1, car2.getTires().size());
assertEquals("unexpected child state", tire.getMiles(), car2.getTires().iterator().next().getMiles());
log.debug("verify got new instances");
assertFalse("same parent instance returned", car == car2);
assertFalse("same child instance returned", tire == car2.getTires().iterator().next());
Rebuild the module and re-run the test method. Notice the parent can be located by primary key through the find() and a LAZY fetch is performed when navigating to the child. Notice when the child is accessed -- the query is issued for members of the child table that match the foreign key and not each child individually.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiDerivedIdClass ... -getting new instances Hibernate: select car0_.id as id41_0_, car0_.model as model41_0_, car0_.year as year41_0_ from RELATIONEX_CAR car0_ where car0_.id=? -checking parent -checking child Hibernate: select tires0_.CAR_ID as CAR1_41_1_, tires0_.CAR_ID as CAR1_1_, tires0_.position as position1_, tires0_.CAR_ID as CAR1_42_0_, tires0_.position as position42_0_, tires0_.miles as miles42_0_ from RELATIONEX_TIRE tires0_ where tires0_.CAR_ID=? -verify got new instances ... [INFO] BUILD SUCCESS
Add the following lines to your test method to add a second child to the relationship.
log.debug("adding new child");
Tire tireB = new Tire(car2, TirePosition.LEFT_FRONT);
car2.getTires().add(tireB);
em.persist(car2);
em.flush();
Rebuild the module and re-run the test method. Notice the insert of the child and the creation of the relationship was done by a single insert into the child table (with the foreign key assigned). The child is persisted during the call to persist() on the already managed parent because of the cascade=PERSIST defined on the parent relationship mapping.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiDerivedIdClass ... -adding new child Hibernate: insert into RELATIONEX_TIRE (miles, CAR_ID, position) values (?, ?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to your test method to verify we can gain acess to the object tree through access from the child. This shows the power of the bi-directional relationship.
log.debug("getting new instances from child side");
em.detach(car2);
Tire tire2 = em.find(Tire.class, new TirePK(car.getId(), tire.getPosition()));
log.debug("checking child");
assertNotNull("child not found", tire2);
assertNotNull("parent not found", tire2.getCar());
log.debug("checking parent");
assertEquals("unexpected number of children", 2, tire2.getCar().getTires().size());
Rebuild the module and re-run the test method. Notice that when we issue find() on the child -- both columns of the compound primary key are used in the where clause.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiDerivedIdClass ... -getting new instances from child side Hibernate: select tire0_.CAR_ID as CAR1_42_1_, tire0_.position as position42_1_, tire0_.miles as miles42_1_, car1_.id as id41_0_, car1_.model as model41_0_, car1_.year as year41_0_ from RELATIONEX_TIRE tire0_ inner join RELATIONEX_CAR car1_ on tire0_.CAR_ID=car1_.id where tire0_.CAR_ID=? and tire0_.position=? -checking child -checking parent Hibernate: select tires0_.CAR_ID as CAR1_41_1_, tires0_.CAR_ID as CAR1_1_, tires0_.position as position1_, tires0_.CAR_ID as CAR1_42_0_, tires0_.position as position42_0_, tires0_.miles as miles42_0_ from RELATIONEX_TIRE tires0_ where tires0_.CAR_ID=? ... [INFO] BUILD SUCCESS
Add the following lines to your test method to verify orphan removal when the child is removed from the parent collection.
log.debug("orphaning one of the children");
int startCount = em.createQuery("select count(t) from Tire t", Number.class).getSingleResult().intValue();
Car car3 = tire2.getCar();
car3.getTires().remove(tire2);
em.flush();
assertEquals("orphaned child not deleted", startCount-1,
em.createQuery("select count(t) from Tire t", Number.class).getSingleResult().intValue());
Rebuild the module and re-run the test method. Notice the child is successfully deleted when it is orphaned by the removal from the parent collection. This works because we have added orphanRemoval=true to the parent relationship mapping.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiDerivedIdClass ... -orphaning one of the children Hibernate: select count((tire0_.CAR_ID, tire0_.position)) as col_0_0_ from RELATIONEX_TIRE tire0_ limit ? Hibernate: delete from RELATIONEX_TIRE where CAR_ID=? and position=? Hibernate: select count((tire0_.CAR_ID, tire0_.position)) as col_0_0_ from RELATIONEX_TIRE tire0_ limit ? ... [INFO] BUILD SUCCESS
Add the following lines to your test method to test cascade delete.
log.debug("deleting parent");
em.remove(car3);
em.flush();
assertEquals("orphaned child not deleted", startCount-2,
em.createQuery("select count(t) from Tire t", Number.class).getSingleResult().intValue());
Rebuild the module and re-run the test method. Notice how the parent and child both get deleted by the single delete on the parent. This is because we supplied the cascade=DELETE on the parent relationship mapping.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.One2ManyBiTest#testOneToManyBiDerivedIdClass ... -deleting parent Hibernate: delete from RELATIONEX_TIRE where CAR_ID=? and position=? Hibernate: delete from RELATIONEX_CAR where id=? Hibernate: select count((tire0_.CAR_ID, tire0_.position)) as col_0_0_ from RELATIONEX_TIRE tire0_ limit ? ... [INFO] BUILD SUCCESS
You have finished going through the derived compound primary key case with an @IdClass for a one-to-may/many-to-one, bi-directional relationship mapped using a foreign key. The primary example here was to derive the primary key from the parent for use in the child's identity. We annotated the @ManyToOne with @Id to show the foreign key mapping for the parent property was part of the child's primary key.
In this chapter we took a look at mapping bi-directional relationships that combined one-to-many and many-to-one. We mapped them with foreign keys and join tables. We also included a case where the child derived its primary key from the parent. Much of what we covered here overlaps with what was provided in the one-to-many and many-to-one, uni-directional chapters. However, in the bi-directional variant, it is easy to navigate from either side of the relationship to the other.
By the time you hit this chapter, you should have had the opportunity to see many relationship combinations and now we will cover the last official one: many-to-many. The many-to-many relationship is not a common relationship because it relates entities without any properties associated with the relationships. Commonly a relationship that starts off being a many-to-many will turn into a many-to-one and one-to-many to allow for properties to be assigned to the relationship. In UML data modeling they would refer to the "one"/entity in the middle as an Associated Class
Create a JUnit test class to host tests for the many-to-many mappings.
Put the following JUnit test case base class in your src/test tree. You can delete the sample test method once we add our first real test. JUnit will fail a test case if it cannot locate a @Test to run.
package myorg.relex;
import static org.junit.Assert.*;
import javax.persistence.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.junit.*;
public class Many2ManyBiTest extends JPATestBase {
private static Logger log = LoggerFactory.getLogger(Many2ManyBiTest.class);
@Test
public void testSample() {
log.info("testSample");
}
}
Verify the new JUnit test class builds and executes to completion
relationEx]$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyBiTest ... -HHH000401: using driver [org.h2.Driver] at URL [jdbc:h2:tcp://localhost:9092/./h2db/ejava] ... [INFO] BUILD SUCCESS
In this section we are going to form a many-to-many, uni-directional relationship. As always with uni-directional relationships, that means we will be able to navigate the relationships from only the owning side. The only way to access the owning side from the inverse side is through a JPA query.
Place the following class in your src/main tree. This class provides an example of the inverse side of a many-to-many, uni-directional relationship so there will be no reference to the relationship within this class. This class, however, has implemented a hashCode() and equals() method so that instances can be correctly identified within multiple collections.
package myorg.relex.many2many;
import javax.persistence.*;
/**
* This class provides an example of the inverse side of a many-to-many, uni-directional relationship
*/
@Entity
@Table(name="RELATIONEX_INDIVIDUAL")
public class Individual {
@Id @GeneratedValue
private int id;
@Column(length=32, nullable=false)
private String name;
protected Individual() {}
public Individual(String name) {
this.name = name;
}
public int getId() { return id; }
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
@Override
public int hashCode() {
return name==null? 0 : name.hashCode();
}
@Override
public boolean equals(Object obj) {
try {
if (this == obj) return true;
Individual rhs = (Individual) obj;
if (name==null && rhs.name != null) { return false; }
return name.equals(rhs.name);
} catch (Exception ex) { return false; }
}
}
Add the following class to your src/main tree. This class provides an example of the owning side of a many-to-many, uni-directional relationship. Therefore it defines the mapping from the entities to the database. The current implementation of the class relies on default mapping. We will make that more explicit shortly.
package myorg.relex.many2many;
import java.util.HashSet;
import java.util.Set;
import javax.persistence.*;
/**
* This class provides an example of the owning side of a many-to-many, uni-directional relationship.
*/
@Entity
@Table(name="RELATIONEX_GROUP")
public class Group {
@Id @GeneratedValue
private int id;
@ManyToMany
// @JoinTable(name="RELATIONEX_GROUP_MEMBER",
// joinColumns=@JoinColumn(name="GROUP_ID"),
// inverseJoinColumns=@JoinColumn(name="MEMBER_ID"))
Set<Individual> members;
@Column(length=32, nullable=false)
private String name;
protected Group() {}
public Group(String name) {
this.name = name;
}
public int getId() { return id; }
public Set<Individual> getMembers() {
if (members == null) {
members = new HashSet<Individual>();
}
return members;
}
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
}
Add the new entitiy classes to your persistence unit.
<class>myorg.relex.many2many.Group</class>
<class>myorg.relex.many2many.Individual</class>
Generate schema for the new entities and their relationship. Notice how the provider chose to use a join table mapping with foreign keys back to the individual entity class tables.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_GROUP ( id integer generated by default as identity, name varchar(32) not null, primary key (id) ); create table RELATIONEX_GROUP_RELATIONEX_INDIVIDUAL ( RELATIONEX_GROUP_id integer not null, members_id integer not null, primary key (RELATIONEX_GROUP_id, members_id) ); ... create table RELATIONEX_INDIVIDUAL ( id integer generated by default as identity, name varchar(32) not null, primary key (id) ); ... alter table RELATIONEX_GROUP_RELATIONEX_INDIVIDUAL add constraint FK4DBB1EB9B0D6112E foreign key (members_id) references RELATIONEX_INDIVIDUAL; alter table RELATIONEX_GROUP_RELATIONEX_INDIVIDUAL add constraint FK4DBB1EB9CE0046D6 foreign key (RELATIONEX_GROUP_id) references RELATIONEX_GROUP;
Make the database mapping more explicit by defining the name for the join table and its individual columns.
public class Group {
...
@ManyToMany
@JoinTable(name="RELATIONEX_GROUP_MEMBER",
joinColumns=@JoinColumn(name="GROUP_ID"),
inverseJoinColumns=@JoinColumn(name="MEMBER_ID"))
Set<Individual> members;
Regenerate database schema for the two entities and their relation. Note that this time, the name of the join table and its columns follow what was specified in the mapping.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_GROUP ( id integer generated by default as identity, name varchar(32) not null, primary key (id) ); create table RELATIONEX_GROUP_MEMBER ( GROUP_ID integer not null, MEMBER_ID integer not null, primary key (GROUP_ID, MEMBER_ID) ); ... create table RELATIONEX_INDIVIDUAL ( id integer generated by default as identity, name varchar(32) not null, primary key (id) ); ... alter table RELATIONEX_GROUP_MEMBER add constraint FK2ADA1F0A50B9540D foreign key (MEMBER_ID) references RELATIONEX_INDIVIDUAL; alter table RELATIONEX_GROUP_MEMBER add constraint FK2ADA1F0AD00E5846 foreign key (GROUP_ID) references RELATIONEX_GROUP;
Note too -- the many-to-many mapping prevents a single entity from being related multiple times to another entity. This is enforced by the two foreign keys of the join table being used as a compound primary key for that table.
Add the following test method to the existing JUnit test case. This method will create an owning and inverse instance and relate the two.
@Test
public void testManyToManyUni() {
log.info("*** testManyToManyUni ***");
log.debug("persisting owner");
Group group = new Group("board");
em.persist(group);
em.flush();
log.debug("persisting inverse");
Individual individual = new Individual("manny");
em.persist(individual);
em.flush();
log.debug("relating parent to inverse");
group.getMembers().add(individual);
em.flush();
}
Build the module and run the test method. Notice how the owning side and inverse side are persisted and then related through the join table.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyUni ... -*** testManyToManyUni *** -persisting owner Hibernate: insert into RELATIONEX_GROUP (id, name) values (null, ?) -persisting inverse Hibernate: insert into RELATIONEX_INDIVIDUAL (id, name) values (null, ?) -relating parent to inverse Hibernate: insert into RELATIONEX_GROUP_MEMBER (GROUP_ID, MEMBER_ID) values (?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to your test method. This will get new instances from the database and verify what what was done in the previous block was correctly performed.
log.debug("getting new instances");
em.clear();
Group group2 = em.find(Group.class, group.getId());
log.debug("checking owner");
assertEquals("unexpected group.name", group.getName(), group2.getName());
log.debug("checking inverse");
assertEquals("unexpected size", 1, group2.getMembers().size());
assertEquals("unexpected member.name", individual.getName(), group2.getMembers().iterator().next().getName());
Rebuild the module and re-run the test method. Notice how the owning entity is first LAZY loaded during the find() and then the join table and inverse entity table is queried for once we navigate the collection.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyUni ... -getting new instances Hibernate: select group0_.id as id43_0_, group0_.name as name43_0_ from RELATIONEX_GROUP group0_ where group0_.id=? -checking owner -checking inverse Hibernate: select members0_.GROUP_ID as GROUP1_43_1_, members0_.MEMBER_ID as MEMBER2_1_, individual1_.id as id44_0_, individual1_.name as name44_0_ from RELATIONEX_GROUP_MEMBER members0_ inner join RELATIONEX_INDIVIDUAL individual1_ on members0_.MEMBER_ID=individual1_.id where members0_.GROUP_ID=? ... [INFO] BUILD SUCCESS
Note that the LAZY load was because of our (default fetch specification and does not have anything directly to do with the many-to-many relationship formed.
Add the following lines to your test method this will add two (2) additional members to the original owning entity.
log.debug("adding inverse members");
Individual individualB = new Individual("moe");
Individual individualC = new Individual("jack");
group2.getMembers().add(individualB);
group2.getMembers().add(individualC);
em.persist(individualB);
em.persist(individualC);
em.flush();
Rebuild the module and re-run the test method. Notice how the persist and the addition to the owning entity collection caused an insert into both the entity and join table for both entities added.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyUni ... -adding inverse members Hibernate: insert into RELATIONEX_INDIVIDUAL (id, name) values (null, ?) Hibernate: insert into RELATIONEX_INDIVIDUAL (id, name) values (null, ?) Hibernate: insert into RELATIONEX_GROUP_MEMBER (GROUP_ID, MEMBER_ID) values (?, ?) Hibernate: insert into RELATIONEX_GROUP_MEMBER (GROUP_ID, MEMBER_ID) values (?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to your test method to add a second owning entity and add a few of the existing inverse entities to the new entity. This is where the many-to-many is unique. Many-to-many allows a single entity to be related to multiple parents.
log.debug("adding owning members");
Group groupB = new Group("night shift");
groupB.getMembers().add(individualB);
groupB.getMembers().add(individualC);
em.persist(groupB);
em.flush();
Rebuild the module and re-run the test method. Notice how there was an insert for the new owning entity and then followed only by inserts into the join table to form the relationships.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyUni ... -adding owning members Hibernate: insert into RELATIONEX_GROUP (id, name) values (null, ?) Hibernate: insert into RELATIONEX_GROUP_MEMBER (GROUP_ID, MEMBER_ID) values (?, ?) Hibernate: insert into RELATIONEX_GROUP_MEMBER (GROUP_ID, MEMBER_ID) values (?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to the test method to verify the number of collections each inverse entity is a member of.
log.debug("checking relations");
assertEquals("unexpected relations for member 1", 1, em.createQuery(
"select count(g) from Group g where :individual member of g.members", Number.class)
.setParameter("individual", individual)
.getSingleResult().intValue());
assertEquals("unexpected relations for member 2", 2, em.createQuery(
"select count(g) from Group g where :individual member of g.members", Number.class)
.setParameter("individual", individualB)
.getSingleResult().intValue());
assertEquals("unexpected relations for member 3", 2, em.createQuery(
"select count(g) from Group g where :individual member of g.members", Number.class)
.setParameter("individual", individualC)
.getSingleResult().intValue());
Rebuild the module and re-run the test method. Observe the selects being done to determine which entities each is associated with and the evaluation of that result passes.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyUni ... Hibernate: select count(group0_.id) as col_0_0_ from RELATIONEX_GROUP group0_ where ? in ( select individual2_.id from RELATIONEX_GROUP_MEMBER members1_, RELATIONEX_INDIVIDUAL individual2_ where group0_.id=members1_.GROUP_ID and members1_.MEMBER_ID=individual2_.id ) limit ? (x3) ... [INFO] BUILD SUCCESS
Add the following lines to the test method. This will remove the relationship between one of the inverse entities and both owning entities. At the conclusion we verify the entity relation we removed does not exist.
log.debug("removing relations");
assertTrue(group2.getMembers().remove(individualB));
assertTrue(groupB.getMembers().remove(individualB));
log.debug("verifying relation removal");
assertEquals("unexpected relations for member 1", 0, em.createQuery(
"select count(g) from Group g, IN (g.members) m where m = :individual", Number.class)
.setParameter("individual", individualB)
.getSingleResult().intValue());
Rebuild the module and re-run the unit test. Notice the two join table rows being deleted to officially remove the entity from the two collections. Notice also with the different query -- we changed the subquery to an INNER JOIN.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyUni ... -removing relations -verifying relation removal Hibernate: delete from RELATIONEX_GROUP_MEMBER where GROUP_ID=? and MEMBER_ID=? Hibernate: delete from RELATIONEX_GROUP_MEMBER where GROUP_ID=? and MEMBER_ID=? Hibernate: select count(group0_.id) as col_0_0_ from RELATIONEX_GROUP group0_ inner join RELATIONEX_GROUP_MEMBER members1_ on group0_.id=members1_.GROUP_ID inner join RELATIONEX_INDIVIDUAL individual2_ on members1_.MEMBER_ID=individual2_.id where individual2_.id=? limit ? ... [INFO] BUILD SUCCESS
Add the following lines to the test method to verify the removed relationship did not impact the inverse entity. Also added is a delete of the initial entity.
log.debug("verifying inverse was not removed");
em.flush(); em.clear();
assertNotNull(em.find(Individual.class, individualB.getId()));
log.debug("removing initial owner");
em.remove(em.find(Group.class, group.getId()));
em.flush();
Rebuild the module and re-run the test method. Notice the check for the inverse still existing passes.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyUni ... -verifying inverse was not removed Hibernate: select individual0_.id as id44_0_, individual0_.name as name44_0_ from RELATIONEX_INDIVIDUAL individual0_ where individual0_.id=?
Notice the removal of one of the owning entities causes a search of the join table, the removal of the join table rows that are associated with the owning entity we want to delete, and then the removal of that entity.
-removing initial owner Hibernate: select group0_.id as id43_0_, group0_.name as name43_0_ from RELATIONEX_GROUP group0_ where group0_.id=? Hibernate: delete from RELATIONEX_GROUP_MEMBER where GROUP_ID=? Hibernate: delete from RELATIONEX_GROUP where id=? ... [INFO] BUILD SUCCESS
You have finished going through a many-to-many, uni-directional relationship. Like all many-to-many relationships, the two sides of the relationship must be linked through a join table. In this case the foreign keys to to the entities were based on generated simple keys. In the next section we will change the relationship to be bi-directional so that we can navigate from either direction.
In the previous section we mapped a relationship only from the owning side. In this exercise we will map the many-to-many relationship from both the owning and inverse sides.
Put the following class in your src/main tree. This class provides an example of both the owning and inverse side of a many-to-many relationship because the design of the relationship is to originate and end with an entity class of the same type. The recursive nature of this example has nothing specifically to do with many-to-many relationships -- but it does make for an interesting example.
In this example, the "children" collection is the owning side and the "parent" collection is the inverse side.
package myorg.relex.many2many;
import java.util.HashSet;
import java.util.Set;
import java.util.UUID;
import javax.persistence.*;
/**
* This class provides an example of a many-to-many, bi-directional relationship that just
* happens to be recursive. Both ends of the relationship reference an instance of this class.
*/
@Entity
@Table(name="RELATIONEX_NODE")
public class Node {
@Id
@Column(length=36, nullable=false)
private String id;
@ManyToMany(cascade={CascadeType.PERSIST}, fetch=FetchType.LAZY)
@JoinTable(name="RELATIONEX_NODE_REL",
joinColumns=@JoinColumn(name="PARENT_ID"),
inverseJoinColumns=@JoinColumn(name="CHILD_ID"))
private Set<Node> children;
@ManyToMany(mappedBy="children", fetch=FetchType.EAGER)
private Set<Node> parents;
@Column(length=32, nullable=false)
private String name;
public Node() { id=UUID.randomUUID().toString(); }
public Node(String name) {
this();
this.name = name;
}
public Node(Node parent, String name) {
this();
this.name = name;
parent.getChildren().add(this);
getParents().add(parent);
}
public String getId() { return id; }
public String getName() { return name; }
public void setName(String name) {
this.name = name;
}
public Set<Node> getChildren() {
if (children==null) {
children = new HashSet<Node>();
}
return children;
}
public Set<Node> getParents() {
if (parents == null) {
parents = new HashSet<Node>();
}
return parents;
}
@Override
public int hashCode() {
return id==null?0:id.hashCode();
}
@Override
public boolean equals(Object obj) {
try {
if (this == obj) { return true; }
Node rhs = (Node)obj;
if (id==null && rhs.id != null) { return false; }
return id.equals(rhs.id);
} catch (Exception ex) { return false; }
}
}
Note that in this class design -- a convenience constructor is supplied that accepts a parent node, adds itself to that parent's children, and then adds the parent to this entity.
Add the new entity to the persistence unit.
<class>myorg.relex.many2many.Node</class>
Generate database schema for the new entity and its relationship. Notice how we only have a single entity class table and a join table that links to rows from that table to one another. In a real application, we would probably want to add an additional CHECK that parent and child cannot be equal.
$ mvn clean process-test-classes; more target/classes/ddl/relationEx-createJPA.ddl ... create table RELATIONEX_NODE ( id varchar(36) not null, name varchar(32) not null, primary key (id) ); create table RELATIONEX_NODE_REL ( PARENT_ID varchar(36) not null, CHILD_ID varchar(36) not null, primary key (PARENT_ID, CHILD_ID) ); ... alter table RELATIONEX_NODE_REL add constraint FK7D1BC34C3F5FCCB4 foreign key (CHILD_ID) references RELATIONEX_NODE; alter table RELATIONEX_NODE_REL add constraint FK7D1BC34C57DC6666 foreign key (PARENT_ID) references RELATIONEX_NODE;
Add the following test method to your JUnit test case. This will create two entities and relate them through a business relationship of parent and child.
@Test
public void testManyToManyBi() {
log.info("*** testManyToManyBi ***");
log.debug("create instances");
Node one = new Node("one");
Node two = new Node(one,"two");
em.persist(one);
em.flush();
}
Build the module and run the test method. Notice how two rows are inserted into the entity table and then a row that relates them is added to the join table.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyBi ... -*** testManyToManyBi *** -create instances Hibernate: insert into RELATIONEX_NODE (name, id) values (?, ?) Hibernate: insert into RELATIONEX_NODE (name, id) values (?, ?) Hibernate: insert into RELATIONEX_NODE_REL (PARENT_ID, CHILD_ID) values (?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to the test method. This will obtain new instances of the entities in our object tree and verify their contents. It also shows the impact of EAGER and LAZY fetching.
log.debug("getting new instances from owning side");
em.clear();
Node one2 = em.find(Node.class, one.getId());
assertNotNull("owning side not found", one2);
log.debug("checking owning side");
assertEquals("unexpected owning.name", one.getName(), one2.getName());
log.debug("checking parents");
assertEquals("unexpected parents.size", 0, one2.getParents().size());
log.debug("checking children");
assertEquals("unexpected children.size", 1, one2.getChildren().size());
assertEquals("unexpected child.name", two.getName(), one2.getChildren().iterator().next().getName());
Rebuild the module and re-run the test method. Notice that because of the EAGER fetch specification on parents, the join table and entity table is queried during the find() operation.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyBi ... -getting new instances from owning side Hibernate: select node0_.id as id45_1_, node0_.name as name45_1_, parents1_.CHILD_ID as CHILD2_45_3_, node2_.id as PARENT1_3_, node2_.id as id45_0_, node2_.name as name45_0_ from RELATIONEX_NODE node0_ left outer join RELATIONEX_NODE_REL parents1_ on node0_.id=parents1_.CHILD_ID left outer join RELATIONEX_NODE node2_ on parents1_.PARENT_ID=node2_.id where node0_.id=? -checking owning side -checking parents
Because of the LAZY fetch specification on children, the join table and child entities are not searched for until specifically requested.
-checking children Hibernate: select children0_.PARENT_ID as PARENT1_45_1_, children0_.CHILD_ID as CHILD2_1_, node1_.id as id45_0_, node1_.name as name45_0_ from RELATIONEX_NODE_REL children0_ inner join RELATIONEX_NODE node1_ on children0_.CHILD_ID=node1_.id where children0_.PARENT_ID=? Hibernate: select parents0_.CHILD_ID as CHILD2_45_1_, parents0_.PARENT_ID as PARENT1_1_, node1_.id as id45_0_, node1_.name as name45_0_ from RELATIONEX_NODE_REL parents0_ inner join RELATIONEX_NODE node1_ on parents0_.PARENT_ID=node1_.id where parents0_.CHILD_ID=? ... [INFO] BUILD SUCCESS
Add the following lines to the test method to add a few additional inverse entities. The new entities are registered with the owning side within the constructor. All are persisted during the subsequent call to persist() on the owning side because of the cascade=PERSIST definition on the owning side.
log.debug("adding more inverse instances");
Node twoB = new Node(one2, "twoB");
Node twoC = new Node(one2, "twoC");
em.persist(one2);
em.flush();
Rebuild the module and re-run the unit test. Notice the two inserts for the inverse entities and two inserts in the join table relating the new entities to the owning side. We now have one owning entity and three inverse entities related.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyBi ... -adding more inverse instances Hibernate: insert into RELATIONEX_NODE (name, id) values (?, ?) Hibernate: insert into RELATIONEX_NODE (name, id) values (?, ?) Hibernate: insert into RELATIONEX_NODE_REL (PARENT_ID, CHILD_ID) values (?, ?) Hibernate: insert into RELATIONEX_NODE_REL (PARENT_ID, CHILD_ID) values (?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to the test method. This tests obtaining our object tree form the inverse side. This is a feature unique to bi-directional relationships.
log.debug("getting new instances from inverse side"); em.clear(); Node two2 = em.find(Node.class, two.getId()); assertNotNull("inverse node not found", two2); log.debug("checking inverse side"); assertEquals("unexpected name", two.getName(), two2.getName()); log.debug("checking parents"); assertEquals("unexpected parents.size", 1, two2.getParents().size()); log.debug("checking children"); assertEquals("unexpected children.size", 0, two2.getChildren().size());
Rebuild the module and re-run the test method. Note the entity we call find() on this time has a parent and no children. Most of the work is done during the EAGER fetch on the parent relationship.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyBi ... -getting new instances from inverse side Hibernate: select node0_.id as id45_1_, node0_.name as name45_1_, parents1_.CHILD_ID as CHILD2_45_3_, node2_.id as PARENT1_3_, node2_.id as id45_0_, node2_.name as name45_0_ from RELATIONEX_NODE node0_ left outer join RELATIONEX_NODE_REL parents1_ on node0_.id=parents1_.CHILD_ID left outer join RELATIONEX_NODE node2_ on parents1_.PARENT_ID=node2_.id where node0_.id=? Hibernate: select parents0_.CHILD_ID as CHILD2_45_1_, parents0_.PARENT_ID as PARENT1_1_, node1_.id as id45_0_, node1_.name as name45_0_ from RELATIONEX_NODE_REL parents0_ inner join RELATIONEX_NODE node1_ on parents0_.PARENT_ID=node1_.id where parents0_.CHILD_ID=? -checking inverse side -checking parents -checking children Hibernate: select children0_.PARENT_ID as PARENT1_45_1_, children0_.CHILD_ID as CHILD2_1_, node1_.id as id45_0_, node1_.name as name45_0_ from RELATIONEX_NODE_REL children0_ inner join RELATIONEX_NODE node1_ on children0_.CHILD_ID=node1_.id where children0_.PARENT_ID=? ... [INFO] BUILD SUCCESS
Add the following lines to the test method to add an additional owning entity that will share a relation to one of the inverse entities. Since we didn't add a concenience method to add an existing child to a new parent, all calls here are exposed. We need to add the child to the parent collection and add the parent to the child collection.
log.debug("adding owning entity");
Node oneB = new Node("oneB");
oneB.getChildren().add(two2);
two2.getParents().add(oneB);
em.persist(oneB);
em.flush();
Rebuild the module and re-run the test method. Notice how all we should get is an insert for the new owning entity and a row in the join table relating the new entity to one of the existing ones.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyBi ... -adding owning entity Hibernate: insert into RELATIONEX_NODE (name, id) values (?, ?) Hibernate: insert into RELATIONEX_NODE_REL (PARENT_ID, CHILD_ID) values (?, ?) ... [INFO] BUILD SUCCESS
Add the following lines to the test method to verify the number of relationships that exist.
log.debug("checking relationships");
assertEquals("unexpected parents", 0,
em.createQuery("select count(p) from Node n, IN (n.parents) p where n=:node", Number.class)
.setParameter("node", one)
.getSingleResult().intValue());
assertEquals("unexpected parents", 2,
em.createQuery("select count(p) from Node n, IN (n.parents) p where n=:node", Number.class)
.setParameter("node", two)
.getSingleResult().intValue());
assertEquals("unexpected parents", 1,
em.createQuery("select count(p) from Node n, IN (n.parents) p where n=:node", Number.class)
.setParameter("node", twoB)
.getSingleResult().intValue());
assertEquals("unexpected parents", 1,
em.createQuery("select count(p) from Node n, IN (n.parents) p where n=:node", Number.class)
.setParameter("node", twoC)
.getSingleResult().intValue());
assertEquals("unexpected children", 3,
em.createQuery("select count(c) from Node n, IN (n.children) c where n=:node", Number.class)
.setParameter("node", one)
.getSingleResult().intValue());
assertEquals("unexpected children", 0,
em.createQuery("select count(c) from Node n, IN (n.children) c where n=:node", Number.class)
.setParameter("node", two)
.getSingleResult().intValue());
assertEquals("unexpected children", 1,
em.createQuery("select count(c) from Node n, IN (n.children) c where n=:node", Number.class)
.setParameter("node", oneB)
.getSingleResult().intValue());
Rebuild the module and re-run the unit test. This shows the assertions on the queuries passing.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyBi ... -checking relationships Hibernate: select count(node2_.id) as col_0_0_ from RELATIONEX_NODE node0_ inner join RELATIONEX_NODE_REL parents1_ on node0_.id=parents1_.CHILD_ID inner join RELATIONEX_NODE node2_ on parents1_.PARENT_ID=node2_.id where node0_.id=? limit ? (x3 more) Hibernate: select count(node2_.id) as col_0_0_ from RELATIONEX_NODE node0_ inner join RELATIONEX_NODE_REL children1_ on node0_.id=children1_.PARENT_ID inner join RELATIONEX_NODE node2_ on children1_.CHILD_ID=node2_.id where node0_.id=? limit ? (x2 more) ... [INFO] BUILD SUCCESS
Add the following lines to the test method. This will remove a relationship between one of the two nodes. None the managed instance for the owning side had to be obtained from the persistence context so we could actually perform the removal.
log.debug("getting managed owning side");
assertNotNull(one = em.find(Node.class, one.getId()));
log.debug("removing relationship");
one.getChildren().remove(two);
two.getParents().remove(one);
em.flush();
assertEquals("unexpected children", 2,
em.createQuery("select count(c) from Node n, IN (n.children) c where n=:node", Number.class)
.setParameter("node", one)
.getSingleResult().intValue());
assertEquals("unexpected parents", 1,
em.createQuery("select count(p) from Node n, IN (n.parents) p where n=:node", Number.class)
.setParameter("node", two)
.getSingleResult().intValue());
Rebuild the module and re-run the test method. Notice that the provider appears to be loading the related entities prior to removing the relationship.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyBi ... -removing relationship Hibernate: select children0_.PARENT_ID as PARENT1_45_1_, children0_.CHILD_ID as CHILD2_1_, node1_.id as id45_0_, node1_.name as name45_0_ from RELATIONEX_NODE_REL children0_ inner join RELATIONEX_NODE node1_ on children0_.CHILD_ID=node1_.id where children0_.PARENT_ID=? Hibernate: select parents0_.CHILD_ID as CHILD2_45_1_, parents0_.PARENT_ID as PARENT1_1_, node1_.id as id45_0_, node1_.name as name45_0_ from RELATIONEX_NODE_REL parents0_ inner join RELATIONEX_NODE node1_ on parents0_.PARENT_ID=node1_.id where parents0_.CHILD_ID=? Hibernate: select parents0_.CHILD_ID as CHILD2_45_1_, parents0_.PARENT_ID as PARENT1_1_, node1_.id as id45_0_, node1_.name as name45_0_ from RELATIONEX_NODE_REL parents0_ inner join RELATIONEX_NODE node1_ on parents0_.PARENT_ID=node1_.id where parents0_.CHILD_ID=? Hibernate: delete from RELATIONEX_NODE_REL where PARENT_ID=? and CHILD_ID=? Hibernate: select count(node2_.id) as col_0_0_ ... [INFO] BUILD SUCCESS
Add the following lines to the test method to remove one of the owning entities.
log.debug("deleting owner");
em.remove(oneB);
em.flush();
assertEquals("unexpected parents", 0,
em.createQuery("select count(p) from Node n, IN (n.parents) p where n=:node", Number.class)
.setParameter("node", two)
.getSingleResult().intValue());
Rebuild the module and re-run the test method. Notice the provider removed owned relationships prior to removing the owning entity.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyBi ... -deleting owner Hibernate: delete from RELATIONEX_NODE_REL where PARENT_ID=? Hibernate: delete from RELATIONEX_NODE where id=? Hibernate: select count(node2_.id) as col_0_0_ from RELATIONEX_NODE node0_ inner join RELATIONEX_NODE_REL parents1_ on node0_.id=parents1_.CHILD_ID inner join RELATIONEX_NODE node2_ on parents1_.PARENT_ID=node2_.id where node0_.id=? limit ? ... [INFO] BUILD SUCCESS
Add the following lines to the test method. They will attempt to remove the inverse side of a relation while a relationstill exists.
log.debug("deleting inverse");
assertNotNull(twoB = em.find(Node.class, twoB.getId()));
em.remove(twoB);
em.flush();
assertNull("inverse not deleted", em.find(Node.class, twoB.getId()));
Rebuild the test method and re-run the test method. Notice how the delete never was issued to the database. Worse case -- I would have expected a foreign key violation from the join table but here the delete was never flushed to the database.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyBi ... -deleting inverse ... Failed tests: testManyToManyBi(myorg.relex.Many2ManyTest): inverse not deleted ... [INFO] BUILD FAILURE
Change the test method to remove the relationship after attempting to remove the inverse side. This will help determine if the delete was waiting for the removal of the foreign key reference.
log.debug("deleting inverse"); assertNotNull(twoB = em.find(Node.class, twoB.getId())); em.remove(twoB); em.flush(); // assertNull("inverse not deleted", em.find(Node.class, twoB.getId())); one.getChildren().remove(twoB); em.flush(); assertNull("inverse not deleted", em.find(Node.class, twoB.getId()));
Rebuild the test method and re-run the test method. Notice the relationship is removed but nothing occurs to the inverse side because of it.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyBi ... -deleting inverse Hibernate: delete from RELATIONEX_NODE_REL where PARENT_ID=? and CHILD_ID=? ... Failed tests: testManyToManyBi(myorg.relex.Many2ManyTest): inverse not deleted ... [INFO] BUILD FAILURE
Change the test method so the inverse entity is removed after the relationship is removed. At this point in time there are no longer any incomming references to the entity.
log.debug("deleting inverse"); assertNotNull(twoB = em.find(Node.class, twoB.getId())); em.remove(twoB); em.flush(); // assertNull("inverse not deleted", em.find(Node.class, twoB.getId())); one.getChildren().remove(twoB); em.flush(); // assertNull("inverse not deleted", em.find(Node.class, twoB.getId())); em.remove(twoB); em.flush(); assertNull("inverse not deleted", em.find(Node.class, twoB.getId()));
Rebuild the test method and re-run the test method. Notice the removal of the relationship caused the removal of a specific row from the join table. The successful removal of the entity itself caused a delete from the entity table and any relationships it owned from the join table. None of this worked, however, when there was a known relationship existing in the cache. That may have had something to do with the EAGER fetch setting for parent members.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.relex.Many2ManyTest#testManyToManyBi ... -deleting inverse Hibernate: delete from RELATIONEX_NODE_REL where PARENT_ID=? and CHILD_ID=? Hibernate: delete from RELATIONEX_NODE_REL where PARENT_ID=? Hibernate: delete from RELATIONEX_NODE where id=? Hibernate: select node0_.id as id45_1_, node0_.name as name45_1_, parents1_.CHILD_ID as CHILD2_45_3_, node2_.id as PARENT1_3_, node2_.id as id45_0_, node2_.name as name45_0_ from RELATIONEX_NODE node0_ left outer join RELATIONEX_NODE_REL parents1_ on node0_.id=parents1_.CHILD_ID left outer join RELATIONEX_NODE node2_ on parents1_.PARENT_ID=node2_.id where node0_.id=? ... [INFO] BUILD SUCCESS
You have finished going through an example of many-to-many, bi-directional relationship. For fun, we used a recursive relationship in this case where the same class was on both side of the relationship and that class had a copy of the owning and inverse side of the relation so that it could navigate in either direction.
In this chapter we looked at many-to-many relationships. We stuck to simple primary keys since the details of using other compound primary key types are identical to where we covered them in more detail. The big difference here is that we could relate many entities to many other entities. As stated previously, this type of relationship is not that common because one tends to model properties of a relationship. In that case a many-to-many would turn into a many-to-one/one-to-many relationship with an association class in the middle.
Table of Contents
To provide hands on experience
Creating JPA queries
Creating native SQL queries
Performing bulk database updates with JPA queries
Adding locks to JPA queries
At the completion of this exercise, the student will be able to
Create a JPAQL query
Create a SQL query
Set query parameters
Set paging controls
Register a named query
Define a SqlRowSetMapping
Perform a bulk database update using JPAQL and native SQL
Set the lock mode for a query
The primary purpose of this exercise is to provide coverage of the broader capabilities of JPA queries. Although some JPAQL features are exposed, it does not yet provide detailed coverage of the JPA query language or SQL. It is expected that JPAQL will be a future enhancement to what is currently covered.
This exercise will start each test case with the following database model in place.
Add the following to your .m2/settings.xml file. This will allow you to resolve the exercise archetype and set a default database for the exercise.
<profiles> <profile> <id>webdev-repositories</id> <repositories> <repository> <id>webdev</id> <name>ejava webdev repository</name> <url>http://webdev.jhuep.com/~jcs/maven2</url> <releases> <enabled>true</enabled> <updatePolicy>never</updatePolicy> </releases> <snapshots> <enabled>false</enabled> </snapshots> </repository> <repository> <id>webdev-snapshot</id> <name>ejava webdev snapshot repository</name> <url>http://webdev.jhuep.com/~jcs/maven2-snapshot</url> <releases> <enabled>false</enabled> </releases> <snapshots> <enabled>true</enabled> <updatePolicy>daily</updatePolicy> </snapshots> </repository> </repositories> </profile> </profiles> <activeProfiles> <activeProfile>h2db</activeProfile> <!-- <activeProfile>h2srv</activeProfile> --> </activeProfiles>
Use the ejava.jpa:jpa-queryex-archetype to setup a new Maven project for this exercise. Activate the webdev-repositories profile (-Pwebdev-repositories) so that you can resolve the archetype off the Internet. The following should be run outside of the class example tree.
This archetype is specific to this exercise and is a different profile than what was used by the previous exercises.
$ mvn archetype:generate -B -DarchetypeGroupId=info.ejava.examples.jpa -DarchetypeArtifactId=jpa-queryex-archetype -DarchetypeVersion=5.0.0-SNAPSHOT -DgroupId=myorg.queryex -DartifactId=queryEx -Pwebdev-repositories INFO] ---------------------------------------------------------------------------- [INFO] Using following parameters for creating project from Archetype: jpa-queryex-archetype:5.0.0-SNAPSHOT [INFO] ---------------------------------------------------------------------------- [INFO] Parameter: groupId, Value: myorg.queryex [INFO] Parameter: artifactId, Value: queryEx [INFO] Parameter: version, Value: 1.0-SNAPSHOT [INFO] Parameter: package, Value: myorg.queryex [INFO] Parameter: packageInPathFormat, Value: myorg/queryex [INFO] Parameter: version, Value: 1.0-SNAPSHOT [INFO] Parameter: package, Value: myorg.queryex [INFO] Parameter: groupId, Value: myorg.queryex [INFO] Parameter: artifactId, Value: queryEx [INFO] Project created from Archetype in dir: /Users/jim/proj/784/queryEx [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2.797 s [INFO] Finished at: 2018-08-18T14:47:41-04:00 [INFO] Final Memory: 15M/223M [INFO] ------------------------------------------------------------------------
You should now have an instantiated template for a JPA project
queryEx/ |-- pom.xml `-- src |-- main | `-- java | `-- myorg | `-- queryex | |-- Actor.java | |-- Director.java | |-- Movie.java | |-- MoviePK.java | |-- MovieRating.java | |-- MovieRole.java | |-- MovieRolePK.java | `-- Person.java `-- test |-- java | `-- myorg | `-- queryex | |-- BulkUpdateTest.java | |-- MovieFactory.java | |-- QueryBase.java | |-- QueryLocksTest.java | |-- QueryTest.java | `-- SQLQueryTest.java `-- resources |-- hibernate.properties |-- log4j.xml `-- META-INF `-- persistence.xml
Verify the instantiated template builds in your environment
Activate the h2db profile (and deactivate the h2srv profile) to use an embedded file as your database. This option does not require a server but is harder to inspect database state in between tests. Remember that the "!" character must be escaped ("\!") for bash shells.
queryEx> mvn clean test -Ph2db -P\!h2srv ... -HHH000401: using driver [org.h2.Driver] at URL [jdbc:h2:/Users/jim/proj/784/queryEx/target/h2db/ejava] ... [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------
Start your database server
$ java -jar M2_REPO/com/h2database/h2/1.4.197/h2-1.4.197.jar
Activate the h2srv profile (and deactivate the h2db profile) to use a running H2 database server. This option provides more interaction with your database but does require the server to be running.
queryEx]$ mvn clean test -P\!h2db -Ph2srv ... -HHH000401: using driver [org.h2.Driver] at URL [jdbc:h2:tcp://127.0.0.1:9092/./h2db/ejava] ... [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------
You may now import the instantiated template into Eclipse as an "Existing Maven Project"
In this chapter you will create simple JPAQL queries and work mostly with the outer JPA query framework.
In this section you will setup a JUnit test case to do work within this chapter.
Create a JUnit test case in src/test called QueryTest. Have this class extend QueryBase. Create an initial test method to verify the setup/teardown works correctly.
package myorg.queryex;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.junit.Test;
public class QueryTest extends QueryBase {
private static final Logger log = LoggerFactory.getLogger(QueryTest.class);
@Test
public void test() {}
}
Build the module and run the test case.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest ... Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
You may remove the sample @Test at this time since we will be adding more below.
In this section you will execute a query that produces different types of results.
Create a query that will return multiple results.
Add the following test method to the test case to return multiple results from a JPAQL query.
@Test
public void testMulti() {
log.info("*** testMulti ***");
List<Movie> movies = em.createQuery(
"select m from Movie m " +
"order by title ASC", Movie.class)
.getResultList();
log.debug("result=" + movies);
assertEquals("unexpected number of movies", 7, movies.size());
}
Notice...
createQuery() accepts JPAQL syntax
supplying a type (Movie.class) for the second argument produces a type-safe assignment and eliminates the need to perform a cast
getResultList() - returns with zero to many elements
we have added an "order by" to get results in a specified order
getResultList() - returns with zero to many elements
Run the new test method. This should produce a query of the entity class' table and return multiple rows -- which we can display.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testMulti ... Hibernate: select movie0_.ID as ID3_, movie0_.DIRECTOR_ID as DIRECTOR6_3_, movie0_.MINUTES as MINUTES3_, movie0_.RATING as RATING3_, movie0_.RELEASE_DATE as RELEASE4_3_, movie0_.TITLE as TITLE3_ from QUERYEX_MOVIE movie0_ -result=[Animal House (1978), Apollo 13 (1995), Diner (1982), Footloose (1984), Sleepers (1996), Tremors (1990), Wag The Dog (1997)] ... [INFO] BUILD SUCCESS
Create a query that will return a single result.
Add the following test method to the test case to return a single result from a JPAQL query.
@Test
public void testSingle() {
log.info("*** testSingle ***");
Movie movie = em.createQuery(
"select m from Movie m " +
"where m.title='Animal House'", Movie.class)
.getSingleResult();
log.debug("result=" + movie);
assertNotNull("no movie", movie);
}
Notice...
getSingleResult() - returns exactly one result
Run the new test method. This should produce a query of the entity class' table and return a single row -- which we can display.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testSingle ... -*** testSingle *** Hibernate: select movie0_.ID as ID3_, movie0_.DIRECTOR_ID as DIRECTOR6_3_, movie0_.MINUTES as MINUTES3_, movie0_.RATING as RATING3_, movie0_.RELEASE_DATE as RELEASE4_3_, movie0_.TITLE as TITLE3_ from QUERYEX_MOVIE movie0_ where movie0_.TITLE='Animal House' limit ? -result=Animal House (1978) ... [INFO] BUILD SUCCESS
Create a query that fails in its attempt to locate a single result because no result is found.
Add the following test method to the test case.
@Test(expected=NoResultException.class)
public void testSingleNoResult() {
log.info("*** testSingleNoResult ***");
em.createQuery(
"select m from Movie m " +
"where m.title='Animal Hut'", Movie.class)
.getSingleResult();
log.debug("query did not produce expected exception");
}
Run the new test method. This should produce a query of the entity class' table, return no rows, and throw an expected javax.persistence.NoResultException
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testSingleNoResult ... -*** testSingle *** Hibernate: select movie0_.ID as ID3_, movie0_.DIRECTOR_ID as DIRECTOR6_3_, movie0_.MINUTES as MINUTES3_, movie0_.RATING as RATING3_, movie0_.RELEASE_DATE as RELEASE4_3_, movie0_.TITLE as TITLE3_ from QUERYEX_MOVIE movie0_ where movie0_.TITLE='Animal Hut' limit ? ... [INFO] BUILD SUCCESS
Create a query that fails in its attempt to locate a single result because too many results are found
Add the following test method to the test case.
public void testSingleNonUniqueResult() {
log.info("*** testSingleNonUniqueResult ***");
em.createQuery(
"select m from Movie m " +
"where m.rating='R'", Movie.class)
.getSingleResult();
log.debug("query did not produce expected exception");
}
Run the new test method. This should produce a query of the entity class' table, locate multiple rows, and throw an expected javax.persistence.NonUniqueResultException
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testSingleNonUniqueResult ... -*** testSingleNonUniqueResult *** Hibernate: select movie0_.ID as ID3_, movie0_.DIRECTOR_ID as DIRECTOR6_3_, movie0_.MINUTES as MINUTES3_, movie0_.RATING as RATING3_, movie0_.RELEASE_DATE as RELEASE4_3_, movie0_.TITLE as TITLE3_ from QUERYEX_MOVIE movie0_ where movie0_.RATING='R' limit ? ... [INFO] BUILD SUCCESS
In this section you will pass in parameters to the JPAQL query.
Add the following test method to your existing test case to execute a query with a provided parameter argument.
@Test
public void testParameters() {
log.info("*** testParameters ***");
List<Movie> movies = em.createQuery(
"select m from Movie m " +
"where m.rating=:rating", Movie.class)
.setParameter("rating", MovieRating.R)
.getResultList();
log.debug("result=" + movies);
assertEquals("unexpected number of movies", 4, movies.size());
}
Notice the parameter name is passed in the setParameter() call and is prefaced within the query with the ":" character. A single parameter can appear in the query multiple times.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testParameters ... -*** testParameters *** Hibernate: select movie0_.ID as ID3_, movie0_.DIRECTOR_ID as DIRECTOR6_3_, movie0_.MINUTES as MINUTES3_, movie0_.RATING as RATING3_, movie0_.RELEASE_DATE as RELEASE4_3_, movie0_.TITLE as TITLE3_ from QUERYEX_MOVIE movie0_ where movie0_.RATING=? -result=[Animal House (1978), Sleepers (1996), Wag The Dog (1997), Diner (1982)] ... [INFO] BUILD SUCCESS
Updated the query specification to include items that match a date comparison. Be sure to update the expected count.
List<Movie> movies = em.createQuery(
"select m from Movie m " +
"where m.rating=:rating " +
"and m.releaseDate > :date", Movie.class)
.setParameter("rating", MovieRating.R)
.setParameter("date", new GregorianCalendar(1980, 0, 0).getTime(), TemporalType.DATE)
.getResultList();
log.debug("result=" + movies);
assertEquals("unexpected number of movies", 3, movies.size());
Re-run the test method. Notice the extra statement within the WHERE clause and fewer matches as a result of the updated query.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testParameters ... -*** testParameters *** Hibernate: select movie0_.ID as ID3_, movie0_.DIRECTOR_ID as DIRECTOR6_3_, movie0_.MINUTES as MINUTES3_, movie0_.RATING as RATING3_, movie0_.RELEASE_DATE as RELEASE4_3_, movie0_.TITLE as TITLE3_ from QUERYEX_MOVIE movie0_ where movie0_.RATING=? and movie0_.RELEASE_DATE>? -result=[Sleepers (1996), Wag The Dog (1997), Diner (1982)] ... [INFO] BUILD SUCCESS
Update the query spec to include a LIKE search for text supplied by a parameter. Concatenate JPAQL wildcards to the beginning and ending of the supplied parameter.
List<Movie> movies = em.createQuery(
"select m from Movie m " +
"where m.rating=:rating " +
"and m.releaseDate > :date " +
"and m.title like concat(concat('%',:title),'%')", Movie.class)
.setParameter("rating", MovieRating.R)
.setParameter("date", new GregorianCalendar(1980, 0, 0).getTime(), TemporalType.DATE)
.setParameter("title", "The")
.getResultList();
log.debug("result=" + movies);
assertEquals("unexpected number of movies", 1, movies.size());
Re-run test the test method with the additional search string.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testParameters
...
-*** testParameters ***
Hibernate:
select
movie0_.ID as ID3_,
movie0_.DIRECTOR_ID as DIRECTOR6_3_,
movie0_.MINUTES as MINUTES3_,
movie0_.RATING as RATING3_,
movie0_.RELEASE_DATE as RELEASE4_3_,
movie0_.TITLE as TITLE3_
from
QUERYEX_MOVIE movie0_
where
movie0_.RATING=?
and movie0_.RELEASE_DATE>?
and (
movie0_.TITLE like (('%'||?)||'%')
)
-result=[Wag The Dog (1997)]
...
[INFO] BUILD SUCCESS
Update the query spec to make the text search case-insensitive
List<Movie> movies = em.createQuery(
"select m from Movie m " +
"where m.rating=:rating " +
"and m.releaseDate > :date " +
"and lower(m.title) like concat(concat('%',lower(:title)),'%')", Movie.class)
.setParameter("rating", MovieRating.R)
.setParameter("date", new GregorianCalendar(1980, 0, 0).getTime(), TemporalType.DATE)
.setParameter("title", "wag")
.getResultList();
log.debug("result=" + movies);
assertEquals("unexpected number of movies", 1, movies.size());
Re-run test the test method with the case-insensitive search.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testParameters
...
-*** testParameters ***
Hibernate:
select
movie0_.ID as ID3_,
movie0_.DIRECTOR_ID as DIRECTOR6_3_,
movie0_.MINUTES as MINUTES3_,
movie0_.RATING as RATING3_,
movie0_.RELEASE_DATE as RELEASE4_3_,
movie0_.TITLE as TITLE3_
from
QUERYEX_MOVIE movie0_
where
movie0_.RATING=?
and movie0_.RELEASE_DATE>?
and (
lower(movie0_.TITLE) like (('%'||lower(?))||'%')
)
-result=[Wag The Dog (1997)]
...
[INFO] BUILD SUCCESS
In this section you will control the amount of rows returned by using paging parameters.
Add the following test method to the existing test case to demonstrate paging capabilities.
@Test
public void testPaging() {
log.info("*** testPaging ***");
List<Movie> movies = new LinkedList<Movie>();
Setup to constant portion of the query up front.
TypedQuery<Movie> query = em.createQuery(
"select m from Movie m " +
"order by title", Movie.class)
.setMaxResults(2);
Loop thru each page until an empty page is returned
List<Movie> page=null;
int offset=0;
do {
page = query.setFirstResult(offset).getResultList();
log.debug("page=" + page);
movies.addAll(page);
offset += page.size();
log.debug("page.size=" + page.size() + ", offset=" + offset);
} while (page.size() > 0);
Evaluate the count of rows returned.
log.debug("result=" + movies);
assertEquals("unexpected number of movies", 7, movies.size());
}
It is recommended that you provide an "order by" with a consistent ordering when working with paging to assure the follow-on page uses the same sort as the prior page. Properties like "createTime" are good default choices when present.
Run the new test method to demonstrate paging.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testPaging ... -*** testPaging *** Hibernate: select movie0_.ID as ID3_, movie0_.DIRECTOR_ID as DIRECTOR6_3_, movie0_.MINUTES as MINUTES3_, movie0_.RATING as RATING3_, movie0_.RELEASE_DATE as RELEASE4_3_, movie0_.TITLE as TITLE3_ from QUERYEX_MOVIE movie0_ order by movie0_.TITLE limit ? -page=[Animal House (1978), Apollo 13 (1995), Diner (1982)] -page.size=3, offset=3
The first page finishes with 3 rows. The row count is added to the offset for the next query.
Hibernate: select movie0_.ID as ID3_, movie0_.DIRECTOR_ID as DIRECTOR6_3_, movie0_.MINUTES as MINUTES3_, movie0_.RATING as RATING3_, movie0_.RELEASE_DATE as RELEASE4_3_, movie0_.TITLE as TITLE3_ from QUERYEX_MOVIE movie0_ order by movie0_.TITLE limit ? offset ? -page=[Footloose (1984), Sleepers (1996), Tremors (1990)] -page.size=3, offset=6
The second page finishes with 3 rows. The row count is added to the offset for the next query.
Hibernate: select movie0_.ID as ID3_, movie0_.DIRECTOR_ID as DIRECTOR6_3_, movie0_.MINUTES as MINUTES3_, movie0_.RATING as RATING3_, movie0_.RELEASE_DATE as RELEASE4_3_, movie0_.TITLE as TITLE3_ from QUERYEX_MOVIE movie0_ order by movie0_.TITLE limit ? offset ? -page=[Wag The Dog (1997)] -page.size=1, offset=7
The third page finishes with 1 row. The row count is added to the offset for the next query.
Hibernate: select movie0_.ID as ID3_, movie0_.DIRECTOR_ID as DIRECTOR6_3_, movie0_.MINUTES as MINUTES3_, movie0_.RATING as RATING3_, movie0_.RELEASE_DATE as RELEASE4_3_, movie0_.TITLE as TITLE3_ from QUERYEX_MOVIE movie0_ order by movie0_.TITLE limit ? offset ? -page=[] -page.size=0, offset=7
The fourth page finishes with 0 rows. This signals the loop to complete. The result is printed.
-result=[Animal House (1978), Apollo 13 (1995), Diner (1982), Footloose (1984), Sleepers (1996), Tremors (1990), Wag The Dog (1997)] ... [INFO] BUILD SUCCESS
Use a named query to register re-usable queries.
Observe the @NamedQuery defined within the Movie entity.
@Entity
@Table(name="QUERYEX_MOVIE")
@NamedQueries(value = {
@NamedQuery(name="Movie.findByTitle", query=
"select m from Movie m " +
"where lower(m.title) like concat(concat('%',lower(:title)),'%')")
})
public class Movie implements Comparable<Movie>{
})
Add the following test method to your existing test case to demonstrate calling a named query.
@Test
public void testNamedQuery() {
log.info("*** testNamedQuery ***");
Movie movie = em.createNamedQuery("Movie.findByTitle", Movie.class)
.setParameter("title", "wag")
.getSingleResult();
log.debug("result=" + movie);
assertNotNull("no movie", movie);
}
Re-run the new test method to show the execution of the named query with a parameter supplied at runtime.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testNamedQuery
...
-*** testNamedQuery ***
Hibernate:
select
movie0_.ID as ID3_,
movie0_.DIRECTOR_ID as DIRECTOR6_3_,
movie0_.MINUTES as MINUTES3_,
movie0_.RATING as RATING3_,
movie0_.RELEASE_DATE as RELEASE4_3_,
movie0_.TITLE as TITLE3_
from
QUERYEX_MOVIE movie0_
where
lower(movie0_.TITLE) like (('%'||lower(?))||'%') limit ?
-result=Wag The Dog (1997)
...
Retrieving an entity by its property values and having that entity be managed is a powerful capability provided by JPAQL queries. However, there are times when retrieving a simple value -- rather than the complete entity -- is a better solution.
In this section we will form a query to return a list of values from a single entity property that has a known type. This allows the query result to be placed into a convenient List for processing.
Add the following test method to demonstrate querying an entity model and a collection of values.
@Test
public void testValueQuery() {
log.info("*** testValueQuery ***");
List<String> titles = em.createQuery(
"select m.title from Movie m " +
"order by title ASC", String.class)
.getResultList();
for (String title : titles) {
log.debug(title);
}
assertEquals("unexpected number of titles", 7, titles.size());
}
Notice...
The data type returned in the list is declared in the query
The result is returned as a simple list of values
Run the new test method and observe that only the requested value from the entity is returned and not the entire entity.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testValueQuery ... -*** testValueQuery *** Hibernate: select movie0_.TITLE as col_0_0_ from QUERYEX_MOVIE movie0_ order by movie0_.TITLE ASC -Animal House -Apollo 13 -Diner -Footloose -Sleepers -Tremors -Wag The Dog ... [INFO] BUILD SUCCESS
In the previous section we performed a query on a single property and the result returned a list. From that list we had the capability to get its size to determine how many entities we had that matched the criteria. If that was our only purpose -- that would be inefficient. Lets delegate the counting to the database and simply returned the result.
Add the following test method to demonstrate querying an entity model and returning a single function result.
@Test
public void testResultValueQuery() {
log.info("*** testResultValueQuery ***");
int titleCount = em.createQuery(
"select count(m) from Movie m", Number.class)
.getSingleResult().intValue();
log.debug("titleCount=" + titleCount);
assertEquals("unexpected number of titles", 7, titleCount);
}
Notice...
The query returns the result of a JPAQL function
Adding the return type to the query function allows for type-safe usage of the result
The value can be retrieved using the getSingleResult()
Run the new test method and observe the query produced.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testResultValueQuery ... -*** testResultValueQuery *** Hibernate: select count(movie0_.ID) as col_0_0_ from QUERYEX_MOVIE movie0_ limit ? -titleCount=7 ... [INFO] BUILD SUCCESS
Notice how the count() of rows is calculated in the database and only the result is returned.
In this section you will query for multiple values for the entity. Results for multiple values are returned in a Java Array[]. In the case of the properties being multiple types, the array is an Object[] array.
Add the following test method to demonstrate querying an entity model and returning multiple values of different types.
@Test
public void testMultiValueQuery() {
log.info("*** testMultiValueQuery ***");
List<Object[]> results = em.createQuery(
"select m.title, m.releaseDate from Movie m " +
"order by title ASC", Object[].class)
.getResultList();
for (Object[] result : results) {
String title = (String)result[0];
Date releaseDate = (Date)result[1];
log.debug(String.format("%s (%s)", title, releaseDate));
}
assertEquals("unexpected number of results", 7, results.size());
}
Notice...
The query requests multiple values to be returned
The query is declared to return an Object[]
The query returns a List with each element in the List containing an Object[]
The elements of the Object[] are in the order expressed by the query
Run the new test method and observe the database query produced.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testMultiValueQuery ... -*** testMultiValueQuery *** Hibernate: select movie0_.TITLE as col_0_0_, movie0_.RELEASE_DATE as col_1_0_ from QUERYEX_MOVIE movie0_ order by movie0_.TITLE ASC -Animal House (1978-06-01) -Apollo 13 (1995-06-30) -Diner (1982-04-02) -Footloose (1984-02-17) -Sleepers (1996-10-18) -Tremors (1990-01-19) -Wag The Dog (1997-12-25) ... [INFO] BUILD SUCCESS
Notice how the database query resulted in a request for multiple values of different type and the provider made these values available to the application using their assigned type.
Using Object[] arrays is functional but it can lead to some errors or less than desired query result handling. In this section you will encapsulate each row of values returned from the query with an instance of a result class. The result class will provide type-safe access to the returned values as well as any additional functionality we wish to assign.
Add the following result class as a static nested class within your test case. Notice it contains an attribute for each value we expect in the query response and contains a constructor that will process them in a specific order.
private static class MovieRelease {
public final String title;
public final Date releaseDate;
@SuppressWarnings("unused")
public MovieRelease(String title, Date releaseDate) {
this.title = title;
this.releaseDate = releaseDate;
}
}
Add the following test method to your test case. This test method will issue a similar query as before -- except this time will supply a result class expression for the values to be handled by.
@Test
public void testResultClass() {
log.info("*** testResultClass ***");
String query = String.format("select new %s(m.title, m.releaseDate) " +
"from Movie m order by title ASC",
MovieRelease.class.getName());
List<MovieRelease> results = em.createQuery(query, MovieRelease.class)
.getResultList();
for (MovieRelease movie: results) {
log.debug(String.format("%s (%s)", movie.title, movie.releaseDate));
}
assertEquals("unexpected number of results", 7, results.size());
}
Notice each row of the query result creates an instance of our result class, passing the values into the constructor in a particular order. Our result class is not an entity and will not be managed within the persistence context.
Run the new test method and observe the JPAQL issued to the entity manager contains a constructor specification for the result class and query values desired. The resultant database query is identical to the one produced in the Object[] array case. The only difference is the provider handles the Object[] array processing for us.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryTest#testResultClass ... -select new myorg.queryex.QueryTest$MovieRelease(m.title, m.releaseDate) from Movie m order by title ASC Hibernate: select movie0_.TITLE as col_0_0_, movie0_.RELEASE_DATE as col_1_0_ from QUERYEX_MOVIE movie0_ order by movie0_.TITLE ASC -Animal House (1978-06-01) -Apollo 13 (1995-06-30) -Diner (1982-04-02) -Footloose (1984-02-17) -Sleepers (1996-10-18) -Tremors (1990-01-19) -Wag The Dog (1997-12-25) ... [INFO] BUILD SUCCESS
In the previous chapter we formed queries based on JPAQL -- which are based on entity and property names and the relationships defined within the entity class structure. In a real application, there is also a need to form queries that go outside the boundaries of the entity class model -- but should still be pushed to the database to be performed. JPA provides us an escape hatch to execute raw SQL queries. Unlike JPAQL, SQL queries need not have a direct relation to the JPA entity model. We will start with something simple and then move to more complex usage of the SQL query capability within JPA.
In this section you will setup a JUnit test case to do work within this chapter.
Create a JUnit test case in src/test called SQLQueryTest. Have this class extend QueryBase. Create an initial test method to verify the setup/teardown works correctly.
package myorg.queryex;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.junit.Test;
public class SQLQueryTest extends QueryBase {
private static final Logger log = LoggerFactory.getLogger(SQLQueryTest.class);
@Test
public void test() {}
}
Build the module and run the test case.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.SQLQueryTest ... Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
You may remove the sample @Test at this time since we will be adding more tests below.
In this section you will create a simple SQL Query using the entity manager.
Add the following test method to your JUnit test case. This test case will form a native SQL query, have it executed by the entity manager, and be provided with result values. There is no need to open/close SQL Connections, create/close SQL Statements, or work with SQL ResultSets. The results are provided in a List for getResultList() or a value in getSingleResult() that we must cast to the appropriate type.
@Test
public void testSQLQuery() {
log.info("*** testSQLQuery ***");
@SuppressWarnings("unchecked")
List<String> titles = em.createNativeQuery(
"select title from queryex_movie " +
"order by title ASC").getResultList();
for (String title : titles) {
log.debug(title);
}
assertEquals("unexpected number of titles", 7, titles.size());
}
Notice the query expressed is in terms of tables and columns in the database schema and not JPA entity and property names.
Run the new test method and observe the database query that resulted.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.SQLQueryTest#testSQLQuery ... -*** testSQLQuery *** Hibernate: select title from queryex_movie order by title ASC -Animal House -Apollo 13 -Diner -Footloose -Sleepers -Tremors -Wag The Dog ... [INFO] BUILD SUCCESS
Notice the query issued to the database is exactly what you entered.
You have completed issuing a native SQL query using the entity manager. Native SQL queries support many of the same features as JPAQL queries
Single/Multiple Results
Parameters -- although the JPA spec limits this to ordinal parameters. Using named parameters for native SQL queries is not portable.
Named Queries
As you will see in the follow-on sections -- we can also use SQL queries to return managed entities. This will allow you to tweak the SQL used within a query and not give up receiving a managed entity in what is returned.
Since JPAQL queries can only be expressed in terms of the entity model, there may be times when a more complicated native SQL query is required to obtain the entities you wish to work with. In this section you will use the simplest form of this capability -- where nothing additional is needed except the specification of the entity. We can use this form when the result returns a single entity class.
Add the following test method to your existing JUnit test case. We will form a query that takes advantage of the knowledge that DIRECTOR and PERSON have a primary key join relationship and share the same primary key value. The select provided skips an unnecessary join of the intermediate DIRECTOR table and performs a join from MOVIE straight to the PERSON table -- where a column value is being evaluated.
@Test
public void testSQLResultMapping() {
log.info("*** testSQLResultMapping ***");
@SuppressWarnings("unchecked")
List<Movie> movies = em.createNativeQuery(
"select m.* from queryex_movie m " +
"join queryex_person p on p.id = m.director_id " +
"where p.first_name = 'Ron'" +
"order by title ASC", Movie.class).getResultList();
log.debug("result=" + movies);
for (Movie movie: movies) {
log.debug("em.contains(" + movie + ")=" + em.contains(movie));
assertTrue(movie + " not managed", em.contains(movie));
}
assertEquals("unexpected number of movies", 2, movies.size());
}
Notice...
The native SQL query is free to do whatever it takes to identify the entity of interest
The native SQL query is required to supply all columns required by the entity
The returned entity is managed
Note -- the example used above is not outside the capability of a JPAQL query. We are using it as a decent example showing some SQL complexity.
Run the new test method.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.SQLQueryTest#testSQLResultMapping -*** testSQLResultMapping *** Hibernate: select m.* from queryex_movie m join queryex_person p on p.id = m.director_id where p.first_name = 'Ron' order by title ASC -result=[Apollo 13 (1995), Tremors (1990)] -em.contains(Apollo 13 (1995))=true -em.contains(Tremors (1990))=true ... [INFO] BUILD SUCCESS
Notice the query executes whatever is in the native SQL and returns the columns required by the default entity mapping. The returned entity instances are managed by the persistence context -- any changes to these entities will cause a database update.
Add the following lines to access an object related to the returned entity.
log.debug("checking unmapped entity name"); assertEquals("unexpected director first name", "Ron", movies.get(0).getDirector().getPerson().getFirstName());
Notice that we are going to be traversing a few relationships during the above call. These entities will have to be loaded if they are not yet loaded in the persistence context.
Re-run the unit test and notice the extra calls to the database to retrieve the related entities on demand since they were not previously loaded into the persistence context during the previous query.
$ mvn clean test -Ph2db -Dtest=myorg.queryex.SQLQueryTest#testSQLResultMapping ... -result=[Apollo 13 (1995), Tremors (1990)] -em.contains(Apollo 13 (1995))=true -em.contains(Tremors (1990))=true -checking unmapped entity name Hibernate: select director0_.PERSON_ID as PERSON1_2_0_ from QUERYEX_DIRECTOR director0_ where director0_.PERSON_ID=? Hibernate: select person0_.ID as ID0_0_, person0_.BIRTH_DATE as BIRTH2_0_0_, person0_.FIRST_NAME as FIRST3_0_0_, person0_.LAST_NAME as LAST4_0_0_ from QUERYEX_PERSON person0_ where person0_.ID=? ... [INFO] BUILD SUCCESS
You have finished a quick look at loading a single entity using using a SQL query. In the next section we will look at eagerly fetching more of the object graph into the persistence context during the initial query.
In the previous section you mapped a native SQL query to a single entity and had to do very little work besides specifying the targeted entity and supplying a query that would result in the targeted entity to be populated and managed on return. In this section we will expand our requirements to loading a graph of related entity types during a single query. To do that we must leverage a @SqlResultSetMapping
Define the following SqlResultSetMapping on the Movie entity. This will define a SqlResultSetMapping called "Movie.movieMapping" that will include Movie, Director, and Person entities. All will use their default column mapping.
@Entity
...
@SqlResultSetMappings({
@SqlResultSetMapping(name="Movie.movieMapping", entities={
@EntityResult(entityClass=Movie.class),
@EntityResult(entityClass=Director.class),
@EntityResult(entityClass=Person.class)
})
})
public class Movie implements Comparable<Movie>{
Add the following test method to your existing test case.
@Test
public void testSQLMultiResultMapping() {
log.info("*** testSQLMultiResultMapping ***");
@SuppressWarnings("unchecked")
List<Object[]> results = em.createNativeQuery(
"select * from queryex_movie m " +
"join queryex_director dir on dir.person_id = m.director_id " +
"join queryex_person p on p.id = dir.person_id " +
"where p.first_name = 'Ron'" +
"order by title ASC", "Movie.movieMapping").getResultList();
log.debug("query returned " + results.size() + " results");
for (Object[] result: results) {
Movie movie = (Movie)result[0];
Director director = (Director) result[1];
Person person = (Person)result[2];
log.debug("em.contains(" + movie + ")=" + em.contains(movie));
log.debug("em.contains(" + director + ")=" + em.contains(director));
log.debug("em.contains(" + person + ")=" + em.contains(person));
assertTrue(movie + " not managed", em.contains(movie));
assertTrue(director + " not managed", em.contains(director));
assertTrue(person + " not managed", em.contains(person));
}
assertEquals("unexpected number of movies", 2, results.size());
}
Notice...
We have replaced the specification of the Movie.class entity with a SqlResultSetMapping that includes Movie.class and several other entities
The query now returns a List of Object[] with each element in the Object[] containing an entity instance we specified in the @SqlResultSetMapping
What we get returned is a set of entities that are now managed by the container. The query is the same as in the previous section -- we are now instructing the provider to map those results to entity instances and have them managed.
Run the test method and observe the result.
$ mvn clean test -Ph2db -Dtest=myorg.queryex.SQLQueryTest#testSQLMultiResultMapping ... -*** testSQLMultiResultMapping *** Hibernate: select * from queryex_movie m join queryex_director dir on dir.person_id = m.director_id join queryex_person p on p.id = dir.person_id where p.first_name = 'Ron' order by title ASC
Up to here, we have what we expected. However in the next statements we see the PERSON being re-fetched for each row.
Hibernate: select person0_.ID as ID0_0_, person0_.BIRTH_DATE as BIRTH2_0_0_, person0_.FIRST_NAME as FIRST3_0_0_, person0_.LAST_NAME as LAST4_0_0_ from QUERYEX_PERSON person0_ where person0_.ID=? Hibernate: select person0_.ID as ID0_0_, person0_.BIRTH_DATE as BIRTH2_0_0_, person0_.FIRST_NAME as FIRST3_0_0_, person0_.LAST_NAME as LAST4_0_0_ from QUERYEX_PERSON person0_ where person0_.ID=? -query returned 2 results -em.contains(Apollo 13 (1995))=true -em.contains(Ron Howard)=true -em.contains(Ron Howard)=true -em.contains(Tremors (1990))=true -em.contains(Ron Underwood)=true -em.contains(Ron Underwood)=true ... [INFO] BUILD SUCCESS
Re-write the query to enumerate each column being selected in an attempt to expose the issue with the above query.
List<Object[]> results = em.createNativeQuery(
"select " +
"m.id, m.minutes, m.rating, m.release_date, m.title, m.director_id, " +
"dir.person_id, " +
"p.id, p.first_name, p.last_name, p.birth_date " +
"from queryex_movie m " +
"join queryex_director dir on dir.person_id = m.director_id " +
"join queryex_person p on p.id = dir.person_id " +
"where p.first_name = 'Ron'" +
"order by title ASC", "Movie.movieMapping").getResultList();
Notice there are two columns labeled "ID"; MOVIE.ID and PERSON.ID.
If you re-run the updated query you will observe the same results as before except for the explicit fields in the select. However, with the explicit naming of the fields -- you can now see the overlap between m.id and p.id.
$ mvn clean test -Ph2db -Dtest=myorg.queryex.SQLQueryTest#testSQLMultiResultMapping1 -*** testSQLMultiResultMapping *** Hibernate: select m.id, m.minutes, m.rating, m.release_date, m.title, m.director_id, dir.person_id, p.id, p.first_name, p.last_name, p.birth_date from queryex_movie m join queryex_director dir on dir.person_id = m.director_id join queryex_person p on p.id = dir.person_id where p.first_name = 'Ron' order by title ASC Hibernate: select person0_.ID as ID0_0_, ... Hibernate: select person0_.ID as ID0_0_, ... -query returned 2 results -em.contains(Apollo 13 (1995))=true -em.contains(Ron Howard)=true -em.contains(Ron Howard)=true -em.contains(Tremors (1990))=true -em.contains(Ron Underwood)=true -em.contains(Ron Underwood)=true [INFO] BUILD SUCCESS
Leverage the @FieldResult specification within the @EntityResult fields attribute to be able to specify an alias for the "p.id" result as "p_id" to distinguish it from "m.id". The other columns for PERSON are fine with their default but re-specifying them seems to be required once one of the columns is specified.
@Entity
...
@SqlResultSetMappings({
...
@SqlResultSetMapping(name="Movie.movieMapping2", entities={
@EntityResult(entityClass=Movie.class),
@EntityResult(entityClass=Director.class),
@EntityResult(entityClass=Person.class, fields={
@FieldResult(name="id", column="p_id"),
@FieldResult(name="firstName", column="first_name"),
@FieldResult(name="lastName", column="last_name"),
@FieldResult(name="birthDate", column="birth_date")
})
})
})
public class Movie implements Comparable<Movie>{
Update the SQL query with the alias defined above. Make sure you update the @SqlResultMapping.name in the query to match what you either added or updated above.
List<Object[]> results = em.createNativeQuery(
"select " +
"m.id, m.minutes, m.rating, m.release_date, m.title, m.director_id, " +
"dir.person_id, " +
"p.id as p_id, " + //NOTICE: the alias for PERSON.ID
"p.first_name, p.last_name, p.birth_date " +
"from queryex_movie m " +
"join queryex_director dir on dir.person_id = m.director_id " +
"join queryex_person p on p.id = dir.person_id " +
"where p.first_name = 'Ron'" +
"order by title ASC", "Movie.movieMapping2").getResultList();
Re-run the query with the alias in place and observe that everything is resolved within the results of the first query.
$ mvn clean test -Ph2db -Dtest=myorg.queryex.SQLQueryTest#testSQLMultiResultMapping2 ... -*** testSQLMultiResultMapping *** Hibernate: select m.id, m.minutes, m.rating, m.release_date, m.title, m.director_id, dir.person_id, p.id as p_id, p.first_name, p.last_name, p.birth_date from queryex_movie m join queryex_director dir on dir.person_id = m.director_id join queryex_person p on p.id = dir.person_id where p.first_name = 'Ron' order by title ASC -query returned 2 results -em.contains(Apollo 13 (1995))=true -em.contains(Ron Howard)=true -em.contains(Ron Howard)=true -em.contains(Tremors (1990))=true -em.contains(Ron Underwood)=true -em.contains(Ron Underwood)=true [INFO] BUILD SUCCESS
Add the following statements to the test method to verify that all related objects have been eagerly fetched/resolved within the initial query.
log.debug("checking unmapped entity name");
assertEquals("unexpected director first name",
"Ron", ((Movie)((Object[])results.get(0))[0]).getDirector().getPerson().getFirstName());
Re-run the test method to show that no additional queries are issued to the database when navigating the relationships within the object graph.
$ mvn clean test -Ph2db -Dtest=myorg.queryex.SQLQueryTest#testSQLMultiResultMapping2 -query returned 2 results -em.contains(Apollo 13 (1995))=true -em.contains(Ron Howard)=true -em.contains(Ron Howard)=true -em.contains(Tremors (1990))=true -em.contains(Ron Underwood)=true -em.contains(Ron Underwood)=true -checking unmapped entity name ... [INFO] BUILD SUCCESS
You have completed querying the main points for querying for entities using native SQL.
In this chapter you used native SQL queries to get both values and managed entities from the database. Although you can do the same thing through JPAQL and this way is not portable across database dialects, using native SQL does provide open-ended optimization or specialization of your database queries. While JPAQL is restricted to queries that are consistent with the entity model, the native SQL extension can allow you to form query that bypasses unnecessary joins or navigates both directions of a uni-directional relationship.
Clearly the power of customization is not always worth the extra effort, but definitely keep this capability in mind when you find yourself using native SQL to locate primary key values and then following up that query with an JPAQL query or entity manager command to get a managed object using that primary key.
We leveraged Java Strings and class @Annotations to express the native SQL within this exercise. However, if you find the native SQL being written is too platform-specific and you need to flexibility to run against different database platforms, it is recommended the native SQL be placed in @NamedNativeQueries and defined within an XML mapping file. Once abstracted into the XML mapping file -- platform specific versions of the XML mapping file can be created and used to keep the core DAO platform neutral.
In most of the other chapters we primary show different ways to us JPA to query the database. In this chapter we will use JPA query to perform bulk updates to the database. This capability eliminates the need to query for entire entities, change a few properties, and perform a follow-on update -- which would be very inefficient for a large number of entities. It this chapter we will bypass the persistence context and perform work directly on the database.
It is worth noting that bulk updates bypass the cache entities in the persistence context and will not update their state or leverage defined constructs like cascade delete. You are responsible for either detaching, refreshing, or deleting the impacted entities when performing a bulk update.
In this section you will setup a JUnit test case to do work within this chapter.
Create a JUnit test case in src/test called BulkUpdateTest. Have this class extend QueryBase. Create an initial test method to verify the setup/teardown works correctly.
package myorg.queryex;
import static org.junit.Assert.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.junit.Test;
public class BulkUpdateTest extends QueryBase {
private static final Logger log = LoggerFactory.getLogger(BulkUpdateTest.class);
@Test
public void test(){}
}
Build the module and run the test case.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.BulkUpdateTest ... Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
You may remove the sample @Test at this time since we will be adding more tests below.
This test case will be changing the database -- so we need to execute the normal test case cleanup/populate in-between test methods to get back to a known state.
Add the following JUnit @Before method to your test case. This will perform the cleanup/populate methods in between each test method since test methods in this test case change the database. The parent class will take care of cleanup/populate prior to running the next test case.
@Before
public void setUpLocksTest() {
em.getTransaction().commit();
cleanup(em);
populate(em);
em.getTransaction().begin();
}
In this section we will update the properties of an entity directly in the database. To demonstrate the bulk query bypasses and invalidates the cache, a copy of the entity being changed will be purposely brought into the persistence context and queried at different points during the process.
Add the following test method to your existing test case. The test method starts out by getting a copy of the entity to be updated and placing it in the persistence context cache.
@Test
public void testUpdate() {
log.info("*** testUpdate ***");
log.debug("get a copy of the entity into the persistence context cache for demo");
String oldFirst = "Ron";
String oldLast = "Howard";
Director d = em.createQuery("select d from Director d JOIN FETCH d.person p " +
"where p.firstName=:firstName and p.lastName=:lastName", Director.class)
.setParameter("firstName", oldFirst)
.setParameter("lastName", oldLast)
.getSingleResult();
log.debug("entity in cache=" + d);
}
Run the test method and notice we found the entity we queried for. Since we used a JOIN FETCH -- the default LAZY fetch was ignored and both entities were loaded by the initial query.
$ mvn clean test -Dtest=myorg.queryex.BulkUpdateTest#testUpdate ... -*** testUpdate *** -get a copy of the entity into the persistence context cache for demo Hibernate: select director0_.PERSON_ID as PERSON1_2_0_, person1_.ID as ID0_1_, person1_.BIRTH_DATE as BIRTH2_0_1_, person1_.FIRST_NAME as FIRST3_0_1_, person1_.LAST_NAME as LAST4_0_1_ from QUERYEX_DIRECTOR director0_ inner join QUERYEX_PERSON person1_ on director0_.PERSON_ID=person1_.ID where person1_.FIRST_NAME=? and person1_.LAST_NAME=? limit ? -entity in cache=Ron Howard ... [INFO] BUILD SUCCESS
Add the following lines to the test method. This will perform the actual bulk update. Note the call will return the number of rows updated.
String newFirst = "Opie";
String newLast = "Taylor";
log.debug("performing bulk update");
int changes=em.createQuery("update Person p " +
"set p.firstName=:newFirst, p.lastName=:newLast " +
"where p.firstName=:oldFirst and p.lastName=:oldLast")
.setParameter("newFirst", newFirst)
.setParameter("newLast", newLast)
.setParameter("oldFirst", oldFirst)
.setParameter("oldLast", oldLast)
.executeUpdate();
log.debug("changes=" + changes);
assertEquals("unexpected changes", 1, changes);
Re-run the test method and note the database update command executed and the number of changes returned.
$ mvn clean test -Dtest=myorg.queryex.BulkUpdateTest#testUpdate ... -performing bulk update Hibernate: update QUERYEX_PERSON set FIRST_NAME=?, LAST_NAME=? where FIRST_NAME=? and LAST_NAME=? -changes=1 ... [INFO] BUILD SUCCESS
Add the following lines to the test method to inspect the entity still in the persistence context cache.
log.debug("entity still in cache has old values=" + d);
assertEquals("unexpected cache change", oldFirst, d.getFirstName());
assertEquals("unexpected cache change", oldLast, d.getLastName());
Re-run the test method to show the cache was bypassed and invalidated by the bulk database updated.
$ mvn clean test -Dtest=myorg.queryex.BulkUpdateTest#testUpdate ... -entity still in cache has old values=Ron Howard ... [INFO] BUILD SUCCESS
Add the following lines to the test method to refresh the stale entities in the persistence context with changes to the database. We must reference each entity we want refreshed unless the relationship has defined cascade=REFRESH.
log.debug("refreshing cache with changes to database");
em.refresh(d.getPerson());
log.debug("refreshed entity in cache has new values=" + d);
assertEquals("unexpected cache change", newFirst, d.getFirstName());
assertEquals("unexpected cache change", newLast, d.getLastName());
}
Re-run the test method and observe how the entity within the persistence context has been updated with the current state of the database.
$ mvn clean test -Dtest=myorg.queryex.BulkUpdateTest#testUpdate ... -refreshing cache with changes to database Hibernate: select person0_.ID as ID0_0_, person0_.BIRTH_DATE as BIRTH2_0_0_, person0_.FIRST_NAME as FIRST3_0_0_, person0_.LAST_NAME as LAST4_0_0_ from QUERYEX_PERSON person0_ where person0_.ID=? -refreshed entity in cache has new values=Opie Taylor ... [INFO] BUILD SUCCESS
You have finished taking a quick look at performing bulk updates using JPAQL. In the next section we will do much the same thing except use native SQL.
In the previous section we used JPAQL to issue a bulk update to the database. In this section we will assume the SQL task to be performed is above and beyond what we can express in the portable JPAQL and must express in native SQL. That may not be the case in this example -- but you will get the point that anything goes with the bulk SQL update.
Add the following test method to your existing test case. This first part will query the database using JPAQL to determine which entities will be updated and to again demonstrate the issues with bulk updates and entries in the cache.
@Test
public void testSQLUpdate() {
log.info("*** testSQLUpdate ***");
log.debug("get a copies of the entities into the persistence context cache for demo");
String genre="Crime Drama";
@SuppressWarnings("unchecked")
List<Movie> movies = em.createQuery("select m from Movie m JOIN m.genres g " +
"where g = :genre")
.setParameter("genre", genre)
.getResultList();
int genreCount=0;
for (Movie movie : movies) {
log.debug("entity in cache=" + movie + ", genres=" + movie.getGenres());
genreCount += movie.getGenres().contains(genre)?1:0;
}
assertTrue("no movies found for genre", movies.size()>0);
assertTrue("unexpected genre count", genreCount > 0);
}
Run the test method to load the targeted entities into the cache.
$ mvn clean test -Dtest=myorg.queryex.BulkUpdateTest#testUpdate ... -*** testSQLUpdate *** -get a copies of the entities into the persistence context cache for demo Hibernate: select movie0_.ID as ID3_, movie0_.DIRECTOR_ID as DIRECTOR6_3_, movie0_.MINUTES as MINUTES3_, movie0_.RATING as RATING3_, movie0_.RELEASE_DATE as RELEASE4_3_, movie0_.TITLE as TITLE3_ from QUERYEX_MOVIE movie0_ inner join QUERYEX_MOVIEGENRE genres1_ on movie0_.ID=genres1_.MOVIE_ID where genres1_.GENRE=? -entity in cache=Sleepers (1996), genres=[Buddy Film, Courtroom Drama, Crime, Crime Drama, Drama, Reunion Films] ... [INFO] BUILD SUCCESS
Add the following lines to the test method to perform the bulk update. Notice that we are using native table and column names in this update command.
log.debug("performing bulk update to remove genre=" + genre);
int changes=em.createNativeQuery("delete from QUERYEX_MOVIEGENRE g " +
"where g.genre=?1")
.setParameter(1, genre)
.executeUpdate();
log.debug("changes=" + changes);
assertEquals("unexpected changes", 1, changes);
Re-run the test method and observe the update issued to the database and the number of changed rows that are returned.
$ mvn clean test -Dtest=myorg.queryex.BulkUpdateTest#testUpdate ... -performing bulk update to remove genre=Crime Drama Hibernate: delete from QUERYEX_MOVIEGENRE g where g.genre=? -changes=1 ... [INFO] BUILD SUCCESS
Add the following lines to your test method to inspect the entity still in the persistence context.
for (Movie movie : movies) {
log.debug("entity still in cache=" + movie + ", genres=" + movie.getGenres());
}
Re-run the test method to show the cached entity is still in its original fetched state.
$ mvn clean test -Dtest=myorg.queryex.BulkUpdateTest#testUpdate ... -entity still in cache=Sleepers (1996), genres=[Buddy Film, Courtroom Drama, Crime, Crime Drama, Drama, Reunion Films] ... [INFO] BUILD SUCCESS
Add the following lines to the test method to refresh the stale entities.
log.debug("refreshing cached objects");
genreCount=0;
for (Movie movie : movies) {
em.refresh(movie);
log.debug("entity in cache=" + movie + ", genres=" + movie.getGenres());
genreCount += movie.getGenres().contains(genre)?1:0;
}
assertEquals("unexpected cache change", 0, genreCount);
Re-run the test method and observe how the REFRESH from the parent object is cascaded to the child elements because of a built-in rule for @ElementCollections to cascade all commands from parent to child.
$ mvn clean test -Dtest=myorg.queryex.BulkUpdateTest#testUpdate ... -refreshing cached objects Hibernate: select movie0_.ID as ID3_0_, movie0_.DIRECTOR_ID as DIRECTOR6_3_0_, movie0_.MINUTES as MINUTES3_0_, movie0_.RATING as RATING3_0_, movie0_.RELEASE_DATE as RELEASE4_3_0_, movie0_.TITLE as TITLE3_0_ from QUERYEX_MOVIE movie0_ where movie0_.ID=? Hibernate: select genres0_.MOVIE_ID as MOVIE1_3_0_, genres0_.GENRE as GENRE0_ from QUERYEX_MOVIEGENRE genres0_ where genres0_.MOVIE_ID=? -entity in cache=Sleepers (1996), genres=[Reunion Films, Crime, Courtroom Drama, Drama, Buddy Film] ... [INFO] BUILD SUCCESS
You have completed an initial look at performing bulk database updates using SQL queries. This is very similar to the JPAQL technique -- except there is no constraint on how to form the database queries. One use I specifically have found for using native SQL bulk updates is to execute database SQL scripts created by SQL schema generation tools.
In this chapter we provided two basic examples of bulk database updates. This provides a very efficient way to make changes to the database since it bypasses the entity model and other business logic because you are directly working with data in the database. The capability does come at a price. As the exercises showed, the entities within the cache are bypassed and made stale by the direct interaction within the database. This can make it very difficult to work with both persistence contexts and bulk updates at the same time. Bulk updates should be limited to their own transaction or the beginning of transactions using hybrid techniques.
In this chapter we will take a brief look at how you can incorporate database locks into your queries to help address race conditions. We will not be covering JPA locking in detail here. We will be limiting our coverage to how to integrate queries with JPA locks.
In this section you will setup a JUnit test case to do work within this chapter.
Create a JUnit test case in src/test called QueryLocksTest. Have this class extend QueryBase. Create an initial test method to verify the setup/teardown works correctly.
package myorg.queryex;
import static org.junit.Assert.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.junit.Test;
public class QueryLocksTest extends QueryBase {
private static final Logger log = LoggerFactory.getLogger(QueryLocksTest.class);
@Test
public void test(){}
}
Build the module and run the test case.
$ mvn clean test -P\!h2db -Ph2srv -Dtest=myorg.queryex.QueryLocksTest ... Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
You may remove the sample @Test at this time since we will be adding more tests below.
For this topic -- we will be looking at the case of a database writer that needs to either update or insert depending on the results of a query. To make this realistic, we will run the same code in multiple threads and purposely form a race condition where we hope to leverage locks to provide us one INSERT and multiple UPDATEs.
Add the following JUnit @Before method to your test case. This will perform the cleanup/populate in between each test method since test methods in this test case change the database. The parent class will take care of cleanup/populate prior to running the next test case.
@Before
public void setUpLocksTest() {
em.getTransaction().commit();
cleanup(em);
populate(em);
}
Add the following enum to your test case. This will be used by the Writer objects to tell the main loop what they did individually to the database.
public static enum Action { INSERT, UPDATE, FAIL };
Add the following Writer class. This class will be used to perform a transaction with the database within its own Java Thread. The transaction is started during the constructor and finished during the run method.
private class Writer extends Thread {
private String context;
private Actor actor;
private LockModeType lockMode;
private EntityManager em_;
private Action action;
private int sleepTime=100;
private String errorText;
public Writer(String context, Actor actor, LockModeType lockMode) {
this.context = context;
this.actor = actor;
this.lockMode = lockMode;
em_ = emf.createEntityManager();
em_.getTransaction().begin();
log.debug(context + " transaction started");
}
public boolean isDone() { return action != null && em_==null; }
public String getContext() { return context; }
public Action getAction() { return action; }
public String getErrorText() { return errorText; }
public void run() {
//...
}
};
Implement the run() method for the Writer class. The method will search for the entity in the database and either create or update it depending on the result of the query.
public void run() {
try {
log.debug(context + " selecting with lockMode=" + lockMode);
List<Actor> actors = em_.createQuery(
"select a from Actor a JOIN a.person as p " +
"where p.firstName=:firstName and p.lastName=:lastName " +
"or p.firstName='" + context + "'", Actor.class)
.setLockMode(lockMode)
.setParameter("firstName", actor.getFirstName())
.setParameter("lastName", actor.getLastName())
.setMaxResults(1)
.getResultList();
Notice...
We are passing in a lockMode property into the query above.
We are performing an INSERT if nothing is returned -- else UPDATE
try {
log.debug(context + " sleeping " + sleepTime + " msecs");
Thread.sleep(sleepTime);
} catch (Exception ex){}
if (actors.size()==0) {
log.debug(context + " creating entity");
em_.persist(actor);
action=Action.INSERT;
} else {
log.debug(context + " updating entity");
actors.get(0).setBirthDate(actor.getBirthDate());
action=Action.UPDATE;
}
We finish up the method with a commit/rollback of the transaction and general accounting so the main loop will know what this instance did.
em_.flush();
log.debug(context + " committing transaction version=" + actor.getVersion());
em_.getTransaction().commit();
log.debug(context + " committed transaction version=" + actor.getVersion());
} catch (PersistenceException ex) {
log.debug(context + " failed " + ex);
em_.getTransaction().rollback();
action = Action.FAIL; errorText = ex.toString();
} finally {
em_.close(); em_=null;
}
}
Add a helper method to setup and execute each test. This helper will accept a LockModeType and count of threads to execute. The helper method will supply each thread with an instance to either INSERT or UPDATE. The primary key is unique -- so the thread will use a query based on the properties of the object.
protected int testUpsert(LockModeType lockMode, int count) {
List<Writer> writers = new ArrayList<QueryLocksTest.Writer>();
//create writer instances within their own thread
for (int i=0; i<count; i++) {
Date birthDate = new GregorianCalendar(1969+i, Calendar.MAY, 25).getTime();
Actor actor = new Actor(new Person("test-actor" + i)
.setFirstName("Anne")
.setLastName("Heche")
.setBirthDate(birthDate));
writers.add(new Writer("writer" + i, actor, lockMode));
}
//...
}
This portion of the helper method will cause the following output from each thread.
-writer0 transaction started
Add the following lines to the helper method to start each thread.
//start each of the threads
List<Writer> working = new ArrayList<Writer>();
for (Writer writer : writers) {
working.add(writer); writer.start();
}
This will produce the following output out of each thread. However, the queries will differ slightly depending on the LockModeType used.
-writer0 selecting with lockMode=NONE Hibernate: select actor0_.PERSON_ID as PERSON1_1_, actor0_.version as version1_ from QUERYEX_ACTOR actor0_ inner join QUERYEX_PERSON person1_ on actor0_.PERSON_ID=person1_.ID where person1_.FIRST_NAME=? and person1_.LAST_NAME=? or person1_.FIRST_NAME='writer0' limit ? -writer0 sleeping 100 msecs
The above code will also cause the following to be produced when no match is found by the query.
-writer0 creating entity Hibernate: insert into QUERYEX_PERSON (BIRTH_DATE, FIRST_NAME, LAST_NAME, ID) values (?, ?, ?, ?) Hibernate: insert into QUERYEX_ACTOR (version, PERSON_ID) values (?, ?)
The threads have been designed to delay processing between the select and the INSERT/UPDATE to simulate additional work and to cause neighboring threads to wait (if configured to do so).
-writer0 committing transaction version=0 -writer0 committed transaction version=0
Add the following lines to the helper method to wait for the threads to complete.
//run until all writers complete
while (!working.isEmpty()) {
try { Thread.sleep(100); } catch (Exception ex) {}
Iterator<Writer> itr = working.iterator();
while (itr.hasNext()) {
if (itr.next().isDone()) { itr.remove(); }
}
}
Add the following lines to the helper method to query for the results and log them. Notice the use of a JOIN FETCH in the query to assure the query performs an EAGER fetch of Person in the same query as the Actor.
//get the resultant entries in database
List<Actor> actors = em.createQuery(
"select a from Actor a JOIN FETCH a.person as p " +
"where p.firstName=:firstName and p.lastName=:lastName", Actor.class)
.setParameter("firstName", "Anne")
.setParameter("lastName", "Heche")
.getResultList();
log.debug("actors=" + actors);
for (Writer w : writers) {
log.debug(String.format("%s => %s %s", w.getContext(), w.getAction(), w.getErrorText()==null?"":w.getErrorText()));
}
This will produce the following output during the test method.
Hibernate: select actor0_.PERSON_ID as PERSON1_1_0_, person1_.ID as ID0_1_, actor0_.version as version1_0_, person1_.BIRTH_DATE as BIRTH2_0_1_, person1_.FIRST_NAME as FIRST3_0_1_, person1_.LAST_NAME as LAST4_0_1_ from QUERYEX_ACTOR actor0_ inner join QUERYEX_PERSON person1_ on actor0_.PERSON_ID=person1_.ID where person1_.FIRST_NAME=? and person1_.LAST_NAME=? -actors=[Anne Heche, version=0] -writer0 => INSERT
Notice there is an audit of what each of the threads performed (INSERT or UPDATE) at the end of the above output.
Add the following line to the helper method to return the number of rows found in the database.
return actors.size();
Add the following test method to verify the code added above and the general working of the test case.
@Test
public void testSimple() {
log.info("*** testPersistentSimple ***");
assertEquals("unexpected number of actors", 1, testUpsert(LockModeType.NONE, 1));
}
Run the simple test method to verify the functionality of the test case. Since we use a single thread and no locking, the output is pretty straight forward.
$ mvn clean test -Dtest=myorg.queryex.QueryLocksTest#testSimple ... -*** testPersistentSimple *** -writer0 transaction started -writer0 selecting with lockMode=NONE Hibernate: select actor0_.PERSON_ID as PERSON1_1_, actor0_.version as version1_ from QUERYEX_ACTOR actor0_ inner join QUERYEX_PERSON person1_ on actor0_.PERSON_ID=person1_.ID where person1_.FIRST_NAME=? and person1_.LAST_NAME=? or person1_.FIRST_NAME='writer0' limit ? -writer0 creating entity Hibernate: insert into QUERYEX_PERSON (BIRTH_DATE, FIRST_NAME, LAST_NAME, ID) values (?, ?, ?, ?) Hibernate: insert into QUERYEX_ACTOR (version, PERSON_ID) values (?, ?) -writer0 sleeping 100 msecs -writer0 committing transaction version=0 -writer0 committed transaction version=0 Hibernate: select actor0_.PERSON_ID as PERSON1_1_0_, person1_.ID as ID0_1_, actor0_.version as version1_0_, person1_.BIRTH_DATE as BIRTH2_0_1_, person1_.FIRST_NAME as FIRST3_0_1_, person1_.LAST_NAME as LAST4_0_1_ from QUERYEX_ACTOR actor0_ inner join QUERYEX_PERSON person1_ on actor0_.PERSON_ID=person1_.ID where person1_.FIRST_NAME=? and person1_.LAST_NAME=? -actors=[Anne Heche, version=0] -writer0 => INSERT ... [INFO] BUILD SUCCESS
In this first section we will demonstrate the problem of implementing INSERT/UPDATE without the ability to form a database lock within the initial query. Without a lock -- the results of the initial query become invalidated by the time this transaction completes and we do not get the desired results.
Add the following test method to your existing test case. This will add additional threads to the simple test case run earlier.
@Test
public void testNONE() {
log.info("*** testNONE ***");
int count=testUpsert(LockModeType.NONE, 5);
for (int i=0; i<10 && count<=1; i++) {
//can't always trigger race condition -- so retry
cleanup(em);
populate(em);
count=testUpsert(LockModeType.NONE, 5);
}
assertTrue("unexpected number of actors", count > 1);
}
Due to the unpredictability of the race condition -- we may have to run the test more than once.
Run the new test method and observe the final results. Although each execution will be slightly different -- the issue is we get more than a single thread creating an instance of the entity. You will find all INSERTS got their query results first and all UPDATES go their results last.
$ mvn clean test -Dtest=myorg.queryex.QueryLocksTest#testNONE ... -writer0 => INSERT -writer1 => UPDATE -writer2 => UPDATE -writer3 => UPDATE -writer4 => INSERT ... [INFO] BUILD SUCCESS
In this section we will drop back to a single thread but add LockMode to the query so the difference can be easily seen.
Add the following test method to your existing test case.
@Test
public void testPessimisticWrite1() {
log.info("*** testPersistentWrite1 ***");
assertEquals("unexpected number of actors", 1, testUpsert(LockModeType.PESSIMISTIC_WRITE, 1));
}
Run the new test method and note the difference in the query output. The provider has added a "for update" at the end of the query. This will form a lock in the database. For the H2 database we are using -- this is a table lock.
$ mvn clean test -Dtest=myorg.queryex.QueryLocksTest#testPessimisticWrite1 ... -*** testPersistentWrite1 *** -writer0 transaction started -writer0 selecting with lockMode=PESSIMISTIC_WRITE Hibernate: select actor0_.PERSON_ID as PERSON1_1_, actor0_.version as version1_ from QUERYEX_ACTOR actor0_ inner join QUERYEX_PERSON person1_ on actor0_.PERSON_ID=person1_.ID where person1_.FIRST_NAME=? and person1_.LAST_NAME=? or person1_.FIRST_NAME='writer0' limit ? for update ... -writer0 => INSERT ... [INFO] BUILD SUCCESS
In this section we will add several threads -- all using PESSIMISTIC_WRITE locks.
Add the following test method to your existing test case. This is the same LockMode as before except we have added additional threads.
@Test
public void testPessimisticWrite() {
log.info("*** testPersistentWrite ***");
assertEquals("unexpected number of actors", 1, testUpsert(LockModeType.PESSIMISTIC_WRITE, 5));
}
Run the new test method and notice we get a single INSERT and multiple UPDATEs every time the test is run. That is because all subsequent selects are blocked until the first select commit()s its transaction.
$ mvn clean test -Dtest=myorg.queryex.QueryLocksTest#testPessimisticWrite ... -writer0 => INSERT -writer1 => UPDATE -writer2 => UPDATE -writer3 => UPDATE -writer4 => UPDATE ... [INFO] BUILD SUCCESS
This chapter exposed at least one problem that can be corrected using pessimistic locking and lockMode within the JPA query. The details of locking are outside the scope of this chapter and is also subject to database capability and connection modes selected. We want to leave you with just a taste of what you can do and how to express that within a JPA query.
Table of Contents
Create and deploy a basic EJB (without resource requirements or other dependencies)
Show different deployment options
Show how to access the EJB
Create an integration test for the EJB
Enhance development skills
At the completion of this topic, the student shall
be able to:
Create an EJB module
Create a simple @Stateless EJB in the EJB module
Add required and key aspects to the EJB
Deploy the EJB to the server using an EAR
Deploy the EJB to the server using a WAR
Integration test the EJB deployed to the server
Create deterministic JNDI names
Debug remote EJB code running on the server
Each component of the Java EE application will be developed as as a separate Maven module. Each module will be placed in a flat structure under a common parent project. This parent project will be used to coordinate goals involved of the entire application. The order in which it builds things can be influenced by the configuration you supply, however, maven will analyze dependencies at build time and either honor the actual dependency ordering or fail if you have expressed a circular dependency.
Create a root directory for the exercise. This will host the root project and sub-projects for the Impl, EJB, WAR, EAR, and RMI Test modules.
$ mkdir ejb-basichotel
Create a root project pom.xml file. This project will will not have an associated artifact and is termed a "parent" (simple term) or "reactor" (techy term) project. It uses a special packaging type called "pom".
This project will also be used as a parent project of all implementation modules so we can define re-usable definitions. Note that the definitions within this project are passive. They will not actively add any dependencies or plugins to inheriting child modules unless the child specifically references the defined artifact. Details for the potentially used artifact can be placed here (and consistently reused) and briefly referenced in the child poms or inherently referenced by the child modules packaging type (i.e., packaging=jar projects automatically bring in the maven-compiler-plugin)
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>myorg.basicejb</groupId>
<artifactId>basicejbEx</artifactId>
<packaging>pom</packaging>
<name>Basic EJB Exercise</name>
<version>1.0-SNAPSHOT</version>
<description>
This project is the root project for the example Java EE
Application.
</description>
<modules>
</modules>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<repositories>
</repositories>
<pluginRepositories>
</pluginRepositories>
<dependencyManagement>
<dependencies>
</dependencies>
</dependencyManagement>
<build>
<pluginManagement>
<plugins>
</plugins>
</pluginManagement>
</build>
<profiles>
</profiles>
</project>
It is very important that you make the packaging type is "pom" in this case. If you leave out this specification, Maven will default to packaging=jar type and attempt to build a Java-based artifact and ignore its responsibility in this project to be the root project to delegate to the child projects that build artifacts.
Add in the maven-compiler-plugin specification to the parent pom to make sure JDK 1.7 or above is used and we use an up to date version of the plugin.
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.source.version>1.8</java.source.version>
<java.target.version>1.8</java.target.version>
<maven-compiler-plugin.version>3.1</maven-compiler-plugin.version>
</properties>
...
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven-compiler-plugin.version}</version>
<configuration>
<source>${java.source.version}</source>
<target>${java.target.version}</target>
</configuration>
</plugin>
Test your root project by building it at this time.
$mvn clean install ... [INFO] Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Basic EJB Exercise 1.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ basicejbEx --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ basicejbEx --- [INFO] Installing /home/jcstaff/proj/basicejbEx/pom.xml to /home/jcstaff/.m2/repository2/myorg/basicejb/basicejbEx/1.0-SNAPSHOT/basicejbEx-1.0-SNAPSHOT.pom [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 0.682 s [INFO] Finished at: 2014-10-04T17:45:26-04:00 [INFO] Final Memory: 7M/105M [INFO] ------------------------------------------------------------------------
The EJB module will be used to develop one of the EJB components. The term "EJB" gets a little overloaded at times. There are EJB classes, EJB components, and an EJB tier. EJB classes break out into the business-remote (aka @Remote) and business-local (aka @Local) interfaces, the EJB implementation class (either @Stateless, @Stateful, @Singleton, or @MessageDriven), and support classes. It is common to think of each cohesive pairing of @Remote, @Local, and implementation class as "an EJB". You can have many EJBs (the sets of classes) within an EJB component. An EJB component is a materialized as a .jar and there can be many EJB components within your EJB tier. For this exercise we will have only one EJB component and start out with only one EJB. A second EJB will be added to the single EJB component in a later excercise to help support testing. A single Maven project can build a single EJB component.
Add an EJB module directory to your project tree.
$ mkdir basicejb-ejb
Add the outer shell of the EJB module's pom.xml.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<parent>
<artifactId>basicejbEx</artifactId>
<groupId>myorg.basicejb</groupId>
<version>1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>basicejb-ejb</artifactId>
<packaging>ejb</packaging>
<name>Basic EJB Exercise::EJB</name>
<description>
This project provides example usages of an EJB tier.
</description>
<dependencies>
</dependencies>
<build>
<plugins>
</plugins>
</build>
</project>
It is important to note that the packaging type is "ejb" in this case. If you leave out the packaging type, Maven will default to "jar" and not handle your module appropriately within the context of a Java EE application.
Add in the maven-ejb-plugin definition to the *parent* pom.xml with all properties and constructs that would be common across all EJB modules.
Tell maven explicitly to use EJB 3.x. Anything 3.0 and above should be fine. The plugin defaults to EJB 2.1 which requires a descriptor in META-INF/ejb-jar.xml. We won't be using a descriptor until later.
Tell maven to add all dependencies that have scope=compile to the Class-Path in the META-INF/MANIFEST.MF. This is a pure Java construct that JavaEE takes advantage of in order to resolve dependencies on the server.
# basicejbEx/pom.xml
<properties>
...
<maven-ejb-plugin.version>3.0.1</maven-ejb-plugin.version>
</properties>
<pluginManagement>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-ejb-plugin</artifactId>
<version>${maven-ejb-plugin.version}</version>
<configuration>
<ejbVersion>3.2</ejbVersion>
<archive>
<manifest>
<addClasspath>true</addClasspath>
</manifest>
</archive>
</configuration>
</plugin>
Add in the maven-ejb-plugin declaration to the EJB/pom.xml. This will contain portions of the plugin definition that could be unique per EJB module.
Tell the plugin to create a ejb-client.jar file for remote clients. This will be populated using a set of include and exclude paths. In this case, we are telling it not to include any of our deployment descriptors in the META-INF directory as well as leaving out our EJB implemenation class. This will produce an extra jar in the target directory called ${project.artifactId}-${project.version}-client.jar and can be brought in with a dependency on the EJB module using a type element set to "ejb-client". We will do this in the RMI Test module.
<!-- tell the EJB plugin to build a client-jar -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-ejb-plugin</artifactId>
<configuration>
<generateClient>true</generateClient>
<clientExcludes>
<clientExclude>**/META-INF/*.xml</clientExclude>
<clientExclude>**/ejb/*EJB.class</clientExclude>
</clientExcludes>
</configuration>
</plugin>
</plugins>
</build>
Since the packaging type is "ejb" for this module, the maven-ejb-plugin is brought in (according to our pluginManagement definition) automatically. We are just extending the definition to include the ejb-client. If the plugin was not automatically brought in because of the packaging type -- the above specification in the build.plugins section of the EJB/pom.xml would have been enough to activate the plugin.
Add several dependencies to the EJB/pom.xml account for use of logging, annotations, and EJB constructs. Since we will be instantiating this code only on the server and not in a 2-tier approach, the pure JavaEE API from Sun/Oracle will do fine. Otherwise we should use the dependencies from Hibernate/JBoss.
# basicejbEx/basicejb-ejb/pom.xml
<dependencies>
<!-- core implementation dependencies -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.ejb</groupId>
<artifactId>javax.ejb-api</artifactId>
<scope>provided</scope>
</dependency>
<!-- test dependencies -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
You should always declare a scope=provided dependency on the JavaEE API artifacts so that downstream clients of the module are free to supply their own version/provider of the API.
Attempt to validate the EJB module at the child level without futher specification. This will fail because we are missing version information for the two dependency artifacts we just added.
$ mvn validate [ERROR] The project myorg.basicejb:basicejb-ejb:1.0-SNAPSHOT (/home/jcstaff/proj/basicejbEx/basicejb-ejb/pom.xml) has 5 errors [ERROR] 'dependencies.dependency.version' for org.slf4j:slf4j-api:jar is missing. @ myorg.basicejb:basicejb-ejb:[unknown-version], .../basicejbEx/basicejb-ejb/pom.xml, line 22, column 21 [ERROR] 'dependencies.dependency.version' for javax.ejb:javax.ejb-api:jar is missing. @ myorg.basicejb:basicejb-ejb:[unknown-version], .../basicejbEx/basicejb-ejb/pom.xml, line 27, column 21 [ERROR] 'dependencies.dependency.version' for junit:junit:jar is missing. @ myorg.basicejb:basicejb-ejb:[unknown-version], .../basicejbEx/basicejb-ejb/pom.xml, line 34, column 21 [ERROR] 'dependencies.dependency.version' for org.slf4j:slf4j-log4j12:jar is missing. @ myorg.basicejb:basicejb-ejb:[unknown-version], .../basicejbEx/basicejb-ejb/pom.xml, line 39, column 21 [ERROR] 'dependencies.dependency.version' for log4j:log4j:jar is missing. @ myorg.basicejb:basicejb-ejb:[unknown-version], .../basicejbEx/basicejb-ejb/pom.xml, line 44, column 21
Update the parent pom.xml with the version specifications for these artifacts.
# basicejbEx/pom.xml
<properties>
...
<javax.ejb-api.version>3.2.2</javax.ejb-api.version>
<junit.version>4.12</junit.version>
<slf4j.version>1.7.25</slf4j.version>
<log4j.version>1.2.17</log4j.version>
...
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>javax.ejb</groupId>
<artifactId>javax.ejb-api</artifactId>
<version>${javax.ejb-api.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
</dependencies>
</dependencyManagement>
Validate that maven can now resolve the new dependencies.
$ mvn validate [INFO] BUILD SUCCESS
We are using only the maven validate phase at this point because we only want to validate whether maven can resolve all artifacts and not fully build an EJB component. The build will fail if you use a more advanced maven phase prior to adding the EJB source code in the following steps.
If you have not yet done so, you can now import your multi-module project into the IDE as an existing set of Maven projects. Since we have not yet linked the two -- you will need to import them in separate steps. Although the instructions are written using file system commands it is assumed that you can translate them into IDE actions and work at the level you desire.
Create the src tree for the EJB.
$ mkdir mkdir -p basicejb-ejb/src/main/java/org/myorg/basicejb/ejb
Add @Remote and @Local interfaces for the EJB. Place a simple ping() method in the @Remote interface for use as an end-to-end sanity check at the end of this exercise.
$ cat basicejb-ejb/src/main/java/org/myorg/basicejb/ejb/ReservationRemote.java
package org.myorg.basicejb.ejb;
import javax.ejb.Remote;
@Remote
public interface ReservationRemote {
void ping();
}
$ cat basicejb-ejb/src/main/java/org/myorg/basicejb/ejb/ReservationLocal.java
package org.myorg.basicejb.ejb;
import javax.ejb.Local;
@Local
public interface ReservationLocal {
}
Add a @Stateless EJB that will implement the provided @Remote and @Local interfaces. Implement @PostConstruct and @Destroy callbacks to intercept EJB lifecycle events. Add a logger and log statements to the methods so we can observe activity within the EJB during the test at the end of this exercise.
$ cat basicejb-ejb/src/main/java/org/myorg/basicejb/ejb/ReservationEJB.java
package org.myorg.basicejb.ejb;
import javax.annotation.PostConstruct;
import javax.annotation.PreDestroy;
import javax.ejb.Stateless;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@Stateless
public class ReservationEJB implements ReservationLocal, ReservationRemote {
private static Logger logger = LoggerFactory.getLogger(ReservationEJB.class);
@PostConstruct
public void init() {
logger.debug("*** ReservationEJB.init() ***");
}
@PreDestroy
public void destroy() {
logger.debug("*** ReservationEJB.destroy() ***");
}
@Override
public void ping() {
logger.debug("ping called");
}
}
The EJB can be built at this time. You will notice the following in the output.
Our 3 Java files (@Remote, @Local, and @Stateless) were compiled
EJB 3 processing was performed on the target. This largely consisted only of MANIFEST Class-Path processing and the construction of an ejb-client.jar file at this point.
$ mvn package [INFO] ------------------------------------------------------------------------ [INFO] Building Basic EJB Exercise::EJB 1.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ ... [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ basicejb-ejb --- ... [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ basicejb-ejb --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 3 source files to /home/jcstaff/proj/basicejbEx/basicejb-ejb/target/classes ... [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ basicejb-ejb --- ... [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ basicejb-ejb --- ... [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ basicejb-ejb --- ... [INFO] --- maven-ejb-plugin:2.4:ejb (default-ejb) @ basicejb-ejb --- [INFO] Building EJB basicejb-ejb-1.0-SNAPSHOT with EJB version 3.2 [INFO] Building jar: .../basicejbEx/basicejb-ejb/target/basicejb-ejb-1.0-SNAPSHOT.jar [INFO] Building EJB client basicejb-ejb-1.0-SNAPSHOT-client [INFO] Building jar: .../basicejbEx/basicejb-ejb/target/basicejb-ejb-1.0-SNAPSHOT-client.jar [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
Module structure looks very similar to a POJO/JAR module
. |-- basicejb-ejb | |-- pom.xml | |-- src | | `-- main | | `-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- ejb | | |-- ReservationEJB.java | | |-- ReservationLocal.java | | `-- ReservationRemote.java | `-- target | |-- basicejb-ejb-1.0-SNAPSHOT-client.jar | |-- basicejb-ejb-1.0-SNAPSHOT.jar ... `-- pom.xml
The EJB.jar contains the full set of EJB classes/interfaces
$ jar tf target/basicejb-ejb-1.0-SNAPSHOT.jar ... org/myorg/basicejb/ejb/ReservationLocal.class org/myorg/basicejb/ejb/ReservationRemote.class org/myorg/basicejb/ejb/ReservationEJB.class ...
The EJB-client.jar contains classes/interfaces required by remote clients
$ jar tf target/basicejb-ejb-1.0-SNAPSHOT-client.jar ... org/myorg/basicejb/ejb/ReservationLocal.class org/myorg/basicejb/ejb/ReservationRemote.class ...
Technically, Local interfaces should not be needed by remote clients. However, there was a time when remote clients of JBoss Stateful Session EJBs required access to all interface types and old assembly habits die hard when it does not hurt to include the extra interface if not needed.
Add the EJB module to the root pom.xml.
<modules>
<module>basicejb-ejb</module>
</modules>
Retest the build from the root.
[INFO] Scanning for projects... ... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Basic EJB Exercise ................................. SUCCESS [ 0.503 s] [INFO] Basic EJB Exercise::EJB ............................ SUCCESS [ 2.457 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
Add a unit test to the EJB module to show that you can unit test functionality that does not require the container. This class will go into the src/test tree and will not be a part of the EJB.jar
$ mkdir -p basicejb-ejb/src/test/java/org/myorg/basicejb/ejb
$ cat basicejb-ejb/src/test/java/org/myorg/basicejb/ejb/ReservationTest.java
package org.myorg.basicejb.ejb;
import org.junit.Before;
import org.junit.Test;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class ReservationTest {
private static final Logger logger = LoggerFactory.getLogger(ReservationTest.class);
ReservationRemote reservatist;
@Before
public void setUp() {
reservatist=new ReservationEJB();
}
@Test
public void testPing() {
logger.info("*** testPing ***");
reservatist.ping();
}
}
Add a log4j.xml file to support logging unit test functionality that does not require the container.
$ mkdir -p basicejb-ejb/src/test/resources
$ cat basicejb-ejb/src/test/resources/log4j.xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE log4j:configuration PUBLIC
"-//APACHE//DTD LOG4J 1.2//EN" "http://logging.apache.org/log4j/1.2/apidocs/org/apache/log4j/xml/doc-files/log4j.dtd">
<log4j:configuration
xmlns:log4j="http://jakarta.apache.org/log4j/"
debug="false">
<appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
<param name="Target" value="System.out"/>
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern" value="%d{HH:mm:ss,SSS} %-5p (%F:%L) -%m%n"/>
</layout>
</appender>
<appender name="logfile" class="org.apache.log4j.RollingFileAppender">
<param name="File" value="target/log4j-out.txt"/>
<param name="Append" value="false"/>
<param name="MaxFileSize" value="100KB"/>
<param name="MaxBackupIndex" value="1"/>
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern"
value="%-5p %d{dd-MM HH:mm:ss,SSS} [%c] (%F:%M:%L) -%m%n"/>
</layout>
</appender>
<logger name="org.myorg">
<level value="debug"/>
<appender-ref ref="logfile"/>
</logger>
<root>
<priority value="info"/>
<appender-ref ref="CONSOLE"/>
</root>
</log4j:configuration>
Rebuild the EJB and/or entire application. You will notice the unit tests executed during the build.
$ mvn clean install ... [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ basicejb-ejb --- [INFO] Surefire report directory: /home/jcstaff/proj/basicejbEx/basicejb-ejb/target/surefire-reports ------------------------------------------------------- T E S T S ------------------------------------------------------- Running org.myorg.basicejb.ejb.ReservationTest 21:50:10,382 INFO (ReservationTest.java:20) -*** testPing *** 21:50:10,390 DEBUG (ReservationEJB.java:26) -ping called Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.218 sec Results : Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 ... [INFO] BUILD SUCCESS
Notice the we are automatically getting a surefire execution and our JUnit test run. We get this because surefire is automatically brought in by the pom's packaging type *and* we ended the Java class name with Test. Looking at the plugin page, the other default name patterns include.
**/Test*.java
**/*Test.java
**/*TestCase.java
As discussed on that same web page -- you can expand or shrink that list with the use of includes and excludes. This is commonly done to focus your testing around a specific unit test or to temporarily exclude all unit tests from the build to focus on a specific IT test (discussed later).
Verify this is what you have so far.
. |-- basicejb-ejb | |-- pom.xml | `-- src | |-- main | | `-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- ejb | | |-- ReservationEJB.java | | |-- ReservationLocal.java | | `-- ReservationRemote.java | `-- test | |-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- ejb | | `-- ReservationTest.java | `-- resources | `-- log4j.xml `-- pom.xml
This set of steps is in preparation for the next chapter where you will be asked to deploy the EJB built above to the server. In this section you will be shown how to start/stop a standalone server and how to setup, start/stop an embedded server.
Depending on the origin of your server setup, either verify or put in place the following in the JBoss/Wildfly application server configuration file to address logging.
# wildfly-13.0.0.Final/standalone/configuration/standalone.xml <logger category="org.myorg"> <level name="DEBUG"/> </logger> ... <root-logger> <level name="INFO"/> <handlers> <handler name="CONSOLE"/> <handler name="FILE"/> </handlers> </root-logger>
The default configuration will permit all log entries INFO and above to be written to the CONSOLE and FILE appenders. The extra lines added for org.myorg defined a new logger that will have messages DEBUG and above logged. The CONSOLE appender is normally throttled to only log INFO and above so you should not see any impact of the changes above in the server's console output. The extra debug will be placed in the standalone/log/server.log file.
$ tail -f standalone/log/server.log ... 06:57:32,031 INFO [info.ejava.examples.ejb.basic.ejb.GreeterEJB] (default task-1) *** GreeterEJB:init(1127977520) *** 06:57:32,031 DEBUG [info.ejava.examples.ejb.basic.ejb.GreeterEJB] (default task-1) sayHello() 06:57:32,113 DEBUG [info.ejava.examples.ejb.basic.ejb.GreeterEJB] (default task-3) sayHello(cat inhat) ...
The standalone application server runs as a separate process either in the same machine as or remote machine from the development machine. This approach more closely resembles the target server setup and can help test certain production-like configurations. This is normally managed through scripts.
Start the JBoss/Wildfly application server. In the command shown below
-Djboss.server.base.dir - used to point to the root of the profile. If you create other profiles (by copying the standalone directory) you can point the script to that directory using this java property flag. This flag is not needed if you use the default standalone directory.
-c standalone.xml - used to point to a specific profile configuration file within the ${jboss.server.base.dir}/configuration directory. If you create alternate configurations (by copying the standalone.xml file) you can point the script to that configuration using this argument. This is useful to switch between alternate configurations but not required if you use the default standalone.xml configuration file.
wildfly-13.0.0.Final$ ./bin/standalone.sh ========================================================================= JBoss Bootstrap Environment JBOSS_HOME: /Users/jim/apps/wildfly-13.0.0.Final JAVA: java JAVA_OPTS: -server -Xms64m -Xmx512m -XX:MetaspaceSize=96M -XX:MaxMetaspaceSize=256m -Djava.net.preferIPv4Stack=true -Djboss.modules.system.pkgs=org.jboss.byteman -Djava.awt.headless=true ========================================================================= 10:14:01,139 INFO [org.jboss.modules] (main) JBoss Modules version 1.8.5.Final 10:14:01,372 INFO [org.jboss.msc] (main) JBoss MSC version 1.4.2.Final 10:14:01,379 INFO [org.jboss.threads] (main) JBoss Threads version 2.3.2.Final 10:14:01,474 INFO [org.jboss.as] (MSC service thread 1-2) WFLYSRV0049: WildFly Full 13.0.0.Final (WildFly Core 5.0.0.Final) starting ... 10:14:03,969 INFO [org.jboss.as] (Controller Boot Thread) WFLYSRV0025: WildFly Full 13.0.0.Final (WildFly Core 5.0.0.Final) started in 3104ms - Started 358 of 581 services (356 services are lazy, passive or on-demand)
Shutdown the server using the command line interface (CLI)
wildfly-13.0.0.Final$ ./bin/jboss-cli.sh --connect command=:shutdown { "outcome" => "success", "result" => undefined }
Restart the server.
wildfly-13.0.0.Final]$ ./bin/standalone.sh ...
Shutdown the server by pressing Control-C keys
^C10:17:19,557 INFO [org.jboss.as.server] (Thread-2) WFLYSRV0236: Suspending server with no timeout. 10:17:19,558 INFO [org.jboss.as.ejb3] (Thread-2) WFLYEJB0493: EJB subsystem suspension complete 10:17:19,559 INFO [org.jboss.as.server] (Thread-2) WFLYSRV0220: Server shutdown has been requested via an OS signal ... 10:17:19,616 INFO [org.jboss.as] (MSC service thread 1-1) WFLYSRV0050: WildFly Full 13.0.0.Final (WildFly Core 5.0.0.Final) stopped in 53ms
There is no negative impact from stopping the application server using Control-C key sequence.
The embedded application server places the server in the same JVM as the IDE. This allows for easy start, stop, and debug options for managing the server. Since the server is running inside the IDE, it can differ greatly from what is used in production.
The following steps assumes JBoss AS Tools have been installed. If the server setup does not list a Wildfly 13.x server option please make install that option. Use the latest version if Wildfly 13.x is not yet available.
Add a new Wildfly (latest).x Server off the Servers tab of the JavaEE IDE profile. Accept all defaults.
Figure 47.1. Choose New Server
Locate Server tab in the Eclipse JavaEE Profile
Right-Click in panel and select New Server
Choose most recent Wildfly configuration closest to current
Start the server by right-clicking on the server and selecting Start.
Stop the server by right-clicking on the server and selecting Stop.
Created parent module
Houses re-usable property, dependency and plugin definitions
Delegates build to implementation modules
Has no technical artifacts of its own
Created EJB module
EJB modules have one or more EJBs
an "EJB" is made up of an EJB class and optional @Remote and @Local interfaces
Functional unit tests can be included in the EJB module
Server management
Control logging verbosity
Start/Stop standalone server
Create and Start/Stop embedded server
EARs are Java archives that are used to house the overall application, with all of its components. The EAR can contain many EJB and WAR components as well as their dependencies (and a little-used Java EE Client type). The EAR is used to deploy the contained modules to the server. It is the original JavaEE deployment type aside from a naked EJB.
Naked EJB deployments are rare because they provide no construct to be deployed with dependencies. Everything required by the EJB must be a part of the EJB. The EAR, on the other hand, can deploy an EJB and any dependencies such that they are loaded by the same class loader. Since Naked EJB deployments are of limited use and present very few additional technical challenges -- we will not be addressing that type of deployment.
Create remaining Maven modules to support EAR deployment
Deploy an EJB
Create an IT test to communicate with and test the EAR-based EJB
A single Maven module can house the development of a single EAR. The bulk of the project is solely within the pom.xml as nearly all of its contents are brought in through dependencies.
Create the sub-project directory for the EAR.
$ mkdir basicejb-ear
Add the initial entries for the EAR pom.xml.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<parent>
<groupId>myorg.basicejb</groupId>
<artifactId>basicejbEx</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>basicejb-ear</artifactId>
<packaging>ear</packaging>
<name>Basic EJB Exercise::EAR</name>
<description>
This project provides a sample EAR for the Java EE components
associated with the overall project.
</description>
<dependencies>
</dependencies>
</project>
It is important to note that the packaging type is "ear" in this case. If you leave this out, Maven will default to a standard "jar" packaging type and not build the EAR correctly.
Add the EJB dependency to the EAR. Use exclusions to keep any unwanted 3rd party .jars from being brought along.
<dependencies>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>basicejb-ejb</artifactId>
<version>${project.version}</version>
<type>ejb</type>
<exclusions>
<!-- server doesn't want to see already provided jars -->
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
</exclusions>
</dependency>
Since our EJB pom declared the dependency on slf4j-api as scope=provided the above exclusion is not necessary but included as an example of how this can be done.
Verify the EAR builds.
$ mvn clean verify ... [INFO] Building Basic EJB Exercise::EAR 1.0-SNAPSHOT ... [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ basicejb-ear --- ... [INFO] --- maven-ear-plugin:2.8:generate-application-xml (default-generate-application-xml) @ basicejb-ear --- [INFO] Generating application.xml ... [INFO] --- maven-resources-plugin:2.4.3:resources (default-resources) @ basicejb-ear --- ... [INFO] --- maven-ear-plugin:2.8:ear (default-ear) @ basicejb-ear --- [INFO] Copying artifact [ejb:myorg.basicejb:basicejb-ejb:1.0-SNAPSHOT] to [basicejb-ejb-1.0-SNAPSHOT.jar] [INFO] Could not find manifest file: .../basicejbEx/basicejb-ear/target/basicejb-ear-1.0-SNAPSHOT/META-INF/MANIFEST.MF - Generating one [INFO] Building jar: .../basicejbEx/basicejb-ear/target/basicejb-ear-1.0-SNAPSHOT.ear [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
Inspect the generated EAR archive. Notice how the EJB we developed in the previous chapter and included as a dependency here was brought into the archive.
$ jar tf target/basicejb-ear-1.0-SNAPSHOT.ear ... basicejb-ejb-1.0-SNAPSHOT.jar META-INF/application.xml ...
Inspect the generated application.xml. There is a copy in target/application.xml without having to unzip the archive. Notice how the archive was registered within this required descriptor as an EJB. Without this designation the EJB will be not be recognized as an EJB module by the container when deployed.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE application PUBLIC
"-//Sun Microsystems, Inc.//DTD J2EE Application 1.3//EN"
"http://java.sun.com/dtd/application_1_3.dtd">
<application>
<display-name>basicejb-ear</display-name>
<description>This project provides a sample EAR for the Java EE components
associated with the overall project.</description>
<module>
<ejb>basicejb-ejb-1.0-SNAPSHOT.jar</ejb>
</module>
</application>
Add the EAR to the *root* level module and verify everything builds from the root.
<modules>
<module>basicejb-ejb</module>
<module>basicejb-ear</module>
</modules>
$ mvn clean install
...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Basic EJB Exercise ................................. SUCCESS [ 0.435 s]
[INFO] Basic EJB Exercise::EJB ............................ SUCCESS [ 3.063 s]
[INFO] Basic EJB Exercise::EAR ............................ SUCCESS [ 0.557 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
This is what our project looks like so far.
. |-- basicejb-ear | `-- pom.xml |-- basicejb-ejb | |-- pom.xml | `-- src | |-- main | | `-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- ejb | | |-- ReservationEJB.java | | |-- ReservationLocal.java | | `-- ReservationRemote.java | `-- test | |-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- ejb | | `-- ReservationTest.java | `-- resources | `-- log4j.xml `-- pom.xml
Any tests we implement within the EJB module itself would likely be a POJO-level unit test. EJB 3.2 does provide a means to create a lightweight EJB container to be used as a test harness, but does not substitue for honest end-to-end testing using a server deployment of the EJB/EAR and external test clients. We will create an additional module to deploy the EAR, locate the server and EJB remote interface, and test the EJB through that interface. We can reuse tests from lower levels, but that will not be shown as a part of this exercise. This module will have no target artifact (i.e., artifactId.jar) that we care about. One could do some tweeking of the pom.xml to keep that from being generated, but I have found that to only confuse Eclipse so we'll just live with and empty, unused RMI Test.jar.
Create the sub-project directory for the RMI Test.
$ mkdir basicejb-test
Create the pom.xml for the RMI Test module.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<parent>
<artifactId>basicejbEx</artifactId>
<groupId>myorg.basicejb</groupId>
<version>1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>basicejb-test</artifactId>
<packaging>jar</packaging>
<name>Basic EJB Exercise::Remote Test</name>
<description>
This project provides an example RMI Test project.
</description>
<dependencies>
</dependencies>
<build>
<plugins>
</plugins>
</build>
</project>
Add the dependencies to the Test/pom.xml required to use logging and JUnit.
<!-- core dependencies -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<scope>test</scope>
</dependency>
<!-- test dependencies -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<scope>test</scope>
</dependency>
Like before, the maven pom.xml will not validate until we populate the parent pom with version information for the new dependencies added. Luckily we added these dependencyManagement declaration while adding the unit test within the EJB module.
Notice we will silently also inherit the maven-compiler-plugin definition from the parent. We don't have to repeat any work to get a properly configured compiler.
This begins to show how work we do at the parent pom.xml can be used to keep child modules consistent and allow child modules the flexibility to determine whether they should or should not include a particular dependency.
Add the dependencies required to be an RMI client of JBoss/Wildfly. The dependencies required and how we define them have varied over the years -- so I created a module ("info.ejava.examples.common:jboss-rmi-client") within the course examples as a single entry point.
<!-- dependencies used for remote interface -->
<dependency>
<groupId>info.ejava.examples.common</groupId>
<artifactId>jboss-rmi-client</artifactId>
<type>pom</type>
<scope>test</scope>
</dependency>
The dependency added above is just a "pom" dependency. However, it will bring in dependencies that bring in other dependencies related to a remote EJB and JMS client of JBoss
Recent JBoss/Wildfly packaging has made my use of jboss-rmi-client across the course examples ]less valuable than in the past. If you take a look at the implementation of that module, you will see it simply creates a dependency on org.wildfly.bom:ejb-client-bom and org.wildfly.bom:jms-client-bom. You can reduce your dependencies by declaring a direct dependency on org.wildfly.bom:ejb-client-bom in this case.
Add a definition of the above dependency to your parent pom
<properties>
...
<ejava.version>5.0.0-SNAPSHOT</ejava.version>
...
<dependency>
<groupId>info.ejava.examples.common</groupId>
<artifactId>jboss-rmi-client</artifactId>
<version>${ejava.version}</version>
<type>pom</type>
</dependency>
Attempt to resolve all dependencies at this point. If you don't yet have a copy of the info.ejava.examples.common#jboss-rmi-client in your repository this will fail. You can use the dependency:go-offline to make sure you have everything your project needs.
(from basicejb-test directory) $ mvn dependency:go-offline ... [WARNING] The POM for info.ejava.examples.common:jboss-rmi-client:pom:5.0.0-SNAPSHOT is missing, no dependency information available [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE ... [ERROR] Failed to execute goal on project basicejb-test: Could not resolve dependencies for project myorg.basicejb:basicejb-test:jar:1.0-SNAPSHOT: Could not find artifact info.ejava.examples.common:jboss-rmi-client:pom:5.0.0-SNAPSHOT -> [Help 1]
Add the repository information required to resolve info.ejava.examples.common#jboss-rmi-client and its dependencies from the Internet. We need one entry for the SNAPSHOT release. Place the following in the root module pom.xml.
<repositories>
<repository>
<id>webdev-snapshot</id>
<name>ejava webdev snapshot repository</name>
<url>http://webdev.jhuep.com/~jcs/maven2-snapshot</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
<updatePolicy>daily</updatePolicy>
</snapshots>
</repository>
</repositories>
(from basicejb-test directory) $ mvn dependency:go-offline ... [INFO] <<< maven-dependency-plugin:2.8:go-offline (default-cli) < :resolve-plugins @ basicejb-test <<< Downloading: http://webdev.jhuep.com/~jcs/maven2-snapshot/info/ejava/examples/common/jboss-rmi-client/5.0.0-SNAPSHOT/maven-metadata.xml Downloaded: http://webdev.jhuep.com/~jcs/maven2-snapshot/info/ejava/examples/common/jboss-rmi-client/5.0.0-SNAPSHOT/maven-metadata.xml (621 B at 0.8 KB/sec) Downloading: http://webdev.jhuep.com/~jcs/maven2-snapshot/info/ejava/examples/common/jboss-rmi-client/5.0.0-SNAPSHOT/jboss-rmi-client-5.0.0-20141001.053140-16.pom Downloaded: http://webdev.jhuep.com/~jcs/maven2-snapshot/info/ejava/examples/common/jboss-rmi-client/5.0.0-SNAPSHOT/jboss-rmi-client-5.0.0-20141001.053140-16.pom (5 KB at 77.3 KB/sec) ... [INFO] BUILD SUCCESS
Maven keeps track of which repositories it has checked for a resource and when so that it can throttle attempts to resolve artifacts during a normal build. Note that in the repository definition we created -- we set the updatePolicy to daily. If you make an error and wish to coldstart Maven's knowledge of that artifact simply delete its directory from the localRepository. You can also try adding the -U (update snapshots) flag to the command line.
$ rm -rf $HOME/.m2/repository/info/ejava/examples/common/jboss-rmi-client/5.0.0-SNAPSHOT
Create a JNDI configuration for JBoss Remoting. Use variable references to the server to better support different configurations. Place this file in src/test/resources. We will describe the contents of the file later once we get tangible values inserted.
$ mkdir -p basicejb-test/src/test/resources $ vi basicejb-test/src/test/resources/jndi.properties
$ cat basicejb-test/src/test/resources/jndi.properties #jndi.properties java.naming.factory.initial=${java.naming.factory.initial} java.naming.factory.url.pkgs=${java.naming.factory.url.pkgs} java.naming.provider.url=${java.naming.provider.url} #java.naming.security.principal=${jndi.user} #java.naming.security.credentials=${jndi.password}
Build the RMI Test project after the jndi.properties file is in place.
$ mvn clean process-test-resources ... [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ basicejb-test --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource ... [INFO] BUILD SUCCESS
Notice the properties file was copied from the src/test tree to target/test-classes without modification. We need to make more changes so the properties within the file get assigned to actual values from out environment.
$ cat basicejb-test/target/test-classes/jndi.properties #jndi.properties java.naming.factory.initial=${java.naming.factory.initial} java.naming.factory.url.pkgs=${java.naming.factory.url.pkgs} java.naming.provider.url=${java.naming.provider.url} #java.naming.security.principal=${jndi.user} #java.naming.security.credentials=${jndi.password}
Add resource filtering to test resources in the pom.xml. This will cause the jndi.properties file to have variables replaces with physical values when copied to the target tree.
<build>
<!-- filter test/resource files for profile-specific valies -->
<testResources>
<testResource>
<directory>src/test/resources</directory>
<filtering>true</filtering>
<includes>
<include>**/*.properties</include>
</includes>
</testResource>
<testResource>
<directory>src/test/resources</directory>
<filtering>false</filtering>
<excludes>
<exclude>**/*.properties</exclude>
</excludes>
</testResource>
</testResources>
The above definition of filtering restricts filtering to a specific pattern of files and leaving all other files unfiltered. This is more verbose but suggested. Filtering some files that were not meant to be filtered causes issues. Binary files (e.g., PKI certs) and variables meant to be expanded in the deployment environment do not work well with filtering.
Rebuild the RMI Test module and notice two copies take place; one for the filtered set and a second for the unfiltered set.
$ mvn clean process-test-resources ... [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ basicejb-test --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] Copying 0 resource ... [INFO] BUILD SUCCESS
However, our jndi.properties file in the target tree still looks the same. That is because we have not defined the referenced variables in the environment.
$ cat basicejb-test/target/test-classes/jndi.properties #jndi.properties java.naming.factory.initial=${java.naming.factory.initial} java.naming.factory.url.pkgs=${java.naming.factory.url.pkgs} java.naming.provider.url=${java.naming.provider.url} #java.naming.security.principal=${jndi.user} #java.naming.security.credentials=${jndi.password}
Add the properties referenced in jndi.properties to the *root* pom.xml.
$ cat pom.xml
<properties>
...
<jboss.host>localhost</jboss.host>
<jboss.http.port>8080</jboss.http.port>
<jndi.user>known</jndi.user>
<jndi.password>password1!</jndi.password>
<jjava.naming.factory.initial>org.jboss.naming.remote.client.InitialContextFactory</java.naming.factory.initial>
<java.naming.provider.url>http-remoting://${jboss.host}:${jboss.http.port}</java.naming.provider.url>
<java.naming.factory.url.pkgs/>
</properties>
Rebuild the RMI Test module and note the contents of the jndi.properties file in the target/test-classes tree should be expanded with the properties defined in the root pom.xml.
$ mvn clean process-test-resources [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ basicejb-test --- ... [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] Copying 0 resource ... [INFO] BUILD SUCCESS
$ cat target/test-classes/jndi.properties
#jndi.properties
java.naming.factory.initial=org.wildfly.naming.client.WildFlyInitialContextFactory
java.naming.factory.url.pkgs=
java.naming.provider.url=http-remoting://127.0.0.1:8080
#java.naming.security.principal=known
#java.naming.security.credentials=password1!
java.naming.factory.initial - an implementation class for the InitialContext() created by the Java code. The WildFlyInitialContextFactory will be able to easily switch between JBoss Remoting and EJB Client JNDI names.
java.naming.factory.url.pkgs= - a list of java packages to search in the classpath to resolve a well-known-named class handler for custom name prefixes (e.g., ejb:). However, since we are using a custom WildFlyInitialContextFactory factory.initial there is no need to list anything here to resolve the "http-remoting" protocol and the "ejb:" naming prefix.
java.naming.provider.url - URL to the JNDI provider. JBoss uses the HTTP port and protocol to initiate communications for JNDI lookups.
java.naming.security.principal and java.naming.security.credentials - credentials to use when interacting with the server. In some cases we have the server locked down so that JNDI lookups require credentials prior to even getting to the application. If your Java code does not supply credentials -- this can be used to authenticate with the server.
Create a JUnit IT test that will lookup the EJB and invoke the ping method.
$ mkdir -p basicejb-test/src/test/java/org/myorg/basicejb/earejb
$ cat basicejb-test/src/test/java/org/myorg/basicejb/earejb/ReservationIT.java
package org.myorg.basicejb.earejb;
import javax.naming.InitialContext;
import javax.naming.NamingException;
import org.junit.Before;
import org.junit.Test;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class ReservationIT {
private static final Logger logger = LoggerFactory.getLogger(ReservationIT.class);
private InitialContext jndi;
@Before
public void setUp() throws NamingException {
logger.debug("getting jndi initial context");
jndi=new InitialContext();
logger.debug("jndi={}", jndi.getEnvironment());
}
@Test
public void testPing() {
logger.info("*** testPing ***");
}
}
The above JUnit test has been purposely ended with the capital letters "IT" to represent integration test. This will be treated special from JUnit test cases ending with *Test. IT tests run during a later set of phases that account for deployment, testing, undeploy, and results verification in separate phases instead of the single test phase used by *Test test cases. *Test JUnit test cases are used for in-process unit tests. *IT test cases are used for integration tests that could span multiple processes requiring extra work. Unit tests are handled by the maven-surefire-plugin. Integration tests are handled by the maven-failsafe-plugin. More in a moment...
Add a log4j.xml file to configure Log4j loggers. You may use the same file you put into your EJB.
$ cp basicejb-ejb/src/test/resources/log4j.xml basicejb-test/src/test/resources/
$ cat basicejb-test/src/test/resources/log4j.xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE log4j:configuration PUBLIC
"-//APACHE//DTD LOG4J 1.2//EN" "http://logging.apache.org/log4j/1.2/apidocs/org/apache/log4j/xml/doc-files/log4j.dtd">
<log4j:configuration
xmlns:log4j="http://jakarta.apache.org/log4j/"
debug="false">
<appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
<param name="Target" value="System.out"/>
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern" value="%d{HH:mm:ss,SSS} %-5p (%F:%L) -%m%n"/>
</layout>
</appender>
<appender name="logfile" class="org.apache.log4j.RollingFileAppender">
<param name="File" value="target/log4j-out.txt"/>
<param name="Append" value="false"/>
<param name="MaxFileSize" value="100KB"/>
<param name="MaxBackupIndex" value="1"/>
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern"
value="%-5p %d{dd-MM HH:mm:ss,SSS} [%c] (%F:%M:%L) -%m%n"/>
</layout>
</appender>
<logger name="org.myorg">
<level value="debug"/>
<appender-ref ref="logfile"/>
</logger>
<root>
<priority value="info"/>
<appender-ref ref="CONSOLE"/>
</root>
</log4j:configuration>
Try building the Test module at this point. Notice how no tests attempted to run. That is because the Tests run reported are surefire unit tests and we have no unit tests in this module. All our tests (1) are integration tests. Okay...why didn't our integration test run? The failsafe plugin, unlike the surefire plugin does not run automatically. We must wire it into the build.
You will also notice the extra resources copy for the log4j.xml we put into src/test/resources. It was not filtered because it did not match the include pattern.
$cd basicejb-test; mvn clean install ... [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ basicejb-test --- [INFO] Deleting /home/jcstaff/proj/basicejbEx/basicejb-test/target ... [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ basicejb-test --- ... [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ basicejb-test --- ... [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ basicejb-test --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] Copying 1 resource ... [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ basicejb-test --- [INFO] Compiling 1 source file to /home/jcstaff/proj/basicejbEx/basicejb-test/target/test-classes ... [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ basicejb-test --- ... [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ basicejb-test --- ... [INFO] --- maven-install-plugin:2.4:install (default-install) @ basicejb-test --- ... [INFO] BUILD SUCCESS
Declare the failsafe plugin to your RMI Test/pom.xml to cause our JUnit IT test to be attempted.
<plugins>
<!-- adds IT integration tests to the build -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
</plugin>
Define the failsafe plugin in the parent module. This definition will have...
version for the plugin
goals to execute for the plugin. Other than "help", these are the only two goals the plugin supports.
pre-integration-test - no plugin goals are bound to this build phase yet. This is where we will want to deploy the EAR.
integration-test failsafe goal is automatically bound to the integration-test build phase to run the IT tests.
post-integration-test - no plugin goals are bound to this build phase yet. This is where we will want to undeploy the EAR.
verify goal is automatically bound to the verify goal to evaluate the IT test results and potentially fail the build.
argLine definition that will permit remote debugging of the IT test if needed to be run within the full Maven lifecycle.
<properties>
...
<maven-failsafe-plugin.version>2.22.0</maven-failsafe-plugin.version>
...
<build>
<pluginManagement>
<plugins>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>${maven-failsafe-plugin.version}</version>
<configuration>
<argLine>${surefire.argLine}</argLine>
</configuration>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
...
<profiles>
<profile> <!-- tells surefire/failsafe to run JUnit tests with remote debug -->
<id>debugger</id>
<activation>
<property>
<name>debugger</name>
</property>
</activation>
<properties>
<surefire.argLine>-Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE</surefire.argLine>
</properties>
</profile>
The debugger profile is one we have added before for the surefire plugin. Activating this profile during the build will cause failsafe to suspend until a debugger client connects and commands to continue. This is only useful when you must run the IT test within the full Maven build. Ideally you would instead debug the IT test inside the IDE debugger.
Try building the Test module now that the failsafe plugin has been correctly declared. Notice how our IT test runs within the integration-test phase. It is not doing much yet but we are purposely taking baby steps to explain every corner of the multi-module build.
$cd basicejb-test; mvn clean install ... [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ basicejb-test --- ... [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ basicejb-test --- ... [INFO] --- maven-failsafe-plugin:2.22.0:integration-test (default) @ basicejb-test --- ... ------------------------------------------------------- T E S T S ------------------------------------------------------- Running org.myorg.basicejb.earejb.ReservationIT 20:51:22,679 DEBUG (ReservationIT.java:18) -getting jndi initial context 20:51:22,815 INFO (Xnio.java:92) -XNIO version 3.2.2.Final 20:51:22,925 INFO (NioXnio.java:56) -XNIO NIO Implementation Version 3.2.2.Final 20:51:23,028 INFO (EndpointImpl.java:69) -JBoss Remoting version 4.0.3.Final 20:51:23,180 DEBUG (ReservationIT.java:20) -jndi={java.naming.factory.initial=org.jboss.naming.remote.client.InitialContextFactory, java.naming.provider.url=http-remoting://localhost:8080, java.naming.factory.url.pkgs=, jboss.naming.client.ejb.context=true} 20:51:23,181 INFO (ReservationIT.java:26) -*** testPing *** Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.767 sec - in org.myorg.basicejb.earejb.ReservationIT Results : Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 [WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. build is platform dependent! [INFO] [INFO] --- maven-failsafe-plugin:2.22.0:verify (default) @ basicejb-test --- [INFO] Failsafe report directory: /home/jcstaff/proj/basicejbEx/basicejb-test/target/failsafe-reports [WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. build is platform dependent! [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ basicejb-test --- ... [INFO] BUILD SUCCESS
The IT test is not yet doing enough to indicate whether the server is running or not.
Add an additional line to the @Before method. This will perform a remote lookup of the "jms" JNDI context. If the server is up and knows about this context, we will continue to be successful. However, if the server is down or does not know about the context -- it will fail.
@Before
public void setUp() throws NamingException {
logger.debug("getting jndi initial context");
jndi=new InitialContext();
logger.debug("jndi={}", jndi.getEnvironment());
jndi.lookup("jms");
}
Re-run the IT test with the server stopped. Note the failure in the build does not come until after the verify phase.
$ cd basicejb-test; mvn clean install T E S T S ------------------------------------------------------- Running org.myorg.basicejb.earejb.ReservationIT ... Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 5.668 sec <<< FAILURE! - in org.myorg.basicejb.earejb.ReservationIT testPing(org.myorg.basicejb.earejb.ReservationIT) Time elapsed: 5.471 sec <<< ERROR! javax.naming.CommunicationException: Failed to connect to any server. Servers tried: [http-remoting://localhost:8080 (Operation failed with status WAITING after 5000 MILLISECONDS)] at org.jboss.naming.remote.protocol.IoFutureHelper.get(IoFutureHelper.java:97) at org.jboss.naming.remote.client.HaRemoteNamingStore.failOverSequence(HaRemoteNamingStore.java:198) at org.jboss.naming.remote.client.HaRemoteNamingStore.namingStore(HaRemoteNamingStore.java:149) at org.jboss.naming.remote.client.HaRemoteNamingStore.namingOperation(HaRemoteNamingStore.java:130) at org.jboss.naming.remote.client.HaRemoteNamingStore.lookup(HaRemoteNamingStore.java:272) at org.jboss.naming.remote.client.RemoteContext.lookup(RemoteContext.java:87) at org.jboss.naming.remote.client.RemoteContext.lookup(RemoteContext.java:129) at javax.naming.InitialContext.lookup(InitialContext.java:411) at org.myorg.basicejb.earejb.ReservationIT.setUp(ReservationIT.java:22) Results : Tests in error: ReservationIT.setUp:22 » Communication Failed to connect to any server. Server... Tests run: 1, Failures: 0, Errors: 1, Skipped: 0 ... [INFO] --- maven-failsafe-plugin:2.22.0:verify (default) @ basicejb-test --- ... [INFO] BUILD FAILURE
Re-run the IT test with the server running.
$ cd basicejb-test; mvn clean install ... [INFO] BUILD SUCCESS
Register the RMI Test module with the root pom and perform a root-level build.
<modules>
<module>basicejb-ejb</module>
<module>basicejb-ear</module>
<module>basicejb-test</module>
</modules>
$ mvn clean install ... [INFO] Reactor Summary: [INFO] [INFO] Basic EJB Exercise ................................. SUCCESS [ 0.564 s] [INFO] Basic EJB Exercise::EJB ............................ SUCCESS [ 5.183 s] [INFO] Basic EJB Exercise::EAR ............................ SUCCESS [ 0.951 s] [INFO] Basic EJB Exercise::Remote Test .................... SUCCESS [ 0.731 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
In the previous section we ended with the IT test communicating with the server's JNDI tree. In this section we will *finally* deploy the EAR, lookup the @Remote interface of our EJB, and invoke our first method.
We want tests to run as automated as possible. This allows us to simplify testing as well as leverage continous integration techniques (e.g., CruiseControl, Hudson, Jenkins; i.e., nightly builds/tests). To help automate this we are going to leverage the Maven cargo plugin. Cargo, itself, is a Java library that is used to manage Java EE containers. The maven cargo plugin just makes it callable from within Maven. We will add the cargo plugin to the RMI Test project (to deploy the application) since the application isn't ready to be deployed until after the EAR is built.
Declare the cargo plugin in the RMI Test pom.xml to deploy the EAR to JBoss. Like always, We will only put what is specific to this module in the module's pom.xml.
$ cat basicejb-test/pom.xml
...
<build>
<plugins>
...
<!-- artifacts to deploy to server -->
<plugin>
<groupId>org.codehaus.cargo</groupId>
<artifactId>cargo-maven2-plugin</artifactId>
<configuration>
<deployables>
<deployable>
<groupId>${project.groupId}</groupId>
<artifactId>basicejb-ear</artifactId>
<type>ear</type>
</deployable>
</deployables>
</configuration>
</plugin>
Cargo requires the module to be deployed to also be a scope=compile dependency of the local module. Since this is a Test module with no dependents -- we can add that without concern.
<!-- cargo requires scope=compile dependencies on deployables -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>basicejb-ear</artifactId>
<type>ear</type>
<version>${project.version}</version>
</dependency>
Define the details of the cargo plugin in the parent pom. Since cargo is not specific to any one container -- there is a good bit that requires configuring. The details are a mouthful but, in short, this tells cargo to deploy our artifacts to a running JBoss server (of a specific version), listening on a specific admin address:port, and where to place the runtime logs from this activity.
<properties>
...
<cargo-maven2-plugin.version>1.4.3</cargo-maven2-plugin.version>
<cargo.containerId>wildfly9x</cargo.containerId>
<wildfly-controller-client.version>8.2.1.Final</wildfly-controller-client.version>
<jboss.mgmt.host>${jboss.host}</jboss.mgmt.host>
<jboss.mgmt.port>9990</jboss.mgmt.port>
...
<build>
<pluginManagement>
<plugins>
<plugin>
...
<groupId>org.codehaus.cargo</groupId>
<artifactId>cargo-maven2-plugin</artifactId>
<version>${cargo-maven2-plugin.version}</version>
<configuration>
<container>
<containerId>${cargo.containerId}</containerId>
<type>remote</type>
<log>target/server.log</log>
<output>target/output.log</output>
</container>
<configuration>
<type>runtime</type>
<properties>
<cargo.hostname>${jboss.mgmt.host}</cargo.hostname>
<cargo.jboss.management.port>${jboss.mgmt.port}</cargo.jboss.management.port>
</properties>
</configuration>
</configuration>
<dependencies>
<dependency>
<groupId>org.wildfly</groupId>
<artifactId>wildfly-controller-client</artifactId>
<version>${wildfly-controller-client.version}</version>
</dependency>
</dependencies>
<executions>
<execution>
<id>cargo-prep</id>
<phase>pre-integration-test</phase>
<goals>
<goal>redeploy</goal>
</goals>
</execution>
<execution>
<id>cargo-post</id>
<phase>post-integration-test</phase>
<goals>
<goal>undeploy</goal>
</goals>
</execution>
</executions>
</plugin>
I have not had time to investigate updating cargo to the latest configurations for wildfly. This is mostly because the older version (and containerId) still work with the newer software and a quick upgrade of cargo version# was not immediately successful.
Rebuild the RMI Test and note the deployment of the EAR to the JBoss server prior to running the integration tests with failsafe and undeployed after finishing.
... [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ basicejb-test --- ... [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ basicejb-test --- ... [INFO] --- cargo-maven2-plugin:1.4.3:redeploy (cargo-prep) @ basicejb-test --- Oct 05, 2014 11:17:36 PM org.xnio.Xnio <clinit> INFO: XNIO version 3.2.2.Final Oct 05, 2014 11:17:36 PM org.xnio.nio.NioXnio <clinit> INFO: XNIO NIO Implementation Version 3.2.2.Final Oct 05, 2014 11:17:36 PM org.jboss.remoting3.EndpointImpl <clinit> INFO: JBoss Remoting version 4.0.3.Final ... [INFO] --- maven-failsafe-plugin:2.22.0:integration-test (default) @ basicejb-test --- ... Running org.myorg.basicejb.earejb.ReservationIT ... Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 ... [INFO] --- cargo-maven2-plugin:1.4.3:undeploy (cargo-post) @ basicejb-test --- ... [INFO] --- maven-failsafe-plugin:2.22.0:verify (default) @ basicejb-test --- ... [INFO] BUILD SUCCESS
The following should have been output at the JBoss console and server.log. The "java:" names are JNDI names that can be used to locate the local and remote interfaces of our ReservationEJB.
23:17:37,356 INFO [org.jboss.as.ejb3.deployment.processors.EjbJndiBindingsDeploymentUnitProcessor] (MSC service thread 1-1) JNDI bindings for session bean named ReservationEJB in deployment unit subdeployment "basicejb-ejb-1.0-SNAPSHOT.jar" of deployment "basicejb-ear-1.0-SNAPSHOT.ear" are as follows: java:global/basicejb-ear-1.0-SNAPSHOT/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationLocal java:app/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationLocal java:module/ReservationEJB!org.myorg.basicejb.ejb.ReservationLocal java:global/basicejb-ear-1.0-SNAPSHOT/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote java:app/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote java:module/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote java:jboss/exported/basicejb-ear-1.0-SNAPSHOT/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote 23:17:37,368 INFO [org.jboss.weld.deployer] (MSC service thread 1-2) JBAS016005: Starting Services for CDI deployment: basicejb-ear-1.0-SNAPSHOT.ear 23:17:37,376 INFO [org.jboss.weld.deployer] (MSC service thread 1-3) JBAS016008: Starting weld service for deployment basicejb-ear-1.0-SNAPSHOT.ear 23:17:38,335 INFO [org.jboss.as.server] (management-handler-thread - 1) JBAS018559: Deployed "basicejb-ear-1.0-SNAPSHOT.ear" (runtime-name : "basicejb-ear-1.0-SNAPSHOT.ear")
The only one that is available to our external RMI client starts with "java:jboss/exported/". That JNDI is available to external clients -- like our IT test.
java:jboss/exported/basicejb-ear-1.0-SNAPSHOT/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
The following will be the base JNDI name of the EJB deployed by the EAR.
basicejb-ear-1.0-SNAPSHOT/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
When you application does not correctly deploy, the most valuable information is typically in the server.log and not in the cargo client log. Applications usually fail to deploy because of a missing or mis-configured dependency/resource and the server.log will be necessary to determine what to correct.
Each EJB interface will have an entry in the JNDI tree. Clients will use the JNDI tree to locate the interface object they need based on a hierarchical name. The names available locally within the server were standardized in JavaEE 6. However that specification did not cover external references -- so we have to peek at what JBoss is telling for the exported name.
java:jboss/exported/basicejb-ear-1.0-SNAPSHOT/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
java: - JNDI naming prefix used to determine which implementation is used to lookup the name. This specific prefix is for local names.
jboss/exported/ - names below this context are available outside the server and exclude this portion of the name.
basicejb-ear-1.0-SNAPSHOT - name of the deployable artifact. In this case the EAR was deployed and the name included the maven full artifact and version name.
basicejb-ejb-1.0-SNAPSHOT - name of the EJB component. It too has its full artifact name and version number applied by maven.
ReservationEJB! - name of the EJB. If not changed by the @Stateless annotation or deployment descriptor -- this will be the same name as the POJO class name.
org.myorg.basicejb.ejb.ReservationRemote - fully qualified class name of the remote interface.
Add the above JNDI name for the @Remote interface in the RMI Test failsafe configuration so that our IT test does not have to know about version numbers. The following is equivalent to passing -Djndi.name.reservation to the JVM.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<configuration>
<systemPropertyVariables>
<jndi.name.reservation>ejb:basicejb-ear-${project.version}/basicejb-ejb-${project.version}/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote</jndi.name.reservation>
</systemPropertyVariables>
</configuration>
</plugin>
The more EJB-aware and efficient EJB Client is used for communications with our EJB when we prefix the JNDI name with "ejb:". However, leaving it off will work but we would be using JBoss Remoting. We will use JBoss Remoting for non-EJB communications like JMS. For now and always get familiar to always prefixing EJB JNDI names with "ejb:".
Add the dependency on the ejb-client.jar to the RMI Test. This will go in the root dependency area. This was built by the maven-ejb-plugin when we built the EJB module.
<!-- brings in the EJB-client jar file w/o the EJB -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>basicejb-ejb</artifactId>
<version>${project.version}</version>
<type>ejb-client</type>
<scope>test</scope>
</dependency>
Add the handling of the provided JNDI name to the IT class by adding the following snippets of code. Note the JNDI name passed as a system property by failsafe.
import static org.junit.Assert.*;
...
public class ReservationIT {
...
private static final String reservationJNDI = System.getProperty("jndi.name.reservation");
@Before
public void setUp() throws NamingException {
assertNotNull("jndi.name.reservation not supplied", reservationJNDI);
...
logger.debug("jndi name:{}", reservationJNDI);
}
Add a lookup of the JNDI name and some debug of the remote interface that came back. We should now have something we can communcate with.
import org.myorg.basicejb.ejb.ReservationRemote;
...
public class ReservationIT {
...
private ReservationRemote reservationist;
@Before
public void setUp() throws NamingException {
...
reservationist = (ReservationRemote) jndi.lookup(reservationJNDI);
logger.debug("reservationist={}", reservationist);
}
Add a call to the ReservationRemote.ping() method in the testPing() @Test method. This should complete our initial end-to-end IT test.
public class ReservationIT {
...
@Test
public void testPing() throws NamingException {
...
reservationist.ping();
}
}
Build the application from the root. Note the JNDI lookup of the @Remote interface and call to ping() that took place.
... [INFO] --- cargo-maven2-plugin:1.4.3:redeploy (cargo-prep) @ basicejb-test --- Oct 06, 2014 12:16:23 AM org.xnio.Xnio <clinit> INFO: XNIO version 3.2.2.Final Oct 06, 2014 12:16:23 AM org.xnio.nio.NioXnio <clinit> INFO: XNIO NIO Implementation Version 3.2.2.Final Oct 06, 2014 12:16:23 AM org.jboss.remoting3.EndpointImpl <clinit> INFO: JBoss Remoting version 4.0.3.Final ... [INFO] --- maven-failsafe-plugin:2.22.0:integration-test (default) @ basicejb-test --- ... Running org.myorg.basicejb.earejb.ReservationIT 00:16:25,660 DEBUG (ReservationIT.java:23) -getting jndi initial context ... 00:16:26,033 DEBUG (ReservationIT.java:25) -jndi={java.naming.factory.initial=org.jboss.naming.remote.client.InitialContextFactory, java.naming.provider.url=http-remoting://localhost:8080, java.naming.factory.url.pkgs=, jboss.naming.client.ejb.context=true} ... 00:16:26,545 DEBUG (ReservationIT.java:28) -jndi name:basicejb-ear-1.0-SNAPSHOT/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote 00:16:26,745 DEBUG (ReservationIT.java:30) -reservationist=Proxy for remote EJB StatelessEJBLocator{appName='basicejb-ear-1.0-SNAPSHOT', moduleName='basicejb-ejb-1.0-SNAPSHOT', distinctName='', beanName='ReservationEJB', view='interface org.myorg.basicejb.ejb.ReservationRemote'} 00:16:26,746 INFO (ReservationIT.java:35) -*** testPing *** Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.46 sec - in org.myorg.basicejb.earejb.ReservationIT ... Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 ... [INFO] --- cargo-maven2-plugin:1.4.3:undeploy (cargo-post) @ basicejb-test --- ... [INFO] --- maven-failsafe-plugin:2.22.0:verify (default) @ basicejb-test --- ... [INFO] BUILD SUCCESS
Look in the server.log for the following output showing the server-side EJB logging INFO and DEBUG messages.
2014-10-06 00:16:26,566 INFO [org.jboss.ejb.client] (pool-1-thread-4) JBoss EJB Client version 2.0.1.Final 2014-10-06 00:16:26,894 DEBUG [org.myorg.basicejb.ejb.ReservationEJB] (EJB default - 1) *** ReservationEJB.init() *** 2014-10-06 00:16:26,898 DEBUG [org.myorg.basicejb.ejb.ReservationEJB] (EJB default - 1) ping called 2014-10-06 00:16:26,899 DEBUG [org.myorg.basicejb.ejb.ReservationEJB] (EJB default - 1) *** ReservationEJB.destroy() ***
Now that you have everything working lets insert a common mistake made when forming a IT test. Rename your ReservationIT test to TestReservationIT or ReservationTest). Be sure to rename both the Java class and the file.
$ mv basicejb-test/src/test/java/org/myorg/basicejb/earejb/ReservationIT.java basicejb-test/src/test/java/org/myorg/basicejb/earejb/TestReservationIT.java ... public class TestReservationIT { private static final Logger logger = LoggerFactory.getLogger(TestReservationIT.class);
Attempt to build your RMI Test module. The problem here is that your IT test (which requires a JNDI name property passed to it) is being run during the test phase and the failsafe configuration we put in place runs in the integration-test-phase. It is running in the earlier test phase because the name of the class now matches the surefire pattern.
$ mvn clean verify ... [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ basicejb-test --- ... [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ basicejb-test --- [INFO] Surefire report directory: /home/jcstaff/proj/basicejbEx/basicejb-test/target/surefire-reports ------------------------------------------------------- T E S T S ------------------------------------------------------- Running org.myorg.basicejb.earejb.TestReservationIT Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.209 sec <<< FAILURE! testPing(org.myorg.basicejb.earejb.TestReservationIT) Time elapsed: 0.013 sec <<< FAILURE! java.lang.AssertionError: jndi.name.reservation not supplied at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.assertTrue(Assert.java:41) at org.junit.Assert.assertNotNull(Assert.java:621) at org.myorg.basicejb.earejb.TestReservationIT.setUp(TestReservationIT.java:22) Results : Failed tests: testPing(org.myorg.basicejb.earejb.TestReservationIT): jndi.name.reservation not supplied Tests run: 1, Failures: 1, Errors: 0, Skipped: 0 [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE
Lets go one more (mistaken) step and (mistakenly) assume all we have to do is copy our failsafe configuration to the surefire plugin. That should make the assert happy.
<plugins>
<!-- a mistaken step to attempt to correct an IT test setup problem -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<systemPropertyVariables>
<jndi.name.reservation>basicejb-ear-${project.version}/basicejb-ejb-${project.version}/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote</jndi.name.reservation>
</systemPropertyVariables>
</configuration>
</plugin>
<!-- adds IT integration tests to the build -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<configuration>
<systemPropertyVariables>
<jndi.name.reservation>basicejb-ear-${project.version}/basicejb-ejb-${project.version}/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote</jndi.name.reservation>
</systemPropertyVariables>
</configuration>
</plugin>
Attempt to build the RMI Test module with the assert for the JNDI name resolved and notice the error we get. The IT test is configured correctly. It is doing all the right things. The problem is that with its current name, it matches the surefire criteria and is being run prior to the application being deployed to the server.
$ mvn clean verify ... [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ basicejb-test --- ... [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ basicejb-test --- ------------------------------------------------------- T E S T S ------------------------------------------------------- Running org.myorg.basicejb.earejb.TestReservationIT 12:59:48,726 DEBUG (TestReservationIT.java:24) -getting jndi initial context ... 12:59:49,546 INFO (TestReservationIT.java:36) -*** testPing *** 12:59:49,595 INFO (VersionReceiver.java:103) -EJBCLIENT000017: Received server version 2 and marshalling strategies [river] 12:59:49,596 INFO (RemotingConnectionEJBReceiver.java:215) -EJBCLIENT000013: Successful version handshake completed for receiver context EJBReceiverContext{clientContext=org.jboss.ejb.client.EJBClientContext@3f7b86b6, receiver=Remoting connection EJB receiver [connection=org.jboss.ejb.client.remoting.ConnectionPool$PooledConnection@46c93749,channel=jboss.ejb,nodename=fedora17x64-kde]} on channel Channel ID d8293998 (outbound) of Remoting connection 6762a5f3 to localhost/127.0.0.1:8080 Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 1.091 sec <<< FAILURE! testPing(org.myorg.basicejb.earejb.TestReservationIT) Time elapsed: 0.891 sec <<< ERROR! java.lang.IllegalStateException: EJBCLIENT000025: No EJB receiver available for handling [appName:basicejb-ear-1.0-SNAPSHOT, moduleName:basicejb-ejb-1.0-SNAPSHOT, distinctName:] combination for invocation context org.jboss.ejb.client.EJBClientInvocationContext@6e9af2a2 at org.jboss.ejb.client.EJBClientContext.requireEJBReceiver(EJBClientContext.java:749) ... Tests in error: testPing(org.myorg.basicejb.earejb.TestReservationIT): EJBCLIENT000025: No EJB receiver available for handling [appName:basicejb-ear-1.0-SNAPSHOT, moduleName:basicejb-ejb-1.0-SNAPSHOT, distinctName:] combination for invocation context org.jboss.ejb.client.EJBClientInvocationContext@6e9af2a2 Tests run: 1, Failures: 0, Errors: 1, Skipped: 0 ... [INFO] BUILD FAILURE
Restore your IT test back to its original state so that it does not get executed during the test phase. Also remote the surefire configuration since the JNDI name is never needed during a unit test and our RMI Test module does not run any unit tests.
$ mv basicejb-test/src/test/java/org/myorg/basicejb/earejb/TestReservationIT.java basicejb-test/src/test/java/org/myorg/basicejb/earejb/ReservationIT.java ... public class ReservationIT { private static final Logger logger = LoggerFactory.getLogger(ReservationIT.class);
Your build should now be working again.
$ mvn clean install
As a final sanity check, this is what your multi-module application should look like at this time.
. |-- basicejb-ear | `-- pom.xml |-- basicejb-ejb | |-- pom.xml | `-- src | |-- main | | `-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- ejb | | |-- ReservationEJB.java | | |-- ReservationLocal.java | | `-- ReservationRemote.java | `-- test | |-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- ejb | | `-- ReservationTest.java | `-- resources | `-- log4j.xml |-- basicejb-test | |-- pom.xml | `-- src | `-- test | |-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- earejb | | `-- ReservationIT.java | `-- resources | |-- jndi.properties | `-- log4j.xml `-- pom.xml
Created an EAR to deploy the EJB
EAR is a packaging construct with no executable code
EARs can deploy EJBs, WARs, and library JARs
Everything within the same EAR share a common classloader and can pass data by reference using local interfaces
Deployed an EAR
Cargo plugin used to automate deploy/undeploy of EAR during build of RMI Test module
deploy/undeploy of EAR occured during pre-integration-test and post-integration-test phases
Looked up @Remote interface in JNDI
Updated JNDI to use EJB-specific EJBClient
Create remaining Maven modules to support WAR deployment
Deploy an EJB within WAR
Create and deploy an EJB within WAR
Create an IT test to communicate with and test the WAR-based EJB
WARs are typically used to deploy web-tier components and this WAR may do that at some point. However, at this point in time we would like to take advantage of the WAR as a deployment artifact for EJBs. Starting with JavaEE 6, EJBs can be flexibly deployed embedded within the WAR or similar to an EAR by hosting EJB archives using dependencies.
Create the sub-project directory for the WAR.
$ mkdir basicejb-war
Add the initial entries for the WAR pom.xml.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<parent>
<groupId>myorg.basicejb</groupId>
<artifactId>basicejbEx</artifactId>
<version>1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>basicejb-war</artifactId>
<packaging>war</packaging>
<name>Basic EJB Exercise::WAR</name>
<description>
This project provides a sample WAR for the Java EE components
associated with the overall project.
</description>
<dependencies>
</dependencies>
<build>
</build>
</project>
It is important to note that the packaging type is "war" in this case. If you leave this out, Maven will default to a standard "jar" packaging type and not build a WAR.
Add the EJB dependency to the WAR. Use exclusions to keep any unwanted 3rd party .jars from being brought along.
<dependencies>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>basicejb-ejb</artifactId>
<version>${project.version}</version>
<type>ejb</type>
<exclusions>
<!-- server doesn't want to see already provided jars -->
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
</exclusions>
</dependency>
Since our WAR pom declared the dependency on slf4j-api as scope=provided the above exclusion is not necessary but included as an example of how this can be done.
Attempt to build the WAR. It should fail because we have not yet added a WEB-INF/web.xml or have not configured the plugin to ignore it.
$ mvn clean package ... [ERROR] Failed to execute goal org.apache.maven.plugins:maven-war-plugin:2.2:war (default-war) on project basicejb-war: Error assembling WAR: webxml attribute is required (or pre-existing WEB-INF/web.xml if executing in update mode)
J2EE 1.4 and prior relied heavily and exclusively on XML deployment descriptors for component deployment definitions. Since JavaEE 5, components have been allowed to be configured by convention and @Annotations to the point that we sometimes do not need the XML deployment descriptor at all.
Add the following property and pluginManagement to your root pom.xml. The plugin definition allows our WAR to be deployed without a WEB-INF/web.xml deployment descriptor. The version is required once we explicitly mention the plugin.
<properties>
...
<maven-war-plugin.version>3.2.2</maven-war-plugin.version>
<build>
<pluginManagement>
<plugins>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>${maven-war-plugin.version}</version>
<configuration>
<failOnMissingWebXml>false</failOnMissingWebXml>
</configuration>
</plugin>
Verify the WAR builds.
$ mvn clean package ... ... [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ basicejb-war --- ... [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ basicejb-war --- ... [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ basicejb-war --- ... [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ basicejb-war --- ... [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ basicejb-war --- ... [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ basicejb-war --- ... [INFO] --- maven-war-plugin:3.2.2:war (default-war) @ basicejb-war --- [INFO] Packaging webapp [INFO] Assembling webapp [basicejb-war] in [/home/jcstaff/proj/basicejbEx/basicejb-war/target/basicejb-war-1.0-SNAPSHOT] [INFO] Processing war project [INFO] Webapp assembled in [34 msecs] [INFO] Building war: /home/jcstaff/proj/basicejbEx/basicejb-war/target/basicejb-war-1.0-SNAPSHOT.war [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
Inspect the generated WAR archive. Notice how the EJB we developed in the previous chapter and included as a dependency here was brought into the archive.
$ jar tf target/basicejb-war-1.0-SNAPSHOT.war ... WEB-INF/classes/ WEB-INF/lib/basicejb-ejb-1.0-SNAPSHOT.jar ...
Add a cargo-maven-plugin declaration to the WAR module to deploy the WAR. Since we are deploying the local artifact and a WAR is a deployable -- we do not need to specify this artifact as a deployable.
...
<build>
<plugins>
...
<!-- artifacts to deploy to server. this module by default -->
<plugin>
<groupId>org.codehaus.cargo</groupId>
<artifactId>cargo-maven2-plugin</artifactId>
</plugin>
Since we are deploying the local artifact, we do not need to specify a dependency or deployable. This can also make also make it a pain to turn off if our root pom wired cargo into every module build.
This can also make also make it a pain to turn off if on a per-module basis if our root pom declared cargo and as a result wired it into every deployable module type build (i.e., all EJB, WAR, and EAR modules). That is one reason why it is nice to passively define a consistent use of the plugins using pluginManagement in the root pom and then actively declare them on a per-module basis in the implementation modules.
Verify the WAR module builds, deploys to the server, and undeploys from the server as part of the build lifecycle.
$ mvn clean verify
...
[INFO] --- maven-war-plugin:3.2.2:war (default-war) @ basicejb-war ---
[INFO] Packaging webapp
[INFO] Assembling webapp [basicejb-war] in [/home/jcstaff/proj/basicejbEx/basicejb-war/target/basicejb-war-1.0-SNAPSHOT]
[INFO] Processing war project
[INFO] Webapp assembled in [56 msecs]
[INFO] Building war: /home/jcstaff/proj/basicejbEx/basicejb-war/target/basicejb-war-1.0-SNAPSHOT.war
[INFO]
[INFO] --- cargo-maven2-plugin:1.4.3:redeploy (cargo-prep) @ basicejb-war ---
Oct 11, 2014 2:11:09 AM org.xnio.Xnio <clinit>
INFO: XNIO version 3.2.2.Final
Oct 11, 2014 2:11:09 AM org.xnio.nio.NioXnio <clinit>
INFO: XNIO NIO Implementation Version 3.2.2.Final
Oct 11, 2014 2:11:09 AM org.jboss.remoting3.EndpointImpl <clinit>
INFO: JBoss Remoting version 4.0.3.Final
...
[INFO] --- cargo-maven2-plugin:1.4.3:undeploy (cargo-post) @ basicejb-war ---
...
[INFO] BUILD SUCCESS
Note the JNDI names printed in the server console and server.log.
02:15:34,284 INFO [org.jboss.as.ejb3.deployment.processors.EjbJndiBindingsDeploymentUnitProcessor] (MSC service thread 1-4) JNDI bindings for session bean named ReservationEJB in deployment unit deployment "basicejb-war-1.0-SNAPSHOT.war" are as follows: java:global/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationLocal java:app/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationLocal java:module/ReservationEJB!org.myorg.basicejb.ejb.ReservationLocal java:global/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote java:app/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote java:module/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote java:jboss/exported/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote 02:15:34,309 INFO [org.jboss.weld.deployer] (MSC service thread 1-4) JBAS016005: Starting Services for CDI deployment: basicejb-war-1.0-SNAPSHOT.war 02:15:34,317 INFO [org.jboss.weld.deployer] (MSC service thread 1-1) JBAS016008: Starting weld service for deployment basicejb-war-1.0-SNAPSHOT.war 02:15:34,624 INFO [org.wildfly.extension.undertow] (MSC service thread 1-1) JBAS017534: Registered web context: /basicejb-war-1.0-SNAPSHOT 02:15:34,636 INFO [org.jboss.as.server] (management-handler-thread - 1) JBAS018559: Deployed "basicejb-war-1.0-SNAPSHOT.war" (runtime-name : "basicejb-war-1.0-SNAPSHOT.war")
The JNDI names starting with java:jboss/exported are especially important because they are available to remote clients.
java:jboss/exported/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
The following will be the base JNDI name of the EJB deployed by the WAR.
/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
Compare that to the base name used by the EJB deployed by the EAR. Notice that we have no application name in the WAR-deployed EJB and the module is named after the hosting WAR and and not the imported EJB.
/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote basicejb-ear-1.0-SNAPSHOT/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
Add the WAR to the *root* level module and verify everything builds from the root.
<modules>
<module>basicejb-ejb</module>
<module>basicejb-ear</module>
<module>basicejb-test</module>
<module>basicejb-war</module>
</modules>
$ mvn clean install -DskipTests
...
[INFO] Reactor Summary:
[INFO]
[INFO] Basic EJB Exercise ................................. SUCCESS [ 0.448 s]
[INFO] Basic EJB Exercise::EJB ............................ SUCCESS [ 2.550 s]
[INFO] Basic EJB Exercise::EAR ............................ SUCCESS [ 0.432 s]
[INFO] Basic EJB Exercise::Remote Test .................... SUCCESS [ 5.934 s]
[INFO] Basic EJB Exercise::WAR ............................ SUCCESS [ 0.458 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
This is what our project looks like so far.
. |-- basicejb-ear | `-- pom.xml |-- basicejb-ejb | |-- pom.xml | `-- src | |-- main | | `-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- ejb | | |-- ReservationEJB.java | | |-- ReservationLocal.java | | `-- ReservationRemote.java | `-- test | |-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- ejb | | `-- ReservationTest.java | `-- resources | `-- log4j.xml |-- basicejb-test | |-- pom.xml | `-- src | `-- test | |-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- earejb | | `-- ReservationIT.java | `-- resources | |-- jndi.properties | `-- log4j.xml |-- basicejb-war | `-- pom.xml `-- pom.xml
One advantage the WAR module has over the EAR module is that it can contain production code, unit tests, and IT tests. One could argue that is too much to place into a single module but who wants to be limited in options before we see what mode is best for our application. One could create the RMI Test for the WAR-deployed EJB in a separate module -- but we already did that for the EAR and the steps would be pretty much the same. In this section we will implement the RMI IT test within the WAR itself. This is a reasonable approach for smaller applications.
Add the dependencies to the WAR/pom.xml required to use logging and JUnit.
<!-- core dependencies -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<scope>provided</scope>
</dependency>
<!-- test dependencies -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<scope>test</scope>
</dependency>
The parent pom.xml should already have a dependencyManagement definition for these dependencies.
Notice we again will silently inherit the maven-compiler-plugin definition from the parent. We don't have to repeat any work to get a properly configured compiler. This continues to show how work we do at the parent pom.xml can be used to keep child modules consistent and allow child modules the flexibility to determine whether they should or should not include a particular dependency.
The dependency on slf4j-api was made scope=provided instead of scope=test to allow us to use this dependency in the src/main tree when we later add classes within the WAR module.
Add the dependencies required to be an RMI client of JBoss/Wildfly. We can again leverage the info.ejava.examples.common:jboss-rmi-client dependency to automatically bring these dependencies in.
<!-- dependencies used for remote interface -->
<dependency>
<groupId>info.ejava.examples.common</groupId>
<artifactId>jboss-rmi-client</artifactId>
<type>pom</type>
<scope>test</scope>
</dependency>
The above dependency on the jboss-rmi-client should be scope=test. If we made it scope=compile or scope=runtime it and its dependencies would be included in the WAR. We don't want these in the WAR. We want these dependencies made available to the RMI Test client left behind.
We should already have a dependencyManagement definition for the jboss-rmi-client module in the parent pom from an earlier chapter when we did this same action for the EAR-based RMI IT test.
Create a JNDI configuration by copying your jndi.properties from the EAR-based RMI Test module. Place this file in src/test/resources of the WAR.
$ mkdir -p basicejb-war/src/test/resources $ cp basicejb-test/src/test/resources/*.properties basicejb-war/src/test/resources/
$ cat basicejb-war/src/test/resources/jndi.properties #jndi.properties java.naming.factory.initial=${java.naming.factory.initial} java.naming.factory.url.pkgs=${java.naming.factory.url.pkgs} java.naming.provider.url=${java.naming.provider.url} #java.naming.security.principal=${jndi.user} #java.naming.security.credentials=${jndi.password}
Add a log4j.xml file to configure Log4j loggers. You may use a copy of the file you put into your EJB and RMI Test.
$ cp basicejb-test/src/test/resources/log4j.xml basicejb-war/src/test/resources/
$ cat basicejb-test/src/test/resources/log4j.xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE log4j:configuration PUBLIC
"-//APACHE//DTD LOG4J 1.2//EN" "http://logging.apache.org/log4j/1.2/apidocs/org/apache/log4j/xml/doc-files/log4j.dtd">
<log4j:configuration
xmlns:log4j="http://jakarta.apache.org/log4j/"
debug="false">
<appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
<param name="Target" value="System.out"/>
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern" value="%d{HH:mm:ss,SSS} %-5p (%F:%L) -%m%n"/>
</layout>
</appender>
<appender name="logfile" class="org.apache.log4j.RollingFileAppender">
<param name="File" value="target/log4j-out.txt"/>
<param name="Append" value="false"/>
<param name="MaxFileSize" value="100KB"/>
<param name="MaxBackupIndex" value="1"/>
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern"
value="%-5p %d{dd-MM HH:mm:ss,SSS} [%c] (%F:%M:%L) -%m%n"/>
</layout>
</appender>
<logger name="org.myorg">
<level value="debug"/>
<appender-ref ref="logfile"/>
</logger>
<root>
<priority value="info"/>
<appender-ref ref="CONSOLE"/>
</root>
</log4j:configuration>
Add resource filtering to test resources in the WAR/pom.xml. This will cause the jndi.properties file to have variables replaced with physical values when copied to the target tree.
<build>
<!-- filter test/resource files for profile-specific valies -->
<testResources>
<testResource>
<directory>src/test/resources</directory>
<filtering>true</filtering>
<includes>
<include>**/*.properties</include>
</includes>
</testResource>
<testResource>
<directory>src/test/resources</directory>
<filtering>false</filtering>
<excludes>
<exclude>**/*.properties</exclude>
</excludes>
</testResource>
</testResources>
As done before in the RMI Test module...the above will filter files that match a specific name pattern and then copy the remainder of files without filtering. It is important that you do not accidentally filter a file that was not meant to be filtered. This can corrupt a binary file or expand a variable during compile time that was meant to be expanded at runtime.
Rebuild just the WAR and verify the two JNDI/EJBClient files were filtered and their variables properly expanded.
$ mvn clean process-test-resources ... [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ basicejb-war --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 2 resources [INFO] Copying 1 resource .. [INFO] BUILD SUCCESS
$ cat basicejb-war/target/test-classes/jndi.properties
#jndi.properties
java.naming.factory.initial=org.wildfly.naming.client.WildFlyInitialContextFactory
java.naming.factory.url.pkgs=
java.naming.provider.url=http-remoting://127.0.0.1:8080
#java.naming.security.principal=known
#java.naming.security.credentials=password1!
Copy your JUnit IT test from the EAR-based RMI Test module and place it in your src/test tree. Use a new package directory (warejb versus earejb) for the copied IT test.
$ mkdir -p basicejb-war/src/test/java/org/myorg/basicejb/warejb/ $ cp basicejb-test/src/test/java/org/myorg/basicejb/earejb/ReservationIT.java basicejb-war/src/test/java/org/myorg/basicejb/warejb/ReservationIT.java
Modify the Java package spec to match the new directory. The rest is exactly what we covered in the EAR deploy section.
package org.myorg.basicejb.warejb; <<<<<<<<<<<<<<<<<<<<<<
import static org.junit.Assert.*;
import javax.naming.InitialContext;
import javax.naming.NamingException;
import org.junit.Before;
import org.junit.Test;
import org.myorg.basicejb.ejb.ReservationRemote;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class ReservationIT {
private static final Logger logger = LoggerFactory.getLogger(ReservationIT.class);
private static final String reservationJNDI = System.getProperty("jndi.name.reservation");
private InitialContext jndi;
private ReservationRemote reservationist;
@Before
public void setUp() throws NamingException {
assertNotNull("jndi.name.reservation not supplied", reservationJNDI);
logger.debug("getting jndi initial context");
jndi=new InitialContext();
logger.debug("jndi={}", jndi.getEnvironment());
jndi.lookup("jms");
logger.debug("jndi name:{}", reservationJNDI);
reservationist = (ReservationRemote) jndi.lookup(reservationJNDI);
logger.debug("reservationist={}", reservationist);
}
@Test
public void testPing() throws NamingException {
logger.info("*** testPing ***");
reservationist.ping();
}
}
Attempt to build at this point. The IT test will be compiled but not run because we have not yet declared the failsafe plugin in our WAR module to execute the JUnit IT tests.
$ mvn clean verify ... [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ basicejb-war --- ... [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ basicejb-war --- ... [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ basicejb-war --- ... [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ basicejb-war --- ... [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ basicejb-war --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 1 source file to /home/jcstaff/proj/basicejbEx/basicejb-war/target/test-classes ... [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ basicejb-war --- ... [INFO] --- maven-war-plugin:3.2.2:war (default-war) @ basicejb-war --- ... [INFO] --- cargo-maven2-plugin:1.4.3:redeploy (cargo-prep) @ basicejb-war --- ... [INFO] --- cargo-maven2-plugin:1.4.3:undeploy (cargo-post) @ basicejb-war --- ... [INFO] BUILD SUCCESS
Declare the failsafe plugin to your WAR/pom.xml to cause our JUnit IT test to be attempted.
<plugins>
<!-- adds IT integration tests to the build -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<configuration>
<systemPropertyVariables>
<jndi.name.reservation>ejb:/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote</jndi.name.reservation>
</systemPropertyVariables>
</configuration>
</plugin>
Remember to pass a jndi.name.reservation system property into the JVM using the failsafe configuration. The JNDI name differs slightly from the EAR-based form and can be obtained from the names printed on the JBoss console or server.log.
java:jboss/exported/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
Be sure to include the leading "/" prior to the module name when there is no EAR (e.g., "ejb:/".
This JNDI name does not contain an EAR application name at the beginning because there is no EAR. It contains a WAR module name instead of an EJB module name and then the rest of the EJB is the same.
Attempt to build at this point. The WAR should be deployed during the pre-integration-test phase, the IT test run during the integration-test phase, the WAR undeployed during the post-integration-test phase, and the results checked for failure during the verify phase.
$ mvn clean verify ... [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ basicejb-war --- ... [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ basicejb-war --- ... [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ basicejb-war --- ... [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ basicejb-war --- ... [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ basicejb-war --- ... [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ basicejb-war --- ... [INFO] --- maven-war-plugin:3.2.2:war (default-war) @ basicejb-war --- ... [INFO] --- cargo-maven2-plugin:1.4.3:redeploy (cargo-prep) @ basicejb-war --- ... [INFO] --- maven-failsafe-plugin:2.17:integration-test (default) @ basicejb-war --- ... Running org.myorg.basicejb.warejb.ReservationIT ... 02:30:58,927 DEBUG (ReservationIT.java:26) -jndi={ java.naming.factory.initial=org.jboss.naming.remote.client.InitialContextFactory, java.naming.provider.url=http-remoting://localhost:8080, java.naming.factory.url.pkgs=org.jboss.ejb.client.naming, jboss.naming.client.ejb.context=true} ... 02:30:59,231 DEBUG (ReservationIT.java:29) -jndi name:ejb:/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote 02:30:59,247 DEBUG (ReservationIT.java:31) -reservationist=Proxy for remote EJB StatelessEJBLocator{ appName='', moduleName='basicejb-war-1.0-SNAPSHOT', distinctName='', beanName='ReservationEJB', view='interface org.myorg.basicejb.ejb.ReservationRemote'} 02:30:59,248 INFO (ReservationIT.java:36) -*** testPing *** ... Tests run: 1, Failures: 0, Errors: 0, Skipped: 0 ... [INFO] --- cargo-maven2-plugin:1.4.3:undeploy (cargo-post) @ basicejb-war --- ... [INFO] --- maven-failsafe-plugin:2.17:verify (default) @ basicejb-war --- ... [INFO] BUILD SUCCESS
Notice the we are now getting a failsafe execution and our JUnit IT test run after the cargo deployment. We get this because we added a declaration of the failsafe plugin to the WAR module *and* we ended the Java class with IT. Looking at the plugin page, the other default name patterns include.
**/IT*.java
**/*IT.java
**/*ITCase.java
As discussed on that same web page -- you can expand or shrink that list with the use of includes and excludes. This is commonly done to focus your testing around a specific IT test or to exclude a IT test that requires further work for later.
Verify that everything builds from the root module.
$ cd ..; mvn clean install ... [INFO] Basic EJB Exercise ................................. SUCCESS [ 0.178 s] [INFO] Basic EJB Exercise::EJB ............................ SUCCESS [ 3.014 s] [INFO] Basic EJB Exercise::EAR ............................ SUCCESS [ 0.412 s] [INFO] Basic EJB Exercise::Remote Test .................... SUCCESS [ 5.144 s] [INFO] Basic EJB Exercise::WAR ............................ SUCCESS [ 2.571 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
In the previous section we deployed an EJB imported from an external EJB Module. In this section we will embed a new EJB within the WAR module. This type of packaging can be used by JSP/Servlet writers to invoke lightweight POJOs injected by the container that can form transaction boundaries and other EJB functionality -- without having to create a separate EJB module. The more you think of this as a "for internal use only" and the smaller/self-contained your application -- the more this packaging scheme makes sense even though it can also blur the boundaries between the architectural layers.
Create a source directory for Java classes that will be included in the production WAR. This source is placed in src/main/java but will end up in WEB-INF/classes once the WAR is built.
$ mkdir -p basicejb-war/src/main/java
Create a package directory for our EJB and interface classes.
$ mkdir -p basicejb-war/src/main/java/org/myorg/basicejb/webejb/
Create the following EJB @Remote interface in the WAR/src/main/java directory.
$ cat basicejb-war/src/main/java/org/myorg/basicejb/webejb/ShopperRemote.java
package org.myorg.basicejb.webejb;
import javax.ejb.Remote;
@Remote
public interface ShopperRemote {
int ping();
void close();
}
Create the following Stateful EJB class in the WAR/src/main/java directory. In the previous example our ReservationEJB was stateless and could not maintain any conversation state with the client. In this example we will make the EJB stateful -- which means there will be a memory allocated for each client to house information specific to their conversation with the EJB instance.
$ cat basicejb-war/src/main/java/org/myorg/basicejb/webejb/ShopperEJB.java
package org.myorg.basicejb.webejb;
import javax.annotation.PostConstruct;
import javax.annotation.PreDestroy;
import javax.ejb.Remove;
import javax.ejb.Stateful;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@Stateful
public class ShopperEJB implements ShopperRemote {
private static Logger logger = LoggerFactory.getLogger(ShopperEJB.class);
//we can only track conversation state here if we are stateful
private int counter=0;
@PostConstruct
public void init() {
logger.debug("*** ShopperEJB({}).init() ***", super.hashCode());
}
@PreDestroy
public void destroy() {
logger.debug("*** ShopperEJB({}).destroy() ***", super.hashCode());
}
@Override
public int ping() {
logger.debug("ping({}) called, returned {}", super.hashCode(), counter);
return counter++;
}
@Override
@Remove
public void close() {
logger.debug("close({}) called", super.hashCode());
}
}
We have added information to the ping call debug text that will provide us an indication which EJB instance is being called.
Annotate a business method with @javax.ejb.Remove for the client to call -- to inform the container when the client is done with the server-side state. Otherwise the container will hold onto the state until the defined timeout.
Deploy the WAR at this point so we can be sure the EJB was created correctly and to be sure of the JNDI name created. Note that we are not yet done with the module. The problem is we have transitioned from a IT test-only module to one that also hosts EJB code. We are missing a few dependencies.
$ mvn clean pre-integration-test ... [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ basicejb-war --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 2 source files to /home/jcstaff/proj/basicejbEx/basicejb-war/target/classes [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] /home/jcstaff/proj/basicejbEx/basicejb-war/src/main/java/org/myorg/basicejb/webejb/ShopperEJB.java:[6,17] package javax.ejb does not exist [ERROR] /home/jcstaff/proj/basicejbEx/basicejb-war/src/main/java/org/myorg/basicejb/webejb/ShopperRemote.java:[3,17] package javax.ejb does not exist [ERROR] /home/jcstaff/proj/basicejbEx/basicejb-war/src/main/java/org/myorg/basicejb/webejb/ShopperRemote.java:[5,2] cannot find symbol symbol: class Remote [ERROR] /home/jcstaff/proj/basicejbEx/basicejb-war/src/main/java/org/myorg/basicejb/webejb/ShopperEJB.java:[11,2] cannot find symbol symbol: class Stateful [INFO] 4 errors ... [INFO] BUILD FAILURE
Add several dependencies to the WAR/pom.xml account for use of EJB types.
# basicejb-war/pom.xml
<!-- for EJBs embedded in WAR module -->
<dependency>
<groupId>javax.ejb</groupId>
<artifactId>javax.ejb-api</artifactId>
<scope>provided</scope>
</dependency>
You should always declare a scope=provided dependency on the JavaEE API artifacts so they are not unnecessarily deployed in the WAR to the server. The server already has a compliant version of these APIs and implementations for those APIs.
Re-deploy the WAR with the compilation dependency corrected and look for the newly added EJB to show up in the console or server.log.
$ mvn clean pre-integration-test ... [INFO] BUILD SUCCESS
java:global/basicejb-war-1.0-SNAPSHOT/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote java:app/basicejb-war-1.0-SNAPSHOT/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote java:module/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote java:jboss/exported/basicejb-war-1.0-SNAPSHOT/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote java:global/basicejb-war-1.0-SNAPSHOT/ShopperEJB java:app/basicejb-war-1.0-SNAPSHOT/ShopperEJB java:module/ShopperEJB
We are specially interested in the java:jboss/exported name.
java:jboss/exported/basicejb-war-1.0-SNAPSHOT/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote
This will form the base of our JNDI name used for the IT test.
/basicejb-war-1.0-SNAPSHOT/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote
The main difference between the stateless and stateful JNDI names are when we use EJB Client. When using the "ejb:" naming prefix we must append "?stateful" to the end of the name to tell EJB Client we are communicating with a stateful EJB. EJB Client, unlike JBoss Remoting, understands EJB communication and will attempt to setup for communication with the EJB in an efficient manner. The extra text is not appropriate for using outside of the "ejb:" naming.
ejb:/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote ejb:/basicejb-war-1.0-SNAPSHOT/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote?stateful
Create the following IT test in your WAR/src/test/java directory tree.
$ cat basicejb-war/src/test/java/org/myorg/basicejb/warejb/ShopperIT.java
package org.myorg.basicejb.warejb;
import static org.junit.Assert.*;
import javax.naming.InitialContext;
import javax.naming.NamingException;
import org.junit.Before;
import org.junit.Test;
import org.myorg.basicejb.webejb.ShopperRemote;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class ShopperIT {
private static final Logger logger = LoggerFactory.getLogger(ShopperIT.class);
private static final String shopperJNDI = System.getProperty("jndi.name.shopper");
private InitialContext jndi;
@Before
public void setUp() throws NamingException {
assertNotNull("jndi.name.reservation not supplied", shopperJNDI);
logger.debug("getting jndi initial context");
jndi=new InitialContext();
logger.debug("jndi={}", jndi.getEnvironment());
}
@Test
public void testPing() throws NamingException {
logger.info("*** testPing ***");
ShopperRemote shopper1=null;
try {
shopper1= (ShopperRemote) jndi.lookup(shopperJNDI);
for (int i=0; i<10; i++) {
int counter1=shopper1.ping();
assertEquals("unexpected count from shopper1", i, counter1);
}
} finally {
if (shopper1!=null) { shopper1.close(); }
}
}
}
Notice the difference in the way we constructed the stateful instance for our IT test. Since the EJB is stateful, we create a reference to it close to where we will use it. Stateful EJBs have (in-memory) state related to a specific client and will be unique to each client.
If you are going to implement a server-side stateful session, you should be sure to close it from the client-side when done. Otherwise the resources will remain allocated until the configured timeout.
Add the new EJB's jndi name to the failsafe configuration using the "jndi.name.shopper" token found in the IT class' System.getProperty() statement. Remember to add the "?stateful" to the end of the JNDI name. This will help the client library setup appropriately for communication with this EJB.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<configuration>
<systemPropertyVariables>
<jndi.name.reservation>ejb:/basicejb-war-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote</jndi.name.reservation>
<jndi.name.shopper>ejb:/basicejb-war-1.0-SNAPSHOT/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote?stateful</jndi.name.shopper>
</systemPropertyVariables>
</configuration>
</plugin>
Build the WAR and note our IT test passes -- verifying the stateful behavior of the EJB. Note the output in the server.log. It shows each call returning to the same bean instance.
$ mvn clean verify ... Running org.myorg.basicejb.warejb.ReservationIT ... Running org.myorg.basicejb.warejb.ShopperIT ... Tests run: 2, Failures: 0, Errors: 0, Skipped: 0 ... [INFO] BUILD SUCCESS
2018-08-19 17:30:21,688 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-1) *** ShopperEJB(1087164002).init() *** 2018-08-19 17:30:21,697 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-1) ping(1087164002) called, returned 0 2018-08-19 17:30:21,701 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-1) ping(1087164002) called, returned 1 ... 2018-08-19 17:30:21,737 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-1) ping(1087164002) called, returned 9 2018-08-19 17:30:21,741 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-1) close(1087164002) called 2018-08-19 17:30:21,742 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-1) *** ShopperEJB(1087164002).destroy() ***
Update the IT test to add a second instance used concurrently with the first.
@Test
public void testPing() throws NamingException {
logger.info("*** testPing ***");
ShopperRemote shopper1=null;
ShopperRemote shopper2=null;
try {
shopper1= (ShopperRemote) jndi.lookup(shopperJNDI);
shopper2= (ShopperRemote) jndi.lookup(shopperJNDI);
for (int i=0; i<10; i++) {
int counter1=shopper1.ping();
int counter2=shopper2.ping();
assertEquals("unexpected count from shopper1", i, counter1);
assertEquals("unexpected count from shopper2", i, counter2);
}
} finally {
if (shopper1!=null) { shopper1.close(); }
if (shopper2!=null) { shopper2.close(); }
}
}
Re-build the WAR and note the IT test continues to pass -- showing we have two independent instances. Note the output in the server.log showing the two sets of calls went to separate, consistent instances on the server.
$ mvn clean verify ... Running org.myorg.basicejb.warejb.ReservationIT ... Running org.myorg.basicejb.warejb.ShopperIT ... Tests run: 2, Failures: 0, Errors: 0, Skipped: 0 ... [INFO] BUILD SUCCESS
2018-08-19 17:37:20,686 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-2) *** ShopperEJB(1430132394).init() *** 2018-08-19 17:37:20,688 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-1) *** ShopperEJB(1087164002).init() *** 2018-08-19 17:37:20,694 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-2) ping(1430132394) called, returned 0 2018-08-19 17:37:20,697 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-1) ping(1087164002) called, returned 0 2018-08-19 17:37:20,699 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-2) ping(1430132394) called, returned 1 2018-08-19 17:37:20,701 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-1) ping(1087164002) called, returned 1 ... 2018-08-19 17:37:20,735 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-2) ping(1430132394) called, returned 9 2018-08-19 17:37:20,737 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-1) ping(1087164002) called, returned 9 2018-08-19 17:37:20,739 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-2) close(1430132394) called 2018-08-19 17:37:20,739 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-2) *** ShopperEJB(1430132394).destroy() *** 2018-08-19 17:37:20,741 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-1) close(1087164002) called 2018-08-19 17:37:20,742 DEBUG [org.myorg.basicejb.webejb.ShopperEJB] (default task-1) *** ShopperEJB(1087164002).destroy() ***
Notice how the EJB @PreDestroy callback was immediately invoked just after the @Remove method was invoked and before the next business method was called on the server.
Take a look at the produced WAR. Notice it contains the imported EJB archive in the WEB-INF/lib directory and embedded EJB classes in WEB-INF/classes directory. These are standard locations in a WAR for deploying executable code in the WAR's classloader.
$ jar tf basicejb-war/target/basicejb-war-1.0-SNAPSHOT.war
...
WEB-INF/lib/basicejb-ejb-1.0-SNAPSHOT.jar
WEB-INF/classes/org/myorg/basicejb/webejb/ShopperRemote.class
WEB-INF/classes/org/myorg/basicejb/webejb/ShopperEJB.class
...
Now that you have this working, experiment by removing the "?stateful" from the JNDI name in the WAR/pom.
<jndi.name.shopper>ejb:/basicejb-war-1.0-SNAPSHOT/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote</jndi.name.shopper>
Attempt to build and IT test your stateless EJB. Needless to say you will not get beyind the first call to ping where the client library is trying to broker something in the call.
[INFO] Running org.myorg.basicejb.warejb.ShopperIT 17:23:30,284 DEBUG (ShopperIT.java:24) -getting jndi initial context 17:23:30,285 DEBUG (ShopperIT.java:26) -jndi={java.naming.factory.initial=org.wildfly.naming.client.WildFlyInitialContextFactory, java.naming.provider.url=http-remoting://127.0.0.1:8080, java.naming.factory.url.pkgs=} 17:23:30,286 INFO (ShopperIT.java:31) -*** testPing *** [ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.098 s <<< FAILURE! - in org.myorg.basicejb.warejb.ShopperIT [ERROR] testPing(org.myorg.basicejb.warejb.ShopperIT) Time elapsed: 0.098 s <<< ERROR! javax.ejb.EJBException: java.lang.IllegalStateException: WFLYEJB0234: Session id hasn't been set for stateful component: ShopperEJB Caused by: java.lang.IllegalStateException: WFLYEJB0234: Session id hasn't been set for stateful component: ShopperEJB ... [ERROR] Errors: [ERROR] ShopperIT.testPing » EJB java.lang.IllegalStateException: WFLYEJB0234: Session... Tests run: 2, Failures: 0, Errors: 1, Skipped: 0 [INFO] BUILD FAILURE
Restore the JNDI name and verify your project tree looks like the following at this point.
. |-- basicejb-ear | `-- pom.xml |-- basicejb-ejb | |-- pom.xml | `-- src | |-- main | | `-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- ejb | | |-- ReservationEJB.java | | |-- ReservationLocal.java | | `-- ReservationRemote.java | `-- test | |-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- ejb | | `-- ReservationTest.java | `-- resources | `-- log4j.xml |-- basicejb-test | |-- pom.xml | `-- src | `-- test | |-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- earejb | | `-- ReservationIT.java | `-- resources | |-- jndi.properties | `-- log4j.xml |-- basicejb-war | |-- pom.xml | `-- src | |-- main | | |-- java | | | `-- org | | | `-- myorg | | | `-- basicejb | | | `-- webejb | | | |-- ShopperEJB.java | | | `-- ShopperRemote.java | | `-- webapp | `-- test | |-- java | | `-- org | | `-- myorg | | `-- basicejb | | `-- warejb | | |-- ReservationIT.java | | `-- ShopperIT.java | `-- resources | |-- jndi.properties | `-- log4j.xml `-- pom.xml
Deploy EJB Dependency
Part of the flexible deploymemt enhancement made in JavaEE 6
Avoids requirement for separate EAR module just to add EJB behavior to an existing WAR
Retains encapsulation of the EJB. It is deployed as an EJB.jar within the WAR.
Deploy Embedded EJB
Another option in the flexible deployment feature
Avoids requirement for separate EJB module just to add EJB behavior to an existing WAR
Conceptually, the EJBs in this mode are likely a small extension of the web code
No built-in feature for creating an EJB-client for remote clients. Must be done manually using JAR plugin
Stateless EJB
No per-client conversational state maintained
Like calling a function with supporting backend resources initialized and enterprise requirements (e.g., security, transactions) enforced
All information passed into and returned from the call except what is accessed/stored in backend resources (i.e., database)
Easier to scale and load balance because each call may go to a separate instance and server
Stateful EJB
Dedicated resources allocated on the server per instance
Holds state in-memory or serialized to temporary storage
Like calling a method of an object which is caching information in a multi-step transaction
Harder to scale since all calls either must return to the same instance or the state must be shared across the cluster
In the previous sections we used several commands as a part of the build. We will now take a moment to describe a few of them so it is more evident they exist and how they can be useful.
Maven builds modules in phases and in a specific phase ordering. We won't discuss all of them here, but we will hit the ones of importance to server-side development. Try each of the following commands from the root/parent directory and also watch the output in the server.log.
$ tail -n 999 -f standalone/log/server.log
$mvn (phase)
mvn clean
Delete previously built artifacts below the target tree(s).
mvn process-test-classes
mvn test
Run unit tests (i.e., classes matching surefire file name pattern)
mvn package
Build Java, EJB, WAR, and EAR archives
mvn pre-integration-test
Deploy deployable archives for the module. Note this is a per-module basis. If you have separate IT modules, each module is responsible for deploying its required artifacts. This is a very useful target when running the IT test within the IDE and outside of the Maven framework once deployed to the server.
mvn integration-test
Run integration tests (IT tests) (i.e., classes matching failsafe file name pattern)
mvn post-integration-test
Undeploy deployed archives for the module. Note this is a per-module basis. If you have separate IT modules, each module is responsible for undeploying its required artifacts. If these artifacts are needed by a downstream module build they will need to be re-deployed by that downstream module.
mvn verify
Evaluate the IT test results and wait to fail until this point so that the deployed artifacts can be undeployed.
mvn install
Install the built artifacts into the local repository to make them available to downstream modules when building outside the context of a common parent. i.e., If you build from the parent of an EJB and EAR, the EAR will be given knowledge of the EJB through the parent. If you build only the EAR, the latest installed version of the EJB in the local repository will be used.
It is common, when building a multi-module application that the first set of modules build fine but an error is encountered when building one of the later modules.
Re-build just the WAR and RMI Test modules by naming the module to start with. Based on the order specified in the parent pom, we will start with the RMI Test module.
<modules> <module>basicejb-ejb</module> <module>basicejb-ear</module> <module>basicejb-test</module> <module>basicejb-war</module> </modules>
$ mvn clean install -rf :basicejb-test ... [INFO] Reactor Summary: [INFO] [INFO] Basic EJB Exercise::Remote Test .................... SUCCESS [ 8.018 s] [INFO] Basic EJB Exercise::WAR ............................ SUCCESS [ 3.030 s] ... [INFO] BUILD SUCCESS
There are times when you are looking to build one of the child modules or a branch of the child module tree but you do not want the build to continue from that point. You likely know that you can execute a build in that module's directory or you can stay in the same directory and point to the module you would like built. This technique is useful when you configure a build server to build specific bootstrap modules (in your tree checked out from CM) prior to moving on to build the remaining modules.
The parameter to -f is a file reference and not a module name. You can use this to refer to a parent (e.g., mvn clean -f ../pom.xml), siblings, as well as children. Lets say you are in src/test/resources actively editing jndi.properties. You can issue a building command that looks like
$ mvn clean process-test-resources -f ../../../pom.xml
Re-build just the RMI Test module from the root directory by specifying the module to the build.
$ mvn clean install -f basicejb-test/pom.xml ... [INFO] BUILD SUCCESS
There are times when you would like to undeploy a deployed archive without re-deploying again first. This is common as part of the "mvn clean" lifecycle and allows you to undeploy everything you have before deploying the next version on a multi-deployment setup.
<profiles>
<!-- this profiles allow the EAR to be undeployed before it is deleted
during the clean target. This behavior requires the EAR to be
present, so it cannot be part of the default behavior. It is
only activated when -Pundeploy is present so that
normal cleans do not fail. -->
<profile>
<id>undeploy</id>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.cargo</groupId>
<artifactId>cargo-maven2-plugin</artifactId>
<executions>
<execution>
<id>undeploy-ear</id>
<phase>pre-clean</phase>
<goals>
<goal>undeploy</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
Deploy the EAR and WAR artifacts to the server and note that they are deployed using the console or server.log.
$ mvn clean pre-integration-test
# server.log 2014-10-11 17:24:02,553 INFO [org.jboss.as.server] (management-handler-thread - 1) JBAS018559: Deployed "basicejb-ear-1.0-SNAPSHOT.ear" (runtime-name : "basicejb-ear-1.0-SNAPSHOT.ear") ... 2014-10-11 17:24:04,157 INFO [org.jboss.as.server] (management-handler-thread - 1) JBAS018559: Deployed "basicejb-war-1.0-SNAPSHOT.war" (runtime-name : "basicejb-war-1.0-SNAPSHOT.war")
This is the point in which we can run our IT tests within the IDE once we resolve a few remaining dependencies on the failsafe plugins (later...)
Undeploy the EAR and WAR artifacts from the server and note they are undeployed using the output of the console or server.log.
$ mvn clean -Pundeploy
2014-10-11 17:24:15,019 INFO [org.jboss.as.server] (management-handler-thread - 1) JBAS018558: Undeployed "basicejb-ear-1.0-SNAPSHOT.ear" (runtime-name: "basicejb-ear-1.0-SNAPSHOT.ear") ... 2014-10-11 17:24:15,662 INFO [org.jboss.as.server] (management-handler-thread - 1) JBAS018558: Undeployed "basicejb-war-1.0-SNAPSHOT.war" (runtime-name: "basicejb-war-1.0-SNAPSHOT.war")
To execute a single IT test case use the following
$ mvn verify -f basicejb-test -Dit.test=org.myorg.basicejb.earejb.ReservationIT
To execute a single IT test case method use the following
$ mvn verify -f basicejb-test -Dit.test=org.myorg.basicejb.earejb.ReservationIT#testPing
Maven and surefire have made it very tough to skip unit tests and only run a specific IT test. However, you can always add a temporary set of includes/excludes to the surefire configuration in order to achieve the sometimes-desired result of turning off any preceding unit tests. Note that this is not an issue when IT tests are hosted in a separate module.
Leverage build lifecycle phases to build what is needed
Keeps from executing the entire build lifecycle
Required in some cases (deployment) to achieve desired results (stay deployed)
Re-start the build at a specific module
Saves time from building unchanged modules
User must know that skipped modules are not needed
Undeploy module artifact
Enables cleanup prior to a re-deploy
Good for when having multiple application deployments with dependencies between them
Although the instructions thus far have provided details at the filesystem and Maven level, you have surely started importing and developing the projects within your IDE by now. The one gotcha remaining to solve are those pesky JNDI names and keeping our Java code ignorant of version numbers. Version numbers have a place but they rarely have a place in lookups when the version# can be updated with even trivial releases.
basicejb-ear-1.0-SNAPSHOT/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
We initially solved the problem by getting the pom involved with expanding the version# and passing the result to the IT test as a system property. That worked well within the Maven build. However, we want to develop and run our IT tests outside of Maven once we have leveraged enough of Maven to get started.
<jndi.name.reservation>
ejb:basicejb-ear-${project.version}/basicejb-ejb-${project.version}/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
</jndi.name.reservation>
private static final String reservationJNDI = System.getProperty("jndi.name.reservation");
In this section we are going to add configuration options that will eliminate the version#s from the JNDI names
basicejb-ear/basicejb-ejb/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
This will permit our IT tests to form a reasonable default. We can still override this default with a system property.
private static final String reservationJNDI = System.getProperty("jndi.name.reservation",
"ejb:basicejb-ear/basicejb-ejb/ReservationEJB!"+ReservationRemote.class.getName());
Build the RMI Test module and notice the application/EAR portion of the JNDI name printed to the server's console and server.log contains the version# of the EAR.
$ mvn verify -f basicejb-test
java:jboss/exported/basicejb-ear-1.0-SNAPSHOT/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
Add the following declaration of the EAR plugin and add a configuration option to change the applicationName to be the artifactId for the module.
<build>
<plugins>
<!-- provide properties here to impact the EAR packaging -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-ear-plugin</artifactId>
<configuration>
<!-- eliminates use of version in EAR JNDI name portion -->
<applicationName>${project.artifactId}</applicationName>
</configuration>
</plugin>
</plugins>
</build>
Notice that with the lack of a maven-ear-plugin specification, the EAR plugin defaults to generate an application.xml according to a J2EE 1.3 DTD.
$ cat basicejb-ear/target/application.xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE application PUBLIC
"-//Sun Microsystems, Inc.//DTD J2EE Application 1.3//EN"
"http://java.sun.com/dtd/application_1_3.dtd">
<application>
<display-name>basicejb-ear</display-name>
<description>This project provides a sample EAR for the Java EE components
associated with the overall project.</description>
<module>
<ejb>basicejb-ejb-1.0-SNAPSHOT.jar</ejb>
</module>
</application>
Since this is our first reference to the EAR plugin, add a pluginManagement definition of this plugin in the parent pom.xml.
<properties>
...
<maven-ear-plugin.version>3.0.1</maven-ear-plugin.version>
...
<build>
<pluginManagement>
<plugins>
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-ear-plugin</artifactId>
<version>${maven-ear-plugin.version}</version>
<configuration>
<version>7</version>
</configuration>
</plugin>
Once we add the plugin definition to use JavaEE 7, the EAR plugin generates an application.xml according to a JavaEE 7 schema. There is no noticeable difference other than the change in schema versus DTD reference.
$ cat basicejb-ear/target/application.xml
<?xml version="1.0" encoding="UTF-8"?>
<application xmlns="http://xmlns.jcp.org/xml/ns/javaee"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee http://xmlns.jcp.org/xml/ns/javaee/application_7.xsd" version="7">
<application-name>basicejb-ear</application-name>
<description>This project provides a sample EAR for the Java EE components
associated with the overall project.</description>
<display-name>basicejb-ear</display-name>
<module>
<ejb>basicejb-ejb-1.0-SNAPSHOT.jar</ejb>
</module>
</application>
Rebuild and redeploy the EAR and note the application portion of the JNDI name is now a fixed value. Since our IT test still references the older name it fails.
java:jboss/exported/basicejb-ear/basicejb-ejb-1.0-SNAPSHOT/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
$ mvn clean install -rf basicejb-ear ... 18:09:46,132 INFO (ReservationIT.java:37) -*** testPing *** [ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.894 s <<< FAILURE! - in org.myorg.basicejb.earejb.ReservationIT [ERROR] testPing(org.myorg.basicejb.earejb.ReservationIT) Time elapsed: 0.767 s <<< ERROR! org.jboss.ejb.client.RequestSendFailedException: EJBCLIENT000409: No more destinations are available at org.myorg.basicejb.earejb.ReservationIT.testPing(ReservationIT.java:39) ... [INFO] Reactor Summary: [INFO] [INFO] Basic EJB Exercise::EAR ............................ SUCCESS [ 1.533 s] [INFO] Basic EJB Exercise::Remote Test .................... FAILURE [ 5.656 s] [INFO] Basic EJB Exercise::WAR ............................ SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE
Update the RMI Test/pom.xml to eliminate the version# of the application and re-run the prior build. We should be working again but still with a version# in the EJB.
<jndi.name.reservation>ejb:basicejb-ear/basicejb-ejb-${project.version}/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote</jndi.name.reservation>
$ mvn clean install -rf basicejb-ear ... [INFO] Reactor Summary: [INFO] [INFO] Basic EJB Exercise::EAR ............................ SUCCESS [ 1.573 s] [INFO] Basic EJB Exercise::Remote Test .................... SUCCESS [ 5.851 s] [INFO] Basic EJB Exercise::WAR ............................ SUCCESS [ 3.068 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
Add a modules section to the EAR plugin declaration. The default for this relied on the dependencies and this caused the EJB.jar to be hosted in the EAR with its fully qualified version. Here we supply a specific name we want for the EJB.jar. This will get reflected in the JNDI name.
I also ask that you add a defaultLibBundleDir specification the to descriptor at this time. This will define a directory within the EAR (similar to WEB-INF/lib) whose contents will be added to the global classpath of components loaded by the EAR. We don't have a use for that yet, but this is the last tweak we will be making to EARs for a while.
<build>
<plugins>
<!-- provide properties here to impact the EAR packaging -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-ear-plugin</artifactId>
<configuration>
<!-- names directory within EAR to place jars to be in classpath -->
<defaultLibBundleDir>lib</defaultLibBundleDir>
<!-- eliminates use of version in EAR JNDI name portion -->
<applicationName>${project.artifactId}</applicationName>
<modules>
<!-- eliminates use of the version in the EJB JNDI name -->
<ejbModule>
<groupId>${project.groupId}</groupId>
<artifactId>basicejb-ejb</artifactId>
<bundleFileName>basicejb-ejb.jar</bundleFileName>
</ejbModule>
</modules>
</configuration>
</plugin>
</plugins>
</build>
Rebuild and redeploy the EAR and note the module portion of the JNDI name is now a fixed value. Since our IT test still references the older name it fails.
java:jboss/exported/basicejb-ear/basicejb-ejb/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote
$ mvn clean install -rf basicejb-ear ... java.lang.IllegalStateException: EJBCLIENT000025: No EJB receiver available for handling [appName:basicejb-ear, moduleName:basicejb-ejb-1.0-SNAPSHOT, distinctName:] combination for invocation context org.jboss.ejb.client.EJBClientInvocationContext@3d01e5eb at org.jboss.ejb.client.EJBClientContext.requireEJBReceiver(EJBClientContext.java:749) ... at org.myorg.basicejb.earejb.ReservationIT.testPing(ReservationIT.java:37) ... [INFO] Reactor Summary: [INFO] [INFO] Basic EJB Exercise::EAR ............................ SUCCESS [ 1.369 s] [INFO] Basic EJB Exercise::Remote Test .................... FAILURE [ 5.643 s] [INFO] Basic EJB Exercise::WAR ............................ SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE
Update the RMI Test/pom.xml to eliminate the version# of the module and re-run the prior build. We should be working again but still with a version# in the EJB.
<jndi.name.reservation>ejb:basicejb-ear/basicejb-ejb/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote</jndi.name.reservation>
$ mvn clean install -rf basicejb-ear ... [INFO] Reactor Summary: [INFO] [INFO] Basic EJB Exercise::EAR ............................ SUCCESS [ 1.424 s] [INFO] Basic EJB Exercise::Remote Test .................... SUCCESS [ 5.811 s] [INFO] Basic EJB Exercise::WAR ............................ SUCCESS [ 3.000 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
Update your IT test to assign a reasonable default for the JNDI name. To test the change, comment out the system property specification in the failsafe configuration.
public class ReservationIT {
private static final String reservationJNDI = System.getProperty("jndi.name.reservation",
"ejb:basicejb-ear/basicejb-ejb/ReservationEJB!"+ReservationRemote.class.getName());
<systemPropertyVariables>
<!--
<jndi.name.reservation>ejb:basicejb-ear/basicejb-ejb/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote</jndi.name.reservation>
-->
</systemPropertyVariables>
$ mvn clean install -rf basicejb-ear ... [INFO] Reactor Summary: [INFO] [INFO] Basic EJB Exercise::EAR ............................ SUCCESS [ 1.522 s] [INFO] Basic EJB Exercise::Remote Test .................... SUCCESS [ 5.810 s] [INFO] Basic EJB Exercise::WAR ............................ SUCCESS [ 3.119 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
If you are not sure if this is using your default JNDI name you can optionally munge the value in the IT test and note it will then fail.
You may optionally re-enable the system property for the JNDI name but I would suggest against it until there is a need to derive the name a different way. It can get confusing when a default gets derived from multiple locations.
Note the JNDI name of the WAR when it was deployed above. We want to remove the version# from the deployed WAR.
java:jboss/exported/basicejb-war-1.0-SNAPSHOT/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote
If we were deploying the WAR within an EAR, we would have added the following to the EAR plugin.
<webModule>
<groupId>${project.groupId}</groupId>
<artifactId>basicejb-war</artifactId>
<contextRoot>basicejb-war</contextRoot>
</webModule>
Since we deploy a naked WAR, we cannot use the standard EAR technique. However, we can accomplish our goal by adding a jboss-specific deployment descriptor (jboss-web.xml) to the WAR (WEB-INF directory).
$ mkdir basicejb-war/src/main/webapp/WEB-INF/
$ cat basicejb-war/src/main/webapp/WEB-INF/jboss-web.xml
<jboss-web>
<!-- needed to always assure that version# does not get into JNDI name -->
<context-root>basicejb-war</context-root>
</jboss-web>
The source directory for the WEB-INF and WAR content data is in src/main/webapp. Artifacts in src/main/resources will end up in WEB-INF/classes.
Rebuild and redeploy the WAR and note the module portion of the JNDI name is now a fixed value. Since our IT test still references the older name it fails.
java:jboss/exported/basicejb-war/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote
$ mvn clean install -rf basicejb-war ... testPing(org.myorg.basicejb.warejb.ReservationIT) Time elapsed: 0.766 sec <<< ERROR! java.lang.IllegalStateException: EJBCLIENT000025: No EJB receiver available for handling [appName:, moduleName:basicejb-war-1.0-SNAPSHOT, distinctName:] combination for invocation context org.jboss.ejb.client.EJBClientInvocationContext@1fafcae2 at org.jboss.ejb.client.EJBClientContext.requireEJBReceiver(EJBClientContext.java:749) ... at com.sun.proxy.$Proxy5.ping(Unknown Source) at org.myorg.basicejb.warejb.ReservationIT.testPing(ReservationIT.java:37) testPing(org.myorg.basicejb.warejb.ShopperIT) Time elapsed: 0 sec <<< ERROR! javax.naming.NamingException: Failed to create proxy at org.jboss.ejb.client.EJBClientContext.requireEJBReceiver(EJBClientContext.java:813) ... at javax.naming.InitialContext.lookup(InitialContext.java:411) at org.myorg.basicejb.warejb.ShopperIT.testPing(ShopperIT.java:32) ... Tests in error: ReservationIT.testPing:37 » IllegalState EJBCLIENT000025: No EJB receiver avai... ShopperIT.testPing:32 » Naming Failed to create proxy ... [INFO] BUILD FAILURE
I thought it was interesting that the IT test for the stateless EJB does not fail until the call is actually made to ping(). The IT test for the stateful EJB fails during the JNDI lookup().
Update the WAR/pom.xml to eliminate the version# of the module for both JNDI names and re-run the prior build. We should be working again but still with a version# in the WAR.
<jndi.name.reservation>ejb:/basicejb-war/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote</jndi.name.reservation>
<jndi.name.shopper>ejb:/basicejb-war/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote?stateful</jndi.name.shopper>
$ mvn clean install -rf basicejb-war ... [INFO] BUILD SUCCESS
Be sure to append "?stateful" to your stateful EJB JNDI name. We need this for "ejb:" naming contexts for EJB Client to work correctly. there should be no issue promoting those details to the client since a client will be designed differently when working with a stateless or stateful EJB.
Update your IT tests to assign a reasonable default for the JNDI name. To test the change, comment out the system property specification in the failsafe configuration.
public class ReservationIT {
private static final String reservationJNDI = System.getProperty("jndi.name.reservation",
"ejb:/basicejb-war/ReservationEJB!"+ReservationRemote.class.getName());
public class ShopperIT {
private static final String shopperJNDI = System.getProperty("jndi.name.shopper",
"ejb:/basicejb-war/ShopperEJB!"+ShopperRemote.class.getName()+"?stateful");
<systemPropertyVariables>
<!--
<jndi.name.reservation>ejb:/basicejb-war/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote</jndi.name.reservation>
<jndi.name.shopper>ejb:/basicejb-war/ShopperEJB!org.myorg.basicejb.webejb.ShopperRemote?stateful</jndi.name.shopper>
-->
</systemPropertyVariables>
$ mvn clean install -rf basicejb-ear ... [INFO] BUILD SUCCESS
If you are not sure if this is using your default JNDI name you can optionally munge the value in the IT test and note it will then fail.
You may optionally re-enable the system property for the JNDI names but I would suggest against it until there is a need to derive the name a different way. It can get confusing when a default gets derived from multiple locations. If we end up with several IT tests forming the same JNDI name, I would suggest moving the construction of the JNDI name to a test utility class.
EAR-based names
impacted using the META-INF/application.xml
application.xml can be auto-built and configured using EAR plugin
EAR name configured with application-name element of application.xml
EJB name configured with the name of the EJB module within the EAR
EJB module can be renamed using ejbModule.bundleFileName of EAR plugin
WAR name configured with module.web.context-root element of application.xml
WAR-based names
no JavaEE/WAR standard for naming a naked-deployed WAR
could deploy WAR within WAR to help control JNDI name
JBoss uses the context-root supplied in WEB-INF/jboss-web.xml
Reasonable defaults in IT classes
Makes them IDE-friendly
Reduces dependency on full Maven build lifecycle for everything you do
Can be developed, run, and re-factored more quickly
Figure 51.1. Sample META-INF/application.xml
<?xml version="1.0" encoding="UTF-8"?>
<application xmlns="http://xmlns.jcp.org/xml/ns/javaee"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee http://xmlns.jcp.org/xml/ns/javaee/application_7.xsd" version="7">
<application-name>basicejb-ear</application-name>
<description>This project provides a sample EAR for the Java EE components
associated with the overall project.</description>
<display-name>basicejb-ear</display-name>
<module>
<web>
<web-uri>(example web module).war</web-uri>
<context-root>(example-fixed-web-module-name)</context-root>
</web>
</module>
<module>
<ejb>basicejb-ejb.jar</ejb>
</module>
<library-directory>lib</library-directory>
</application>
Figure 51.2. Sample EAR plugin configuration
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-ear-plugin</artifactId>
<configuration>
<!-- names directory within EAR to place jars to be in classpath -->
<defaultLibBundleDir>lib</defaultLibBundleDir>
<!-- eliminates use of version in EAR JNDI name portion -->
<applicationName>${project.artifactId}</applicationName>
<modules>
<webModule>
<groupId>${project.groupId}</groupId>
<artifactId>(example-web-module-artifactId)</artifactId>
<contextRoot>(example-fixed-web-module-name)</contextRoot>
</webModule>
<!-- eliminates use of the version in the EJB JNDI name -->
<ejbModule>
<groupId>${project.groupId}</groupId>
<artifactId>basicejb-ejb</artifactId>
<bundleFileName>basicejb-ejb.jar</bundleFileName>
</ejbModule>
</modules>
</configuration>
</plugin>
Figure 51.3. Sample WEB-INF/jboss-web.xml
<jboss-web>
<!-- needed to always assure that version# does not get into JNDI name -->
<context-root>basicejb-war</context-root>
</jboss-web>
Up until now we have kept our focus away from the IDE and onto the filesystem and Maven configurations to give better insight into the structure of a multi-module application and to show how IDE-agnostic the build process is. This will be very important when you check in your modules to be built/tested by a build server (e.g., CruiseControl, Hudson, or Jenkins). However, we want to speed up the process to do actual development. We want to spend as much of our time as possible within a development/test-bench that promotes the efficient development of Java code. The command-line builds will still be leveraged but all those copy/pastes/edits I asked you do would more than likely be created with the use of a Java/JavaEE-knowledgeable IDE. The examples in this section use Eclipse. Any other credible IDE (e.g., NetBeans, IntelliJ) should work in a similar manner and offer comparable functionality.
In this section we want to break the cycle of having to issue a heavyweight Maven build every time we want to run an IT test. We can do this by making JUnit tests "IDE-friendly". What I mean by that is ... allow the IDE environment to derive usable values without relying on system properties being passed in from failsafe. Failsafe won't be involved with running your JUnit tests within the IDE. That means you can either get the usable values thru reasonable defaults or through the use of filtered property files in the classpath. Looking up the JNDI name for a remote EJB provides an excellent example of both techniques.
To derive properties for the InitialContext -- we used filtered property files.
#jndi.properties java.naming.factory.initial=${java.naming.factory.initial} java.naming.factory.url.pkgs=${java.naming.factory.url.pkgs} java.naming.provider.url=${java.naming.provider.url} #java.naming.security.principal=${jndi.user} #java.naming.security.credentials=${jndi.password}
That got expanded by our initial Maven build to contain the following. The Maven/Eclipse plugin (m2e) within Eclipse will continue to keep this up to date. It has a plugin the provides understanding of what the maven-resource-plugin would do outside of the IDE and re-creates that functionality within the IDE environment.
#jndi.properties java.naming.factory.initial=org.wildfly.naming.client.WildFlyInitialContextFactory java.naming.factory.url.pkgs= java.naming.provider.url=http-remoting://127.0.0.1:8080 #java.naming.security.principal=known #java.naming.security.credentials=password1!
There is nothing magical about jndi.properties. We can add more properties to that file (not common) or create other property files (e.g., it.properties) to be filtered and made available to the IT test. However, our other option was to pass in a system property (-DpropertyName=value) using the failsafe configuration.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<configuration>
<systemPropertyVariables>
<jndi.name.reservation>ejb:basicejb-ear/basicejb-ejb/ReservationEJB!org.myorg.basicejb.ejb.ReservationRemote</jndi.name.reservation>
</systemPropertyVariables>
</configuration>
</plugin>
Anticipating the system property might not always be available or necessary in 99-100% of the IT tests once we stablized some values, we started processing the system properties by deriving a reasonable default. This can be overriden but when not overriden it will work 99%-100% of the time as well.
private static final String reservationJNDI = System.getProperty("jndi.name.reservation",
"ejb:basicejb-ear/basicejb-ejb/ReservationEJB!"+ReservationRemote.class.getName());
Let's show this in action by loading our application into the IDE, deploying the EAR and WAR, and executing the IT tests from within Eclipse.
If you have not already done so, import your multi-module project into the IDE. For Eclipse that involves using Right-Click, Import Existing Maven Projects,
We made several incremental changes to the plugins as we built the end-to-end application. If you initially loaded the project prior to making many of these incremental changes Eclipse could have some issues upgrading the "facet" of a particular module on the fly. That was the case in my environment so I ...
Removed all modules related to this assignment from Eclipse
Deleted all .settings directories and .project and .classpath files from the filesystem
Re-imported the modules fresh and everything cleared with the "facet" JavaEE versions.
Right Click on a Working Group or Select File
Select "Import Existing Maven Projects"
Navigate to the root module
Click OK. Eclipse will show you a list of modules at and under the root module.
Select all modules in the tree. They should now show up in the various explorer tabs
Leverage existing Maven configuration to deploy the EAR and WAR artifacts to the server.
This can be done at the command line command line using techniques you are already familiar with.
$ mvn pre-integration-test
The Eclipse/Maven integration can also provide access to that functionality through a re-usable Maven Build Run Configuration.
Right Click on the Root Module and select "Run As"
Select "Maven build...". An "Edit Configuration" panel is displayed
Type "mvn pre-integration-test" for Name
Type or select the variable "${project_loc}" for Base directory
Type "pre-integration-test" for Goals
Select "Skip Tests" option
Click on the "Common" tab
Select "Run" in the Display in favorites menu
Click Apply to save the run configuration
Click Run to have it start and deploy the WAR and EAR to the server. This should provide the same output you experienced previously at the command line. The Maven command output is displayed in an Eclipse Console window and the server output is available in the server console and server.log
Select the root module again
With the root module actively selected, click the down-arrow to the right of the green circle with the white arrow.
Select the "mvn pre-integration-test" build configuration you pinned to this menu by selecting it as a favorite in the "Common" tab
Notice that the EAR and WAR get re-built and re-deployed again. You can use this or the command-line technique at any time to achieve the same goal of deploying the application(s) to the server. If you want to deploy just the EAR or WAR -- select that specific module. However, keep in mind that the build will be constrained to that module and will not pick up changes in upsteam modules that have not installed those changes into the local repository.
Run the IT tests by selecting either the module, folder, or package with the IT tests or the specific IT test or specific test method and right-click, Run As, JUnit test.
You should see the IT test run, the IT test communicate with the EJB on the server, and see activity in the server.log.
Make a change to the ReservationIT to invoke ping() several times in a row in a for-loop.
@Test
public void testPing() throws NamingException {
logger.info("*** testPing ***");
for (int i=0; i<10; i++) {
reservationist.ping();
}
}
Rerun the ReservationIT#testPing() @Test method once your changes are complete. Notice you get multiple sets of debug on the server.
2014-10-11 23:42:19,474 DEBUG [org.myorg.basicejb.ejb.ReservationEJB] (EJB default - 8) *** ReservationEJB.init() *** 2014-10-11 23:42:19,475 DEBUG [org.myorg.basicejb.ejb.ReservationEJB] (EJB default - 8) ping called 2014-10-11 23:42:19,475 DEBUG [org.myorg.basicejb.ejb.ReservationEJB] (EJB default - 8) *** ReservationEJB.destroy() *** 2014-10-11 23:42:19,521 DEBUG [org.myorg.basicejb.ejb.ReservationEJB] (EJB default - 7) *** ReservationEJB.init() *** 2014-10-11 23:42:19,522 DEBUG [org.myorg.basicejb.ejb.ReservationEJB] (EJB default - 7) ping called 2014-10-11 23:42:19,522 DEBUG [org.myorg.basicejb.ejb.ReservationEJB] (EJB default - 7) *** ReservationEJB.destroy() *** ...
Set a breakpoint inside the for-loop where the method calls reservationist.ping() and chose Debug As this time.
Keep clicking the green, right-facing arrow at the top or the yellow step over/into arrows to move forward. Notice that you have access to the state variables of the IT test while you move forward with the test.
Getting to the point where you can run and debug the IT test within the IDE is a significant amount of the work involved with being able to debug the code on the server-side. At this point you have that code surrounded and now all you have to do is peek inside. Lets look at how this is done with a remote server first.
Start the server with the debug option
wildfly-13.0.0.Final$ ./bin/standalone.sh -c standalone.xml --debug 8787 ... ========================================================================= Listening for transport dt_socket at address: 8787 18:23:37,054 INFO [org.jboss.modules] (main) JBoss Modules version 1.8.5.Final
The server to run normally until connected to a debugger client and a breakpoint is hit. Until then there is no noticeable impact so you can leave this setting active on your development server instance all the time.
Attach a debugger client to the server using the following...
Select the RMI Test module to make it the active project within Eclipse.
Click on the arrow to the right of the green bug icon on the top menu bar and select Debug Configurations...
Create a new "Remote Java Application"
Change the Port to 8787 to match your JBoss configuration.
Select the Source tab, click Add..., select Java Project and press Add. A selection box of Eclipse projects is displayed for your to select from.
Select all modules that are a part of this application and press OK to continue.
Click Debug to start the debugging session. Nothing exiting will occur until we set and cause a breakpoint to be hit.
Create a breakpoint in the ReservationEJB ping() call and re-run the IT test in either Run As or Debug As mode. Your selection there only pertains to the IT test and not what you do on the client. Notice you are now stopped on the server-side.
Click the red bent line icon on the top menu bar to detach from the server and release control of the execution.
In the previous section we showed how the stand-alone server required some setup but was very easy to establish a debugging session once much of the one-time setup was complete. In this method we are going to show how that setup can get even easier if you are willing and able to run the application server in the same JVM as the IDE.
Shutdown the standalone server instance.
Activate the JavaEE profile, select the server you setup in the environment setup instructions, right click, and select Debug. The server should start as usual.
The server will restart and likely still have the EAR and WAR deployed from the previous section. Ignore that they are already deployed and re-deploy using the "mvn pre-integration-test" Build Configuration you created earlier. Be sure to have the appropriate module selected prior to executing this Build Configuration so that ${project.loc} can get assigned a project.
Select the ReservationIT and execute it using Debug As. It matters what you select this time because both the client IT test and server will be part of the same debugging session.
You should be soon looking at a breakpoint in ReservationIT.ping(). Select continue in order to hit the breakpoint on the server-side. If you get a prompt asking for the path to the source for the EJB register the java projects as you did in the section prior to this.
I have found that I always have to restart the test for the new source paths to take effect.
At this point you are in a similar state as with the stand-alone server. However, if you click the red box icon for stop, you will kill both the IT test and the server. In the standalone server environment the kill only stops the IT test.
Much faster code, compile, test development cycle
Enables debugging
More complex but not complicated
Consistent with debugging true remote server
Some independence between remote and local code
Embedded Server
Easy to setup
Requires server to be local
Local and remote code accessed seemlessly
One last point before we wrap up. During the Maven module setup chapters you went through some extra hoops to separate plugin and dependency definitions from their declaration in the implementation modules. We placed those definitions, by default into the parent module for the exercise. In this chapter we will gut that parent of the generic setup that can be used in other projects based on EJB and related technologies.
Create a re-suable root pom that defines dependencies and plugins suitable for use by multiple applications
Releave the parent pom of the multi-module project of tracking those details
Create a new module that will be used as a root pom. To cut down on redundancy, have that pom inherit from the jpa-parent pom created in the jpa portion of the course. Our EJB solutions will soon need the details contained in the data tier parent pom.
$ mkdir ../ejb-parent
$ cat ../ejb-parent/pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<groupId>info.ejava.examples.jpa</groupId>
<artifactId>jpa-parent</artifactId>
<version>4.0.0-SNAPSHOT</version>
<relativePath/>
</parent>
<modelVersion>4.0.0</modelVersion>
<groupId>org.myorg</groupId>
<artifactId>ejb-parent</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>pom</packaging>
<name>EJB Parent POM</name>
<description>
This parent pom is intended to provide common and re-usable
definitions and constructs across EJB related modules.
It extends and does not repeat the details contained in
the jpa-parent module.
</description>
<properties>
</properties>
<repositories>
</repositories>
<pluginRepositories>
</pluginRepositories>
<dependencyManagement>
<dependencies>
</dependencies>
</dependencyManagement>
<build>
<pluginManagement>
<plugins>
</plugins>
</pluginManagement>
</build>
<profiles>
</profiles>
</project>
Update the current root/parent pom to inherit from this new pom.
<parent>
<groupId>org.myorg</groupId>
<artifactId>ejb-parent</artifactId>
<version>1.0-SNAPSHOT</version>
<relativePath>../ejb-parent</relativePath>
</parent>
<modelVersion>4.0.0</modelVersion>
<groupId>myorg.basicejb</groupId>
<artifactId>basicejbEx</artifactId>
<packaging>pom</packaging>
Verify the exercise still builds.
$ mvn clean install
Remove the properties already defined in the jpa-parent that we have defined in this parent pom.
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.source.version>...</java.source.version>
<java.target.version>...</java.target.version>
<ejava.version>...</ejava.version>
<junit.version>...</junit.version>
<slf4j.version>...</slf4j.version>
<log4j.version>...</log4j.version>
<maven-compiler-plugin.version>...</maven-compiler-plugin.version>
Define the properties unique to EJB development in the ejb-parent pom. Remove them from the current parent pom.
<properties>
<javax.ejb-api.version>3.2.2</javax.ejb-api.version>
<cargo-maven2-plugin.version>1.4.15</cargo-maven2-plugin.version>
<maven-ear-plugin.version>3.0.1</maven-ear-plugin.version>
<maven-ejb-plugin.version>3.0.1</maven-ejb-plugin.version>
<maven-failsafe-plugin.version>2.22.0</maven-failsafe-plugin.version>
<maven-war-plugin.version>3.2.2</maven-war-plugin.version>
<cargo.containerId>wildfly9x</cargo.containerId>
<jboss.host>localhost</jboss.host>
<jboss.http.port>8080</jboss.http.port>
<jboss.mgmt.host>${jboss.host}</jboss.mgmt.host>
<jboss.mgmt.port>9990</jboss.mgmt.port>
<jndi.user>known</jndi.user>
<jndi.password>password1!</jndi.password>
<java.naming.factory.initial>org.wildfly.naming.client.WildFlyInitialContextFactory</java.naming.factory.initial>
<java.naming.provider.url>http-remoting://${jboss.host}:${jboss.http.port}</java.naming.provider.url>
<java.naming.factory.url.pkgs/>
</properties>
Verify the exercise still builds.
$ mvn clean install
Remove the dependencyManagement definitions already defined in the jpa-parent that we have defined in this parent pom.
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
Define the dependencyManagement definitions unique to EJB development in the ejb-parent pom. Remove them from the current parent pom.
<dependencyManagement>
<dependencies>
<dependency>
<groupId>javax.ejb</groupId>
<artifactId>javax.ejb-api</artifactId>
<version>${javax.ejb-api.version}</version>
</dependency>
<dependency>
<groupId>info.ejava.examples.common</groupId>
<artifactId>jboss-rmi-client</artifactId>
<version>${ejava.version}</version>
<type>pom</type>
</dependency>
</dependencies>
</dependencyManagement>
Verify the exercise still builds.
$ mvn clean install
Remove the pluginManagement definitions already defined in the jpa-parent that we have defined in this parent pom.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven-compiler-plugin.version}</version>
<configuration>
<source>${java.source.version}</source>
<target>${java.target.version}</target>
</configuration>
</plugin>
Define the pluginManagement definitions unique to EJB development in the ejb-parent pom. Remove them from the current parent pom.
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-ejb-plugin</artifactId>
<version>${maven-ejb-plugin.version}</version>
<configuration>
<ejbVersion>3.2</ejbVersion>
<archive>
<manifest>
<addClasspath>true</addClasspath>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>${maven-war-plugin.version}</version>
<configuration>
<failOnMissingWebXml>false</failOnMissingWebXml>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-ear-plugin</artifactId>
<version>${maven-ear-plugin.version}</version>
<configuration>
<version>7</version>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>${maven-failsafe-plugin.version}</version>
<configuration>
<argLine>${surefire.argLine}</argLine>
</configuration>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.cargo</groupId>
<artifactId>cargo-maven2-plugin</artifactId>
<version>${cargo-maven2-plugin.version}</version>
<configuration>
<container>
<containerId>${cargo.containerId}</containerId>
<type>remote</type>
<log>target/server.log</log>
<output>target/output.log</output>
</container>
<configuration>
<type>runtime</type>
<properties>
<cargo.hostname>${jboss.mgmt.host}</cargo.hostname>
<cargo.jboss.management.port>${jboss.mgmt.port}</cargo.jboss.management.port>
</properties>
</configuration>
</configuration>
<dependencies>
<dependency>
<groupId>org.wildfly</groupId>
<artifactId>wildfly-controller-client</artifactId>
<version>${wildfly.version}</version>
</dependency>
</dependencies>
<executions>
<execution>
<id>cargo-prep</id>
<phase>pre-integration-test</phase>
<goals>
<goal>redeploy</goal>
</goals>
</execution>
<execution>
<id>cargo-post</id>
<phase>post-integration-test</phase>
<goals>
<goal>undeploy</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
</build>
Verify the exercise still builds.
$ mvn clean install
Define the repository declarations we added for EJB development that were not in the the jpa-parent pom. Remember that we used the class maven repository to locate the jboss-rmi-client archive in order to more easily bring in JBoss dependencies into the RMI client.
<repositories>
<repository>
<id>webdev-snapshot</id>
<name>ejava webdev snapshot repository</name>
<url>http://webdev.jhuep.com/~jcs/maven2-snapshot</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
<updatePolicy>daily</updatePolicy>
</snapshots>
</repository>
</repositories>
Verify the exercise still builds.
$ mvn clean install
Define the profiles we added for EJB development that were not in the jpa-parent pom. We added the following profile definition to be able to perform remote debugging on an IT test running within a full Maven build lifecycle.
<profiles>
<profile> <!-- tells surefire/failsafe to run JUnit tests with remote debug -->
<id>debugger</id>
<activation>
<property>
<name>debugger</name>
</property>
</activation>
<properties>
<surefire.argLine>-Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE</surefire.argLine>
</properties>
</profile>
</profiles>
Verify the exercise still builds.
$ mvn clean install
The following is what should remain in the application's root pom. Most other root application poms may also look as simple now that we have off-loaded the dependency and plugin specifics to the ejb-parent and jpa-parent root poms.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<parent>
<groupId>org.myorg</groupId>
<artifactId>ejb-parent</artifactId>
<version>1.0-SNAPSHOT</version>
<relativePath>../ejb-parent</relativePath>
</parent>
<modelVersion>4.0.0</modelVersion>
<groupId>myorg.basicejb</groupId>
<artifactId>basicejbEx</artifactId>
<packaging>pom</packaging>
<name>Basic EJB Exercise</name>
<version>1.0-SNAPSHOT</version>
<description>
This project is the root project for the example Java EE
Application.
</description>
<modules>
<module>basicejb-ejb</module>
<module>basicejb-ear</module>
<module>basicejb-test</module>
<module>basicejb-war</module>
</modules>
</project>
Copyright © 2019 jim stafford (jim.stafford@jhu.edu)
Built on: 2019-08-22 07:11 EST
Abstract
This exercise demonstrates how to integrate JPA-based solutions into the EJB tier and how to address many of the issues that can be encountered when introducing server-side technology -- specifically the remote interface with the client.
Table of Contents
Integrate JPA persistence units with server-side deployed EJBs
Implement EJB business methods using a JPA persistence context
Access EJB business methods using remote interface
Identify how a remote interface and local business interfaces are different and the role of the DTO
Develop solutions for remote interfaces that are appropriate for specific application requirements
At the completion of this topic, the student shall
be able to:
Integrate a JPA persistence unit into a EAR-deployed EJB
Integrate a JPA persistence unit into a WAR-deployed EJB
Integrate external business and DAO logic into an EJB
Develop lazy-load solutions for marshalling data to remote client (pre-touching and join fetches)
Develop solutions for when marshalling managed entities as DTOs (classpath and cleansing)
Develop solutions to create an independenct remote (DTO) and data tier (BO/entity) layers
In this section of the lab we will be getting the environment ready and getting you familiar with the modules you will be working with.
Start the application server either at the command line or within IDE
Figure 54.1. Start the Application Server (command line)
$ ./bin/standalone.sh ... 22:48:07,175 INFO [org.jboss.as] (Controller Boot Thread) JBAS015874: WildFly 8.1.0.Final "Kenny" started in 56096ms - Started 630 of 694 services (131 services are lazy, passive or on-demand)
Start off by building the solution to verify the environment os setup correctly.
Change directory to the jpatickets-labsol
directory
Build, deploy, and test the solution
Figure 54.2. Build, Deploy, and Test Solution
$ mvn clean install ... [INFO] Reactor Summary: [INFO] [INFO] EJB::JPA Tickets Lab::Solution ..................... SUCCESS [ 0.556 s] [INFO] EJB::JPA Tickets Lab::Solution::Impl ............... SUCCESS [ 11.543 s] [INFO] EJB::JPA Tickets Lab::Solution::EJB ................ SUCCESS [ 0.948 s] [INFO] EJB::JPA Tickets Lab::Solution::EAR ................ SUCCESS [ 0.730 s] [INFO] EJB::JPA Tickets Lab::Solution::WAR ................ SUCCESS [ 1.280 s] [INFO] EJB::JPA Tickets Lab::Solution::IT Test ............ SUCCESS [ 16.556 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
Now build the starting point for the exercise. Many things have been turned off but it should build successfully.
Change directory to the jpatickets-labex
directory
Build initial state of exercise. Notice all the tests have been skipped
Figure 54.3. Build Initial Exercise State
$ mvn clean install ... Tests run: 10, Failures: 0, Errors: 0, Skipped: 10 ... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] EJB::JPA Tickets Lab::Exercise ..................... SUCCESS [ 0.690 s] [INFO] EJB::JPA Tickets Lab::Exercise::Impl ............... SUCCESS [ 11.621 s] [INFO] EJB::JPA Tickets Lab::Exercise::EJB ................ SUCCESS [ 1.021 s] [INFO] EJB::JPA Tickets Lab::Exercise::EAR ................ SUCCESS [ 0.747 s] [INFO] EJB::JPA Tickets Lab::Exercise::WAR ................ SUCCESS [ 0.846 s] [INFO] EJB::JPA Tickets Lab::Exercise::IT Test ............ SUCCESS [ 7.229 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
Import the jpatickets-labex
module and its children into your
IDE to begin the changes to the exercise. The following shows the source
trees for the modules you will be working with.
The overall project is a six (6) module project with one(1) parent and five (5) child modules; impl, ejb, ear, war, and test.
jpatickets-labex |-- jpatickets-labex-impl | |-- pom.xml | `-- src | |-- main | | `-- java | `-- test | |-- java | `-- resources |-- jpatickets-labex-ejb | |-- pom.xml | `-- src | `-- main | |-- java | `-- resources |-- jpatickets-labex-ear | `-- pom.xml |-- jpatickets-labex-war | |-- pom.xml | `-- src | `-- main | |-- java | |-- resources | `-- webapp-filtered |-- jpatickets-labex-test | |-- pom.xml | `-- src | `-- test | |-- java | `-- resources `-- pom.xml
The Impl module provides the business objects (BOs), data access objects (DAOs), and business logic for the application. The BOs are mapped to the database using JPA, the DAOs implement the details of accessing the BOs in the database. The business logic handles any logic issues above the data tier (minimal in this case). The impl module contains a JPA persistence unit and test infrastructure that we will want to use for reference when deploying one to the server-side. There are two main interfaces to the business logic; VenueMgmt and EventMgmt. The EJB tier simply needs to instantiate this logic, inject the objects with required resource dependencies (i.e., a persistence context/EntityManager), provide remote and local access, and define transaction semantics to the container. You will not need to modify this module during the exercise but may be asked to look through it.
jpatickets-labex/jpatickets-labex-impl/ |-- pom.xml `-- src |-- main | `-- java | `-- org | `-- myorg | `-- jpatickets | |-- bl | | |-- EventMgmtImpl.java | | |-- EventMgmt.java | | |-- UnavailableException.java | | |-- VenueMgmtImpl.java | | `-- VenueMgmt.java | |-- bo | | |-- Address.java | | |-- Event.java | | |-- Seat.java | | |-- SeatPK.java | | |-- Ticket.java | | |-- TicketPK.java | | `-- Venue.java | `-- dao | |-- EventMgmtDAOImpl.java | |-- EventMgmtDAO.java | |-- VenueDAOImpl.java | `-- VenueDAO.java `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- bl | |-- TicketsFactory.java | `-- TicketsTest.java `-- resources |-- log4j.properties `-- META-INF `-- persistence.xml
The EJB contains two primary EJBs for the exercise (VenueMgmtEJB and EventMgmtEJB) and a third/fourth EJB we will not address (TicketsInitEJB and TicketsInitTxEJB). I refer to it as a third/fourth EJB because logically "TickitInit" is one logical EJB that has been split into two physical EJBs. TicketInitEJB is the remote facade for IT clients to initiate a DB reset. TicketInitTxEJB is an EJB that wraps drop() and create() schema into separate transactions. It happens to use BEAN managed transactions for those actions. You will primarily be working with the VenueMgmt and EventMgmt EJBs and the persistence unit that will be defined in META-INF/persistence.xml
jpatickets-labex/jpatickets-labex-ejb/ |-- pom.xml `-- src `-- main |-- java | `-- org | `-- myorg | `-- jpatickets | |-- dto | | `-- EventDTO.java | `-- ejb | |-- EventMgmtEJB.java | |-- EventMgmtRemote.java | |-- TicketsInitEJB.java | |-- TicketsInitRemote.java | |-- TicketsInitTxEJB.java | |-- VenueMgmtEJB.java | `-- VenueMgmtRemote.java `-- resources `-- META-INF `-- persistence.xml
The EAR module contains no source other than a pom.xml. It is exclusively used to deploy the EJB to the server with its Impl module dependency. When identifying the entity classes in the EAR-deployed EJB persistence.xml as a "jar-file", it will be important to know the path of the Impl module within the EAR.
jpatickets-labex/jpatickets-labex-ear/ `-- pom.xml
jpatickets-labex/jpatickets-labex-ear/target/jpatickets-labex-ear-4.0.0-SNAPSHOT |-- jpatickets-labex-ejb.jar |-- lib | `-- jpatickets-labex-impl-4.0.0-SNAPSHOT.jar `-- META-INF `-- application.xml
The WAR module is provide to demonstrate persistent unit deployment through an imported EJB and an embedded persistence.xml. It is strictly being used as a deployment platform and does not contain a user interface.
jpatickets-labex-war/src/ `-- main |-- java | `-- org | `-- myorg | `-- jpatickets | `-- webejb | `-- WebVenueMgmtEJB.java |-- resources | `-- META-INF | `-- persistence.xml `-- webapp-filtered `-- WEB-INF `-- jboss-web.xml
The RMI IT Test module is used to deploy the EAR and WAR and initiate tests within the exercise. When you start the exercise most/all of the @Tests have been deactivated with an @Ignore. You will turn them on to expose issues with the implementation and then implement a fix. At the end of the exercise all @Tests should be enabled and passing.
During this setup you started you
Started you application server so applications could be deployed to it
Built, deployed, and tested the solution to verify your environment is correct
Built the starting state of the exercise to verify it could build in its initial state
Imported the exercise into your IDE to begin making changes to the exercise copy
In this chapter we will form a persistence unit to be deployed to the server-side. The data tier has been implemented for you and is located in the "impl" module. We will use the persistence unit from that module as a starting point.
This exercise relies on an EJB to reset the DB in between IT tests. Something may not be getting closed out because after ~10 re-deploys of the application, the re-deploy is stalled by a failure to get a DB connection. When this occurred, the only way I could resolve was to kill the application server (kill -9 on Linux). It did not occur often -- but enough to place this warning and work-around here.
In this section we will deploy a persistence unit within a EAR-based EJB deployment. The exercise starts off with a missing persistence unit in the EJB, an injection of the persistence context into the EJBs commented out, and the test of the EJB @Ignored.
Remove the @Ignore from the venueEAR() @Test in VenueMgmtIT.java. This will activate the test and request the server-side to create a Venue.
Figure 55.1. Activate venueEAR() @Test
jpatickets-labex-test/src/ `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejbclient | `-- VenueMgmtIT.java
@Test
//@Ignore
public void venueEAR() throws NamingException {
logger.info("*** venueEAR ***");
VenueMgmtRemote venueMgmt=tf.lookup(VenueMgmtRemote.class, VENUE_JNDINAME);
Venue venue = tf.makeVenue();
venueMgmt.createVenue(venue, 1, 2, 3);
assertNotNull("could not locate venue:" + venue.getId(), venueMgmt.getVenue(venue.getId()));
}
Attempt to build the application from the parent pom.
Figure 55.2. Build Application from Parent POM
jpatickets-labex]$ mvn clean install ... Tests in error: VenueMgmtIT.venueEAR:33 » EJB error creating venue:java.lang.NullPointerExcept... Tests run: 10, Failures: 0, Errors: 1, Skipped: 9 ... [INFO] Reactor Summary: [INFO] [INFO] EJB::JPA Tickets Lab::Exercise ..................... SUCCESS [ 0.652 s] [INFO] EJB::JPA Tickets Lab::Exercise::Impl ............... SUCCESS [ 11.386 s] [INFO] EJB::JPA Tickets Lab::Exercise::EJB ................ SUCCESS [ 0.902 s] [INFO] EJB::JPA Tickets Lab::Exercise::EAR ................ SUCCESS [ 0.561 s] [INFO] EJB::JPA Tickets Lab::Exercise::WAR ................ SUCCESS [ 0.760 s] [INFO] EJB::JPA Tickets Lab::Exercise::IT Test ............ FAILURE [ 8.761 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE
There is a NullPointerException on the server-side because the @PersistenceContext is not being injected and the EntityManager passed to the DAO is null. Please fix this in the next step.
Inject a persistence context (i.e., EntityManager) for the "jpatickets-labex" persistence unit for VenueMgmtEJB.
There are two types of JPA injections that can be done in JavaEE; @PersistenceUnit and @PersistenceContext. The @PersistenceUnit defines an injection of an EntityManagerFactory and is usually only done in EJBs using BEAN-managed transactions. The TicketsInitTxEJB uses that construct. @PersistenceContext defines the injection of an EntityManager. Both express the persistence unit name with the "unitName" attribute. The "name" attribute is used for JNDI ENC injection. We will be defining our injections direct without going through the indirection of the ENC.
Since VenueMgmtEJB and EventMgmtEJB are both @Stateless EJBs, the persistence contexts are defined to be transaction-scoped. That means there will be a single transaction for each sequence of interactions with the persistence context. You have no option to commit and begin a new transaction when it is transaction-scoped. To do otherwise you would need either a @Stateful or @Singleton EJB.
Figure 55.3. Inject @PersistenceContext into EJB
jpatickets-labex-ejb/src/ `-- main |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejb | |-- VenueMgmtEJB.java
@Stateless
public class VenueMgmtEJB implements VenueMgmtRemote {
@PersistenceContext(unitName="jpatickets-labex")
private EntityManager em;
Attempt to build and deploy application from parent pom. This will fail since we do not yet have the persistence unit defined. We have a persistence.xml file with "a" persistence unit defined, but it is not the one VenueMgmtEJB is trying to use.
Figure 55.4. Attempt to Deploy Application without Persistence Unit
$ mvn clean install ... [INFO] Reactor Summary: [INFO] [INFO] EJB::JPA Tickets Lab::Exercise ..................... SUCCESS [ 0.632 s] [INFO] EJB::JPA Tickets Lab::Exercise::Impl ............... SUCCESS [ 16.222 s] [INFO] EJB::JPA Tickets Lab::Exercise::EJB ................ SUCCESS [ 1.833 s] [INFO] EJB::JPA Tickets Lab::Exercise::EAR ................ SUCCESS [ 0.958 s] [INFO] EJB::JPA Tickets Lab::Exercise::WAR ................ SUCCESS [ 1.715 s] [INFO] EJB::JPA Tickets Lab::Exercise::IT Test ............ FAILURE [ 5.867 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE
2014-11-29 23:45:05,849 ERROR [org.jboss.msc.service.fail] (MSC service thread 1-4) MSC000001: Failed to start service jboss.deployment.subunit."jpatickets-labex-ear-4.0.0-SNAPSHOT.ear"."jpatickets-labex-ejb.jar".INSTALL: org.jboss.msc.service.StartException in service j ... Caused by: org.jboss.as.server.deployment.DeploymentUnitProcessingException: JBAS011047: Component class org.myorg.jpatickets.ejb.VenueMgmtEJB for component VenueMgmtEJB has errors: JBAS011440: Can't find a persistence unit named jpatickets-labex in subdeployment "jpatickets-labex-ejb.jar" of deployment "jpatickets-labex-ear-4.0.0-SNAPSHOT.ear"
Open the existing META-INF/persistence.xml in the EJB. When you are done this file will have two (2) persistence units defined. The second one is already defined and is used by TicketsInitTxEJB to reset the database in between IT tests. The first one can be built using parts of the "impl" module (I am choosing to ignore what the schema-gen persistence unit can provide you and focusing on what we can/can't carry forward from the unit test)
Figure 55.5. Starting EJB persistence.xml
jpatickets-labex-ejb/src/ `-- main `-- resources `-- META-INF `-- persistence.xml
<persistence ...
<!-- TODO: supply jpatickets-labex persistence unit
<persistence-unit name="jpatickets-labex">
</persistence-unit>
-->
<!-- This persistence unit is used exclusively for managing the schema
during IT tests.
-->
<persistence-unit name="jpatickets-schemagen-labex">
...
</persistence-unit>
</persistence>
Open the existing META-INF/persistence.xml in the impl module src/test/resource directory. We will use that as a starting point and discuss the differences. Note that we placed the unit test's persistence.xml within the src/test tree so that it would be available for unit testing but would not get placed in the JAR and deployed to the server.
Figure 55.6.
jpatickets-labex-impl/src/ `-- test `-- resources `-- META-INF `-- persistence.xml
<persistence-unit name="jpatickets-test">
<!-- used on the server-side
<jta-data-source>java:jboss/datasources/ExampleDS</jta-data-source>
-->
<!-- jarfile shortcut can be used when deploying EJB within EAR
<jar-file>lib/jpatickets-labsol-impl-${project.version}.jar</jar-file>
-->
<!-- classes must be enumerated when deploying outside of EAR -->
<class>org.myorg.jpatickets.bo.Venue</class>
<class>org.myorg.jpatickets.bo.Address</class>
<class>org.myorg.jpatickets.bo.Seat</class>
<class>org.myorg.jpatickets.bo.Event</class>
<class>org.myorg.jpatickets.bo.Ticket</class>
<properties>
<!-- this applies to both unit-test and server-side environments -->
<property name="hibernate.dialect" value="${hibernate.dialect}"/>
<!-- these can be helpful in both environments when debugging
<property name="hibernate.jdbc.batch_size" value="0"/>
<property name="hibernate.show_sql" value="true"/>
<property name="hibernate.format_sql" value="true"/>
-->
<!-- this applies to only the ***demo***-nature of this example -->
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<!-- these apply only to the unit test environment -->
<property name="javax.persistence.jdbc.driver" value="${jdbc.driver}"/>
<property name="javax.persistence.jdbc.url" value="${jdbc.url}"/>
<property name="javax.persistence.jdbc.user" value="${jdbc.user}"/>
<property name="javax.persistence.jdbc.password" value="${jdbc.password}"/>
</properties>
</persistence-unit>
Create the outer element of the persistence unit in the EJB. Note that we have picked a different name than the persistence unit used during the unit tests. This helps keep things straight if they ever were deployed to the server.
Figure 55.7. Persistence Unit Outer Element
jpatickets-labex-ejb/src/ `-- main `-- resources `-- META-INF `-- persistence.xml
<persistence-unit name="jpatickets-labex">
</persistence-unit>
Add a jta-data-source for obtaining connections. The jta-data-source references a JNDI name for a javax.sql.DataSource defined at the application server. The components will borrow physical connections from the DataSource during transactions, form lightweight logical connections, close them complete, which frees of the physical connection for other clients.
The jta-data-source takes the place of the "javax.persistence.jdbc" connection properties. These properties are only necessary when creating direct connections to the database and the server will have already done this for us.
Figure 55.8. Add jta-data-source to persistence unit in EJB
jpatickets-labex-ejb/src/ `-- main `-- resources `-- META-INF `-- persistence.xml
<!-- used on the server-side -->
<jta-data-source>java:jboss/datasources/ExampleDS</jta-data-source>
<!-- these apply only to the unit test environment
<property name="javax.persistence.jdbc.driver" value="${jdbc.driver}"/>
<property name="javax.persistence.jdbc.url" value="${jdbc.url}"/>
<property name="javax.persistence.jdbc.user" value="${jdbc.user}"/>
<property name="javax.persistence.jdbc.password" value="${jdbc.password}"/>
-->
The JNDI name from the server's configuration (standalone.xml) must match the JNDI name of the jta-data-source element of the peristence unit.
Figure 55.9. ExampleDS in Wildfly's standalone.xml
<datasources> <datasource jndi-name="java:jboss/datasources/ExampleDS" pool-name="ExampleDS" enabled="true" use-java-context="true"> ... </datasources>
Attempt the build, deploy, and test the application with the persistence unit defined with only a jta-data-source. The build will fail because of an unknown entity.
Figure 55.10. Build Fails with Unknown Entity
$ mvn clean install ... 00:17:11,110 INFO (VenueMgmtIT.java:29) -*** venueEAR *** Tests run: 3, Failures: 0, Errors: 1, Skipped: 2, Time elapsed: 1.521 sec <<< FAILURE! - in org.myorg.jpatickets.ejbclient.VenueMgmtIT venueEAR(org.myorg.jpatickets.ejbclient.VenueMgmtIT) Time elapsed: 1.197 sec <<< ERROR! javax.ejb.EJBException: error creating venue:java.lang.IllegalArgumentException: Unknown entity: org.myorg.jpatickets.bo.Venue
The entity is unknown because the entity class is part of a separate JAR file and must be brought into the persistence unit using one of several techniques.
Since the EJB is being deployed to the server using an EAR - lets take advantage of a shorthand way of referring to every @Entity in the impl.jar. Notice the impl.jar is in the lib directory of the EAR so we can reference it there. We can also take advantage of Maven filtering to take care of the version#.
Figure 55.11.
jpatickets-labex-ear/target/jpatickets-labex-ear-4.0.0-SNAPSHOT |-- jpatickets-labex-ejb.jar |-- lib | `-- jpatickets-labex-impl-4.0.0-SNAPSHOT.jar `-- META-INF `-- application.xml
<persistence-unit name="jpatickets-labex">
<!-- used on the server-side -->
<jta-data-source>java:jboss/datasources/ExampleDS</jta-data-source>
<!-- jarfile shortcut can be used when deploying EJB within EAR -->
<jar-file>lib/jpatickets-labex-impl-${project.version}.jar</jar-file>
</persistence-unit>
While doing a quick dry-run of this exercise, I noticed that I could not get this step to work as planned and am not currently sure why broken. If this is not working for you -- please update to the following. There will be a later step that asks you to that exact edit for a separate reason when using WARs.
<persistence-unit name="jpatickets-labex">
<jta-data-source>java:jboss/datasources/ExampleDS</jta-data-source>
<!--
<jar-file>lib/jpatickets-labex-impl-${project.version}.jar</jar-file>
-->
<class>org.myorg.jpatickets.bo.Venue</class>
<class>org.myorg.jpatickets.bo.Address</class>
<class>org.myorg.jpatickets.bo.Seat</class>
<class>org.myorg.jpatickets.bo.Event</class>
<class>org.myorg.jpatickets.bo.Ticket</class>
</persistence-unit>
Add some debug output from the persistence unit.
Figure 55.12. Printing SQL Debug From Persistence Unit
<properties>
<!-- these can be helpful in both environments when debugging
<property name="hibernate.jdbc.batch_size" value="0"/>
-->
<property name="hibernate.show_sql" value="true"/>
<property name="hibernate.format_sql" value="false"/>
</properties>
Build, deploy, and test the application by building from the parent pom. This should be successful. If you look in the server.log you should see the show_sql we enabled above creating and getting a venue.
Figure 55.13.
$ mvn clean install [INFO] Reactor Summary: [INFO] [INFO] EJB::JPA Tickets Lab::Exercise ..................... SUCCESS [ 1.462 s] [INFO] EJB::JPA Tickets Lab::Exercise::Impl ............... SUCCESS [ 12.172 s] [INFO] EJB::JPA Tickets Lab::Exercise::EJB ................ SUCCESS [ 0.991 s] [INFO] EJB::JPA Tickets Lab::Exercise::EAR ................ SUCCESS [ 1.831 s] [INFO] EJB::JPA Tickets Lab::Exercise::WAR ................ SUCCESS [ 1.622 s] [INFO] EJB::JPA Tickets Lab::Exercise::IT Test ............ SUCCESS [ 8.931 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
*** VenueMgmtEJB:init(869033952) *** *** VenueMgmtEJB:destroy(869033952) *** insert into JPATICKETS_VENUE (CITY, STATE, STREET, POSTAL_CODE, NAME, VENUE_ID) values (?, ?, ?, ?, ?, ?) insert into JPATICKET_SEAT (POSTION, ROW, SECTION, VENUE_ID) values (?, ?, ?, ?) insert into JPATICKET_SEAT (POSTION, ROW, SECTION, VENUE_ID) values (?, ?, ?, ?) ... *** VenueMgmtEJB:init(477412492) *** select venue0_.VENUE_ID as VENUE_ID1_2_0_, venue0_.CITY as CITY2_2_0_, venue0_.STATE as STATE3_2_0_, venue0_.STREET as STREET4_2_0_, venue0_.POSTAL_CODE as POSTAL_C5_2_0_, venue0_.NAME as NAME6_2_0_ from JPATICKETS_VENUE venue0_ where venue0_.VENUE_ID=? *** VenueMgmtEJB:destroy(477412492) ***
Notice with the @Stateless EJB, we get an init/destroy pair for every business method call because we have not placed this EJB in a pool. Notice too that the destroy for the createVenue() is printed prior to the database INSERTs. That is because the persistence context was injected by the container and the end-of-transaction flush() did not occur until after the business method exited. The init/destroy pair for the getVenue() occur around the SELECT because queries are performed immediately and do not wait until the end of a transaction (if one is active).
You have completed this section of the exercise. You now have an initial persistence unit defined and deployed to the server. The persistence unit is injected into the EJB and has required entities registered. The EJB is able to expose business methods that use the persistence unit to create and get a Venue.
In this section we will deploy the EJB from the previous section using a WAR instead of an EAR. We will do this to demonstrate that the jar-file shortcut only works for EAR-based deployments.
Activate the venueImportedEJB() @Test within VenueMgmtIT.java of the test module.
Figure 55.14. Activate venueImportedEJB() @Test
@Test
//@Ignore
public void venueImportedEJB() throws NamingException {
logger.info("*** venueImportedEJB ***");
VenueMgmtRemote venueMgmt=tf.lookup(VenueMgmtRemote.class, WEBIMPORTED_VENUE_JNDINAME);
Venue venue = tf.makeVenue();
venueMgmt.createVenue(venue, 1, 2, 3);
assertNotNull("could not locate venue:" + venue.getId(), venueMgmt.getVenue(venue.getId()));
}
Attempt to build the application from the root. Notice the EAR-based EJB test still passes but the imported WAR-based cannot resolve the entities (like before).
Figure 55.15. Build Failure - Unknown Entity (again)
jpatickets-labex-war/target/jpatickets-labex-war `-- WEB-INF `-- lib |-- jpatickets-labex-ejb-4.0.0-SNAPSHOT.jar `-- jpatickets-labex-impl-4.0.0-SNAPSHOT.jar
$ mvn clean install Tests in error: VenueMgmtIT.venueImportedEJB:45 » EJB error creating venue:java.lang.IllegalAr... Tests run: 10, Failures: 0, Errors: 1, Skipped: 8 ... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] EJB::JPA Tickets Lab::Exercise ..................... SUCCESS [ 0.538 s] [INFO] EJB::JPA Tickets Lab::Exercise::Impl ............... SUCCESS [ 11.080 s] [INFO] EJB::JPA Tickets Lab::Exercise::EJB ................ SUCCESS [ 1.473 s] [INFO] EJB::JPA Tickets Lab::Exercise::EAR ................ SUCCESS [ 0.787 s] [INFO] EJB::JPA Tickets Lab::Exercise::WAR ................ SUCCESS [ 1.368 s] [INFO] EJB::JPA Tickets Lab::Exercise::IT Test ............ FAILURE [ 10.664 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE
01:04:47,063 ERROR [org.myorg.jpatickets.ejb.VenueMgmtEJB] (EJB default - 5) error creating venue: java.lang.IllegalArgumentException: Unknown entity: org.myorg.jpatickets.bo.Venue
The jar-file element unfortunately will not resolve a JAR file within anything besides an EAR - even if we adjusted the path to match the WAR locations. We will have to use one of the other techniques to register the entity classes with the persistence unit that is more portable.
Register the entity classes using class elements. This is more verbose but will pull the entities from the classpath formed by either the EAR or WAR and we will not have to worry about the location of the impl.jar containing the @Entities. Remove the jar-file reference
Figure 55.16. Entities Registered using class Elements
jpatickets-labex-ejb/src/ `-- main `-- resources `-- META-INF `-- persistence.xml
<persistence-unit name="jpatickets-labex">
<!-- used on the server-side -->
<jta-data-source>java:jboss/datasources/ExampleDS</jta-data-source>
<!-- jarfile shortcut can be used when deploying EJB within EAR
<jar-file>lib/jpatickets-labex-impl-${project.version}.jar</jar-file>
-->
<!-- classes must be enumerated when deploying outside of EAR -->
<class>org.myorg.jpatickets.bo.Venue</class>
<class>org.myorg.jpatickets.bo.Address</class>
<class>org.myorg.jpatickets.bo.Seat</class>
<class>org.myorg.jpatickets.bo.Event</class>
<class>org.myorg.jpatickets.bo.Ticket</class>
<properties>
Build, Deploy, and Re-run the IT test for the WAR-based imported EJB. This should now complete successfully.
Figure 55.17. Success Build of WAR-based Imported EJBs
$ mvn clean install ... [INFO] Reactor Summary: [INFO] [INFO] EJB::JPA Tickets Lab::Exercise ..................... SUCCESS [ 0.640 s] [INFO] EJB::JPA Tickets Lab::Exercise::Impl ............... SUCCESS [ 19.197 s] [INFO] EJB::JPA Tickets Lab::Exercise::EJB ................ SUCCESS [ 0.793 s] [INFO] EJB::JPA Tickets Lab::Exercise::EAR ................ SUCCESS [ 0.543 s] [INFO] EJB::JPA Tickets Lab::Exercise::WAR ................ SUCCESS [ 0.909 s] [INFO] EJB::JPA Tickets Lab::Exercise::IT Test ............ SUCCESS [ 9.691 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
You have completed two of the three deployment alternatives. The first and second were actually the same EJB artifact. One was deployed as an EAR-based EJB. The other deployed as a WAR-based EJB. We learned that jar-file references only work in EAR-based deployments but naming each entity using the class element works in both environments.
In this section we will make a simple copy of the VenueMgmtEJB and implement it as an EJB embedded within the WAR. That means it will not be a part of any EJB.jar, will not be in WEB-INF/lib, but will be located in WEB-INF/classes.
Activate the venueWAR() @Test within the VenueMgmtIT.java IT test in the test module.
Figure 55.18. Activate IT test for WAR-based Embedded EJB
jpatickets-labex-test/src/ `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejbclient | `-- VenueMgmtIT.java
@Test
//@Ignore
public void venueWAR() throws NamingException {
logger.info("*** venueWAR ***");
VenueMgmtRemote venueMgmt=tf.lookup(VenueMgmtRemote.class, WEBVENUE_JNDINAME);
Venue venue = tf.makeVenue();
venueMgmt.createVenue(venue, 1, 2, 3);
assertNotNull("could not locate venue:" + venue.getId(), venueMgmt.getVenue(venue.getId()));
}
Attempt the build the application from the parent pom. This should fail for a familiar error -- Unknown entity.
Figure 55.19. Build, Deploy, and Test WAR-embedded Persistence Unit (fail)
$ mvn clean install ... Tests run: 3, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 1.748 sec ... javax.ejb.EJBException: error creating venue:java.lang.IllegalArgumentException: Unknown entity: org.myorg.jpatickets.bo.Venue ... [INFO] Reactor Summary: [INFO] [INFO] EJB::JPA Tickets Lab::Exercise ..................... SUCCESS [ 0.906 s] [INFO] EJB::JPA Tickets Lab::Exercise::Impl ............... SUCCESS [ 25.933 s] [INFO] EJB::JPA Tickets Lab::Exercise::EJB ................ SUCCESS [ 0.806 s] [INFO] EJB::JPA Tickets Lab::Exercise::EAR ................ SUCCESS [ 0.470 s] [INFO] EJB::JPA Tickets Lab::Exercise::WAR ................ SUCCESS [ 0.773 s] [INFO] EJB::JPA Tickets Lab::Exercise::IT Test ............ FAILURE [ 10.174 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE
Note in this case we have an entirely new persistence.xml. It is embedded within the WAR and deployed to WEB-INF/classes/META-INF. The source location within the WAR module is src/main/resources.
Figure 55.20. WAR-based Embedded persistence.xml
jpatickets-labex-war/target/jpatickets-labex-war `-- WEB-INF |-- classes | |-- META-INF | | `-- persistence.xml | `-- org | `-- myorg | `-- jpatickets | `-- webejb | `-- WebVenueMgmtEJB.class
jpatickets-labex-war/src/ `-- main |-- resources | `-- META-INF | `-- persistence.xml
<!-- TODO: supply jpatickets-labexsweb persistence unit -->
<persistence-unit name="jpaticketsweb-labex">
</persistence-unit>
We are currently missing all of the internals of this persistence unit.
Add jta-data-source and entity class references.
Figure 55.21. WAR-based Embedded Persistence Unit (filled in)
<persistence-unit name="jpaticketsweb-labex">
<jta-data-source>java:jboss/datasources/ExampleDS</jta-data-source>
<class>org.myorg.jpatickets.bo.Venue</class>
<class>org.myorg.jpatickets.bo.Address</class>
<class>org.myorg.jpatickets.bo.Seat</class>
<class>org.myorg.jpatickets.bo.Event</class>
<class>org.myorg.jpatickets.bo.Ticket</class>
</persistence-unit>
Build, deploy, and test the WAR-based embedded persistence unit. This should now pass and you should notice that three (3) IT tests were actually executed and the rest are still ignored.
Figure 55.22. Build, Deploy, and Test WAR-embedded Persistence Unit (success)
Tests run: 10, Failures: 0, Errors: 0, Skipped: 7 ... [INFO] Reactor Summary: [INFO] [INFO] EJB::JPA Tickets Lab::Exercise ..................... SUCCESS [ 0.740 s] [INFO] EJB::JPA Tickets Lab::Exercise::Impl ............... SUCCESS [ 10.747 s] [INFO] EJB::JPA Tickets Lab::Exercise::EJB ................ SUCCESS [ 0.880 s] [INFO] EJB::JPA Tickets Lab::Exercise::EAR ................ SUCCESS [ 0.566 s] [INFO] EJB::JPA Tickets Lab::Exercise::WAR ................ SUCCESS [ 0.860 s] [INFO] EJB::JPA Tickets Lab::Exercise::IT Test ............ SUCCESS [ 11.401 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS
You have finished the third of three(3) different deployment types for a persistence unit. In this last option you deployed a persistence unit for the explicit use by beans embedded within the WAR. Note the beans in the WAR also had access to the persistence unit within the imported EJB if they wanted to share a persistence context at runtime (quite likely).
In this exercise you deployed a persistence unit within a EAR-deployed and WAR-deployed EJB and embedded a persistence unit within a WAR for use by embedded EJBs and other beans. From this point you should be able to begin layering your implementation around the injected @PersistenceContext/EntityManager and addressing issues that are of concern to the remote access to the EJB.
In this chapter we are going to focus on identifying issues that occur when interfacing with a JPA-based EJB and especially through a remote interface. To be fair -- not all of the issues covered in this chapter are unique to JPA but the integration between the Java business objects and database persistence technology is so seamless that it will definitely come up within this context.
A Data Transfer Object (DTO) is anything you pass between the client and server to express information. They could be based on XML or JSON with class representations on either end. They could be simple Serializable Java data structures or business objects (BOs) that are also mapped as JPA entities. Whatever the approach -- one key requirement is that all Java objects passed through an RMI interface to a remote client must implement Serializable. Lets demonstrate that in this section.
Activate the eventSerializable() @Test within EventMgmtIT.java
Figure 56.1. Activate eventSerializable() @Test
jpatickets-labex-test/src/ `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejbclient | |-- EventMgmtIT.java
@Test
//@Ignore
public void eventSerializable() throws UnavailableException {
logger.info("*** eventSerializable ***");
Venue venue = venueMgmt.createVenue(tf.makeVenue(), 1, 2, 2);
Event event = eventMgmt.createEvent(tf.makeEvent(), venue);
assertNotNull("null tickets for event", event);
logger.info("event.tickets.class={}", event.getTickets().getClass());
}
Build, Deploy, and Test the application from the parent module
You can use "mvn clean install -Dit.test=org.myorg.jpatickets.ejbclient.EventMgmtIT#testMethod" to execute a test against a specific IT test. You can obtain the fully qualified name of the class/testMethod within Eclipse by selecting the method, right-click, and Copy Qualified Name.
Figure 56.2. Build, Deploy, and Test Serialization Problem
$mvn clean install ... Running org.myorg.jpatickets.ejbclient.EventMgmtIT 22:12:14,708 INFO (EventMgmtIT.java:37) -*** eventSerializable *** Tests run: 8, Failures: 0, Errors: 1, Skipped: 7, Time elapsed: 0.189 sec <<< FAILURE! - in org.myorg.jpatickets.ejbclient.EventMgmtIT eventSerializable(org.myorg.jpatickets.ejbclient.EventMgmtIT) Time elapsed: 0.155 sec <<< ERROR! java.lang.IllegalStateException: EJBCLIENT000025: No EJB receiver available for handling [appName:jpatickets-labex-ear, moduleName:jpatickets-labex-ejb, distinctName:] combination for invocation context org.jboss.ejb.client.EJBClientInvocationContext@18218b5
This problem can be hard to solve because there is no pointer to the problem. We have a DTO in one of our interface methods that has not implemented Serializable. I can usually tell that when one or more other methods work and a specific method does not work and the client simply reports "No EJB receiver...".
Update Event to implement Serializable within Event.java in the impl module.
Also uncomment the injection of the @PersistenceContext within the EventMgmtEJB. We can uncomment this because we successfully deployed the persistence unit in the previous chapter. If you forget to inject the @PersistenceContext -- then you will see the NullPointerException we encountered before.
Figure 56.3. Add implements Serializable to DTO
jpatickets-labex-impl/src |-- main | `-- java | `-- org | `-- myorg | `-- jpatickets | |-- bo | | |-- Event.java
import java.io.Serializable;
...
public class Event implements Serializable {
jpatickets-labex-ejb/src/ |-- main | |-- java | | `-- org | | `-- myorg | | `-- jpatickets | | `-- ejb | | |-- EventMgmtEJB.java
@Stateless
public class EventMgmtEJB implements EventMgmtRemote {
@PersistenceContext(unitName="jpatickets-labex")
private EntityManager em;
Attempt to build, deploy, and test the application in this state. You will have fixed the previous problem but have uncovered a new problem with a Hibernate class not found.
Figure 56.4. Build, Deploy, and Test Provider ClassNotFoundException Problem
$ mvn clean install ... 22:31:28,343 INFO (EventMgmtIT.java:37) -*** eventSerializable *** Tests run: 8, Failures: 0, Errors: 1, Skipped: 7, Time elapsed: 0.285 sec <<< FAILURE! - in org.myorg.jpatickets.ejbclient.EventMgmtIT eventSerializable(org.myorg.jpatickets.ejbclient.EventMgmtIT) Time elapsed: 0.268 sec <<< ERROR! javax.ejb.EJBException: java.lang.ClassNotFoundException: org.hibernate.collection.internal.PersistentBag
You have finished this section of the exercise where we identified a DTO passed in the remote interface that did not implement Serializable. We were able to determine that by possibly by inspecting all DTO classes used by the interface method called because there was no specific error message pointing us to the class in error. In the next section we will address the ClassNotFoundException.
The previous section fixed an issue with non-Serializable DTOs, now we are faced with a result telling us "ClassNotFoundException: org.hibernate.collection.internal.PersistentBag". Depending on our situation, it could have been this or a different Hibernate class.
So where does this this Hibernate class come from? It came from the fact the EJB returned the result of a JPA query/access that returned a managed entity with proxy class instances attached to watch for changes. These proxy classes were serialized along with the business object it was watching. The following is a gisting of the code executed on the server. It has the business logic and DAO layers removed to show the raw JPA call underneath the layers.
Figure 56.5. EventMgmtEJB.getEvent() Returns Managed @Entity
@Override
@TransactionAttribute(TransactionAttributeType.SUPPORTS)
public Event getEvent(int id) {
logger.debug("getEvent({})", id);
return em.find(Event.class, eventId);
}
We have a choice in how to fix this. We can either add the JAR that contains the missing class to the client's classpath or we can attempt to remove the provider class entirely and pass a pure POJO back to the client. In this section -- we will use the classpath solution. This solution might be appropriate for internal system interfaces and not when you have clients from external organizations. In that later case -- you likely would not use RMI anyway over other more web-friendly technologies like JAX-RS.
Add the JAR that contains the missing class as a dependency of the RMI IT Test module. This is hibernate-core.
Figure 56.6. (option) Add hibernate-core Dependency to Resolve ClassNotFoundException
<!-- necessary to supply hibernate classes when marshaling managed entities as DTOs -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
</dependency>
Note that the above dependency -- without version# -- depends upon the root pom defining a dependencyManagement with the version for implementation modules to use.
Figure 56.7. Provider Artifact Dependency Management
<properties>
<hibernate-entitymanager.version>5.3.1.Final</hibernate-entitymanager.version>
</properties>
<dependencyManagement>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>${hibernate-entitymanager.version}</version>
</dependency>
</dependencyManagement>
Re-test the application. Note that since the changes we made were only on the client-side, there is no need to re-build or even re-deploy the entire application. If you are using an IDE you can just make the IT test code change and re-run the test without re-deploying.
Figure 56.8. Successfully Resolve Class with new Client Dependency
$ mvn clean install ... Running org.myorg.jpatickets.ejbclient.EventMgmtIT 23:30:58,787 INFO (EventMgmtIT.java:37) -*** eventSerializable *** 23:30:59,113 INFO (EventMgmtIT.java:41) -event.tickets.class=class org.hibernate.collection.internal.PersistentBag Tests run: 9, Failures: 0, Errors: 0, Skipped: 8, Time elapsed: 0.395 sec - in org.myorg.jpatickets.ejbclient.EventMgmtIT
Note the PersistentBag class is still there. Our client just knows what to do with it now.
You have finished solving the missing provider class by adding a classpath dependency for the client on our persistence provider JAR. That may be a suitable answer for an internal client but may not be the solution for a remote client from a different application. In the next section(s) we will look at alternative approaches.
Before we introduce more solutions -- lets introduce another problem with re-using managed JPA entity classes as DTOs. In this case we are going to access more than just the initial object and the the reference was never realized on the server-side before marshaling to the client. Once the client attempts to access the missing data -- it is too late.
Activate the next two @Tests (eventLazy() and eventTouched()). They both make calls to getEvent() and then attempt to access portions of the returned event. eventLazy() catches the exception and reports success when the anticipated error occurs. eventTouchedSome() does not attempt to catch the anticipated error and will fail until corrected.
Figure 56.9. Enable Lazy Load Tests
jpatickets-labex-test/src/ `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejbclient | |-- EventMgmtIT.java
@Test
//@Ignore
public void eventLazy() throws UnavailableException {
logger.info("*** eventLazy ***");
Venue venue = venueMgmt.createVenue(tf.makeVenue(), 1, 2, 2);
Event event = eventMgmt.createEvent(tf.makeEvent(), venue);
event=eventMgmt.getEvent(event.getId());
assertNotNull("null tickets for event", event.getTickets());
try {
assertTrue("no tickets for event", event.getTickets().size() > 0);
fail("did not get expected lazy-load exception");
} catch (Exception ex) {
logger.info("caught expected lazy-load exception:" + ex);
}
}
@Test
//@Ignore
public void eventTouchedSome() throws UnavailableException {
logger.info("*** eventTouchedSome ***");
Venue venue = venueMgmt.createVenue(tf.makeVenue(), 1, 2, 2);
Event event = eventMgmt.createEvent(tf.makeEvent(), venue);
event=eventMgmt.getEvent(event.getId());
// event=eventMgmt.getEventTouchedSome(event.getId());
assertNotNull("null tickets for event", event.getTickets());
assertTrue("no tickets for event", event.getTickets().size() > 0);
for (Ticket t: event.getTickets()) {
try {
assertNotNull("no ticket price:" + t, t.getPrice());
fail("did not get expected lazy-load exception");
} catch (Exception ex) {
logger.info("caught expected lazy-load exception:" + ex);
}
}
}
Re-test the application with eventLazy() and eventTouchedSome() activated. The first will pass and the second will fail in their current state.
Figure 56.10. Build, Deploy, and Test Lazy Load Problem
EventMgmtIT.java:48) -*** eventLazy *** EventMgmtIT.java:58) -caught expected lazy-load exception:org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: org.myorg.jpatickets.bo.Event.tickets, could not initialize proxy - no Session EventMgmtIT.java:65) -*** eventTouchedSome *** Tests run: 9, Failures: 0, Errors: 1, Skipped: 6, Time elapsed: 1.019 sec <<< FAILURE! - in org.myorg.jpatickets.ejbclient.EventMgmtIT eventTouchedSome(org.myorg.jpatickets.ejbclient.EventMgmtIT) Time elapsed: 0.229 sec <<< ERROR! org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: org.myorg.jpatickets.bo.Event.tickets, could not initialize proxy - no Session at org.hibernate.collection.internal.AbstractPersistentCollection.throwLazyInitializationException(AbstractPersistentCollection.java:572) at org.hibernate.collection.internal.AbstractPersistentCollection.withTemporarySessionIfNeeded(AbstractPersistentCollection.java:212) at org.hibernate.collection.internal.AbstractPersistentCollection.readSize(AbstractPersistentCollection.java:153) at org.hibernate.collection.internal.PersistentBag.size(PersistentBag.java:278) at org.myorg.jpatickets.ejbclient.EventMgmtIT.eventTouchedSome(EventMgmtIT.java:72)
Update eventTouchedSome() to call getEventTouchedSome() instead of getEvent().
Figure 56.11. Correct Lazy Load Problem by Pre-Touching on Server-side
jpatickets-labex-test/src/ `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejbclient | |-- EventMgmtIT.java
// event=eventMgmt.getEvent(event.getId());
event=eventMgmt.getEventTouchedSome(event.getId());
The difference between the two methods is that getEventTouchedSome() specifically accesses some of the properties of the event in order to trigger the lazy-load to occur, in time on the server-side.
Figure 56.12. Pre-Touching EJB Implementation
jpatickets-labex-ejb/src/ |-- main | |-- java | | `-- org | | `-- myorg | | `-- jpatickets | | `-- ejb | | |-- EventMgmtEJB.java
@Override
public Event getEventTouchedSome(int id) {
logger.debug("getEventTouchedSome({})", id);
Event event = getEvent(id);
//touch the ticket collection to load tickets prior to marshaling back
event.getTickets().size();
return event;
}
Re-test the application now that you have updated the eventTouchedSome() @Test method. The tests will both pass, but notice that there were more lazy-loads to resolve.
Figure 56.13. Build, Deploy, and Test Pre-Touching Solution
$ mvn clean install ... Running org.myorg.jpatickets.ejbclient.EventMgmtIT EventMgmtIT.java:37) -*** eventSerializable *** EventMgmtIT.java:41) -event.tickets.class=class org.hibernate.collection.internal.PersistentBag EventMgmtIT.java:48) -*** eventLazy *** EventMgmtIT.java:58) -caught expected lazy-load exception:org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: org.myorg.jpatickets.bo.Event.tickets, could not initialize proxy - no Session EventMgmtIT.java:65) -*** eventTouchedSome *** EventMgmtIT.java:78) -caught expected lazy-load exception:org.hibernate.LazyInitializationException: could not initialize proxy - no Session EventMgmtIT.java:78) -caught expected lazy-load exception:org.hibernate.LazyInitializationException: could not initialize proxy - no Session ... Tests run: 9, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 1.09 sec - in org.myorg.jpatickets.ejbclient.EventMgmtIT
Activate the eventTouchedMore() @Test method to demonstrate a possible solution to completing client access to the necessary data.
Figure 56.14. Activate eventTouchedMore() to Demonstrate Full Load
jpatickets-labex-test/src/ `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejbclient | |-- EventMgmtIT.java
@Test
//@Ignore
public void eventTouchedMore() throws UnavailableException {
logger.info("*** eventTouchedMore ***");
Venue venue = venueMgmt.createVenue(tf.makeVenue(), 1, 2, 2);
Event event = eventMgmt.createEvent(tf.makeEvent(), venue);
event=eventMgmt.getEventTouchedMore(event.getId()); //<===
assertNotNull("null tickets for event", event.getTickets());
assertTrue("no tickets for event", event.getTickets().size() > 0);
for (Ticket t: event.getTickets()) {
assertNotNull("no ticket price:" + t, t.getPrice());
assertTrue("unexpected ticket price:" + t, t.getPrice().intValue() > 0);
Seat s = t.getSeat();
assertNotNull("null seat", s);
}
}
This solution works because the getEventTouchedMore() method on the server-side does a more complete walk of the object graph before returning it to the client.
Figure 56.15. Deeper Pre-Touch Solution Allowing More Data Returned
jpatickets-labex-ejb/src/ |-- main | |-- java | | `-- org | | `-- myorg | | `-- jpatickets | | |-- dto | | | `-- EventDTO.java | | `-- ejb | | |-- EventMgmtEJB.java
@Override
public Event getEventTouchedMore(int id) {
logger.debug("getEventTouchedMore({})", id);
Event event = getEvent(id);
//touch ticket collection and all seats to load both prior to marshaling back
event.getTickets().size();
event.getVenue().getName();
for (Ticket t: event.getTickets()) {
t.getSeat().getPosition();
}
return event;
}
Re-test the application with eventTouchedMore() @Test method activated. This will pass because the client is now serialized an object that has been fully loaded prior to being marshaled back.
Figure 56.16. Build, Deploy, and Test Deeper Pre-Touch Solution
$ mvn clean install -Dit.test=org.myorg.jpatickets.ejbclient.EventMgmtIT#eventTouchedMore ... 00:50:33,856 INFO (EventMgmtIT.java:86) -*** eventTouchedMore *** Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.928 sec - in org.myorg.jpatickets.ejbclient.EventMgmtIT ... Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
You have completed implementing a solution for lazy-load exceptions on the client when marshaling back managed entity classes. In this solution the server-side pre-touched every entity that was required in the response to the client. It functionally worked, but if you were paying attention to the activity on the server-side you should have noticed a lot of extra database activity going one to resolve those lazy-loads. It is inefficient to do it this way and we will look to see how we can improve in a later section of this exercise.
Figure 56.17. Excessive Server-side Lazy Loads during Pre-Touching
EventMgmtEJB] (EJB default - 8) *** EventMgmtEJB:init(752888223) *** EventMgmtEJB] (EJB default - 8) getEventTouchedMore(182) EventMgmtEJB] (EJB default - 8) getEvent(182) Hibernate: select event0_.EVENT_ID as EVENT_ID1_0_0_, event0_.EVENT_NAME as EVENT_NA2_0_0_, event0_.START_TIME as START_TI3_0_0_, event0_.VENUE_ID as VENUE_ID4_0_0_ from JPATICKETS_EVENT event0_ where event0_.EVENT_ID=? Hibernate: select tickets0_.EVENT_ID as EVENT_ID1_0_0_, tickets0_.EVENT_ID as EVENT_ID1_1_0_, tickets0_.VENUE_ID as VENUE_ID0_1_0_, tickets0_.SECTION as SECTION0_1_0_, tickets0_.ROW as ROW0_1_0_, tickets0_.POSITION as POSITION0_1_0_, tickets0_.EVENT_ID as EVENT_ID1_1_1_, tickets0_.VENUE_ID as VENUE_ID0_1_1_, tickets0_.SECTION as SECTION0_1_1_, tickets0_.ROW as ROW0_1_1_, tickets0_.POSITION as POSITION0_1_1_, tickets0_.VENUE_ID as VENUE_ID4_1_1_, tickets0_.SECTION as SECTION5_1_1_, tickets0_.ROW as ROW6_1_1_, tickets0_.POSITION as POSITION7_1_1_, tickets0_.PRICE as PRICE2_1_1_, tickets0_.SOLD as SOLD3_1_1_ from JPATICKETS_TICKET tickets0_ where tickets0_.EVENT_ID=? Hibernate: select venue0_.VENUE_ID as VENUE_ID1_2_0_, venue0_.CITY as CITY2_2_0_, venue0_.STATE as STATE3_2_0_, venue0_.STREET as STREET4_2_0_, venue0_.POSTAL_CODE as POSTAL_C5_2_0_, venue0_.NAME as NAME6_2_0_ from JPATICKETS_VENUE venue0_ where venue0_.VENUE_ID=? Hibernate: select seat0_.POSTION as POSTION1_3_0_, seat0_.ROW as ROW2_3_0_, seat0_.SECTION as SECTION3_3_0_, seat0_.VENUE_ID as VENUE_ID4_3_0_ from JPATICKET_SEAT seat0_ where seat0_.POSTION=? and seat0_.ROW=? and seat0_.SECTION=? and seat0_.VENUE_ID=? Hibernate: select seat0_.POSTION as POSTION1_3_0_, seat0_.ROW as ROW2_3_0_, seat0_.SECTION as SECTION3_3_0_, seat0_.VENUE_ID as VENUE_ID4_3_0_ from JPATICKET_SEAT seat0_ where seat0_.POSTION=? and seat0_.ROW=? and seat0_.SECTION=? and seat0_.VENUE_ID=? Hibernate: select seat0_.POSTION as POSTION1_3_0_, seat0_.ROW as ROW2_3_0_, seat0_.SECTION as SECTION3_3_0_, seat0_.VENUE_ID as VENUE_ID4_3_0_ from JPATICKET_SEAT seat0_ where seat0_.POSTION=? and seat0_.ROW=? and seat0_.SECTION=? and seat0_.VENUE_ID=? Hibernate: select seat0_.POSTION as POSTION1_3_0_, seat0_.ROW as ROW2_3_0_, seat0_.SECTION as SECTION3_3_0_, seat0_.VENUE_ID as VENUE_ID4_3_0_ from JPATICKET_SEAT seat0_ where seat0_.POSTION=? and seat0_.ROW=? and seat0_.SECTION=? and seat0_.VENUE_ID=? EventMgmtEJB] (EJB default - 8) *** EventMgmtEJB:destroy(752888223) ***
Lets address a possible solution to two problems; lazy-load and provider classes. In a previous section we added to the client's classpath to resolve the provider class(es). In this section we will take a different approach and cleans the managed entity by creating a new instance and copying over the data. This will not only solve our provider class problem -- it will also coincidentally solve our lazy-load issue because we will be accessing any information we copy into the pure POJOs.
Activate the eventCleansed() @Test. This test will inspect the class type of the tickets collection returned and fail if the getter() returns a provider-based class.
Figure 56.18. Activate eventCleansed() @Test
jpatickets-labex-test/src/ `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejbclient | |-- EventMgmtIT.java
@Test
//@Ignore
public void eventCleansed() throws UnavailableException {
logger.info("*** eventCleansed ***");
Venue venue = venueMgmt.createVenue(tf.makeVenue(), 1, 2, 2);
Event event = eventMgmt.createEvent(tf.makeEvent(), venue);
logger.info("event.tickets.class={}", event.getTickets().getClass());
assertTrue("missing provider class", event.getTickets().getClass().getName().contains("org.hibernate"));
event=eventMgmt.getEvent(event.getId());
// event=eventMgmt.getEventCleansed(event.getId());
logger.info("(cleansed)event.tickets.class={}", event.getTickets().getClass());
assertFalse("unexpected provider class", event.getTickets().getClass().getName().contains("org.hibernate"));
}
Re-test the application with the eventCleansed() @Test activated. This test will fail because the getter() returned an object instance that contained a provider class.
Figure 56.19. Uncleansed Object Returned
$ mvn clean install EventMgmtIT.java:47) -*** eventCleansed *** EventMgmtIT.java:51) -event.tickets.class=class org.hibernate.collection.internal.PersistentBag EventMgmtIT.java:55) -(cleansed)event.tickets.class=class org.hibernate.collection.internal.PersistentBag Tests run: 9, Failures: 1, Errors: 0, Skipped: 7, Time elapsed: 0.965 sec <<< FAILURE! - in org.myorg.jpatickets.ejbclient.EventMgmtIT eventCleansed(org.myorg.jpatickets.ejbclient.EventMgmtIT) Time elapsed: 0.343 sec <<< FAILURE! java.lang.AssertionError: unexpected provider class at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.assertTrue(Assert.java:41) at org.junit.Assert.assertFalse(Assert.java:64) at org.myorg.jpatickets.ejbclient.EventMgmtIT.eventCleansed(EventMgmtIT.java:56)
Update the implementation of the eventCleansed() @Test method to call getEventCleansed() instead of getEvent()
Figure 56.20. Update @Test to obtain DTO Cleansed of Provider Classes
jpatickets-labex-test/src/ `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejbclient | |-- EventMgmtIT.java
// event=eventMgmt.getEvent(event.getId());
event=eventMgmt.getEventCleansed(event.getId());
The new call will call a wrapper method within the EJB that will create a new Event POJO and copy over state information from the managed Event entity. The chain usually continues (i.e., to Tickets) until we have created clean POJO clones free of all provider classes.
Figure 56.21. EventMgmtEJB.getCleansedEvent()
jpatickets-labex-ejb/src/ |-- main | |-- java | | `-- org | | `-- myorg | | `-- jpatickets | | `-- ejb | | |-- EventMgmtEJB.java
@Override
public Event getEventCleansed(int id) {
logger.debug("getCleansedEvent({})", id);
Event event = getEvent(id);
return toCleansed(event);
}
private Event toCleansed(Event bo) {
Event pojo = new Event(bo.getId());
pojo.setName(bo.getName());
pojo.setStartTime(bo.getStartTime());
List<Ticket> tickets = new ArrayList<>(bo.getTickets().size());
for (Ticket t: bo.getTickets()) {
toCleansed(t, pojo);
}
pojo.setTickets(tickets);
return pojo;
}
private Ticket toCleansed(Ticket bo, Event event) {
//example cleansing is stopping here for the example
Ticket pojo = new Ticket(event, bo.getSeat());
pojo.setPrice(bo.getPrice());
pojo.setSold(bo.isSold());
return pojo;
}
Re-test the application now that the client has been updated to call getCleansedEvent() to get a POJO that does not contain provider classes. This test should now pass.
Figure 56.22. Build, Deploy, and Test DTO Cleansing Solution
$ mvn clean install ... Running org.myorg.jpatickets.ejbclient.EventMgmtIT ... 00:01:28,198 INFO (EventMgmtIT.java:47) -*** eventCleansed *** 00:01:28,265 INFO (EventMgmtIT.java:51) -event.tickets.class=class org.hibernate.collection.internal.PersistentBag 00:01:28,293 INFO (EventMgmtIT.java:55) -(cleansed)event.tickets.class=class java.util.ArrayList Tests run: 9, Failures: 0, Errors: 0, Skipped: 7, Time elapsed: 0.516 sec - in org.myorg.jpatickets.ejbclient.EventMgmtIT
You have completed a solution approach where we can create a cleansed POJO to be returned to the client rather than update the client classpath. However, look at the lazy-load activity we triggered on the client side. This likely occurred during the previous solution as well but it is easier to spot with the explicit cloning calls being made within the EJB. We will look to optimize this in a future section of this exercise. Just know for now that this solution is not 100% perfect.
Figure 56.23. BO/DTO Cleansing Triggering Unefficient Lazy Loads
EventMgmtEJB] (EJB default - 6) *** EventMgmtEJB:init(1746126146) *** EventMgmtEJB] (EJB default - 6) getCleansedEvent(161) EventMgmtEJB] (EJB default - 6) getEvent(161) [stdout] (EJB default - 6) Hibernate: select event0_.EVENT_ID as EVENT_ID1_0_0_, event0_.EVENT_NAME as EVENT_NA2_0_0_, event0_.START_TIME as START_TI3_0_0_, event0_.VENUE_ID as VENUE_ID4_0_0_ from JPATICKETS_EVENT event0_ where event0_.EVENT_ID=? [stdout] (EJB default - 6) Hibernate: select tickets0_.EVENT_ID as EVENT_ID1_0_0_, tickets0_.EVENT_ID as EVENT_ID1_1_0_, tickets0_.VENUE_ID as VENUE_ID0_1_0_, tickets0_.SECTION as SECTION0_1_0_, tickets0_.ROW as ROW0_1_0_, tickets0_.POSITION as POSITION0_1_0_, tickets0_.EVENT_ID as EVENT_ID1_1_1_, tickets0_.VENUE_ID as VENUE_ID0_1_1_, tickets0_.SECTION as SECTION0_1_1_, tickets0_.ROW as ROW0_1_1_, tickets0_.POSITION as POSITION0_1_1_, tickets0_.VENUE_ID as VENUE_ID4_1_1_, tickets0_.SECTION as SECTION5_1_1_, tickets0_.ROW as ROW6_1_1_, tickets0_.POSITION as POSITION7_1_1_, tickets0_.PRICE as PRICE2_1_1_, tickets0_.SOLD as SOLD3_1_1_ from JPATICKETS_TICKET tickets0_ where tickets0_.EVENT_ID=? EventMgmtEJB] (EJB default - 6) *** EventMgmtEJB:destroy(1746126146) ***
In this section we will enlist some direct help from the DAO to fetch data from the database in a manner that is more efficient and tuned for returning information to the caller.
Activate the eventFetchedSome() @Test method. This implementation will allow the client to access additional information but not all of the event information without additional work.
Figure 56.24. Activate eventFetchSome() @Test
jpatickets-labex-test/src/ `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejbclient | |-- EventMgmtIT.java
@Test
//@Ignore
public void eventFetchedSome() throws UnavailableException {
logger.info("*** eventFetchedSome ***");
Venue venue = venueMgmt.createVenue(tf.makeVenue(), 1, 2, 2);
Event event = eventMgmt.createEvent(tf.makeEvent(), venue);
event=eventMgmt.getEventFetchedSome(event.getId());
assertNotNull("null tickets for event", event.getTickets());
assertTrue("no tickets for event", event.getTickets().size() > 0);
for (Ticket t: event.getTickets()) {
try {
assertNotNull("no ticket price:" + t, t.getPrice());
fail("did not get expected lazy-load exception");
} catch (Exception ex) {
logger.info("caught expected lazy-load exception:" + ex);
}
}
}
The above client call will now be calling a JPA-QL query instead of a simple em.find(). In the figure below I have removed the business logic and DAO layers from the call and exposed the raw NamedQuery -- which is defined in the Event @Entity class. The JPA-QL performs a "join fetch" on the tickets to pre-load that collection. That allows the client to call event.getTickets().size() without suffering a lazy-load error and it allows the information to be more efficiently accessed from the database.
Figure 56.25. Partial Pre-Fetching Solution
jpatickets-labex-ejb/src/ |-- main | |-- java | | `-- org | | `-- myorg | | `-- jpatickets | | `-- ejb | | |-- EventMgmtEJB.java
@Override
public Event getEventFetchedSome(int id) {
logger.debug("getEventFetchedSome({})", id);
List<Event> events = em.createNamedQuery("JPATicketEvent.fetchEventTickets",
Event.class)
.setParameter("eventId", id)
.getResultList();
return events.isEmpty() ? null : events.get(0);
}
jpatickets-labex-impl/src/ |-- main | `-- java | `-- org | `-- myorg | `-- jpatickets | | |-- Event.java
@Entity
@NamedQueries({
@NamedQuery(name="JPATicketEvent.fetchEventTickets",
query="select e from Event e "
+ "join fetch e.tickets "
+ "where e.id=:eventId")
})
public class Event implements Serializable /* */ {
Test the newly activated eventFetchedSome() @Test method and notice how the subsequent query for the tickets after getting the event has been replaced by a single, more complex query -- which should be faster because of the fewer calls.
Figure 56.26. Build, Deploy, and Test Partial Pre-Fetching Solution
$ mvn clean install -Dit.test=org.myorg.jpatickets.ejbclient.EventMgmtIT#eventFetchedSome ... EventMgmtIT.java:121) -*** eventFetchedSome *** EventMgmtIT.java:133) -caught expected lazy-load exception:org.hibernate.LazyInitializationException: could not initialize proxy - no Session ... Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.175 sec - in org.myorg.jpatickets.ejbclient.EventMgmtIT 01:04:21,380 INFO (ChannelAssociation.java:458) -EJBCLIENT000016: Channel Channel ID e5553882 (outbound) of Remoting connection 374287a9 to localhost/127.0.0.1:8080 can no longer process messages Results : Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
EventMgmtEJB] (EJB default - 2) *** EventMgmtEJB:init(533059749) *** EventMgmtEJB] (EJB default - 2) getEventFetchedSome(183) Hibernate: select event0_.EVENT_ID as EVENT_ID1_0_0_, tickets1_.EVENT_ID as EVENT_ID1_1_1_, tickets1_.VENUE_ID as VENUE_ID0_1_1_, tickets1_.SECTION as SECTION0_1_1_, tickets1_.ROW as ROW0_1_1_, tickets1_.POSITION as POSITION0_1_1_, event0_.EVENT_NAME as EVENT_NA2_0_0_, event0_.START_TIME as START_TI3_0_0_, event0_.VENUE_ID as VENUE_ID4_0_0_, tickets1_.VENUE_ID as VENUE_ID4_1_1_, tickets1_.SECTION as SECTION5_1_1_, tickets1_.ROW as ROW6_1_1_, tickets1_.POSITION as POSITION7_1_1_, tickets1_.PRICE as PRICE2_1_1_, tickets1_.SOLD as SOLD3_1_1_, tickets1_.EVENT_ID as EVENT_ID1_0_0__, tickets1_.EVENT_ID as EVENT_ID1_1_0__, tickets1_.VENUE_ID as VENUE_ID0_1_0__, tickets1_.SECTION as SECTION0_1_0__, tickets1_.ROW as ROW0_1_0__, tickets1_.POSITION as POSITION0_1_0__ from JPATICKETS_EVENT event0_ inner join JPATICKETS_TICKET tickets1_ on event0_.EVENT_ID=tickets1_.EVENT_ID where event0_.EVENT_ID=?
Lets take this a step further and attempt to optimize getting more information from the event. Activate the eventFetchedMore() @Test method.
Figure 56.27. Activate eventFetchMore() @Test
jpatickets-labex-test/src/ `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejbclient | |-- EventMgmtIT.java
@Test
//@Ignore
public void eventFetchedMore() throws UnavailableException {
logger.info("*** eventFetchedMore ***");
Venue venue = venueMgmt.createVenue(tf.makeVenue(), 1, 2, 2);
Event event = eventMgmt.createEvent(tf.makeEvent(), venue);
event=eventMgmt.getEventFetchedMore(event.getId());
assertNotNull("null tickets for event", event.getTickets());
assertTrue("no tickets for event", event.getTickets().size() > 0);
for (Ticket t: event.getTickets()) {
assertNotNull("no ticket price:" + t, t.getPrice());
assertTrue("unexpected ticket price:" + t, t.getPrice().intValue() > 0);
Seat s = t.getSeat();
assertNotNull("null seat", s);
}
}
The above call uncovers an impact because the EJB and supporting DAO call execute a deeper "join fetch" to satisfy a more complex information need.
Figure 56.28. Deeper Pre-Fetching Solution
jpatickets-labex-ejb/src/ |-- main | |-- java | | `-- org | | `-- myorg | | `-- jpatickets | | |-- dto | | | `-- EventDTO.java | | `-- ejb | | |-- EventMgmtEJB.java
@Override
public Event getEventFetchedMore(int id) {
logger.debug("getEventFetchedMore({})", id);
List<Event> events = em.createNamedQuery("JPATicketEvent.fetchEventTicketsSeats",
Event.class)
.setParameter("eventId", id)
.getResultList();
return events.isEmpty() ? null : events.get(0);
}
jpatickets-labex-impl/src/ |-- main | `-- java | `-- org | `-- myorg | `-- jpatickets | | |-- Event.java
@Entity
@NamedQueries({
@NamedQuery(name="JPATicketEvent.fetchEventTicketsSeats",
query="select e from Event e "
+ "join fetch e.venue "
+ "join fetch e.tickets t "
+ "join fetch t.seat "
+ "where e.id=:eventId")
})
public class Event implements Serializable /* */ {
Test the newly activated eventFetchedMore() @Test and notice how the information was obtained from the database. This query by itself may be an expensive query but it is going to be faster than the EJB tier going back and getting information using multiple, separate queries.
Figure 56.29. Build, Deploy, and Test Deeper Pre-Fetching Solution
$ mvn clean install -Dit.test=org.myorg.jpatickets.ejbclient.EventMgmtIT#eventFetchedMore ... 01:29:52,688 INFO (EventMgmtIT.java:141) -*** eventFetchedMore *** Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.812 sec - in org.myorg.jpatickets.ejbclient.EventMgmtIT
EventMgmtEJB] (EJB default - 6) *** EventMgmtEJB:init(454399101) *** EventMgmtEJB] (EJB default - 6) getEventFetchedMore(184) Hibernate: select event0_.EVENT_ID as EVENT_ID1_0_0_, venue1_.VENUE_ID as VENUE_ID1_2_1_, tickets2_.EVENT_ID as EVENT_ID1_1_2_, tickets2_.VENUE_ID as VENUE_ID0_1_2_, tickets2_.SECTION as SECTION0_1_2_, tickets2_.ROW as ROW0_1_2_, tickets2_.POSITION as POSITION0_1_2_, seat3_.POSTION as POSTION1_3_3_, seat3_.ROW as ROW2_3_3_, seat3_.SECTION as SECTION3_3_3_, seat3_.VENUE_ID as VENUE_ID4_3_3_, event0_.EVENT_NAME as EVENT_NA2_0_0_, event0_.START_TIME as START_TI3_0_0_, event0_.VENUE_ID as VENUE_ID4_0_0_, venue1_.CITY as CITY2_2_1_, venue1_.STATE as STATE3_2_1_, venue1_.STREET as STREET4_2_1_, venue1_.POSTAL_CODE as POSTAL_C5_2_1_, venue1_.NAME as NAME6_2_1_, tickets2_.VENUE_ID as VENUE_ID4_1_2_, tickets2_.SECTION as SECTION5_1_2_, tickets2_.ROW as ROW6_1_2_, tickets2_.POSITION as POSITION7_1_2_, tickets2_.PRICE as PRICE2_1_2_, tickets2_.SOLD as SOLD3_1_2_, tickets2_.EVENT_ID as EVENT_ID1_0_0__, tickets2_.EVENT_ID as EVENT_ID1_1_0__, tickets2_.VENUE_ID as VENUE_ID0_1_0__, tickets2_.SECTION as SECTION0_1_0__, tickets2_.ROW as ROW0_1_0__, tickets2_.POSITION as POSITION0_1_0__ from JPATICKETS_EVENT event0_ inner join JPATICKETS_VENUE venue1_ on event0_.VENUE_ID=venue1_.VENUE_ID inner join JPATICKETS_TICKET tickets2_ on event0_.EVENT_ID=tickets2_.EVENT_ID inner join JPATICKET_SEAT seat3_ on tickets2_.VENUE_ID=seat3_.POSTION and tickets2_.SECTION=seat3_.ROW and tickets2_.ROW=seat3_.SECTION and tickets2_.POSITION=seat3_.VENUE_ID where event0_.EVENT_ID=? EventMgmtEJB] (EJB default - 6) *** EventMgmtEJB:destroy(454399101) ***
You have completed a solution for lazy-load that can be more efficient than simple pre-touching of the managed entities prior to marshalling back to the client. If there is no other reason for brining up this topic -- it is to introduce the concept of the DAO supply methods that help support the construction of DTOs that are necessary to express parts of the service/system to remote clients.
In this section we will look at adding an new implementation approach for the DTO pattern player. We will implement the DTO using a separate, Serializable class. This allows the service to separate the database mapping and business implementation concerns of the BO/entity from the information transfer to the remote clients. This allows the service to design client-appropriate views DTOs that is independent of the implementation data tier.
In this section we are going to start with a brute force technique and then look to add direct DAO support. The brute force technique will look a lot like the cleansed DTO approach except we are using an entirely different class.
Activate the eventLazyDTO() @Test method. This method will receive a new POJO DTO class instance from the server that summarizes the key information a client needs to know about an Event.
Figure 56.30. Activate Brute Force DTO @Test
jpatickets-labex-test/src/ `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejbclient | |-- EventMgmtIT.java
@Test
//@Ignore
public void eventLazyDTO() throws UnavailableException {
logger.info("*** eventLazyDTO ***");
Venue venue = venueMgmt.createVenue(tf.makeVenue(), 1, 2, 2);
Event event = tf.makeEvent();
int eventId = eventMgmt.createEvent(tf.makeEvent(), venue).getId();
EventDTO dto=eventMgmt.getEventLazyDTO(eventId);
logger.debug("eventDTO={}", dto);
assertEquals("unexpected eventName", event.getName(), dto.getEventName());
assertEquals("unexpected venueName", venue.getName(), dto.getVenueName());
assertEquals("unexpected startTime", event.getStartTime(), dto.getStartTime());
assertTrue("no tickets for event", dto.getNumTickets() > 0);
}
The EventDTO removes the need for the Venue entity and just carries the venueName. It removes the need to carry all the tickets and replaces that with just the number of tickets.
Figure 56.31. EventDTO Class
jpatickets-labex-ejb/src/ |-- main | |-- java | | `-- org | | `-- myorg | | `-- jpatickets | | |-- dto | | | `-- EventDTO.java
public class EventDTO implements Serializable {
private int id;
private String eventName;
private String venueName;
private Date startTime;
private int numTickets;
I called this first implementation "brute-force" earlier in the introduction. That is because the EJB method simply walks the event managed entity and grabs what it needs -- loaded or not. As you should expect, this will cause a significant number of lazy-loads. We will improve shortly.
Figure 56.32. Brute-Force getEventLazyDTO()
jpatickets-labex-ejb/src/ |-- main | |-- java | | `-- org | | `-- myorg | | `-- jpatickets | | |-- dto | | | `-- EventDTO.java
@Override
public EventDTO getEventLazyDTO(int id) {
logger.debug("getEventDTO({})", id);
Event event = eventMgmt.getEvent(id);
return toEventDTO(event);
}
private EventDTO toEventDTO(Event event) {
EventDTO dto = new EventDTO();
dto.setId(event.getId());
dto.setEventName(event.getName());
dto.setVenueName(event.getVenue().getName());
dto.setStartTime(event.getStartTime());
dto.setNumTickets(event.getTickets().size());
return dto;
}
Run the newly activated eventLazyDTO() @Test method. Notice there was a database access for the Event, a second for the Venue (to get the venueName), and a third for the Tickets (to get the count of tickets).
Figure 56.33. Lazy Loads from Brute Force DTO Construction
$ mvn clean install -Dit.test=org.myorg.jpatickets.ejbclient.EventMgmtIT#eventLazyDTO ... 01:54:23,564 INFO (EventMgmtIT.java:159) -*** eventLazyDTO *** Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.853 sec - in org.myorg.jpatickets.ejbclient.EventMgmtIT
EventMgmtEJB] (EJB default - 4) *** EventMgmtEJB:init(1695166532) *** EventMgmtEJB] (EJB default - 4) getEventDTO(186) Hibernate: select event0_.EVENT_ID as EVENT_ID1_0_0_, event0_.EVENT_NAME as EVENT_NA2_0_0_, event0_.START_TIME as START_TI3_0_0_, event0_.VENUE_ID as VENUE_ID4_0_0_ from JPATICKETS_EVENT event0_ where event0_.EVENT_ID=? Hibernate: select venue0_.VENUE_ID as VENUE_ID1_2_0_, venue0_.CITY as CITY2_2_0_, venue0_.STATE as STATE3_2_0_, venue0_.STREET as STREET4_2_0_, venue0_.POSTAL_CODE as POSTAL_C5_2_0_, venue0_.NAME as NAME6_2_0_ from JPATICKETS_VENUE venue0_ where venue0_.VENUE_ID=? Hibernate: select tickets0_.EVENT_ID as EVENT_ID1_0_0_, tickets0_.EVENT_ID as EVENT_ID1_1_0_, tickets0_.VENUE_ID as VENUE_ID0_1_0_, tickets0_.SECTION as SECTION0_1_0_, tickets0_.ROW as ROW0_1_0_, tickets0_.POSITION as POSITION0_1_0_, tickets0_.EVENT_ID as EVENT_ID1_1_1_, tickets0_.VENUE_ID as VENUE_ID0_1_1_, tickets0_.SECTION as SECTION0_1_1_, tickets0_.ROW as ROW0_1_1_, tickets0_.POSITION as POSITION0_1_1_, tickets0_.VENUE_ID as VENUE_ID4_1_1_, tickets0_.SECTION as SECTION5_1_1_, tickets0_.ROW as ROW6_1_1_, tickets0_.POSITION as POSITION7_1_1_, tickets0_.PRICE as PRICE2_1_1_, tickets0_.SOLD as SOLD3_1_1_ from JPATICKETS_TICKET tickets0_ where tickets0_.EVENT_ID=? EventMgmtEJB] (EJB default - 4) *** EventMgmtEJB:destroy(1695166532) ***
Activate the last @Test within EventMgmtIT.java; eventFetchedDTO(). This @Test demonstrates what can be done to add DAO support to build DTO responses.
Figure 56.34. Activate Fetched DTO @Test
jpatickets-labex-test/src/ `-- test |-- java | `-- org | `-- myorg | `-- jpatickets | `-- ejbclient | |-- EventMgmtIT.java
@Test
//@Ignore
public void eventFetchedDTO() throws UnavailableException {
logger.info("*** eventFetchedDTO ***");
Venue venue = venueMgmt.createVenue(tf.makeVenue(), 1, 2, 2);
Event event = tf.makeEvent();
int eventId = eventMgmt.createEvent(tf.makeEvent(), venue).getId();
EventDTO dto=eventMgmt.getEventFetchedDTO(eventId);
logger.debug("eventDTO={}", dto);
assertEquals("unexpected eventName", event.getName(), dto.getEventName());
assertEquals("unexpected venueName", venue.getName(), dto.getVenueName());
assertEquals("unexpected startTime", event.getStartTime(), dto.getStartTime());
assertTrue("no tickets for event", dto.getNumTickets() > 0);
}
The EJB invokes a NamedQuery on the DAO (layers have been removed for clarity) that is tuned to provide the information required for the EventDTO. The DAO returns the information (event, venueName, and numTickets) in a Map since the DTO is located in the EJB module and not accessible to the DAO code in the Impl module. The EJB uses the information from the map to populate the DTO prior to returning it to the client.
Figure 56.35. Fetched DTO EJB/DAO Calls
jpatickets-labex-ejb/src/ |-- main | |-- java | | `-- org | | `-- myorg | | `-- jpatickets | | |-- dto | | | `-- EventDTO.java | | `-- ejb | | |-- EventMgmtEJB.java
@Override
public EventDTO getEventFetchedDTO(int eventId) {
@SuppressWarnings("unchecked")
List<Object[]> rows = em.createNamedQuery("JPATicketEvent.fetchEventDTO")
.setParameter("eventId", eventId)
.getResultList();
Map<String, Object> dtoData = new HashMap<String, Object>();
if (!rows.isEmpty()) {
Object[] row = rows.get(0);
Event event = (Event) row[0];
String venueName = (String) row[1];
Number numTickets = (Number) row[2];
dtoData.put(EventMgmtDAO.EVENT, event);
dtoData.put(EventMgmtDAO.VENUE_NAME, venueName);
dtoData.put(EventMgmtDAO.NUM_TICKETS, numTickets.intValue());
}
return toEventDTO(dtoData);
}
private EventDTO toEventDTO(Map<String, Object> dtoData) {
EventDTO dto = new EventDTO();
Event event = (Event) dtoData.get(EventMgmtDAO.EVENT);
String venueName = (String) dtoData.get(EventMgmtDAO.VENUE_NAME);
int numTickets = (Integer) dtoData.get(EventMgmtDAO.NUM_TICKETS);
dto.setId(event.getId());
dto.setEventName(event.getName());
dto.setStartTime(event.getStartTime());
dto.setVenueName(venueName);
dto.setNumTickets(numTickets);
return dto;
}
To complete our exercise, I will also explain the DAO query. The DAO uses a @NamedNativeQuery with custom native SQL and a @SqlResultSetMapping. The native SQL is used to obtain an Event entity, the name of the Venue, and perform an aggregate count() of the tickets within the DB. The @SqlResultSetMapping is used to realize a managed Event instance, a venueName String, and numTickets Number from the returned columns. If you look back at the EJB/DAO processing above you will see the information coming back in three (3) elements of an Object array.
Figure 56.36. Fetched DTO JPA Support
jpatickets-labex-impl/src/ |-- main | `-- java | `-- org | `-- myorg | `-- jpatickets | | |-- Event.java
@Entity
@Table(name="JPATICKETS_EVENT")
@NamedNativeQueries({
@NamedNativeQuery(name="JPATicketEvent.fetchEventDTO",
query="select event.EVENT_ID, event.EVENT_NAME, event.START_TIME, event.VENUE_ID, "
+ "venue.NAME venueName, count(ticket.*) numTickets "
+ "from JPATICKETS_EVENT event "
+ "join JPATICKETS_VENUE venue on venue.VENUE_ID = event.VENUE_ID "
+ "join JPATICKETS_TICKET ticket on ticket.EVENT_ID = event.EVENT_ID "
+ "where event.EVENT_ID = :eventId "
+ "group by event.EVENT_ID, event.EVENT_NAME, event.START_TIME, event.VENUE_ID, venue.NAME",
resultSetMapping="JPATicketEvent.EventDTOMapping")
})
@SqlResultSetMappings({
@SqlResultSetMapping(name="JPATicketEvent.EventDTOMapping",
entities={
@EntityResult(entityClass=Event.class)
},
columns={
@ColumnResult(name="venueName", type=String.class),
@ColumnResult(name="numTickets", type=Long.class)
}
)
})
public class Event implements Serializable /* */ {
Test the application using the newly activated eventFetchedDTO() @Test method. This should result in a very efficient and recognizable query issued to the DB for exactly the information we need for the DTO.
Figure 56.37. Build, Deploy, and Test Fetched DTO Solution
$ mvn clean install -Dit.test=org.myorg.jpatickets.ejbclient.EventMgmtIT#eventFetchedDTO ... 02:03:18,325 INFO (EventMgmtIT.java:175) -*** eventFetchedDTO *** Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.519 sec - in org.myorg.jpatickets.ejbclient.EventMgmtIT
EventMgmtEJB] (EJB default - 8) *** EventMgmtEJB:init(793182011) *** Hibernate: select event.EVENT_ID, event.EVENT_NAME, event.START_TIME, event.VENUE_ID, venue.NAME venueName, count(ticket.*) numTickets from JPATICKETS_EVENT event join JPATICKETS_VENUE venue on venue.VENUE_ID = event.VENUE_ID join JPATICKETS_TICKET ticket on ticket.EVENT_ID = event.EVENT_ID where event.EVENT_ID = ? group by event.EVENT_ID, event.EVENT_NAME, event.START_TIME, event.VENUE_ID, venue.NAME EventMgmtEJB] (EJB default - 8) *** EventMgmtEJB:destroy(793182011) ***
You have finished implementing the most aggressive/complete solution for forming DTO instances for return to the client. This approach separated the DTO from the BO/entity class so the remote interface was not tied to providing the exact representation to remote clients that it used internally. We implemented the mapping first using brute-force information transfer at the remote facade level and then improved the implementation by creating a DAO query that was tuned to specific responses.
You have finished coverage of various remote interface issues to address when designing remote interfaces for EJBs -- and specifically EJBs that make use of JPA. In our coverage you discovered and implemented solutions for:
Provider classes missing
Lazy-load exceptions on the client
More efficient fetchs of DTO information
Separating the DTO and BO/entity implementations into separate classes