Posts mit dem Label Java werden angezeigt. Alle Posts anzeigen
Posts mit dem Label Java werden angezeigt. Alle Posts anzeigen

Donnerstag, 29. November 2012

Weblets 1.3 released

Just a short notice, that Weblets 1.3 is released on Maven Central. To include Weblets in your own programs use following Maven directive. If you want to include Weblets in your program simply now use following entries in your dependencies section.

<dependency>
    <groupId>com.github.weblets</groupId>
    <artifactId>weblets-api</artifactId>
    <version>1.3</version>
</dependency>

and for the implementation use following include 

<dependency>
    <groupId>com.github.weblets</groupId>
    <artifactId>weblets-impl</artifactId>
    <version>1.3</version>
</dependency>

For a summary of the new features see either the Weblets documentation or following blogpost.

 

Links

[1] Weblets homepage

[2] Weblets 1.2 and 1.3 features blogpost

[2] Weblets documentation

Dienstag, 27. November 2012

TomEE and Maven

Introduction

TomEE is a lightweight application server which provides a full JEE Web Profile. It provides following technologies within a simple war container:

  • CDI - Apache OpenWebBeans
  • EJB - Apache OpenEJB
  • JPA - Apache OpenJPA
  • JSF - Apache MyFaces
  • JSP - Apache Tomcat
  • JSTL - Apache Tomcat
  • JTA - Apache Geronimo Transaction
  • Servlet - Apache Tomcat
  • Javamail - Apache Geronimo JavaMail
  • Bean Validation - Apache BVal

This means you simply can drop your EJBs into a single war and be done with your applications. The same goes for CDI and Entity Beans, all reachable from a single war.

Now to provide an example of such a configuration I have written a 3 tier helloworld which does the basic things an enterprise application performs.

  • Display a UI
  • Calls a service layer
  • Calls the database
  • Peforms authentication 
  • Uses internationalization

And all from within a simple Maven start.

This blogentry is about this program and a step by step guide through it.

About the program and getting started

To get started with the program download the program from Github via http://www.github.com/werpu/tomeemaven

Once downloaded simply run the program via mvn install tomee:run and then point your browser to

http://localhost:8080/tomeemaven. The rest is very barebones and basic and self explanatory.

You will be brought to a landing page which loads some data over a service layer from the database. Next is a login page which does basic authentication and then brings you into a restricted area.

Now the program itself is as barebones as it gets, the more interesting part is the setup, which this blog will guide you through.

Maven Setup

The barebones maven setup is a simple war file:

<groupId>com.github.werpu</groupId>
<artifactId>tomeemaven</artifactId>
<packaging>war</packaging>

Now to the first interesting part, normally you have in a Maven and Tomcat configuration a huge list of dependencies, which your app uses. Our setup has reduced the dependency list to two entries.

<dependencies>
    <!-- contains the entire jee api we only use a subset of it-->
    <dependency>
        <groupId>javax</groupId>
        <artifactId>javaee-api</artifactId>
        <version>6.0</version>
        <!-- scope provided means the container delivers it -->
        <scope>provided</scope>
    </dependency>
    <!-- hsqldb references, we start hsqldb as server -->
    <dependency>
        <groupId>org.hsqldb</groupId>
        <artifactId>hsqldb</artifactId>
        <version>2.2.8</version>
        <scope>provided</scope>
    </dependency>
</dependencies>

As you can see both dependencies are <scope>provided</scope> which means that the app server provides those two.

The more interesting part is the jee-api include which basically just provides the JEE APIs for the code. We only use a subset of those, but for the simplicity of things a full JEE api include should be enough to get you started.

The second include is the hsql db which we use because we start hsql directly in server mode. (Note TomEE comes with hsql included, we do not have to do anything here regarding including it)

The next part is the TomEE Maven Plugin

<plugin>
    <groupId>org.apache.openejb.maven</groupId>
    <artifactId>tomee-maven-plugin</artifactId>
    <version>1.0.0</version>
    <configuration>
        <tomeeVersion>1.5.0</tomeeVersion>
        <tomeeClassifier>plus</tomeeClassifier>
        <debugPort>5005</debugPort>
    </configuration>
</plugin>

This basically is our TomEE starter, we de not have to do anything about its configuration. With this entry our mvn clean install tomee:run is possible and our war automatically picked up as application war.

The last part is the compile time instrumentation weaving for OpenJPA which enhances our entities at compile time.

<plugin>
    <groupId>org.apache.openjpa</groupId>
    <artifactId>openjpa-maven-plugin</artifactId>
    <version>2.2.0</version>
    <executions>
        <execution>
            <id>mappingtool</id>
            <phase>process-classes</phase>
            <goals>
                <goal>enhance</goal>
            </goals>
        </execution>
    </executions>
    <configuration>
        <includes>
            com/github/werpu/tomeedemo/orm/*
        </includes>
    </configuration>
</plugin>

This is basically the entire maven setup.

File structure

The folder hierarchy follows a strict Maven setup with small enhancements for the TomEE Maven plugin.

It is as follows:

Folder Hierarchy

Following Folders are of interest

  • src/main/java The Java sources
  • src/main/resources the resources folger hosting the beans and persistence.xml, this folder later will be compiled into the web-inf/classes folder along with the java sources
  • src/main/webapp the web application folder
  • src/main/tomee hosting the TomEE configuration file overrides (we will come later to this important folder)

The main difference to a standard Maven layout is the src/main/tomee folder.

It currently hosts 2 files:

  • tomcat-users.xml
  • and tomee.xml

tomee.xml hosts a new database resource which connects to a locally running HSQL DB Server. Outside of that it is a blank copy of what is provided from the standard TomEE distribution.

tomcat-users.xml is the standard Tomcat UserDatabase Realm but provides one additional user, who is the login user for our secured part.

We will go later into more details regarding both files. The important part about this files is, that they replace the original files during the build process and bundle them into the running TomEE once you issue mvn install tomee:run on the command line.

That way you can override any of the standard settings of TomEE with your custom settings.

Override Config Files

 

3 Tiers

The UI layer

Here we use plain JSF as provided by TomEE (in our case Apache MyFaces 2.1.9 which as time of writing is almost the most recent distribution of Apache MyFaces)

The main difference is that we basically use CDI  instead of JSF managed beans. Here is an example page controller class implemented as CDI bean.

@Named
@RequestScoped
public class HelloView
{
    @EJB
    HelloEJB helloEjb;

    //---------------------------- getter and setter ---------------------------------
    public String getHelloWorld()
    {
        return helloEjb.getHello();
    }
}

The main difference here to a standard Tomcat setup is, that we can inject an EJB directly into our managed bean. This is thanks to TomEE.

The EJB can be straightly bundled into the WAR file, no need for EAR and complicated classloader hierarchies.

This brings us directly to our second layer.

 

The service/DAO layer

We have implemented the Service layer as stateless Session Bean, since we use JPA we will omit separate DAOs in favor of simplicity.

The Service does not do to much it basically just writes to the database in case no data is present otherwise it will read from the database.

@Stateless
public class HelloEJB
{
    @PersistenceContext(name = "Demo_Unit")
    EntityManager em;

    public String getHello()
    {
        Query query = em.createQuery("select hello from HelloEntity hello");
        if (query.getResultList().size() == 0)
        {
            HelloEntity entity = new HelloEntity();
            em.persist(entity);
            return entity.getHelloWorld();
        }
        HelloEntity entity = (HelloEntity) query.getResultList().get(0);
        return entity.getHelloWorld();
    }
}

This brings us straight to

 

The database layer

 

Centralized database connection

 The database layer is a little bit more complicated because it needs several configuration files.

First we have to deal with the a centralized managed database connection.

TomEE utilizes a resource entry in tomee.xml to enable the centralized database connection. (Which theoretically can be shared over multiple applications)

Here is our entry:

<Resource id="MyDataSource" type="DataSource">
        JdbcDriver org.hsqldb.jdbcDriver
        JdbcUrl    jdbc:hsqldb:hsql://localhost:9001/testdb2
        UserName   sa
        Password
        JtaManaged true
</Resource>

What is strikingly odd here is, that this is not the same syntax as used in a standard context.xml Tomcat configuration but instead TomEE follows its own syntax which is explained more thoroughly here  (I have not yet tested if a context.xml db connection also works, given that there is the web.xml method, I probably never will (see below)).

In our case we connect to a running HSQLDB server which is started with the application automatically (we will come later to that)

The important parts are following

  1. the Datasource id="MyDataSource"
  2. the Type being a DataSource: type="DataSource"
  3. and JtaManaged being true
The rest is standard JDBC.
The main advantage of this method is mostly the centralized connection which can be shared over multiple web applications, and also that it allows for more configuration options compared to the web.xml method. The downside however is, that it is centrally managed which means you have to alter your config entries in the tomee dir. In our case the project does that automatically during deployment of the web application.
 

Web application managed database connections

 
As one of the readers of this blog has pointed out, the method mentioned before is centralized but non standard (not JEE), which means for a real deployment you have to change the configuration of your tomee and to the worse if you go to a different app server you have to setup the connection again for that server, in some cases this is not wanted. Since we use Servlet 3.0 there is a standard way to define database connections on a per webapp base. We can achieve it simply by adding following code to our web.xml.
 
<data-source>
    <name>MyDataSource</name>
    <class-name>org.hsqldb.jdbcDriver</class-name>
    <url>jdbc:hsqldb:hsql://localhost:9001/testdb2</url>
    <user>sa</user>
    <password></password>
</data-source>
 
The advantage of this method is simplicity and you do not tamper with the config directory of Tomee, the downside is, that MyDataSource only can be used within the context of the web application and you will lose some advanced configuration options the centralized method can provide.
 

Persistence.xml

Now to the second part the persistence.xml. As already discussed we use OpenJPA for persistence, because it already is present in the container. Hibernate or EclipseLink also would be possible by adjusting the pom.xml and removing the OpenJPA implementation and adding the other jars. Following page has an example which exactly does that. 

For our persistence.xml we use following entries:

<persistence xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd"
    version="2.0">


    <!-- simply all annotated persistent entities will be part of this unit-->
    <persistence-unit name="Demo_Unit">
        <provider>org.apache.openjpa.persistence.PersistenceProviderImpl</provider>
        <jta-data-source>MyDataSource</jta-data-source>
        <class>com.github.werpu.tomeedemo.orm.HelloEntity</class>

        <properties>
            <property name="openjpa.jdbc.DBDictionary" value="hsql"/>

            <property name="openjpa.jdbc.SynchronizeMappings" value="buildSchema(ForeignKeys=true)"/>

            <!-- disable runtime instrumentation -->
            <property name="openjpa.DynamicEnhancementAgent" value="false"/>

            <property name="openjpa.Log" value="DefaultLevel=WARN, Runtime=INFO, Tool=INFO, SQL=TRACE"/>

            <!-- with those two settings openjpa behaves better for merge and detach -->
            <property name="openjpa.DetachState" value="loaded(DetachedStateField=true)"/>
            <property name="openjpa.Compatibility" value="IgnoreDetachedStateFieldForProxySerialization=true"/>

            <property name="openjpa.jdbc.MappingDefaults"
                      value="ForeignKeyDeleteAction=restrict, JoinForeignKeyDeleteAction=restrict"/>

        </properties>

    </persistence-unit>

</persistence>

 The important parts are following:

  •  <persistence-unit name="Demo_Unit"which provides us with the persistence unit name used by our EJB.
  • <jta-data-source>MyDataSource</jta-data-source> As datasource definition.
  • <class>com.github.werpu.tomeedemo.orm.HelloEntity</class> For our enity class

The rest are internal openjpa settings, like foreign key handling, that we use compile time instrumentation instead of runtime instrumentation (see also the maven config for that one), or a helper which preserves the states in serialized scenarii, to help to speed up subsequent merges. Neither of that is if importance for the blog.

Now last but not least our entity class which is used by our service:

 

@Entity
public class HelloEntity
{
    @Id
    @GeneratedValue
    Integer id;

    @Column(name = "HELLO_WORLD")
    String helloWorld = "helloWorld from entity";

    @Version
    protected Integer optlock;

    public Integer getId()
    {
        return id;
    }

    public void setId(Integer id)
    {
        this.id = id;
    }

    public String getHelloWorld()
    {
        return helloWorld;
    }

    public void setHelloWorld(String helloWorld)
    {
        this.helloWorld = helloWorld;
    }

    public Integer getOptlock()
    {
        return optlock;
    }

    public void setOptlock(Integer optlock)
    {
        this.optlock = optlock;
    }
}

 

Authentication

We use standard server hosted authentication for our application to limit user access to certain pages. Deeper knowledge on the mechanisms of this authentication method is outside of the scope of this blog, but you can find information here.

The application has a secured page which can be accessed from our landing page. For simplicty reasons we use the standard tomcat-users.xml file and the UserDatabase realm defined by tomcat as standard.

All we do is to add another user and a new permission to the realm:

 

<role name="authenticated-user" />
<!-- we use a new user in our standard realm for the authenticated pages -->
<user name="test" password="user" roles="authenticated-user" />

 The web.xml now has to have following entries to secure the page:

 

<!--
authentication , we use browser authentication
and the tomcat-users.xml file as authentication realm
-->
<security-constraint>
    <web-resource-collection>
        <web-resource-name>Authenticated Pages</web-resource-name>
        <url-pattern>/user/*</url-pattern>
    </web-resource-collection>
    <auth-constraint>
        <role-name>authenticated-user</role-name>
    </auth-constraint>
</security-constraint>

<login-config>
    <auth-method>FORM</auth-method>
    <realm-name>UserDatabase</realm-name>
    <form-login-config>
        <form-login-page>/login/login.jsf</form-login-page>
        <form-error-page>/login/error.jsf</form-error-page>
    </form-login-config>
</login-config>

What we do here is to secure all pages under /user/* only accessible for users with the role authenticated-user which was defined before.

<realm-name>UserDatabase</realm-name> 

was the standard realm which we defined our new user and new role

We also define a login and error page for a standard login. 

The login looks as follows:

<form method="POST" action="j_security_check">
        <input type="text" name="j_username"/>
        <br />
        <input type="password" name="j_password"/>
        <br />
        <input type="submit" value="Submit"></input>
        <input type="reset" value="Reset"></input>
</form>

 This is a standard login form for server based authentication, it has to follow a certain naming syntax, hence, the non JSF syntax.

The Database

We could make it simpler by running our database in embedded mode, which would look like following:

 

<Resource id="MyDataSource" type="DataSource">
    JdbcDriver org.hsqldb.jdbcDriver
    JdbcUrl jdbc:hsqldb:file:data/hsqldb/testdb
    UserName sa
    Password
    JtaManaged true
</Resource>

This however has the downside of not being able to connect to the database with third party tools, while the db is running. While simplicty is the primary objective of the demo project, which this blog is based upon we stray away from this path in case of the DB. Our primary aim is to have a db which runs in server mode. Normally this would be an Oracle, MySQL or PostgresSQL db. However in our case we simply want the db to start once we start our own server.

For this we use a small trick, we use a servlet to start the db. (theoretically a context listener also would suffice)

The servlet has to be started via loadonstartup = 1

@WebServlet(urlPatterns = "/db/*", loadOnStartup = 1)
public class HSQLDBStartupServlet extends HttpServlet
{
    Logger log = Logger.getLogger(HSQLDBStartupServlet.class.getName());
    
    @Override
    public void init() throws ServletException
    {
        super.init();
        try
        {
            log.info("Starting Database");
            HsqlProperties p = new HsqlProperties();
            p.setProperty("server.database.0", "file:data/hsqldb");
            p.setProperty("server.dbname.0", "testdb2");
            p.setProperty("server.port", "9001");
            Server server = new Server();
            server.setProperties(p);
            server.setLogWriter(null);
            server.setErrWriter(null);
            server.start();
        }
        catch (ServerAcl.AclFormatException afex)
        {
            throw new ServletException(afex);
        }
        catch (IOException ioex)
        {
            throw new ServletException(ioex);
        }
    }
}

 

And the entry in the tomee.xml looks like following:

<Resource id="MyDataSource" type="DataSource">
    JdbcDriver org.hsqldb.jdbcDriver
    JdbcUrl    jdbc:hsqldb:hsql://localhost:9001/testdb2
    UserName   sa
    Password
    JtaManaged true
</Resource>

 Almost the same as before however the JDBCURL parameter is different it now uses a TCP/IP connection instead of direct file access.

 

 

 

Summary

This sums up our little walkthrough through the demo application. Have in mind, that the application is a good kickstarter for your own projects, however in a real production environment some adjustments have to be made. For instance authentication needs to be set towards a database, passwords need to be replaced by their hash sums, authentication has to go through https, and also the db should not be generated out of the entities.

But all of this would sacrifice the simplicity of the demo project. Also feel free to comment or fork the project for your own purposes or improvements. Any participation is welcome as always in opensource projects.

 

Links 

  1. TomEE Maven project
  2. Apache TomEE project
  3. OpenJpa project
  4. Mark Strubergs blog (JPA and CDI information)
  5. Oracle server side authentication tutorial
  6. How to start HSQLDB in a webapp 

 

 

 

 

Donnerstag, 8. November 2012

Marrying Scala with Apache MyFaces Part 4 - Apache MyFaces Extension Scripting

Introduction

In Parts 1 2 and 3 we have had a look at various implementation aspects on how to enable Scala and Apache MyFaces.
Part 4 introduces a framework which allows a zero restart configuration for Scala based JSF artifacts, such as components, managed beans, renderers etc...

This project is a relatively new extension to the Apache MyFaces project which already has had spawned several sideprojects such as CODI or Ext-Val.

The projects name is Ext-Scripting.

What is Ext-Scripting? This is an ongoing effort to integrate scripting languages into Apache MyFaces in a way that the user gets zero restart configurations for code writing.

Following video will give more details on what Ext-Scripting can provide:



Ext-Scripting itself is not only a Scala specific project but also supports as of now Groovy, dynamically recompiled Java  and soon JRuby.
One of the major aspects of Ext-Scripting was to provide extensive documentation which you can find here.

In this blog we will trim down all the information provided to give you the bare essentials in the shortest possible time.

Ext-Script in action

Ext-Script allows you now to edit direct JSF artifacts like managed beans and recompiles those artifacts on the fly. Very similar to JSP, however in JSP you were limited to stateless pages,  Ext-Scripting literally allows you to edit most JSF artifacts no matter if they are stateless or stateful.

It uses sophisticated dependency detection mechanisms on bytecode level, to reload exactly the artifacts it needs to keep the system in a stable state.

Following video (With a Java Example) will demonstrate of what happens exactly



So what does this mean for Scala. Once Ext-Scripting is enabled you can start to write managed beans, components, navigation handlers, phase listeners etc... directly in Scala and have Ext-Scripting do the heavy lifting for you.

Your server restart times will be reduced up to 90% by simply using the dynamic recompilation feature. You even can use the JSF annotations in a dynamic manner, by simply moving them deleting them etc...

Getting started -

Maven config

The easiest way to get started with Ext-Scripting is simply to take one of the example projects provided by the distribution and to start hacking or use maven. We will emphasis on the maven configuration in this blog. To add Ext-Scripting all you need is a working war configuration and then add following entries to your dependency section:

<dependency>
 <groupId>org.apache.myfaces.extensions.scripting</groupId>
 <artifactId>extscript-myfaces20-bundle</artifactId>
 <version>1.0.4</version>
</dependency>


And for the Scala support:

<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>2.9.1</version>
</dependency>
<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-compiler</artifactId>
    <version>2.9.1</version>
</dependency>


This should add all the needed dependencies automatically to your project.

web.xml

General Setup To have Apache MyFaces picking up Ext-Scripting we have to add following context parameter to the web.xml:

<context-param>
     <description>
         Enables our scripting engine support plugins
     </description>
     <param-name>org.apache.myfaces.FACES_INIT_PLUGINS</param-name>
     <param-value>
         org.apache.myfaces.extensions.scripting.servlet.StartupServletContextPluginChainLoader
     </param-value>
</context-param>

Setting up your source directories

Now theoretically you can start to code. Ext-Script will pick up sources hosted under WEB-INF/scala in your deployment directory. However under normal circumstances this is undesired because you want to have Ext-Scripting to pick up the sources from your real source directories for editing.

So we now have a number of context parameters in the web.xml which will allow Ext-Scripting to pick up your various sources from the source destinations itself.

For Scala following config parameters is relevant:

<context-param>
     <description>Additional comma separated loader paths to allow direct editing on the sources directory instead of the deployment dir
     </description>
     <param-name>org.apache.myfaces.extensions.scripting.scala.LOADER_PATHS</param-name>
     <param-value>
   <some project path>/src/main/webapp/WEB-INF/scala
     </param-value>
</context-param>


Now this is all to have Ext-Script to pick up your sources from the Scala path and you can start. Happy Coding.

[1] Link to Apache MyFaces Ext-Scripting
[2] Marrying JSF and Scala Part 1
[3] Marrying JSF and Scala Part 2
[4] Marrying JSF and Scala Part 3

Ext-Script and TomEE

Introduction:

I just wanted to inform everyone that Apache MyFaces Extension-Scripting works out of the box with Apache TomEE.

 

 What is Apache MyFaces Ext-Scripting?


For all people who don't know what Ext-Script is (probably most of you). Apache MyFaces Extension Scripting short Ext-Script is an Apache MyFaces extension which provides scripting language support and dynamic recompilation for Apache MyFaces. This means you can code the JSF parts of your project in Groovy, Scala, Java and soon in JRuby and you get jsp like dynamic recompilation of those parts for free.

It does not work with all app servers due to server limitations (mostly in the classloader section)  but so far it has been tested with

  • Tomcat
  • Jetty
  • Websphere Liberty Profile
  • Glassfish and
  • TomEE

More information about Apache MyFaces Ext-Scripting here.

This blogpost is about TomEE and how to get it up and running with this excellent lightweight server.

 

What is TomEE:


TomEE is a project, which provides a full JEE Web Profile to a Tomcat, it does not touch Tomcat but provides the entire JEE web profile stack (and more) within a standard Tomcat war container.  Thats right no EAR anymore just drop a war and you will get JSF CDI etc... out of the box from the container.
More information on Apache TomEE here. No more bundling of JSF CDI etc... all is provided by a lightweight servlet container with great startup times.

 

Combination of both


So how do we combine both containers? The answer is simple, drop
Ext-Scripting:

  • extscript-myfaces20-bundle-1.0.4.jar

and its dependencies :

  • commons-beanutils.jar version 1.8.3 or above
  • commons-codec.jar version 1.3 or above
  • commons-collections.jar version 3.2 or above
  • commons-io.jar version 1.4 or above
  • groovy-all.jar version 1.7.2 or above
  • scala-compiler.jar version 2.10.0-M2 or above
  • scala-library.jar version 2.10.0-M2 or above

(Note if you use Maven then Maven will do the job for you) into your war WEB-INF/LIB and set following configuration entries:

<context-param>
        <description>
            Initializes the plugins for our groovy handlers
        </description>
        <param-name>org.apache.myfaces.FACES_INIT_PLUGINS</param-name>
        <param-value> 
            org.apache.myfaces.extensions.scripting.jsf.startup.StartupServletContextPluginChainLoader
        </param-value>
</context-param>


This is the most important entry and under normal Servlet containers now we would be set and could start coding, however we have to set additional parameters for TomEE:

 <context-param>
        <description>Additional comma separated loader paths to allow direct editing on the sources directory instead
            of the deployment dir
        </description>
        <param-name>org.apache.myfaces.extensions.scripting.groovy.LOADER_PATHS</param-name>
        <param-value>
            /whatever/myfaces20-extscript-helloworld/src/main/webapp/WEB-INF/groovy
        </param-value>
</context-param>

<context-param>
        <description>Additional comma separated loader paths to allow direct editing on the sources directory instead
            of the deployment dir
        </description>
        <param-name>org.apache.myfaces.extensions.scripting.java.LOADER_PATHS</param-name>
        <param-value>
            /whatever/myfaces20-extscript-helloworld/src/main/webapp/WEB-INF/java
        </param-value>
</context-param>
    
<context-param>
        <description>Additional comma separated loader paths to allow direct editing on the sources directory instead
            of the deployment dir
        </description>
        <param-name>org.apache.myfaces.extensions.scripting.scala.LOADER_PATHS</param-name>
        <param-value>
            /whatever/myfaces20-extscript-helloworld/src/main/webapp/WEB-INF/scala
        </param-value>
</context-param>

<context-param>
        <description>Additional comma separated loader paths to allow direct editing on the source
        directories of the deployment dir.
        </description>
        <param-name>org.apache.myfaces.extensions.scripting.ruby.LOADER_PATHS</param-name>
        <param-value>
            /whatever/extscript-examples/myfaces20-example/src/main/webapp/WEB-INF/ruby
        </param-value>
</context-param>


Those paths point to your source directories hosting the source files. (Note that the Ruby loader paths are only valid with the latest 1.0.5-SNAPSHOTS, 1.0.4 has no Ruby support)

Once done you can start coding in the code hosted at the respective directories.

 More information about setting up Ext-Script can be found here.

Freitag, 25. November 2011

JSF Ajax and Multiple Forms

This blogpost is about a problem which about 80% of all user problems regarding JSF Ajax in the myfaces mailinglist revolve around. Namely how do I handle the jsf ajax in a multiple form scenario.

JSF Ajax and multiple forms, a standard case, which should be easy right.
Let's have a small look at an example:

<h:form id="firstForm" prependId="false">
<h:panelGroup layout="block" id="renderTarget1"/>
<h:inputText id="first_input" value="#{multiFormBean.inputText1}"/>
<h:commandButton action="#{multiFormBean.doSubmit1}" value="submit1"
onclick="javascript:jsf.ajax.request(this, event, {execute:'firstForm', render:'renderTarget1'}); return false;">
</h:commandButton>
</h:form>
<h:form id="secondForm" prependId="false">
<h:panelGroup layout="block" id="renderTarget2"/>
<h:inputText id="second_input" value="#{multiFormBean.inputText2}"/>
<h:commandButton action="#{multiFormBean.doSubmit2}" value="submit2"
onclick="javascript:jsf.ajax.request(this, event, {execute:'firstForm', render:'renderTarget2'}); return false;">
</h:commandButton>
</h:form>
view raw multiform.xhtml hosted with ❤ by GitHub


As we see here, two forms each updating a component in itself via ajax.
Now what happens if we submit the forms alternating.

After a while we run into a ViewRoot cannot be found Exception on the server side.

We did everything right, why do we face this issue?

The answer lies in a bug in the JSF Ajax protocol, more precisely the way the ViewState is processed.

Lets have a look at an Ajax response:

<?xml version="1.0" encoding="utf-8"?>
<partial-response>
<changes>
<update id="renderTarget1"><![CDATA[<div id="renderTarget1"></div>]]></update>
<update id="javax.faces.ViewState">
<![CDATA[T2Sk3fhPxh/3V4FRMOA6jXjkc/Z9bOAzqA4dBeXHVrNpgC/cMgAi7bRUgL/faSQmOhYwkpPoG/cv8YZeyNlxa6B0r81ssbeWnBWZ5g7NG4K8Tt+E87ihH1XErK0AUsYTw8zKRDKGFrj7IKs25F3zdhmy6WV1EaSSIxyD5w==]]></update>
</changes>
</partial-response>
view raw response.xml hosted with ❤ by GitHub


Here we see the root cause of the problem. There is a parameter defining the ViewState with the identifier javax.faces.ViewState, however it is not clear where it belongs to.

Practically a viewstate must be attached to a form. So the issuing form definitely must receive it. However what about the other forms?

And here is the root cause of the error. Only the issuing form is updated and the ViewState which is dependend on the viewroot not the form in the second form is not updated.

This image shows exactly what happens:


As you can see only one form is update the second form now has a viewstate which is not the current one and at one point is dropped from the ViewState history, a classical concurrency issue.

So the solution would be to update all forms in the html document, right?

Theoretically yes, but there is one API which prevents this simple solution. Portlets.
In a portlet environment you have multiple viewroots all belonging to different jsf session instances on the server.


The next logic solution would be to update all jsf elements under ViewRoot.

Again, a good idea, but the protocol prevents it. On the pure client side we do not have any marker or indicator, which tells the ajax code where the current ViewRoot begins.

To ease this protocol problem an extension to the jsf spec was added under 2.1 which eases this problem. According to the spec you have to add the second form manually as render target. Something which both implementations follow. Here is a snippet which shows the solution:

<h:form id="firstForm" prependId="false">
<h:panelGroup layout="block" id="renderTarget1"/>
<h:inputText id="first_input" value="#{multiFormBean.inputText1}"/>
<h:commandButton action="#{multiFormBean.doSubmit1}" value="submit1"
onclick="javascript:jsf.ajax.request(this, event, {execute:'firstForm', render:'renderTarget1 secondForm'}); return false;">
</h:commandButton>
</h:form>
<h:form id="secondForm" prependId="false">
<h:panelGroup layout="block" id="renderTarget2"/>
<h:inputText id="second_input" value="#{multiFormBean.inputText2}"/>
<h:commandButton action="#{multiFormBean.doSubmit2}" value="submit2"
onclick="javascript:jsf.ajax.request(this, event, {execute:'firstForm', render:'renderTarget2 firstForm'}); return false;">
</h:commandButton>
</h:form>


Here we can see, we have added the second form as render target.

Now both forms will be updated and the viewstate always will be up to date.


This solution while being spec compliant is not satisfactory. Sometimes you dont know if a certain form is still present after the ajax case. Is there still a way to update all forms in the page or at least to take the form id out of the equation?

Unfortunately not within the bounds of the specification. However, being aware of this issue, I have added in Apache MyFaces additionally to the spec behavior two other ways to resolve the issue.

First, you don't have to define the form as render target, but you can define any element within a form as render target and the form will be updated.

<h:form id="firstForm" prependId="false">
<h:panelGroup layout="block" id="renderTarget1"/>
<h:inputText id="first_input" value="#{multiFormBean.inputText1}"/>
<h:commandButton action="#{multiFormBean.doSubmit1}" value="submit1"
onclick="javascript:jsf.ajax.request(this, event, {execute:'firstForm', render:'renderTarget1 renderTarget2'}); return false;">
</h:commandButton>
</h:form>
<h:form id="secondForm" prependId="false">
<h:panelGroup layout="block" id="renderTarget2"/>
<h:inputText id="second_input" value="#{multiFormBean.inputText2}"/>
<h:commandButton action="#{multiFormBean.doSubmit2}" value="submit2"
onclick="javascript:jsf.ajax.request(this, event {execute:'firstForm', render:'renderTarget1 renderTarget2'}); return false;">
</h:commandButton>


Secondly, the probably best solution. A configuration parameter which forces MyFaces to update all forms in a page.

With following code snippet you can enable this mechanism:
window.myfaces = window.myfaces || {};
myfaces.config = myfaces.config || {};
//set the config part
myfaces.config.no_portlet_env = true;
view raw noportlet.js hosted with ❤ by GitHub


Once you have added this snippet of javascript, myfaces will be in no portlet mode and will enforce an update all forms with the viewstates policy.

Note: While this method wont break the code if you switch to mojarra, you will not get the benefits, because Mojarra does not yet have a similar solution to the issue.

Donnerstag, 24. November 2011

JSF Ajax Encoding

The entire story started when we got the request to encode everything in ISO-8859-51 including the ajax cycle. Well first I thought it was easy just change the encoding on the javascript side, let the server handle the rest. The encoding was easily detectable on the javascript side simply by checking the xhtmls encoding (the meta tag head encoding itself would have been another option, but since we are nailed down to xhtml anyway due to facelets we have an easier way)
Easy, I thought, but then I ran into browser hell.

The problem normally would be easily resolvable. The XHR object has the option of adding
xhr header content type:

xhr.setRequestHeader("ContentType", "application/x-www-form-urlencoded; charset=ISO-8859-15");

The problem now are the browsers themselves. By testing the dynamic encoding on various browsers following came out:

BrowserActual Encoding
Mozilla 7UTF-8
Chrome UTF-8
IE ISO-8859-15
Opera ISO-8859-15


So what does this mean, only opera and IE got it right. Which means the path of allowing non UTF-8 submits is blocked for now.

However JSF automatically deals with the problem properly. While I implemented the most of the Ajax part of myfaces, I have to admit the actual encoding part was provided by another project, namely j4fry and its implementors worked on that part, so I never gave a second thought. However both implementations deal the same way with the issue.

First ajax submits are UTF-8 encoded, this at the first look could pose problems with non UTF pages. It turns out there are none.

The solution both implementations follow is to encode the actual key value pair parameters into a utf url encoded representation.

Both implementations seem to apply the encodeURIComponent function of javascript-
Now now matter what content type the page has, always a proper utf-8 representation of the original content will be passed down.

Given the response type also then is UTF-8 what happens with the response. After all the page needs to handle the response properly in its own encoding.
Well there MyFaces and Mojarra differ slightly. Both in compliance with the spec, encode the response in a XML CDATA block. However MyFaces does additional escaping of non ascii chars with their unicode representation, while Mojarra simply pushes the content byte per byte into the CDATA block.


Here is an example:

Mojarra:

<?xml version='1.0' encoding='UTF-8'?>
<partial-response>
<changes>
<update id="testform"><![CDATA[
<form id="testform"
name="testform" method="post"
action="/experimental/iso-8859-15.jsf"
enctype="application/x-www-form-urlencoded">
<input type="hidden" name="testform" value="testform" />
Press fast for a delay
<input id="testform:inputtext"
type="text" name="testform:inputtext"
value="€€€€€€" />
<span id="testform:outputtext">€€€€€€</span>
<span id="testform:output">Blörks 6 zahl über...$%</span>
<a id="testform:pressme" href="#"
onclick="mojarra.ab(this,event,'action','@this testform:inputtext testform:pressme','testform');return false">
Über Pressen</a>
</form>]]></update>
<update id="javax.faces.ViewState"><![CDATA[-3413056635477888908:-884780313822768314]]></update>
</changes>
</partial-response>


Here it is clearly visible that the cdata block has a different encoding than the outer UTF-8 encoded xml. In the final page representation all the special chars are visible again as they should be.

However MyFaces goes one step further and escapes the content additionally to get rid of the non utf-8 representation of the characters.

<?xml version="1.0" encoding="utf-8"?>
<partial-response>
<changes>
<update id="testform"><![CDATA[
<form id="testform" name="testform"
method="post" action="/experimental/iso-8859-15.jsf"
enctype="application/x-www-form-urlencoded">
Press fast for a delay
<input id="testform:inputtext"
name="testform:inputtext"
type="text" value="&euro;&euro;&euro;&euro;&euro;" />
<span id="testform:outputtext">&euro;&euro;&euro;&euro;&euro;</span>
<span id="testform:output">Bl&ouml;rks 33 zahl &uuml;ber...$%</span>
<a href="#"
onclick="jsf.util.chain(document.getElementById('testform:pressme'), event,'jsf.ajax.request(\'testform:pressme\',event,{execute:\'@this testform:inputtext testform:pressme\',render:\'testform\',\'javax.faces.behavior.event\':\'action\'})'); return false;"
id="testform:pressme" name="testform:pressme">&Uuml;ber Pressen</a>
<input type="hidden" name="testform_SUBMIT" value="1" /></form>]]>
</update>
<update id="javax.faces.ViewState">
<![CDATA[X4esYFadbDOTm2+eJEayTfM3M/7uX1jS5+ZVXaklBMszHBFWKU4GwDucc84o8baxZJzgvTk8yLtX6ptwas+25gXNr8ivBBplW4WUOPqc0q87kwchVx50nD0YDYZKVUndUQ3X+09LD6e0YRrTgmBa4EMJkro=]]></update>
</changes>
</partial-response>


However this comes from the fact that myfaces basically also does an escape of special chars at a full page refresh, so the core behavior regarding the partial response is the same.

So what happens for instance if you just tamper with the UTF header.
You automatically will run into problems due to the uri encoded UTF-8 representation of the parameters. In the worst case you will trigger a server error because of non decodable parameters, in the best case if you pass down ascii only chars you will get it through, in the normal case you will get junk in which is wrongly decoded.

See here an example on IE9 and Mojarra:




The question remains, are both save from a client point of view? Theoretically it should be since everything is embedded in a CDATA block.
However I cannot say if the browsers swallow really everything within their browser engines which is embedded in a CDATA block (aka every byte combination outside of their encoding).


It comes down again to the golden rule #1 in web programming, use UFT-8 and never bother with other encodings, if you can. If you have to, be prepared to enter the valley of tears, after all UTF-8 now has been for almost a decade the fix for all the encoding issues in the web space.

Dienstag, 8. November 2011

Introducing the MyFaces jsf.js Modular Includes

Introduction:

Why modular jsf.js includes, there is one jsf.js file, right?

Since the beginning of the introduction of jsf.js in JSF there has been only one jsf.js file which serves all Ajax needs.
However, at least for Apache MyFaces there have been a lot of extensions added such as:
  • iframe ajax fileupload
  • html5 form detection
  • client side i18n message support for various languages (including chinese)
  • legacy browser support down to blackberry 4.7 and winmobile 6.5 and ie6
  • browser optimisations for browsers supporting dom level3
  • queue control for the ajax request queue
  • timeout handling
  • better error handling
  • delay handling
  • modular structure which can be changed at runtime to replace it with your own  implementation of subparts
  • clear distinction between api and impl on codelevel
(More info on many of those extended features see following Link)

Most of those extensions will make it one way or the other into the official jsf spec. However different scenarii have different needs.
While most legacy mobile browsers are now slowly phased out (and is seen as deprecated) ie6 for the time being is still supported with a burden of legacy code.
However in a pure modern mobile environment for instance you need smaller files and no legacy code at all. Many users are also perfectly happy with a pure implementation of jsf.js according to the spec without any extra features. Other users only need a subset of those features and want to leave out the other (i18n support for instance)

The advantage of dropping code is smaller filesizes, faster code due to less lines which have to be processed and less fallbacks which have to be prechecked. Something worthwhile to explore given modern mobile browser environments and situations where you simply want to be as lean as possible.

So what is the solution for all this

The MyFaces jsf.js Modular Includes

As of MyFaces 2.1.4 a modular include system will be added. Simply by adding following parameter to your web.xml you will be able to determine what will be included

<context-param>
<param-name>org.apache.myfaces.JSF_JS_MODE</param-name>
<param-value>normal</param-value>
</context-param>
<!--
node modular jsf.js includes are only enabled in production
in development the normal jsf.js is included always
-->
<context-param>
<param-name>javax.faces.PROJECT_STAGE</param-name>
<param-value>Production</param-value>
</context-param>

With the parameter

org.apache.myfaces.JSF_JS_MODE

and the allowed values of
  • normal
  • modern
  • minimal-modern
you will be able to adjust which version of jsf.js is included. If you need extra functionality you can include the subparts excluded from your version as separate Javascript resources via normal jsf mechanisms.

Following image shows the version of each file and the extra functionality:

Modular include parts

As you you can see you can stack and mix all the modules you need by simply choosing a base jsf.js and then you are able to stack the extra functionality needed by adding a separate include.

That way you have the choice between filesize, number of includes and features according to your needs.













Example on how to Enable the Modular Includes with Additional Features

The following example will include the minimal-modern jsf.js and will stack all the extra functionality on top of it.

web.xml

<context-param>
<param-name>org.apache.myfaces.JSF_JS_MODE</param-name>
<param-value>minimal-modern</param-value>
</context-param>
<!--
node modular jsf.js includes are only enabled in production
in development the normal jsf.js is included always
-->
<context-param>
<param-name>javax.faces.PROJECT_STAGE</param-name>
<param-value>Production</param-value>
</context-param>
view raw web.xml hosted with ❤ by GitHub


in your page template

<h:outputScript name="jsf.js" library="javax.faces" target="head" />
<h:outputScript name="jsf-i18n.js" library="myfaces" target="head" />
<h:outputScript name="jsf-legacy.js" library="myfaces" target="head" />
<h:outputScript name="jsf-experimental.js" library="myfaces" target="head" />
view raw includes.xhtml hosted with ❤ by GitHub


As you can see all you need to to is to reference the correct js file additionally which is hosted under the resource module myfaces and you are ready to go.