Skip to Content.
Sympa Menu

shibboleth-dev - Integration JUnit testing

Subject: Shibboleth Developers

List archive

Integration JUnit testing


Chronological Thread 
  • From: "Howard Gilbert" <>
  • To: <>
  • Subject: Integration JUnit testing
  • Date: Mon, 22 Aug 2005 15:16:41 -0400

JUnit testing of the Shibboleth SP can be difficult because of several things:

 

Much of the SP depends on an environment created by the SPContext and the parsing of the SPConfig. Fortunately, this entire block of code was designed with JUnit in mind, so the entire structure can be built with a few setup statements. Furthermore, the setup decides which configuration file will be used. Furthermore, individual elements of the configuration can be changed dynamically so a sequence of tests can be run with slightly different versions of a single configuration element.

 

However, the SP depends on SAML data that it has no logic to generate. OpenSAML often requires that this data be signed (so it is not possible to modify a template) and it must contain current timestamps. Having looked over the code, I see no reasonable solution to this problem except to get the IdP classes to generate the required data dynamically in each run of the test.

 

These two things tend to produce what is more commonly called an "integration" rather than a "unit" test. Although a few classes can be tested in relative isolation, in a lot of cases you need to run a very large block of code to create the larger environment. I do not intend to give up entirely in finding ways to bypass some setup and run tests more efficiently, but the first challenge is to figure out how to run the integrated components in a JUnit environment.

 

For a full functional test, one might choose a technology like Cactus. That product assumes that the application is dependent on the J2EE environment and so it runs the tests under Tomcat. However, Shibboleth does not need this level of support.

 

After looking at many different alternatives, I decided to try out Mockrunner, or specifically the mockrunner-servlet.jar. This creates mock versions of all the objects in the Servlet environment. It does not use configuration files, so the bad news is that each object has to be setup programmatically, but then the good news is that the objects can be changed by the test code to run several different cases directly without a lot of secondary configuration files to maintain.

 

It has a small set of additional dependencies (Jakarta-oro, jdom, and nekohtml) and with mockrunner-servlet itself, all four jar files can be placed in a separate directory of the project. I suggest subdirectory "testlib" under the Shibboleth java project. Then you add them to the Eclipse project build libraries and to any JUnit test classpath in Ant.

 

You can test both Filters and Servlets. You can test a chain of Filters and one Servlet, simulating the most complex real situation. I tested it against the Shib Filter (the RM) and the IdP SSO.

 

The test case creates an instance of WebMockObjectFactory that creates instances of all the required objects (MockServletContext, MockFilterConfig, MockHttpServletRequest, MockHttpServletResponse, etc.). You then create a ServletTestModule connected to the factory.

 

A TestCase then simulates the configuration by setting init-params into the MockServletContext, MockServletConfig, and MockFilterConfig (as would be done in the WEB-INF/web.xml). The ServletTestModule is used to create instances of the Filter and Servlet classes and init() them using the simulated configuration.

 

The Request object can then be set up with URLs, parameters, headers, and other stuff (like a RemoteUser). You can call the Filter processing entry point or the doGET or doPOST methods.

 

The Filter or Servlet method is called and returns. Mockrunner doesn't itself follow redirects or forwards, which is good because the TestCase is simpler this way.

 

Let me annotate an example where the IdP SSO is called to generate an Assertion. First the objects are created and some fields are set up:

 

    // The Factory creates the Request, Response, Session, etc.

    WebMockObjectFactory factory = new WebMockObjectFactory();

   

    // The TestModule runs the Servlet and Filter methods in the simulated container

    ServletTestModule testModule = new ServletTestModule(factory);

   

    // Now simulated Servlet API objects

    MockServletContext servletContext= factory.getMockServletContext();

    MockFilterConfig filterConfig= factory.getMockFilterConfig();

    MockHttpServletResponse response = factory.getMockResponse();

    MockHttpServletRequest request = factory.getMockRequest();

   

    // Servlet object

    private IdPResponder sso;

 

Then the one necessary configuration parameter that would normally be in the IdP web.xml is set:

 

        // ServletContext (argument to Filters and Servlets)

        servletContext.setInitParameter("IdPConfigFile", "file:/C:/usr/local/shibboleth-idp/etc/idp.xml");

 

Then Mockrunner creates an instance of the Servlet and initializes it:

 

        // Create instance of Filter class, add to chain, call its init()

        sso = (IdPResponder) testModule.createServlet(IdPResponder.class);

 

In the event that you are going to do any processing with the SP, you need to initialize it. There is a cheat here because the initServiceProvider() method is inherited from SPTestCase, but it doesn't contain any Mockrunner code.

 

        // Initialize an SP Context and Confg

        String configFileName = new File("data/spconfig.xml").toURI().toString();

        initServiceProvider(configFileName);

 

Now I like to separate the request parameters that aren't likely to change from test case to test case:

 

        request.setRemoteAddr("127.0.0.1");

        request.setContextPath("/shibboleth-idp");

        request.setProtocol("HTTP/1.1");

        request.setScheme("https");

        request.setServerName("idp.example.org");

        request.setServerPort(443);

 

From the ones that are probably test case specific

 

    void setRequestUrls(String suffix) {

        request.setRequestURI("https://idp.example.org/shibboleth-idp/"+suffix);

        request.setRequestURL("https://idp.example.org/shibboleth-idp/"+suffix);

        request.setServletPath("/shibboleth.idp/"+suffix);

   }

 

And

 

        setRequestUrls("SSO");

        testModule.addRequestParameter("target", "https://nonsense");

        testModule.addRequestParameter("shire","https://sp.example.org/Shibboleth.sso/SAML/POST");

        testModule.addRequestParameter("providerId", "https://sp.example.org/shibboleth");

        request.setRemoteUser("BozoTClown");

 

Now the SSO Servlet is called:

 

        testModule.doGet();

 

The Servlet sets various things into the attributes of the HttpServletRequest, then it tries to forward to IdP.jsp. Mockrunner allows one to set up forwarding, but I don't do it. Therefore, when doGet() ends there has been no output generation, but the important stuff is available directly:

 

        String bin64assertion = (String) request.getAttribute("assertion");

        String assertion = new String(Base64.decodeBase64(bin64assertion.getBytes()));

        String handlerURL= (String) request.getAttribute("shire");

        String targetURL = (String) request.getAttribute("target");

 

These values can then be used directly as simulated Request parameters in input to the SP processing. This can be avoided if you use AttributePush, which will not solve your problem if you are trying to test AA Query but is perfectly reasonable when you are trying to test various ARP-AAP combinations.

 

It is possible to Mock up the call to the AA, but it may require some thought to decide how to intercept the SP end of this, which now builds the socket and calls the URL directly.




Archive powered by MHonArc 2.6.16.

Top of Page