Difference between revisions of "Unit Testing"

From Opentaps Wiki
Jump to navigationJump to search
(A Unit Testing Tutorial)
Line 42: Line 42:
* Use a "test" delegator to point your tests to a separate database, and make sure it is defined in framework/entity/config/entityengine.xml is set to the right database.
* Use a "test" delegator to point your tests to a separate database, and make sure it is defined in framework/entity/config/entityengine.xml is set to the right database.
* The opentaps tests are commented out in hot-deploy/component-load.xml by default so don't forget to activate them.
* The opentaps tests are commented out in hot-deploy/component-load.xml by default so don't forget to activate them.
==Where are the Unit Tests?==
All opentaps unit tests are located in hot-deploy/opentaps-tests
There are also a small number of unit tests from ofbiz in their respective modules, such as framework/entity for the entity engine unit tests.
==Setting Up For Unit Testing==
==Setting Up For Unit Testing==

Revision as of 21:56, 29 September 2008

How to Write Unit Tests

opentaps 1.0

For opentaps 1.0, you would write a set of Junit tests in a class, then define it in an XML testdef file like this:

<test-suite suite-name="entitytests"
    <test-case case-name="security-tests"><junit-test-suite class-name="com.opensourcestrategies.crmsfa.test.SecurityTests"/></test-case>

You can define multiple tests per testdef xml file. Then, add the testdef file to your ofbiz-component.xml, like this:

   <test-suite loader="main" location="testdef/crmsfa_tests.xml"/>

Then, when you do

  $ ant run-tests

your tests will be run.

opentaps 0.9

In opentaps 0.9, you would write your Junit tests class and add your it to the base/config/test-containers.xml file, in the "junit-container" at the bottom, like this:

    <container name="junit-container" class="org.ofbiz.base.container.JunitContainer">
        <property name="base-test" value="org.ofbiz.base.test.BaseUnitTests"/>
        <property name="entity-test" value="org.ofbiz.entity.test.EntityTestSuite"/>
        <property name="service-test" value="org.ofbiz.service.test.ServiceEngineTests"/>
        <property name="crm-security" value="com.opensourcestrategies.crmsfa.test.SecurityTests"/>  <!-- your unit tests -->
        <property name="usps-test" value="org.ofbiz.shipment.thirdparty.usps.UspsServicesTests"/>
        <property name="jxunit-test" value="net.sourceforge.jxunit.JXTestCase"/>

Then you would do

 $ ant run-tests

Your tests will run alongside the existing OFBIZ test suites.


  • Use a "test" delegator to point your tests to a separate database, and make sure it is defined in framework/entity/config/entityengine.xml is set to the right database.
  • The opentaps tests are commented out in hot-deploy/component-load.xml by default so don't forget to activate them.

Where are the Unit Tests?

All opentaps unit tests are located in hot-deploy/opentaps-tests

There are also a small number of unit tests from ofbiz in their respective modules, such as framework/entity for the entity engine unit tests.

Setting Up For Unit Testing

We recommend that you create a separate database on the same database server for testing purposes and install all demo data into the testing database. Let's say that this database is called "opentaps_testing". Then, edit the file framework/entity/config/entityengine.xml and define opentaps_testing as a new datasource, called "localmysltesting" or "localpostgrestesting". Next, initiate the demo data into the testing database by editing the default delegator:

   <delegator name="default" entity-model-reader="main" entity-group-reader="main" entity-eca-reader="main" distributed-cache-clear-enabled="false">
        <group-map group-name="org.ofbiz" datasource-name="localXXXtesting"/>

Then do an

  $ ant run-install

to install all the seed and demo data into the testing database. Then you can edit the default delegator back to your original delegator, and set the test delegator to the testing database:

    <delegator name="test" entity-model-reader="main" entity-group-reader="main" entity-eca-reader="main">
        <group-map group-name="org.ofbiz" datasource-name="localXXXtesting"/>

All unit tests should be run to use the test delegator. This can be done by instantiating the the "test" delegator by name and using that delegator to instantiate a dispatcher. Or you can just write a test suite which extends the OpentapsTests base class, which does it for you.

If you need to modify port settings for the testing instance, you should edit the file framework/base/config/test-containers.xml.

Unit Testing Strategies

These are some strategies for unit testing:

  • Transaction comparison - Compare the transaction produced with a sample transaction, possibly pre-loaded into the system. For example, posting a paycheck to the ledger and then comparing with test data of a correct ledger transaction to make sure that they are equivalent. Equivalence is a very important concept: it is not possible that two sets of transactions are identical, since at a minimum they would have different ID numbers, and they would probably reference other transactions with different IDs. For example, each order would have a different orderId and different inventory item Ids reserved against it. However, two orders may be considered equivalent if they have the same set of items, prices, shipping methods, customer, addresses, tax and promotional amounts, etc.
  • State change - Compare the state of the system before and after a transaction has occurred. For example, check the inventry of an item, then ship an order, and check the resulting inventory to make sure that it is correctly decremented. This could get very complex: Shipping an order could cause customer balances, invoices, ledger postings, and inventory changes. Multiple tests could be run off the same transaction event.
  • Absolute state check - At all times, certain relationships must hold. For example, the sum of all debits must equal sum of all credits.

Tests should be written against the services that create the original data. For example, if you are writing tests against CRMSFA activity, you can use users from the demo data set, but you should use the CRMSFA activity services to create or update your activities. Otherwise, if you create those activities with some other method, future changes to the services to create activities will not be covered by your unit tests.

Tests should be run against a dedicated testing database with demo and seed data rather than production data. Therefore, the tests generally should set up their own initial conditions and run to completion, but they do not need to "tear down" and remove all data created by the tests. (This would be very impractical: imagine creating and shipping an order. To tear it down would involve reverting order data, customer information, inventory data, shipment data, invoices and payments, and accounting entries.) A good test for the tests is that if you ran the test suite in succession multiple times, they should pass during the second and third runs as well as the first run.

A Unit Testing Tutorial

IMPORTANT: Each unit test method must start with the word "test" -- it must be called testXXX(), not tryXXX() or verifyXXX().

Now let's walk through a particular unit test and see how it works. The one that we're looking at is the ProductionRunTests.java's testProductionRunTransactions method. This particular test verifies that a standard manufacturing process is working correctly and checks the inventory results and financial statements. As you read through the code, you will notice that it does the following

  1. Sets up by first receiving the raw materials (MAT_A_COST and MAT_B_COST) into inventory
  2. Checks the initial state by getting the GL account balances and the initial inventory quantities, both ATP and QOH
  3. Runs through the production run
  4. Checks the final state by getting the GL account balances and the inventory quantities for raw materials and the finished product.
  5. Verify the following:
    1. The change in inventory quantities is correct: raw materials are used, so their quantities are reduced, and finished product's quantity is increased because it is produced.
    2. The change in GL account balances are correct: inventory value increases and are offset by raw materials and manufacturing expenses.
    3. The financial statements are in balance at all times.
    4. The financial transactions created by this production run is in agreement with the reference transactions MFGTEST-1, -2, -3. This is done by finding all new financial transactions after the production run has been begun, as they should only be generated by the production run.
    5. The unit value of the finished product is correct.

Along the way, the tests will verify that all the services are run correctly and return success as well.

The test case uses the classes InventoryAsserts and FinancialAsserts to obtain information and run tests on the inventory balances and financial statement values. This is a common "delegation" pattern to separate the code for testing assertions to new classes. It also uses methods such as assertMapDifferenceCorrect and assertTransactionEquivalence which are inherited from the OpentapsTestCase and FinancialsTestCase base classes.

For comparisons of GL account changes, we have set a convention so that increases in the balances of debit GL accounts are positive, and increases in the balances of credit GL accounts are negative values. So, if a transaction caused an inventory GL account to increase by 100 and accounts payable to increase by 100 as well, the GL account changes would be {INVENTORY: 100, ACCOUNTS_PAYABLE: -100}

Note that in this case we used the receiveInventoryProduct service to receive inventory, called the various production run services, and then compared the results versus pre-stored AcctgTrans and AcctgTransEntries. With other tests, such as those for invoices and payments, we have used pre-existing Invoice and Payment records stored in the hot-deploy/opentaps-tests component and merely changed their status codes to verify the results. This brings up an interesting question--what should be done during the service, and what with existing data?

Our recommendation is this:

  • What you are testing must be done with the services that you would normally use as part of your application. For example, when we are testing GL postings, the user does not actually call the "postInvoiceToGl" service anywhere on the screen. Instead, she would set the invoice status, and the services would run behind the scenes. Therefore, it would not do to run the "postInvoiceToGl" test, or worse, manually create the results of that service in the database. Instead, we should be calling the service to set invoice status, which is the same one accessed via the controller.
  • For everything else, do whatever is easiest to set up the pre-conditions for testing. Calling receiveInventoryProduct is pretty easy compared to storing InventoryItem and InventoryItemDetail, so we decided to use that. Calling createInvoice and createInvoiceItem would have been many lines of code, so we just stored an invoice in the database.

Creating Reference Data Sets

In many tests you will see comparisons against pre-stored AcctgTrans and AcctgTransEntries. These are reference data sets which are used to compare actual transactions' results and make sure that they are consistent with the reference. Reference data sets are created in the following way:

  1. Run through a set of business transactions, such as creating an invoice and marking it as READY.
  2. Go to Webtools > XML Data Export and select the entities to export. In this case, it might be the Invoice, InvoiceItem, AcctgTrans, AcctgTransEntry entities. Export them either to a file or to a browser and copy them to a file.
  3. Edit the file of transactions and change the following:
    1. All IDs from the system-generated 100xx to something like "XXX-TEST-###" so that they would not cause primary key conflicts.
    2. For AcctgTrans, change the glFiscalTypeId of all the AcctgTrans to "REFERENCE" from "ACTUAL" so they would not interfere with actual records.
    3. Remove references to entities which would not be part of the reference set. For example, the invoice might be part of the reference set, but workEffortId, inventoryItemId, etc. referenced by AcctgTransEntry would not be.
    4. Test by loading the new entity XML into your dedicated testing database. It should cause no conflicts.
    5. Add it to the opentaps-tests component's ofbiz-component.xml so that it would load for future tests and commit it!

Running a Unit Test from Beanshell

After you have written a lot of unit tests, running all of them could take a long time. Fortunately, you can use beanshell to run just one unit test at a time to speed up your development. To do this, you would need to telnet into your beanshell port, then instantiate an object of the unit tests class, and run your test method:

si-chens-computer:~ sichen$ telnet localhost 9990
Trying ::1...
Connected to localhost.
Escape character is '^]'.
BeanShell 2.0b4 - by Pat Niemeyer (pat@pat.net)
bsh % import org.opentaps.tests.purchasing.MrpTests;
bsh % mrpTests = new MrpTests();
bsh % mrpTests.testMrpPurchasedProduct();
bsh %

If the test succeeded, you would see no messages on your beanshell console. If it failed, you would see a stack trace. In both cases, you should see the log messages in runtime/logs/ofbiz.log or runtime/logs/console.log

To make your life even simpler, you can put all of this into a .bsh file of your own, like myMrpTests.bsh, and then just call it with the source method from the beanshell console:

bsh % source("myMrpTests.bsh");

Debugging Unit Tests with IntelliJ

The default task for tests will do a global compile. To skip this, you can redefine the run-tests target in build.xml as follows,

    <target name="run-tests">
        <java jar="ofbiz.jar" fork="true">
            <arg value="test"/>

Using a debugger can help speed up development of the unit tests. You can enable debugging by specifying the JVM arguments for your debugging system. For instance, if you have the IntelliJ IDE, the run-tests target becomes,

    <target name="run-tests">
        <java jar="ofbiz.jar" fork="true">
            <jvmarg value="${memory.max.param}"/>
            <jvmarg value="-Xdebug"/>
            <jvmarg value="-Xnoagent"/>
            <jvmarg value="-Djava.compiler=NONE "/>
            <jvmarg value="-Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=5005"/>
            <arg value="test"/>

You should be able to attach the debugger immediately after running ant run-tests. Don't forget to recompile the component where your tests live.

Another tip is to comment out all unnecessary test suites. Unfortunately, this involves searching every ofbiz-component.xml. One way to find them, if you're on a POSIX OS, is to use find,

$  find . -name ofbiz-component.xml -exec grep test-suite {} \; -print

Warning about running Unit Tests in MySQL

When running unit tests in MySQL that use transactions please be aware that an assert failure can issue a rollback on the transaction leaving the data on the database in an inconsistent state, potentially allowing the test to pass even if in reality it failed.