4.1.1 to 5.0.2 Upgrade Notes

Note

Since the KFS 5.0 and KFS 5.0.1 releases were internal, non-public releases, there are three sets of upgrade scripts to move from KFS 4.1.1 to KFS 5.0.2. These scripts are in the following locations:

work/db/upgrades/4.1.1_5.0
work/db/upgrades/5.0_5.0.1
work/db/upgrades/5.0.1_5.0.2

Scripts in all three locations are referenced throughout this guide.

 

 

 

Rice Server Upgrade

KFS 5.x is dependent on Rice 2.x.  Specifically, KFS 5.0.2 was written against Rice 2.1.6.  This causes logistics issues for upgrading KFS due to new constraints placed upon KIM data.  For this reason, a portion of the KFS upgrade must be performed *before* the Rice server’s database is updated.

For more information on upgrading the Rice server, please see the Rice application’s release notes:

http://site.kuali.org/rice/2.0.0/reference/html/release-notes.html#upgrade-guide 

Please note: the section: “Upgrading a Client Application” should be reviewed, but most of the items noted there have been taken care of as part of the KFS 5.0.2 Upgrade.  They are mainly intended for others who have implemented on top of the Rice framework.

Rice Server Data Upgrade

The KFS project has provided Liquibase scripts to perform the non-workflow updates to the Rice database necessary for KFS functionality.

The scripts for this upgrade are within the KFS distribution at the following locations:

work/db/upgrades/4.1.1_5.0/rice_server
work/db/upgrades/5.0_5.0.1/rice_server
work/db/upgrades/5.0.1_5.0.2/rice_server

KIM Data

For KFS 5.0.2, we have a two types of changes for the KIM data.  The first is clean-up for readiness of Rice 2.0.  KIM permissions and responsibilities needed to be cleaned up due to the new constraints.  These statements are in a Liquibase script:

work/db/upgrades/4.1.1_5.0/rice_server/kim_upgrade_pre_rice_20.xml 

After the rest of the Rice server upgrade is complete, you can apply the new KIM data for KFS 5.0.2.

work/db/upgrades/4.1.1_5.0/rice_server/kim_upgrade.xml
work/db/upgrades/5.0_5.0.1/rice_server/kim_upgrade.xml
work/db/upgrades/5.0.1_5.0.2/rice_server/kim_upgrade.xml

A summary of the changes to KIM roles, permissions and responsibilities are in this spreadsheet:

5.0.2 KIM Data Changes 

KEW Data (after Rice 2.0 Upgrade)

Aside from the workflow updates, there are a few changes which should be made directly against the rice tables.  These are being applied as database scripts because ingesting them would leave existing documents in routing either with the wrong information or unusable.

work/db/upgrades/4.1.1_5.0/rice_server/kew_upgrade.xml
work/db/upgrades/5.0_5.0.1/rice_server/kew_upgrade.xml
work/db/upgrades/5.0.1_5.0.2/rice_server/kew_upgrade.xml (empty - no changes)

In the first script listed above, one of the changesets contains a list of all document type codes which were part of KFS as of KFS 4.1.1.  You will want to include any custom KFS document types you may have created in this script or create a separate upgrade script of the same command for your custom documents.

System Parameters (after Rice 2.0 Upgrade)

KFS 5.0 changed a number of parameters.  See the 5.0.2 Parameter Changes document for more information.  This must be run after the Rice structure upgrade script below as it uses the updated table names in Rice 2.0.

work/db/upgrades/4.1.1_5.0/rice_server/parameter_updates.xml
work/db/upgrades/5.0_5.0.1/rice_server/parameter_updates.xml
work/db/upgrades/5.0.1_5.0.2/rice_server/parameter_updates.xml

Rice Application Server / Database Upgrade

Database Data/Structure

The Rice team provided scripts in their project for upgrading Oracle and MySQL client and server database.  For consistency, the KFS team has extracted those scripts and embedded them into Liquibase.  The main scripts for this are:

work/db/upgrades/4.1.1_5.0/rice_server/rice-server-script.xml
work/db/upgrades/5.0_5.0.1/rice_server/rice-server-script.xml
work/db/upgrades/5.0.1_5.0.2/rice_server/rice-server-script.xml 

The SQL scripts in these directories were copied from the Rice project. They are specific to MySQL and leveraged by the rice-server-script.xml liquibase scripts for that database platform.

Rice Application Server Changes

Due to changes in Rice, the use of ${application.url} on the KFS documents will no longer work.  The above kew_upgrade.xml script changes all the references on baseline KFS documents to use ${kfs.url}.

The other half of making that work is to add a “kfs.url” property to the rice-config.xml file used for your rice servers and have it point back to the appropriate base URL for your KFS application.  This will be the server including the base path (if any).  (E.g., for local developers, this would be: http://localhost:8080/kfs-dev)

Workflow XML Updates

Rice-Provided

The Rice-provided scripts have been copied.  If you are not going to use the new KRMS module yet and do not use eDocLite, then the only one you would *need* to run would be the 12-06-2011-ComponentMaintenanceDocument-doctype.xml file.

All the scripts are in:

work/db/upgrades/4.1.1_5.0/workflow/rice_provided 

KFS Scripts

The KFS scripts are broken out into eight files:

  • work/db/upgrades/4.1.1_5.0/workflow/purchasing_document_updates.xml
  • work/db/upgrades/4.1.1_5.0/workflow/workflow_attribute_updates.xml
  • work/db/upgrades/4.1.1_5.0/workflow/workflow_document_updates.xml
  • work/db/upgrades/5.0_5.0.1/workflow/purchasing_document_updates.xml
  • work/db/upgrades/5.0_5.0.1/workflow/workflow_attribute_updates.xml
  • work/db/upgrades/5.0_5.0.1/workflow/workflow_document_updates.xml
  • work/db/upgrades/5.0.1_5.0.2/workflow/workflow_attribute_updates.xml
  • work/db/upgrades/5.0.1_5.0.2/workflow/workflow_document_updates.xml

The work/db/upgrades/5.0_5.0.1/workflow/workflow_attribute_updates.xml file contains the updated attribute definitions used by workflow to customize document search.  These *must* be ingested. (This replaces the work/db/upgrades/4.1.1_5.0/workflow/workflow_attribute_updates.xml file.) The work/db/upgrades/5.0.1_5.0.2/workflow/workflow_attribute_updates.xml file contains the updated attribute definition for document search security.  This *must* be ingested as well.

The workflow document update files (work/db/upgrades/4.1.1_5.0/workflow/workflow_document_updates.xml,work/db/upgrades/5.0_5.0.1/workflow/workflow_document_updates.xml, work/db/upgrades/5.0.1_5.0.2/workflow/workflow_document_updates.xml)contain changes to various KFS documents. These changes should be reviewed and adapted for your institution as needed.

The purchasing document update files (work/db/upgrades/4.1.1_5.0/purchasing_document_updates.xml, andwork/db/upgrades/5.0_5.0.1/workflow/purchasing_document_updates.xml) contain all KFS purchasing transactional documents present in KFS 5.0.  The version in 5.0_5.0.1 replaces the version in 4.1.1_5.0 and contains some additional docType changes. Due to the changes for the purchasing application document status, every transactional document needed to be updated.  As such, if you are using the purchasing module, you must review and adapt these files to match any changes made by your institution and update the nextAppDocStatus attributes accordingly.  You will also probably have to make changes to the document code itself to manage some of the application status transitions.

DO NOT change the values of any of the application document statuses without first reviewing the new purap code, as those strings are referenced within the application to control certain aspects of the module.

KFS Database Upgrade

Database Structure

There are multiple components to the database upgrade.

  1. Upgrade Rice Client table Structure

  2. Upgrade KFS table structure and add new reference data

  3. Upgrade Maintainable document XML

  4. Upgrade Workflow XML

  5. Upgrade Purchasing documents for application document status changes

In some cases, the changes to the table can not be done on existing tables or are destructive in nature.  In these cases, the table will be copied with a name of <original name>_BKUP (truncated if necessary) and left behind for later review and deletion.

Rice Client Tables

The changes to the rice client tables are fairly minimal and are run via the following Liquibase scripts:

work/db/upgrades/4.1.1_5.0/rice-client-script.xml
work/db/upgrades/5.0.1_5.0.2/rice-client-script.xml

The SQL scripts in these directories were copied from the Rice project and are leveraged by the rice-client-script.xml liquibase scripts.

KFS Tables

The KFS changes are broken into three master scripts in each upgrade directory.  (Each of which runs a series of other scripts with changes specific to each module.

work/db/upgrades/4.1.1_5.0/db 
  • master-structure-script.xml
  • master-constraint-script.xml
  • master-data-script.xml
work/db/upgrades/5.0_5.0.1/db
  • master-structure-script.xml
  • master-constraint-script.xml (empty - no changes)
  • master-data-script.xml (empty - no changes)
work/db/upgrades/5.0.1_5.0.2/db
  • master-structure-script.xml
  • master-constraint-script.xml (empty - no changes)
  • master-data-script.xml (empty - no changes)

Data Updates

New Reference Data

Capital Asset Module

There are two new tables in the CAM module with data which should be reviewed.

  • CM_AST_PMT_DST_CD_T : Asset Payment Distribution Type Code

    • The primary values in this table are hard-coded into the application, but the descriptions can be changed.

  • CM_AST_PMT_DOC_TYP_T : Asset Payment Document Type

    • This table contains the list of “eligible” document types for asset payments.  It is used to provide a more manageable list of document types when using the lookups on the Asset Payment or Asset Global documents.

    • This table’s data should be modified to match the documents in use at your institution.

Labor Distribution

  • LD_LBR_BFT_RT_CAT_T : Labor Benefit Rate Category

    • Default value of “--” only.  Must be inserted as it is added as the default in the LD_BENEFITS_CALC_T and CA_ACCOUNT_T tables.

Pre-Disbursement Processor

  • PDP_PMT_CHG_CD_T : Payment Change Code

    • Addition of new value: “RC” for the Reissue/Cancel action.  The “RC” value is hard-coded into the application and must not be changed.

Purchasing / Accounts Payable

  • PUR_COMM_T : Purchasing Commodity Code

    • Addition of a new “default” commodity code (99200000) for unordered items received from the vendor.

    • This commodity code can be changed to match an institution’s commodity code set.  It is defined in the new UNORDERED_ITEM_DEFAULT_COMMODITY_CODE system parameter.

Historical Maintenance Document Conversion

The Rice upgrade has introduced some potential changes to the structure of the XML used to store maintenance documents.  All maintenance documents are stored in a local database table (KRNS_MAINT_DOC_T).  If you do not perform this upgrade, then it may not be possible to open maintenance documents created prior to the Rice 2.0 upgrade.  (There is also a copy of this table on the Rice server database - if you are running a standalone rice instance - which will need to be upgraded.  It, however, only contains Rice documents (parameter, role, etc...)

The rice team has provided some information on this process under the “Maintainable XML” heading in the above-linked release notes.  This needs to be run *after* the application upgrade but before you bring up the system for use.

Cash Document Structure

NOTE:Unlike other data updates, this one is done in the 

work/db/upgrades/4.1.1_5.0/db/01_structure/fp-module-structure-updats.xml script.  This is because the data updates are intertwined with table relationship changes.  These changes affect the following tables:
  • FP_DEPOSIT_HDR_T

  • FP_COIN_DTL_T

  • FP_CHECK_DTL_T

  • FP_CURRENCY_DTL_T

  • FP_CASH_RCPT_DOC_T

Summary: 

  1. A cashier status code was added to each of the detail tables above to replace the “FDOC_COLUMN_TYP_CD” column.  It uses the existing value in that field to set the new CSHR_STAT_CD value *if* the FDOC_TYP_CD on the record is “CR” or “CM”.  Otherwise, it leaves the value in that column to its default of “X”.

    1. Exception to the above:  For the “CM” document type in the FP_CHECK_DTL_T table, the FDOC_COLUMN_TYP_CD of “I” is converted to an “R” in the CSHR_STAT_CD column.

  2. The FDOC_TYP_CD of “CM” is changed to “CMD” to match the document type code.

  3. The FDOC_COLUMN_TYP_CD is then dropped from the three detail tables.  This requires the dropping and rebuilding of the primary and foreign key relationships between the tables.

ALERT: The update logic DELETES records from the FP_CURRENCY_DTL_T and FP_COIN_DTL_T tables where the FDOC_TYP_CD is “CH” or “BCS”.  It appears to be redundant with the new implementation, but you should make sure you create backups of these tables before running the upgrade.

NOTE: The MySQL Script to move the ICR accounts from the CA_ACCOUNT_T table to the new CA_ICR_ACCT_T does not run cleanly through Liquibase and has been commented out.  The coa-module-data-updates.xml script should be reviewed and the conversion script adapted and run manually.

Capital Asset Total Costs Update

There are a pair of statements in the

work/db/upgrades/4.1.1_5.0/db/04_optional/cam-module-data-updates.xml  

file that recalculate Capital Asset Total Costs that could have been corrupted because of a bug in the CAMS locking mechanism that has been corrected. These should be reviewed and executed if you have previously implemented the CAMS module.

Purchasing Application Document Status

As part of the purchasing document status change, the “codes” in the purchasing tables need to be translated into workflow “Application Document Statuses”.  Due to the fact that the workflow tables and KFS tables could be in different databases, a simple database script could not be provided.  Instead, the KFS team developed a conversion program in the form of a batch step which can be run.

org.kuali.kfs.module.purap.batch.MigratePurapStatCodeToWorkflowDocumentStep

Since this step is meant to be run only once, it is not included in the out of the box spring configuration. However, it needs to be wired up before it can be run by the BatchStepRunner. See the attached spring-purap.xml file for an example of how to do that.

This step, when run, will iterate over all purchasing documents one at a time, converting the code into the new status *via the values in the related code table*.  So, for requisitions, this would be PUR_REQS_STAT_T.  The names in this table will be the new Application Document Statuses stored in the workflow header table.  So, if you have made changes to any of these labels, you *must* match the changes in the workflow XML you ingest for that document type.

Please note that you must ingest the new workflow XML for the purchasing documents before running the batch step above.

Also, since the status fields are gone, the method of updating the status has changed.  New methods have been added to FinancialSystemTransactionalDocumentBase:

  • getApplicationDocumentStatus()

  • setApplicationDocumentStatus(String applicationDocumentStatus)

  • updateAndSaveAppDocStatus(String applicationDocumentStatus)

Update Institution Customizations

Install Groovy

The automated conversion program runs using the Groovy scripting language.  The instructions below will assume that you have groovy installed and in your path.

You can download a Groovy installer from: http://groovy.codehaus.org/ 

Run Against Local Customizations

It is important that you do not point this application at a directory containing the baseline code.  The conversion program is a "run-once" application, as there are multiple layers of conversions.

The script is:

kfs/work/db/upgrades/4.1.1_5.0/rice-20-code-upgrade-scripts/UpdateRiceReferences.groovy

run it from the

kfs/work/db/upgrades/4.1.1_5.0/rice-20-code-upgrade-scripts 

directory as:

groovy UpdateRiceReferences.groovy /path/to/your/custom/source

Where the directory points to a path to customization files (Java source, XML, and JSP/TAG files.)  If these are in separate places, they can be run individually.

The resulting code will not compile in many places.  The script handles about 95% of what would otherwise be manual conversions.  It converts classes/packages, but also converts some of the API calls to KEW and KIM which have changed.

It does not convert Ebo references to Rice objects like Campus, Country, State, and PostalCode.  These conversions must be done manually as described below.

We have outlined a number of the major change areas below.  However, IU created a fairly detailed document during their conversion available here:

Kuali Rice 2.0 Upgrade Notes - Indiana University

Update Rice ExternalizableBusinessObject Usage

Due to the way that the “location” objects of Country, State, PostalCode and Campus where named, it was not possible to write the script to rename these properly.  Additionally, the one-line service method KFS depended on for refreshing these objects when necessary has been removed in Rice 2.0.  This has resulting in much more verbose (though probably more efficient and correct) getter methods for these ExternalizableBusinessObjects.

First, the object names have changed.  While the previous names exist, they are no longer of a data type which can be used with the Data Dictionary.  So, any place where these objects are being used on a document, they must be changed into their “EBO” form by adding “Ebo” to the end of the class name.  (E.g., Campus → CampusEbo)

Second, the getters need to be altered in any business objects which have these defined as related objects.

Getter: CampusEbo

 

public CampusEbo getOrganizationPhysicalCampus() {
    if ( StringUtils.isBlank(organizationPhysicalCampusCode) ) {
        organizationPhysicalCampus = null;
    else {
        if ( organizationPhysicalCampus == null || !StringUtils.equals( organizationPhysicalCampus.getCode(),organizationPhysicalCampusCode) ) {
            ModuleService moduleService = SpringContext.getBean(KualiModuleService.class).getResponsibleModuleService(CampusEbo.class);
            if ( moduleService != null ) {
                Map<String,Object> keys = new HashMap<String, Object>(1);
                keys.put(LocationConstants.PrimaryKeyConstants.CODE, organizationPhysicalCampusCode);
                organizationPhysicalCampus = moduleService.getExternalizableBusinessObject(CampusEbo.class, keys);
            else {
                throw new RuntimeException( "CONFIGURATION ERROR: No responsible module found for EBO class.  Unable to proceed." );
            }
        }
    }
    return organizationPhysicalCampus;
}

 

Getter: CountryEbo

 

public CountryEbo getOrganizationCountry() {
 
    if ( StringUtils.isBlank(organizationCountryCode) ) {
        organizationCountry = null;
    else {
        if ( organizationCountry == null || !StringUtils.equals( organizationCountry.getCode(),organizationCountryCode) ) {
            ModuleService moduleService = SpringContext.getBean(KualiModuleService.class).getResponsibleModuleService(CountryEbo.class);
            if ( moduleService != null ) {
                Map<String,Object> keys = new HashMap<String, Object>(1);
                keys.put(LocationConstants.PrimaryKeyConstants.CODE, organizationCountryCode);
                organizationCountry = moduleService.getExternalizableBusinessObject(CountryEbo.class, keys);
            else {
                throw new RuntimeException( "CONFIGURATION ERROR: No responsible module found for EBO class.  Unable to proceed." );
            }
        }
    }
    return organizationCountry;
}

 

Getter: StateEbo

 

public StateEbo getAccountState() { 
    if ( StringUtils.isBlank(accountStateCode) || StringUtils.isBlank(accountCountryCode) ) {
        accountState = null;
    else {
        if ( accountState == null || !StringUtils.equals( accountState.getCode(),accountStateCode) || !StringUtils.equals(accountState.getCountryCode(), accountCountryCode ) ) {
            ModuleService moduleService = SpringContext.getBean(KualiModuleService.class).getResponsibleModuleService(StateEbo.class);
            if ( moduleService != null ) {
                Map<String,Object> keys = new HashMap<String, Object>(2);
                keys.put(LocationConstants.PrimaryKeyConstants.COUNTRY_CODE, accountCountryCode);
                keys.put(LocationConstants.PrimaryKeyConstants.CODE, accountStateCode);
                accountState = moduleService.getExternalizableBusinessObject(StateEbo.class, keys);
            else {
                throw new RuntimeException( "CONFIGURATION ERROR: No responsible module found for EBO class.  Unable to proceed." );
            }
        }
    }
    return accountState;
}

 

Getter: PostalCodeEbo

 

public PostalCodeEbo getPostalZip() {
    if ( StringUtils.isBlank(organizationZipCode) || StringUtils.isBlank(organizationCountryCode) ) {
        postalZip = null;
    else {
        if ( postalZip == null || !StringUtils.equals( postalZip.getCode(),organizationZipCode) || !StringUtils.equals(postalZip.getCountryCode(), organizationCountryCode ) ) {
            ModuleService moduleService = SpringContext.getBean(KualiModuleService.class).getResponsibleModuleService(PostalCodeEbo.class);
            if ( moduleService != null ) {
                Map<String,Object> keys = new HashMap<String, Object>(2);
                keys.put(LocationConstants.PrimaryKeyConstants.COUNTRY_CODE, organizationCountryCode);
                keys.put(LocationConstants.PrimaryKeyConstants.CODE, organizationZipCode);
                postalZip = moduleService.getExternalizableBusinessObject(PostalCodeEbo.class, keys);
            else {
                throw new RuntimeException( "CONFIGURATION ERROR: No responsible module found for EBO class.  Unable to proceed." );
            }
        }
    }
    return postalZip;
}

 

Third, the property names of the primary key (and other) fields within the objects has changed in rice 2.0.  You will need to change your “relationshipDefinitions” in your business object data dictionary files to match the new (simplified) names.

 

CampusEbo

campusCode → code

campusName

CountryEbo

postalCountryCode → code

postalCountryName → name

StateEbo

postalCountryCode → countryCode

postalStateCode → code

postalStateName → name

PostalCodeEbo

postalCountryCode → countryCode

postalStateCode → stateCode

postalCode → code

postalCityName → cityName 

 

Impacting API Changes

Parameter Service

The parameter service changed significantly, especially with regards to updating parameters. (Most code should not be updating parameters anyway.)  But, in addition to that, some methods have changed their meaning and/or return type.

getParameterValues() returns Collection<String> instead of List<String>

In Rice 2.x, all of the calls which retrieved lists of parameters now return collections.  For the most part, none of the code in KFS really needed the results to be lists, but the local variables which held the values were defined as such.  It is not safe to simply cast these results to List objects.  Redefining the variables in your code should solve this API change.

getParameterXxxx() no longer throws an exception if the parameter does not exist

In Rice 1.x applications, it was fairly common to call parameterExists before retrieving a parameter to prevent exceptions from being thrown.  In Rice 2.x, parameter retrieval now returns a null if the parameter does not exist.  The parameterExists() method still exists but is now largely unnecessary.

Requests for "sub-parameters" are now explicit

In Rice 1.x, the only distinction between normal list parameters and parameters which contained sub-lists of values (e.g., EX=1234,5678;IN=9870) was the presence of an additional "constrainingValue" parameter to the method call.

In Rice 2.x, that behavior must be invoked explicitly by calling getSubParameterValue[s]AsString().

In addition, the non sub-parameter APIs now take an additional parameter with a default value to return if the parameter does not exist.  The main problem with this is that, after the conversion script (which can not tell the difference), queries for sub-parameters will be mapped to this (incorrect) API.  So, all use of the ParameterService will need to be reviewed for use against parameters which contain sub-parameters. 

Removal of ParameterEvaluator API

The ParameterEvaluator methods have been removed from the ParameterService and are no longer supported by Rice.  However, the methods used by KFS have been moved to the ParameterEvaluatorService.  The groovy conversion script attempts to adjust usage of these methods to the new API, but does not catch them all.  The API methods work the same as before, but the service reference will need to be updated.

KFS Spring XML Changes

Kuali Service Bus Exporting

In Rice 2.0, significant changes were made to the Kuali Service Bus.  The main one which will affect implementors are the changes for exporting services.  In Rice 1.x, we had a bean called: KSBServiceExporter.  That is now gone, replaced by multiple service exporter beans depending on the type of service to be exported.

Exporting Callback Services

The main type exported by KFS were the KIM type services.  These use a special type of exporter called the "CallbackServiceExporter".  Here are a number of examples below.  For more, all the exported beans have been moved into explicit files for each module called spring-xxx-bus-exports.xml.  The main change from the Rice 1.0 exporters is that the interface being implemented *must* be specified as part of the definition.

 

<bean class="org.kuali.rice.ksb.api.bus.support.CallbackServiceExporter" p:serviceBus-ref="rice.ksb.serviceBus"
    p:callbackService-ref="financialSystemDocumentTypePermissionTypeService"
    p:localServiceName="financialSystemDocumentTypePermissionTypeService"
    p:serviceInterface="org.kuali.rice.kim.framework.permission.PermissionTypeService" />
 
<bean class="org.kuali.rice.ksb.api.bus.support.CallbackServiceExporter"
    p:serviceBus-ref="rice.ksb.serviceBus"
    p:callbackService-ref="financialSystemUserRoleTypeService"
    p:localServiceName="financialSystemUserRoleTypeService"
    p:serviceInterface="org.kuali.rice.kim.framework.role.RoleTypeService" />
 
<bean class="org.kuali.rice.ksb.api.bus.support.CallbackServiceExporter"
    p:serviceBus-ref="rice.ksb.serviceBus"
    p:callbackService-ref="KFSDocumentSearchCustomizer"
    p:localServiceName="KFSDocumentSearchCustomizer"
    p:serviceInterface="org.kuali.rice.kew.framework.document.search.DocumentSearchCustomizer" />
 
<bean class="org.kuali.rice.ksb.api.bus.support.CallbackServiceExporter"
    p:serviceBus-ref="rice.ksb.serviceBus"
    p:callbackService-ref="FinancialSystemSearchableAttribute"
    p:localServiceName="FinancialSystemSearchableAttribute"
    p:serviceInterface="org.kuali.rice.kew.framework.document.attribute.SearchableAttribute" />

 

Exporting Non-Callback Services

If you are exporting new services for consumption by other applications, you will use a different exported class.

  • org.kuali.rice.ksb.api.bus.support.ServiceBusExporter

  • org.kuali.rice.ksb.api.bus.support.PropertyConditionalServiceBusExporter

These have a slightly different set of properties then the above.  The example below is of the second type above.  The only difference between the two is the presence of the “exportIf” property.

 

<bean class="org.kuali.rice.ksb.api.bus.support.PropertyConditionalServiceBusExporter" p:serviceBus-ref="rice.ksb.serviceBus">
    <property name="serviceDefinition">
        <bean class="org.kuali.rice.ksb.api.bus.support.SoapServiceDefinition"
            p:localServiceName="awardInterfaceService"
            p:service-ref="awardInterfaceService"
            p:jaxWsService="true"
            p:serviceInterface="edu.sampleu.kfs.module.cg.service.AwardInterfaceService" />
    </property>
    <property name="exportIf">
        <list>
            <value>awardInterfaceServiceSOAP.expose</value>
        </list>
    </property>
</bean>

 

Specification of Key Fields on DD Relationships

When you set up relationships in the data dictionary (as is needed for ExternalizableBusinessObjects), you *must* specify all the primary key fields for the relationship to take effect.  Otherwise, the framework will silently ignore the relationship and you will not get the automatic quickfinders or inquiry links.

This should normally be the case, but in a number of places in the baseline system, the country code was left off of the postal code relationships.  (And it would assume “US” for the country code.)  This also requires that there be a field present for the relationship.  So, if you have postal code relationships on business objects that do not have a country code, you will need to add that as a property, but it can be hard-coded.  See Account.java/Account.xml for an example of this.

Application/Tomcat Server Changes

Tomcat Version

  • KFS 4.1.1 runs under Tomcat 5.5 or Tomcat 6.0.

  • Rice 2.0 requires Tomcat 6.0 or 7.0.

  • KFS 5.0.2 is untested with Tomcat 7.0.

Therefore, we recommend running using Tomcat 6.0 as other combinations are untested at this time.

Tomcat Libraries

KFS deployment instructions required that some jar files be copied into the tomcat/common/lib (Tomcat 5.5) or tomcat/lib (Tomcat 6) directories as they were not part of the WAR file.  These libraries need to be upgraded to the current versions in the

build/external/appserver directory.

 

Old Library

New Library

commons-cli-1.0.jar

(same)

commons-logging-1.1.jar

(same)

connector-1_5.jar

connector-api-1.5.jar

howl.jar

howl-1.0.1-1.jar

jotm-2.0.10.jar

jotm-core-2.1.9.jar

jotm_iiop_stubs.jar

(remove)

jotm_jrmp_stubs.jar

(remove)

jta-spec1_0_1.jar

jta-1.1.jar

objectweb-datasource.jar

carol-interceptors-1.0.1.jar

ow_carol.jar

carol-3.0.6.jar

p6spy-1.3-patched.jar

(same)

xapool-1.5.0-patch3.jar

(same)

Startup Properties

There are no new required configuration properties which need to be added for tomcat.  There is an optional configuration property which specifies a file (outside the KFS war file) that KFS will read configuration properties from at runtime:

-Dadditional.kfs.config.locations=PATH_TO_CONFIG_FILE

In addition, you may want to review the memory settings for the JVM, as the memory requirements have gone up.  For a server which supports any number of concurrent users, you will want the heap memory at at least 4g and the MaxPermSize at 512m.

Running Liquibase Scripts

Liquibase is a Java program designed for applying changes to a database.  It has the advantage of being cross-platform as well as containing a model of tracking previously applied changes.  That is, if you use liquibase to run your upgrade process and some part of it fails:  It will detect the portions which have run successfully and not run them again after you correct the problem and attempt to re-execute them.

Run Changes Directly into Database

The easiest way to run liquibase scripts is to copy the liquibase-sample.properties file as liquibase.properties and fill in with the appropriate DB information. Then, run the runlog.sh script (or the command within it), passing in the location of the file you want to execute.

java -jar liquibase-2.0.5.jar --logLevel=finest --changeLogFile=rice/kns_upgrade.xml update

Run Changes to SQL Script

Another nice feature of Liquibase is that it can simply generate the SQL for you if you do not have a DBA who will run Liquibase scripts against your databases. The command below will run the script except that it will dump the SQL commands to the console. (Note that you still need a database against which to run, but it will not attempt to run any of the updates.)

java -jar liquibase-2.0.5.jar --logLevel=finest --changeLogFile=rice/kns_upgrade.xml updateSQL

The assumption with the paths in all the scripts is that you have changed into the directory of the script (.xml file) and then run the tool.

For more info on liquibase changesetLogs and changesets, see http://www.liquibase.org/manual/overview

Liquibase Version Upgrade

For KFS 5.x we have upgraded from liquibase 1.9.5 to 2.0.5. In order to run changes directly to the database this may require you to add columns to your DATABASECHANGELOG table. See Liquibase 1.x to 2.0 Upgrade Guide for more details.