IBM i (AS400, iSeries) application performance tuning software
Tips & Techniques

How many customers or end-users recommend purging historical transaction data to improve system performance?

Answer: NONE!

Keeping 12 years of closed order history in your 'Open Order' database file is affecting system performance and hardware expenditures in ways that you can't even imagine.

Allowing users to query or run SQL over a General Ledger transaction history file containing 7 years of old transactions, consumes enormous system resources without providing them with acceptable response times.

Tips & Techniques

Don't purge data, MOVE data!

By creating a simple procedure that runs nightly on your system(s), you can easily create a data extraction process that can tremendously improve the performance of queries against your transaction history files.

Setup a job that simply extracts today's data or this month's data into a separate physical file object on your system. Even extracting the current year's data would be a big improvement. Teach your query and SQL users how to point their requests for recent data to this new database file. If they want access to older transactions, show them how to point the queries back to the other file.

Maybe your end-users don't access query tools or SQL directly. Usually, some minor CL programming changes can easily toggle the user back and forth between these two database files.

Performance of any application can usually be directly attributed to data volumes. The worst performing application in the world will perform acceptably over small datasets. Regardless of whether you are using SQL, RPG or COBOL for end-user inquiries - the less data to process, the better performance and response time.

Of course good database tuning is of critical importance. Make sure you are properly indexing both of these files.

Frequently Asked Questions

How can I do any of this with purchased software andno access to source code?

You can easily create this data extraction process with the IBM commands WRKQRY, STRSQL or as a last resort the CPYF command with SELECT/OMIT criteria. Use the WRKJOBSCDE command to schedule the job to run on a nightly, weekly or monthly basis - depending upon your user's need for 'current' data.

To get your existing application to point to the 'current' data versus the 'old' data - you would be amazed with what you can accomplish with simple 'Library List' changes.

Create the new object in a new library - keep the object name the same. Make sure to duplicate all indexes or logical files into this new library. Now just 'toggle' the library list to switch between datasets.

How do we identify the transaction history files thatwould most benefit from these techniques?

Using the IBM command DSPFD with the *MBR option to create an output file containing information for your most business critical database libraries, would be an easy approach. Using a query tool, you can analyze this output file and the 'Activity Statistics' for all of the physical files used by your application software.

The information of highest importance is the data regarding 'Logical Reads'. The database files on your system with the most logical reads are likely some of the largest transaction history files and the most queried files - with the poorest database and SQL tuning.

For existing Workload Performance Series customers, our Disk Navigator tool automates this process of analyzing 'Disk Activity Statistics'.

Another approach for environments with extensive SQL based application software would be to use the IBM command STRDBMON. This command starts the IBM 'Database Monitor' which intercepts every query or SQL request that runs on your system. This data can be sent to an output file and queried for analysis purposes. The ENDDBMON command stops the operating system process and leaves you with lots of data to analyze.

Using a query tool, you can retrieve various kinds of information about the queries and SQL running on your system. Looking at the 'Index Advised' information is of critical importance. Analyzing the SQL 'Elapsed Time' statistics can be of enormous value as well.

Our Query Optimizer tool automates this data collection and analysis process - with built-in controls to ensure that the negative impact of running the STRDBMON command on Production systems is minimized.

New Features

The Workload Performance Series software contains many built-in features for effectively dealing with the database purge and archive needs of our software. Being an application performance analysis and tuning toolset, ensuring that our software sets a good example is a top priority.

Our automated 'Database Maintenance' features provide you with the ability to customize data retention for each module within our software. You can define the number of days of data that you would like to retain for each of our tools. The retention can also be set to *NOMAX if desired. Tracking of the amount of DASD, the number of records and number of data samples can be done via these features as well.

A job runs on a nightly basis via our 'Automated Scheduling' features that uses the pre-defined data retention table and performs the actual data purge. All database files within our software are created with 'Reuse Deleted Records' of *YES as an additional measure to ensure that DASD usage stays under control.

The automated dynamic creation and purging of 'Database Views' within our software is likely one of its most unique and sophisticated features. All throughout the software, you have the ability to dynamically decide as a user how you want to see and access the data. You can customize how the data gets summarized. The columns of data that show within the online inquires and in the generated PDFs and Microsoft Excel spreadsheets is completely under your control.

Unlike SQL, WRKQRY and the OPNQRYF command, our software dynamically builds permanent logical files based on user selections and behavior. These new indexes on the system are automatically deleted via our 'Database Maintenance' features based on the data retention that you define for our 'VIEWS' module. Permanent logical files get created as needed and get deleted if unused for a pre-defined number of days.

The performance implications associated with dynamic SQL running over years worth of historical data - without proper purging, archiving and database tuning is significant.

We have a special offer for our existing Workload Performance Series customers. Use our Disk Navigator and Query Optimizer tools as mentioned earlier to identify the transaction history files that are most significantly impacting the performance and capacity of your system(s).

Once these files are identified, contact us via phone or email. We would like to discuss your specific database purge and archive needs. Consider having MB Software custom build an interface and application which would provide you with all of the features and functionality of our software - but for your transaction history data.

The custom built, OS/400 trigger based or import file based interface will be provided at no additional charge. LPAR based licensing and annual software maintenance fees will apply based on your specific needs.

Special Offer

We invite you to go to our web site at www.mb-software.com, take us up on our Free 30 Day Trial offer. Existing Workload Performance Series customers, consider our special offer to custom build an interface and application to resolve many of your database purge & archive needs.