DB2 Forum 2018 First Quarter Meeting
Wednesday, February 14, 2018
08:30 - 12:30 p.m.
The Original Pancake House
1505 William D Tate Ave.
** Directions to The Original Pancake House **
The Original Pancake House is in Grapevine on William D Tate Ave just south of 114 and east of 121. It
is between Buffalo Wild Wings and Freebirds World Burrito across the street from Classic Chevrolet. .
Meeting Cost: $10 for non-members, free to members. Membership is $25 per
year per person --or-- $100 for 5 people in a company --or-- $350 for
unlimited people in a company. You can register at the meeting.
This meeting is sponsored by
- SEGUS, Inc.
Compliance with Compliments! Viable Db2 z/OS Workload Tracking
Audit and Compliance is a need that many companies want and have to fulfill. There's different ways and tools that
promise to be able to do it, but what can they really do and what are the associated costs?
This presentation introduces Db2 technology exploitation that delivers any DML, DDL, DCL being executed in a
Db2 environment along with identification details. Learn how you can run Audit analytics against a long-term
repository, pinpointing who executed a query, when and from where. Analyze your entire workload to
understand access patterns and abnormalities.
Roy Boxwell has more than 31 years of experience in MVS, OS/390, and z/OS environments – 29 of those in Db2.
He specializes in installation, migration, and performance monitoring and tuning. Roy leads the SEG development
team of SEGUS responsible for the real-time database maintenance solutions. He is also an active participant,
speaker and contributor on the IDUG Db2 Listserv and sends out a monthly Db2 z/OS Newsletter.
Roy Boxwell - SEGUS, Inc.
Db2 12 Continuous Delivery – New Challenges for Deployment
Fundamental changes in the Db2 z world often lead to concerns. Let’s face it – some changes force us to change!
While a Db2 version migration usually took months, or even years, there will be no new Db2 version after 12, but
continuous code drops. This will have a tremendous impact on migration strategies, because we have to find a
reliable way to test these code deliveries in a fraction of time. If we make it, Business Divisions will become
enthused at how quickly new technology becomes available for new applications. This presentation will describe the
difference between Code, Catalog, Function and Application Levels, how you can control them and how you can
fallback in case of anomalies. It also illustrates how we still can be pro-active in testing without burning weeks and
months. Learn how to choose from four different levels of testing and a new way of automation. CD-Screening
allows you to pick and choose from KPI based test automation. The levels include simple anomaly alerting, access
path verification, clone Pre-apply and even workload capture/replay to easily discover different behavior resulting
from a new code level. Joining this presentation, you’ll learn how to align Continuous Delivery to your Continuous
Roy Boxwell - SEGUS, Inc.
Db2 z/OS Lies, Damn Lies, and Statistics
..Lies, damn lies, and statistics... – Benjamin Disraeli, Prime Minister of England (1868, 1874-1880)
The above line may, or may not, have been spoken well over 100 years ago, but the need for statistics and,
above all else, accurate statistics is more important than ever in the Db2 world of today. Bad statistics is the
number one reason for bad access paths! How could that happen; well usually bad timing is the number one
reason, followed by manually updated stats and then forgotten about. There are various features and
enhancements for both static and dynamic SQL delivered with Db2, but unfortunately still no common way to
This presentation explains Db2s Optimizers behavior and discusses the pros and cons of Plan Stability and
Stabilized Dynamic Queries. It additionally introduces your lifesaver when your critical access path goes
ballistic, a simple and solid “RUNSTATS recovery”. The basis is a complete copy of all productive required
statistical data in the DB2 catalog. This should be regularly executed and the data saved away. The best way
to do this is to use a Generation Gengroup which automatically stores multiple “copies” of the data.