*********** Overview: ********** There are some court documents online that do not have headnote data in them. So a process had to be devised to insert the headnote segments into the online documents. Generally, the vendor will send headnote data embedded in a document. In addition to the headnote data, there may be docket info, data info, and/or cite info. This gives the process a means with which to match the vendor docs with the online docs. Once a match is found a job has to be submitted to capture lock keys from the online docs. This is required as the databases are part of the database warehouse. Since the headnote docs are inserted online with a direct segment replacement update, the lock keys are required or the update will fail. Here is what the process is doing: Based on data set name either copy VISF data to headnote tmp directory for datasets containing visf data OR convert michie printer CSV data TO L-N visf if CSV is detected. Now a visf dataset containing docs with headnote data exists. Run deser job to capture online $00/docket/date/cite segments Build table consisting of online docket/date/$00 segment info. Read headnote visf file and capture headnote document docket & filed date info. Note: Some docs do not have a filed date segment, so the Decided date is used. Now scan online table to find matching docket & date to find match. If match is found - assign online document number to headnote document. 1. match is docket/date - mm/dd/yyyy format. 2. match is docket/date - mm/yyyy format (some New Hampshire docs are 1 day different...) 3. match is docket only 4 remaining unmatched docs are rescanned to do a match on the cite segment. So a total of 4 attempts at a match are made. unmatched docs are kept and a message sent so that a data tech can attempt to match docs manually. Dupes are also kept in a file and a message sent so that a data tech can look and see which doc is the correct one. More than one online doc may have the same docket/date or cite. The document numbers of matching documents are placed into the jcl of a deser job that will now deser matched docs and in the process capture lock keys. Note: The lock keys are essential because the update process is a Direct Segment Replacement (DSR). Since these databases are part of the database warehouse, the update will fail without these lock keys. ****************************** Headnote process - job stream ****************************** trigger_hn.sh - CSV_to_visf.sh - hn_CSV_visf.xom start_hn.sh - run_trigger_jcl.sh - load.sh - eldm.trigger.jcl- I8EHN*1 (Add first opc appl.) - SI02ELDMHDN**1 - run.sfg.jcl.sh - load.sh - sfg.key.jcl - I4E9999* (deser online visf) - mail.sh - load.sh - mail.sh hn_table_bld.sh - clean_CR.xom (cleanup visf) - NC_hn_table_bld.xom (build table of online docs) hn_match.sh - run_trigger_jcl.sh - load.sh - eldm.trigger.jcl - I8EHN*2 (add second opc Appl.) - SI02ELDMHDN**2 - load.sh - eldm.opstat.jcl - I8ELDMHN (opstat job - delete unneed jobs) - run_lkey_jcl.sh - load.sh - sfg.key.jcl.1st + sfg.key.jcl.2nd - I4E9999* (deser - capture locks) - cite_match.xom ( match based on cites only ) - clean_CR.xom ( clean up visf data ) - hn_dkt_match.xom (match docs based on docket & date ) - hn_format_docnum.xom (make sure docno file is formatted ) hn_insert_lkey.sh - mail.sh - run_pipeline_oe.sh - load.sh - I3P9999P ( submit jcl for pipeline oe) - clean_CR.xom ( clean up visf data ) - insert_lkey.xom ( insert lock keys into headnote data.) Other files: hn_function.h - omnimark functions used by various *.xom programs. run.jcl.sh - Allows user to resubmit jcl manually. hnote.env - Headnote environmental file. Set up variables, paths, etc. ************* Notes: ************* omnimark troubleshooting: Most omnimark executables (*.xom) contain a variable 'test-switch' uncommenting the test-switch so that the switch is activated will allow additional troubleshooting info to be sent to the ERROR file. Accessing AUSM in TSO: To check update: 'd.a' '=2' 'l 9999' check history: 'd.a' '=1' 'l 9999' 'h' Accessing OPC in TSO: To check eldm process: 'A.O' '=5.2' application: '*ELDM*' check eldm process in ERROR: 'A.O' '=5.4' application: '*ELDM*' *************** Common Errors: *************** ERROR messages - Any omnimark program (*.xom) 'Error while trying to authorize the program (Code = 9602003)'. root cause: omnimark license daemon may be down - all in use. This can occur from time to time, we have a limited number of licenses. action: Wait a few minutes then restart. If this happens repeatedly then there may be a problem with the license file (deleted, file system unmounted, etc.). Contact the help desk and have USA investigate. *********************************************************************************************************************** Deser job I4E9999* - root cause: not enough space on device. action: Wait an hour or so for dasd to clear & rerun. 14.49.13 JOB31816 VAMX9S3 VAM REDIRECTED TO STAGE1 STAGE1.PI00.LPA2B.VISF.D2758.B224909.SFG 14.49.15 JOB31816 VAM0039 NO VOLUME SELECTED FOR DSN=STAGE1.PI00.LPA2B.VISF.D2758.B224909.SFG 14.49.15 JOB31816 VAM0039A VOLUMES NOT SELECTED FOR ONE OR MORE OF THE FOLLOWING REASONS: 14.49.15 JOB31816 VAM0039M NOT ENOUGH SPACE FOR PRIMARY ALLOCATION 14.49.15 JOB31816 -I4E2758B SFGBLD SFGBLD FLUSH 0 .00 .00 .0 0 0 0 0 0 *********************************************************************************************************************** deser job I4E9999* - root cause: current generation has not been promoted. action: Wait until current gen is promoted then rerun. step SFGPP rc= 2000 ISFAR: Unable to allocate dataset PL00.LPA4C.D002744.G1877.S000001 R15: 4 -- Error RC: 1708 -- Info Code: 0002 *********************************************************************************************************************** Update failure: i4c2744c rc=2721 data warehouse problem invalid lock key was somehow captured by program. Too many spaces, a space in the wrong place, bad syntax of $00: segment could cause this type of problem: Action check other files of job output and try to find problem document. Note. Check the doc after the doc listed in the output as well. Sometimes the program abends at the wrong time, and the last document number processed is given as the problem instead of the actual bad document. UPII_CLPRMS +I+ UPICHK COMMAND PARM 1 = 2744 UPII_CLPRMS +I+ UPICHK COMMAND PARM 2 = 1915 UPII_CLPRMS +I+ UPICHK COMMAND PARM 3 = Y UPII_CLPRMS +I+ UPICHK COMMAND PARM 4 = 3145728 UPII_CLPRMS +I+ UPICHK COMMAND PARM 5 = 100 UPII_CLPRMS +I+ UPICHK COMMAND PARM 6 = P UPII_CLPRMS +I+ UPICHK COMMAND PARM 7 = P UPII_HSHINF +I+ HASH TABLE LOADING COMPLETE; STATISTICS FOLLOW TOTAL HASH ENTRIES STORED: 52289 COLLISIONS ON ORIG DOC NUMBER: 6532 COLLISIONS ON LNI NUMBER: 22893 LONGEST PATH TO FIND ORIG DOC NUMBER: 12 LONGEST PATH TO FIND LNI NUMBER: 117 UPII_BCECHO +I+ 0113 DIRSEGI STAGE2T.PI02.LPA3P.VISF.D2744.B513531 UPII_BCECHO +I+ 0123 SFCNTL PL00.LPA4C.LSER.D2744.G1914V00 SEL NNN UPII_BCECHO +I+ 0133 IIIFIP STAGE2T.PI02.LPA3P.VISF.D2744.B513531 REP UPIE_BACCDN +E+ ACC TABLE CONTAINS DOC-NO = ZERO, POSITION = 0 ORIGINATING ROUTINE: MakeAVPump *********************************************************************************************************************** lock key deser problem: I4E9999* Online document already locked: rc = 2000; rc = 5005: Action: determine which userid has locked the documents. This can be done by contacting the team leader (team 6). The user can be contacted and have them finish their work so thata locks are removed. If userid is the elsa userid, then the locks were probably left by a previous failed run. The team leader can remove these locks. No further processing is possible until these locks are removed. 16.31.08 JOB28504 IEC130I DWLOG DD STATEMENT MISSING 16.31.08 JOB28504 +SFG: Logging bypassed: DWLOG file could not be opened. 16.31.08 JOB28504 +SFG: No documents processed before DW0009 failure 16.31.08 JOB28504 +SFG: (E) Terminatable Status from Lock Request. DWMI_CALL_STATUS = 5005 DWLK_LDH_STATUS = X' 00000004 ' DWLK_DOC_INFO = ' 0D1C3TTK-4TM0-0039-453Y-00000-00 ' 16.31.08 JOB28504 +SFG18 - DW0009 failure caused program abend. 16.31.09 JOB28504 ICH408I USER(UELSA13 ) GROUP(ELSAIDS ) NAME(MITCHELL, MARTHA ) PI00.LPA2B.VISF.D2744.B452220.SFG CL(DATASET ) VOL(C29000) WARNING: INSUFFICIENT AUTHORITY - TEMPORARY ACCESS ALLOWED FROM PI00.L%%2B.* (G) ACCESS INTENT(ALTER ) ACCESS ALLOWED(READ ) 16.31.09 JOB28504 -I4E2744D SFGBLD SFGBLD 2000 465 .00 .00 .0 980 0 0 0 0 0 file 8 document already locked: DWLAGK: *** AN ERROR HAS BEEN TRAPPED. DWLAGK: *** USE RESULTS WITH CAUTION. DWLAGK: *** CALL_STATUS = 5005 DW0009: *** AN ERROR HAS BEEN TRAPPED. DW0009: *** USE RESULTS WITH CAUTION. DW0009: *** CALL_STATUS = 5005 DW0009: ERROR_CODE = 5005 DW0009: ERROR_MESSAGE = RESULT_NONE_LOCKED IBM910I 'ONCODE'=9050 AN ABEND HAS OCCURRED, USER CODE= 3000-00000000 IN STATEMENT 15 AT OFFSET +0000A0 IN PROCEDURE WITH ENTRY SFGBLD *********************************************************************************************************************** I4E9999* lock key deser job. rc 2000 is a generic return code. So you need to go thru the job output to see the below message root cause: 2 headnote docs with same doc number: 00057099 The vendor sent 2 docs with identical docket/date or cite info. Both docs were assigned the same number and this causes a problem with the deser job. This error should now be be trapped in the match script. But just in case duplicate headnote docs do make it to this point, this was documented. *** WARNING *** INVALID PARAMETERS IGNORED: 1 VALID PARMS ARE: DDDD,GGGG WHERE D = DATABASE ID, AND G = GENERATION NUMBER. SFGCPP CONTROL CARDS: DOCTYPE=ORIG DOC=00057101,00057316,00057089,00057088,00057093,00057098,00057096 ,00057102,00057092,00057099,00057099,00057090,00057091,00058006 ,00057097,00058005,00057317,00057318,00058004,00058003,00057095 SEG=20 NO ORIGINAL DOCUMENT FOUND: 00057099 *********************************************************************************************************************** Restarting: start_hn.sh just do from the command line. This will start process from beginning. A visf file with the following naming convention hn00.visf where court id is a two letter court id (ex. nc) needs to be placed in the headnote home directory - /elsa/eldm_hnotes trigger_hn.sh [1 or 2] just do from the command line. This will add opc application part 1 or part 2 for a given court ex. trigger_hn.sh nc1 Rarely used. Mainframe jobs - Reset job stream back to 'R' (ready status) in OPC 5.2 or 5.4 use 'run.jcl.sh ' to resubmit jobs. A back up copy of the edit'ed jcl is kept in the headnote tmp file /elsa/eldm_hnotes/tmp The mainframe jobs were not put directly under opc control as it is hoped to remove them at some point in the future. scripts - OPC is the place to restart the table build, match, insert lock key scripts. Restarting scripts outside of OPC can cause an extra OPC application to be added. If the match step is reran - check OPC first and see if part 2 of a given court already exists. If so delete it - the match automatically adds part 2 each time it is reran. So rekicking it off will add a new part 2 each time. checking for and deleting part two will prevent OPC from getting out of sync. OPC can be checked by using option 5.2 *********************************************************************************************************************** Comprehensive listing of Headnote files: CSV_to_visf.sh Script that converts michie CSV data to visf. Clean-Visf.h Contains function that cleans visf and does character substition or deletion to get data ready to update. NC_dkt_match.xom Input is North Carolina visf from eldm. Data has visf tags, but no segments - so match is done by pattern matching. NC_hn_match.sh Script that runs North Carolina match process. cite_match.xom Program that attempts to match docs based on the cite. clean_CR.xom Remove new lines from data. eldm.opstat.jcl generic jcl template that deletes unneeded jobs from an OPC application. eldm.opstat.jcl.nc North Carolina jcl template that deletes unneeded jobs from an OPC application. eldm.opstat.jcl.nh New Hampshire jcl template that deletes unneeded jobs from an OPC application. eldm.opstat.jcl.vt Vermont jcl template that deletes unneeded jobs from an OPC application. eldm.trigger.jcl Adds an OPC application to the current plan. hn_CSV_visf.xom Conversion program that converts michie CSV data to visf. hn_dkt_match.xom Match program that matches visf tagged and segmented vendor headnote data with online docs. hn_format_docnum.xom Format docnum file to be sure that is ready to be inserted into the SFG key jcl. hn_function.h Various functions used by omnimark programs for this project. hn_insert_lkey.sh Insert lock key information in the $00: segment of headnote docs so that they can be updated. hn_match.sh script that runs headnote match program. hn_table_bld.sh script that build the table of online document infot. hn_table_bld.xom conversion program that builds a table of online docs - based on doc number, filed date, & docket. hn_vsf.sh eldm function that builds eldm headnote docs for North Carolina. hnote.env Environment file that contains global variables that are used by almost all headnote shell scripts. hnote.env.casenet test environment file set up for testing in casenet. hnote.env.test test environment file set up for testing in the tdd. insert_lkey.xom Program that inserts lock key information into headnote docs. load.sh Program that submits jcl to the internal reader on the mainframe. pipeline.jcl template for pipeline OE jcl. run.jcl.sh Script made to submit edit'ed jcl manually. run.sfg.jcl.sh runs SFG (Serial File Get) program to capture designated segments from all online documents from a database. run_lkey_jcl.sh Runs SFG program that captures docs + lock keys for specified documents only from a database. run_pipeline_oe.sh Runs pipeline OE that will eventually submit headnote batch for update into AUSM. run_trigger_jcl.sh Runs OPC trigger job that will add an application to OPC for a court - either part 1 or part 2. sfg.jcl template for SFG (serial file get) jcl. sfg.jcl.tdd Production and the TDD were out of sync. So certain over rides were put into jcl to get it to run on TDD. sfg.key.jcl.1st First half jcl built to do SFG on lock key docs. sfg.key.jcl.2nd Second half of SFG jcl built to capture lock key docs. sfg.key.jcl.2nd.tdd Production and the TDD were out of sync. So certain over rides were put into jcl to get it to run on TDD. start_hn.sh This job is the cron job that starts the headnote process; it starts when data is detected in the home directory. trigger_hn.sh This job was created for a data technician to start the headnote process after michie has cataloged mainframe data. This job is made for either visf or CSV data.