Unconfigured Ad Widget

Collapse

Announcement

Collapse
No announcement yet.

Stumped by SQL

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • mwbrock@hotmail.com
    replied
    Thanks, but submitting as a new job isn't an option for this particular application. When the primary job has completed, the summary file needs to be ready for download by the ETL processes for the data warehouse.

    I have no doubt that submitting the final RUNSQLSTM as a new job would work just fine, but unfortunately that doesn't solve the actual problem. The job should not be hanging in TIMW to begin with...I guess I'll open up a database support call next week.

    Leave a comment:


  • satid
    replied
    Since the problem occured in the CL program that you mentioned and the last RUNSQLSTM worked fine when you submitted it individually, you may modify that CL program to use SBMJOB for the last RUNSQLSTM to see if the problem is solved. If so, then we know we should focus on that CL program to investigate why the last RUNSQLSTM hangs when not submitted.
    Last edited by satid; 07-02-2009, 08:42 PM.

    Leave a comment:


  • mwbrock@hotmail.com
    replied
    ---You may have already tried it but job log is always a good place to look. I wonder if there is a clue there.

    There is not, unfortunately. The last entry in the job log is just the RUNSQLSTM command. No further information.

    Leave a comment:


  • mwbrock@hotmail.com
    replied
    Thank you, satid, I do appreciate your willingness to help. I've been on the AS/400 (and all its other names) since 1989.

    Also, running V5R4, current on PTFs. :-)

    Leave a comment:


  • satid
    replied
    Originally posted by mwbrock@hotmail.com View Post
    ...The problem is that when the job reaches this final step, it hangs in a TIMW status, where it just sits and consumes CPU until I end it.

    If I submit the RUNSQLSTM as a separate job after creating and updating the input table, it runs just fine and produces the summary output as expected.
    When the job is in TIMW status and consuming CPU, you use the command WRKACTJOB and put option 5 in from of that job and press Enter. Then, in the subsequent menu that appears, you select option 10 Display Job log and press Enter, followed by F10. Then browse the last few messages in the log to see if there is any that looks out of place and could provide some clues as to what is happening. Move the screen cursor to the message line and press F1 to see more details of the message that may also indicate how to recover the issue.

    Please also make sure you specify the parameter NAMING(*SYS) for the RUNSQLSTM command.

    I also suspect you may have a low PTF level. What is the OS release you are using? Run the command DSPPTF and identify the topmost item name under "PTF ID" column to see if it is too low or not.
    Last edited by satid; 07-02-2009, 01:50 AM.

    Leave a comment:


  • don.johnson31
    replied
    You may have already tried it but job log is always a good place to look. I wonder if there is a clue there.

    Leave a comment:


  • mwbrock@hotmail.com
    replied
    Thanks Don, I appreciate your response. I'm certain the query isn't choking the box...it easily handles much larger and more complex queries than this, across much larger tables as well. As well, this query runs just fine when submitted as a separate job after the input file is built (not an option to do that for this particular application, though, everything needs to run in a single job). The query also runs fine in interactive SQL.

    I've run it through Visual Explain and there is an index recommended. I had not planned on doing that because this is a temp file that will go away after this query produces the summary file. However, I went ahead an built the index as an intermediate step after creating the temp file and producing the summary file. No luck. The job still hung up in a TIMW status until I ended it.

    ...Michael

    Leave a comment:


  • don.johnson31
    replied
    Indexes

    It sounds like the SQL Query is choking your machine. Many times this is beacuse the system is having to dynamically create indexes. You will first have to determine what indexes are needed if any.
    To do this you can use a nifty feature called SQL Performance Monitor. You can access it multiple ways but I use iSeries Navigator also known as Ops Navigator. In the connection to your machine you will drill down into the Databases option and click on your database (many times is your machine name). Then you will see SQL Performance Monitor. Right click this option and then click new --> SQL Performance Monitor. A window will appear and will ask you questions about the monitor you want to perform. Depending on your version of Client Access you will find that there are different options. Basically you want to give the monitor a name and then select the schema (which means library). I recommend narrowing down to your job so that you dont get all kinds of crap in the monitor results. once you have finished going through this wizard type window and click finish, you will then run your query. i recommend having everything ready so you dont have to do data setup and such after the monitor has been started - therefore limiting the amound of stuff you are monitoring to just the query. after you have ran the query you will left click on the SQL Performance Montor option in Ops Nav again and in the right pane of the window you will see the montor in the list. right click it and click analyze. It will take a minute but a window will open and show a Summary. you can click on option in the list for "index creates advised". you will have to play around with the results but this should give you somewhere to start looking if the system is having to create indexes on the fly. I hope this helps.

    Leave a comment:


  • mwbrock@hotmail.com
    started a topic Stumped by SQL

    Stumped by SQL

    If anyone could help with this, I would certainly appreciate it. I have a CL program that needs to copy data into a temporary table, run a couple of RPG programs to update the table, and then produce summary output using SQL (I am replacing an old RUNQRY with RUNSQLSTM). The problem is that when the job reaches this final step, it hangs in a TIMW status, where it just sits and consumes CPU until I end it.

    If I submit the RUNSQLSTM as a separate job after creating and updating the input table, it runs just fine and produces the summary output as expected. I had a similar problem on a CL I did last week which began to work correctly after I specified ALWCPYDTA(*NO) on the RUNSQLSTM. However, that doesn't work with this one for some reason.

    I have used just about every method one could imagine to create the temporary table, including DYNSQL (from the SEQUEL product, specifying SERVER(*LOCAL)), another RUNSQLSTM to drop, create, and insert into the table, and good old CPYF frmrcd(1). All of this works just as expected, but I get the same results from the final step of creating the summary each time.

    I have commented out the RPG applications that update the data to see if that was the issue...it isn't.

    The CL looks like this:



    RUNSQLSTM SRCFILE(mbsrc) SRCMBR(RESENGS) +
    COMMIT(*NONE) ERRLVL(20)

    ............Drop table mbddl/tempfile;
    ............Create table mbddl/tempfile like mbrock/prodfile;
    ............Insert into mbddl/tempfile
    ............select * from mbrock/prodfile;

    (have also used CPYF, etc instead of the above)

    ...rpg applications run here to update tempfile...

    RUNSQLSTM SRCFILE(mbsrc) SRCMBR(crtsumtbl) +
    COMMIT(*NONE) ERRLVL(20) ALWCPYDTA(*NO)

    ..........drop table mbrock/sumtable;

    ..........create table mbrock/sumtable as (
    ..........select '1' as breaklvl, ' ' as overflow,
    .......... t.fld01, ((t.year*100)+t.month) as proddt,
    .......... sum(t.fld02) as fld0201, sum(t.fld03) as fld0301,
    .......... .
    .......... ...etc for about 40 more fields...
    .......... .
    .......... sum(t.fld43) as fld4301, sum(t.fld44) as fld4401
    .......... from mbddl/tempfile t
    .......... where t.fld01 = 100
    .......... and t.fldxx = '2'
    .......... group by t.fld01, ((t.year*100)+t.month)
    .......... order by t.fld01, proddt )
    .......... with data;


    The SQL is long and ugly because I am feeding a data warehouse and need to use the same field names that RUNQRY has been generating. Also, I needed to substitute the field and table names in this post for client discretion.

    HELP?!?!?
Working...
X