Unconfigured Ad Widget

Unconfigured Ad Widget Module


Announcement Module
No announcement yet.

SQL Error 802 Type 6 - Numeric Handling for Joins?

Page Title Module
Move Remove Collapse
Conversation Detail Module
  • Filter
  • Time
  • Show
Clear All
new posts

  • SQL Error 802 Type 6 - Numeric Handling for Joins?

    I hit my head against this all yesterday, but have not found a clear explanation or way around this error.

    I'm trying to run a SQL search over a Physical File that has been externally defined with a DDS file.
    The field that is failing is actually representing a date and is supposed to exist as CYYMMDD, but the DDS has defined this field as a NUMERIC(7,0). I say "supposed to" because if I look over the file myself (using DBU), I can see accounts, particularly early ones, have incomplete or completely missing values. I don't know if this is what is causing a problem or not.

    This initial query works cleanly when run from STRSQL or through SQuirrel SQL Client (v3.3.0):
    However, this really only gives me part of the data set I really need. The subsequent query is what fails:
    SELECT AMKEY as "Account", AMFDT as "Frst Ord Date", AMLDT as "Last Ord Date", 
        OMORD# as "OrdNumOpen", OMFRST as "First", OMPRDT as "Purch Date" 
    WHERE AMCUS = 'C' AND AMFDT > 2110000 and OMFRST = 'Y'
    Error when run from SQuirrel SQL Client v3.3.0 (syntax altered accordingly)
    Error: [SQL0802] Data conversion or data mapping error. 
    Cause . . . . . :   Error type 6 has occurred. Error types and their meanings are: 
                            ... 6 -- Numeric data that is not valid. ...
    Recovery  . . . :   The error was caused by data that was not valid or that was too large.  
       Look at the previously listed messages in the job log (DSPJOBLOG command) or press F10 
       (Display messages in job log) on this display to determine what row and columns were 
       involved in the error.  Correct the data and then try the request again.
    SQLState:  22023
    ErrorCode: -802
    Error when run in STRSQL
    Selection error involving AMFDT.
    JobLog Continues:
    CPD4019: Select or omit error on field AMFDT member FILE1.
    Message . . . . :   Select or omit error on field AMFDT member FILE1.      
    Cause . . . . . :   A select or omit error occurred in record 0, record format FILE1RA, 
                           member number 1 of file FILE1 in library LIB, because of condition 6
                           of the following conditions:                                    
                            ...   6 - A data mapping error occurred on the field before the 
                                       select or omit operation was attempted.  ...
    I've been running Google searches, but I'm just not clear what is causing this error. All I know is it's related to the AMFDT field.
    Can SQL just not handle if a numeric field is completely empty? Or is something more subtle going on here? (or something blatantly obvious that I'm apparently oblivious about..)

    Absolutely any insight or guidance for additional investigation would be hugely appreciated!
    Thank you!

  • #2
    Simply put, this is saying that somewhere in your file you have a decimal data error on the specified field.

    This might be spaces, or other data that does not conform to zoned standards.

    To see if this is the case you could try SELECT DISTINCT HEX(CHAR(AMFDT)) FROM FILE1

    This will show you if this is the case, but not pinpoint the record doing the damage.



    • #3
      Thank you so much! I've just been stumped at this.

      I ran this query and will admit I'm not entirely clear on exactly what I'm looking at in the hex,
      ...but paging through the results quickly, I am seeing results with ++++++++++++++++

      So I'm going to guess that those are the records that could be triggering the SQL error.. is that accurate?

      Again, HUGE thanks!


      • #4
        Actually, I hadn't filtered out the types of accounts that wouldn't have this data filled in yet.. if I do that I lose the ++++ "entries".

        I'm guessing I have to save off a filtered segment of the data to a new table and try joining that on the original table I want.. maybe that will bypass this error.

        So, would the correct takeaway from this be that a field defined as a numeric cannot be completely blank for SQL to handle correctly? It must have a 0 at the least?

        Thank you!


        • #5
          A field defined as numeric or decimal must actually be numeric or decimal!



          • #6
            And logically that makes perfect sense. I think this comes down to a collision of factors between the implementation of the DDS over legacy data and my own understanding of the order of WHERE check handling within a complex SQL query.
            • The field in question is one that would only be filled in for a certain type of record (another flag being set).
            • The actual data is used by programs as a date, not a number, but defining it as a date field and then performing updates on it could cause conflicts with existing programs unless reconverted back every time?
              (CYYMMDD -> YYYY-MM-DD and back?)
              (Current handling is mainly through internal program definitions, so this is not an issue for the current system)
            • The DDS defined the field as made the most sense, but we did not go back and apply a default value to any field that would be blank.
              (I'm not sure what additional implications that might have)
            • My own attempt to combine two SQL WHERE clauses into a single statement thinking when the first factor failed it would stop and skip those records.
              But SQL would really be processing all the factors before eliminating those records, so I needed to really break the queries into two steps. One to filter out the records that would be wrong, THEN join that result group with the second table as needed.
              Having done that, it seems to have worked quite well.

            Thank you very much for your help!