3

I'm loading data into my table through SQL Loader data loading is successful but i''m getting garbage(repetitive) value in a particular column for all rows

After inserting : column TERM_AGREEMENT is getting value '806158336' for every record My csv file contains atmost 3 digit data for that column,but i'm forced to set my column definition to Number(10).

   LOAD DATA
    infile '/ipoapplication/utl_file/LBR_HE_Mar16.csv'
    REPLACE
    INTO TABLE LOAN_BALANCE_MASTER_INT
    fields terminated by ',' optionally enclosed by '"'
    (
    ACCOUNT_NO,
    CUSTOMER_NAME,
    LIMIT,
    REGION,

    **TERM_AGREEMENT INTEGER**
    )

create table LOAN_BALANCE_MASTER_INT
(
  ACCOUNT_NO             NUMBER(30),
  CUSTOMER_NAME          VARCHAR2(70),
  LIMIT                  NUMBER(30),

  PRODUCT_DESC           VARCHAR2(30),
  SUBPRODUCT_CODE        NUMBER,

  ARREARS_INT            NUMBER(20,2),
  IRREGULARITY           NUMBER(20,2),
  PRINCIPLE_IRREGULARITY NUMBER(20,2),


  **TERM_AGREEMENT         NUMBER(10)**
)
2
  • Could 806158336 be the account number of the next line? Commented May 30, 2016 at 11:42
  • no,account number is only of 7 digits.And how it's giving me the same value for 3987 rows? I even ran the following command(AIX): cat LBR_HE_Mar16.csv | grep 806158336 ,and there were no records found. Commented May 31, 2016 at 3:27

2 Answers 2

1

INTEGER is for binary data type. If you're importing a csv file, I suppose the numbers are stored as plain text, so you should use INTEGER EXTERNAL. The EXTERNAL clause specifies character data that represents a number.

Edit: The issue seems to be the termination character of the file. You should be able to solve this issue by editing the INFILE line this way:

INFILE'/ipoapplication/utl_file/LBR_HE_Mar16.csv' "STR X'5E204D'"

Where '5E204D' is the hexadecimal for '^ M'. To get the hexadecimal value you can use the following query:

SELECT utl_raw.cast_to_raw ('^ M') AS hexadecimal FROM dual;

Hope this helps.

Sign up to request clarification or add additional context in comments.

10 Comments

i used INTEGER because without it i was getting the error ORA-01722: invalid number ,even though in my csv file i changed it's datatype to 'General' . Even with INTEGER EXTERNAL i'm getting same error,records are getting rejected.
If you can provide a line of example of your csv, it would help solving the issue.
:file on server: 3041295,Mr. ASHVINBHAI PRAVINBHAI DESAI,600000,WEST,216,SURAT BRANCH,1,29-Jan-09,03-Jan-21,7201,7201-HOME EQUITY,101,HOME EQUITY FINANCE - RES,Fully Advanced,80 5816,19.25,600000,381412,381386,381386,0,0,0,0,-27,0,0,0,03-Feb-09,10239,02-Apr-16,03-Apr-16,58,85,0,0,0,ECS,PROCESSED,,3,29-Jan-09,0,0,0,,1171.26,120,N,1,AKOPD2862M,0^ M
i believe that '^M' in the end is an issue ,when the file is ftped on server.I read somewhere to use 'dos2unix' for it but i'm getting 'command not found' error. How come i'm getting those characters in the end when i put the file on server?
Now i'm getting this SQLLoader-510: Physical record in data file (/ipoapplication/utl_file/HEr16.csv) is longer than the maximum(1048576) SQLLoader-2026: the load was aborted because SQL Loader cannot continue.
|
1

I actually solved this issue on my own. Firstly, thanks to @Gary_W AND @Alessandro for their inputs.Really appreciate your help guys,learned some new things in the process. Here's the new fragment which worked and i got the correct data for the last column

 LOAD DATA
    infile '/ipoapplication/utl_file/LBR_HE_Mar16.csv'
    REPLACE
    INTO TABLE LOAN_BALANCE_MASTER_INT
    fields terminated by ',' optionally enclosed by '"'
    (
    ACCOUNT_NO,
    CUSTOMER_NAME,
    LIMIT,
    REGION,

    **TERM_AGREEMENT INTEGER Terminated by Whitspace**
    )
  • 'Terminated by whitespace' - I went through some threads of SQL Loader and i used 'terminated by whitespace' in the last column of his ctl file. it worked ,this time i didn't even had to use 'INTEGER' or 'EXTERNAL' or EXPRESSION '..' for conversion. Just one thing, now can you guys let me now what could possibly be creating issue ?what was there in my csv file in that column and how by adding this thing solved the issue ? Thanks.

1 Comment

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.