SQL * Loader User’s Guide

Oracle’s SQL * Loader can load external data into a database table. Here is the SQL * Loader’s basic characteristics:

1) can be loaded into the data of the different types of data files and data files
2) can be loaded into a fixed format, freely given community as well as the degree of long format data
3) can be loaded into the binary, packed decimal data
4) once to load the data on multiple tables
5) connecting a plurality of physical records loaded into a recording
6) decomposition of a single record and then loaded into the table
7) can be used to develop out to generate a unique KEY
8) data files on disk or tape loaded tab
9) loading error
10) integer string in the file can be automatically converted into packed decimal and load the list.

1.2 control file control file is a text file, the text file can be written in a language SQL * Loader identify. SQL * LOADER control file can be found in the data that needs to be loaded. And the analysis and interpretation of these data. The control file consists of three parts:

l global option line, the number of records to skip, etc.;
l INFILE clause specifies the input data;
l Data Feature Description.

1.3 input files for SQL * Loader, in addition to the control file is the input data. SQL * Loader reads data from one or more of the specified file. If the data is specified in the control file should be written as in the control file the INFILE * format. When the fixed data format (length) and file, use INFILE “fix n”

load data
infile ‘example.dat’ “fix 11”
into table example
fields terminated by ‘,’ optionally enclosed by ‘”‘
(Col1 char (5),
col2 char (7))
example.dat:
001, cd, 0002, fghi,
00003, lmn,
“PQRS”
0005, uvwx,

When the data variable format (length) and is in the file, use INFILE “var n”. Such as:

load data
infile ‘example.dat’ “var 3”
into table example
fields terminated by ‘,’ optionally enclosed by ‘”‘
(Col1 char (5),
col2 char (7))
example.dat:
009hello, cd, 010world, im,
012my, name is,

1.4 bad file bad file contains records of those who have been rejected by SQL * Loader. Rejected records may be non-compliant records.
Bad file name from the SQL * Loader command BADFILE parameters given.

1.5 log file and log information when the SQL * Loader begins execution, it will automatically create a log file. The log file contains a summary of the load, in the error message.

The control file syntax of the control file format is as follows:

OPTIONS ({[SKIP = integer] [LOAD = integer]
[ERRORS = integer] [ROWS = integer]
[BINDSIZE = integer] [SILENT = (ALL | FEEDBACK | ERROR | DISCARD)])
LOAD [DATA]
[{INFILE | INDDN} {file | *}
[STREAM | Record | Fixed length [of BLOCKSIZE size] |
VARIABLE [length]]
[{BADFILE | BADDN} file]
{DISCARDS | DISCARDMAX} integr]
As [{INDDN | infile} …]
[APPEND | REPLACE | INSERT]
[RECLENT integer]
[{The CONCATENATE integer |
CONTINUEIF {[THIS | NEXT] (start [: end]) LAST}
Operator {‘string’ | X ‘hex’}}]
INTO TABLE [user] table
[APPEND | REPLACE | INSERT]
[WHEN condition [AND condition].
[Fields [delimiter]]
(
column {
RECNUM | CONSTANT value |
SEQUENCE ({integer | MAX | COUNT} [, increment]) |
[POSITION ({start [end] | * [+ integer]}
)]
datatype
[Terminated [BY] {whitespace | [X] ‘character’}]
[[OPTIONALLY] ENCLOSE [BY] [X] ‘charcter’]
[NULLIF condition]
[DEFAULTIF condotion]
}
[…]
)
[INTO TABLE …]
[BEGINDATA]

1) To load the data file:

1. INFILE and INDDN, are synonyms, they are behind the data file to be loaded. * Indicates that the data in the control file. After INFILE file.
2. STRAM read one byte of data at once. The new line represents a new physical record (logical record may be several physical records).
3. RECORD host operating system documentation and records management system. If the data in the control file, then use this method.
3. The Fixed length you want to read the record length for length bytes,
4. VARIABLE read the records in the first two bytes contain the length, length records may length. The lack of injury to 8k bytes.
5. BADFILE, and BADDN synonymous. Oracle can not load the data into the database of those records.
6. DISCARDFILE, and DISCARDDN is synonymous. Record the data did not pass.
7. DISCARDS DISCARDMAX, is synonymous. Integer maximum number of abandoned file.

2) loading method:

1. APPEND add rows to the table.
2. INSERT adding rows to empty table (if the table has records exit).
3. REPLACE first empty table in the data is loaded.
4. RECLEN used in both cases, 1) sqlldr not be calculated automatically record length, 2) or when the user wants to see a complete record of the bad file. For the latter, Oracle can only press the conventional the bad recording portion written to the wrong place. If you look at the entire record, the entire record can be written to the bad file.

3) specifies the maximum record length:

1. The CONCATENATE allows the user to set an integer that represents the number of logical records to be combined.

4) the establishment of a logical record:

1. THIS check the conditions of the current record, the next record if true.
2. NEXT examination under a recording condition. If true, then connect the next record to the current record.
2. Start: end want to check to determine whether the connection exists to continue the string column THIS NEXT string. Such as: continueif next (1-3) = ‘WAG’ or CONTINUEIF next (1-3) X’0d03if to ‘

5) Specify the table to be loaded:

1. INTO TABLE to add the table name.
2. WHEN and SELECT WHERE similar to. Used to check the records of the case, such as: when (3-5) = ‘SSM’ and (22) = ‘*

6), introduced and enclosed fields in the record:

1. FIELDS given record field delimiter FIELDS format:

FIELDS [TERMIALED [BY] {WHITESPACE | [X] ‘charcter’}]
[[OPTIONALLY] ENCLOSE [BY] [X] ‘charcter’]

TERMINATED after reading a field that began to read the next field until.
WHITESPACE refers to the mean terminator spaces. Including space, tab, newline, formfeed and carriage return. To determine the characters enclosed in single quotes, the such as X’1B ‘.
OPTIONALLY ENCLOSED special characters of data should be enclosed. Can also include in TERMINATED characters.OPTIONALLY both TERMINLATED.
ENCLOSED within two delimiters. If ENCLOSED TERMINAED, the order in which they decided to calculate the order.

7) define the column:

column table column names. The value of the column can be:
The logical record BECHUM. The first record is recorded as 1, 2 2.
The CONSTANT given constant.
SEQUENCE sequence can start from any number, in the format:
SEQUENCE ({integer | MAX | COUNT} [, increment]
POSITION given column position in the logical record. Can be absolute or relative of a value. The format is:
POSITION ({start [end] | * [+ integer]})
Start start position
* Before the field immediately after the start.
+ From the forefront of the backward position.

8) defined data types:
14 data types can be defined:
CHAR
DATE
DECIMAL EXTERNAL
DECIMAL
DOUBLE
FLOAT
FLOAT EXTERNAL
GRAPHIC EXTERNAL
INTEGER
INTEGER EXTERNAL
SMALLINT
VARCHAR
VARGRAPHIC

1. Character type data
Char [(length)] [delimiter]
The default length is 1.
2. Date type data
DATE [(length)] [‘date_format’ [delimiter]
To_date function to limit.

3. Decimal character format
DECIMAL EXTERNAL [(length)] [delimiter]
Decimal number for the conventional format (not binary => a bit equal to one bit).

4. Packed decimal format data
DECIMAL (digtial [, divcision])

5. Double-precision floating point binary
DOUBLE

6. Ordinary floating point binary
FLOAT

7. Character format floating point
FLOAT EXTERNAL [(length)] [delimiter]

8. Double-byte character string data
GRAPHIC [(legth)]

9. Double-byte character string data
GRAPHIC EXTERNAL [(legth)]

10. Conventional whole-word binary integer
INTEGER

11. Character format integer
INTEGER EXTERNAL

12. Conventional whole-word binary data
SMALLINT

13. Variable-length string
VARCHAR

14. Variable double-byte character string data
VARGRAPHIC

2.2 write control file CTL

1. The file name of each data file;
2. Each data file format;
3. Each data file for each data record field’s properties;
4. ORACLE data table column attributes;
5. Data definition;
6. Other

The requirements of the data files:

The designation of the data type
CHAR character
INTEGER EXTERNAL integer
Decimal the EXTERNAL floating point type

3.1 the contents of data files

A file can be under the OS; or with specific data in the control file. The data file can be:
1, binary and character format: Loader can to read the binary file (as a character read) list
2, the fixed format: a record of data, the data type, the data length is fixed.
3, the variable format: Each record has at least a variable-length data field, a record can be a continuous string.
Data segment boundaries (such as name, age) “,” for the points of the field;, ‘data brackets
4, Loader can use multiple consecutive fields of physical records to form a logical record, record file to run in files: include the following:
1 Run Date: software version number
2, all of the input, the output file name; shows information on the command line, supplementary information,
3 on each mounted Report: Table name into the case; initial load, plus cut into or update loaded the selection of columns of information
4, data Error Report: Error code; abandon History Report
5, each equipped with X Report: Load line; loading the number of rows, you may skip the number of rows; may refuse the number of rows; may give up the number of rows
6, the statistical summary: Use space (packet size, length); read into the record, into the number of records to skip the number of records; refused a number of records, to give up the number of records; run time.

================================================== ================================================== ======

sql load of little summary of

sqlldr userid = lgone / tiger control = a.ctl
LOAD DATA
INFILE ‘t.dat’ / / file to import
/ / INFILE ‘tt.date’ / / import multiple files
/ / INFILE * / / to import behind in the below BEGINDATA control file is to import the contents of the

INTO TABLE table_name / / Specify the table being loaded
BADFILE ‘c :/ bad.txt’ / / specify the bad file address

, ************* Following is loaded into the table
Append / / the original table data added at the end
/ / INSERT / / load the empty table if the original table data sqlloader will stop the default value
/ / Replace / / the original table data the previous data will be deleted in its entirety
/ / TRUNCATE / / delete the existing data specified content and replace the same will truncate statement

************* Specified TERMINATED can also be the beginning of the table inside a table field part
FIELDS TERMINATED BY ‘,’ OPTIONALLY ENCLOSED BY ‘”‘
/ / Load this data: 10, lg, “” “LG” “”, “lg, lg
/ / Results in the table: 10 lg lg LG, LG
/ / TERMINATED BY X ’09 ‘/ / in hexadecimal format ’09’ represented
/ / TERMINATED BY WRITESPACE / / loading data: 10 lg lg

Empty field does not correspond to the value the Trailing NULLCOLS ************* table allows

The ************* following table field
(
col_1, col_2, col_filler FILLER / / FILLER keyword value of this column will not be loaded
/ /: Lg, lg, not the results of LG LG
)
/ / When no statement FIELDS TERMINATED BY ‘,’
/ / (
/ / Col_1 [interger external] TERMINATED BY ‘,’,
/ / Col_2 [date “dd-mon-yyy”] TERMINATED BY ‘,’,
/ / Col_3 [char] TERMINATED BY ‘,’ OPTIONALLY ENCLOSED BY ‘lg’
/ /)
/ / When no statement FIELDS TERMINATED BY ‘,’ position told field loading data
/ / (
/ / Col_1 position (1:2),
/ / Col_2 position (3:10)
/ / Col_3 Position (*: 16), / / ??this field the position of the start position of the front end of a field
/ / Col_4 position (1:16)
/ / Col_5 position (3:10) char (8) / / Specify the type of the field
/ /)

BEGINDATA / / corresponds to the beginning of the INFILE * to import the contents of the control file
10, SQL, what
20, lg, show

================================================== ===================================
/ / / / / / / / / / / / Note begindata spaces in front of the value can not be

1 ***** ordinary loading
LOAD DATA
INFILE *
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY ‘,’ OPTIONALLY ENCLOSED BY ‘”‘
(DEPTNO,
DNAME,
LOC
)
BEGINDATA
10, Sales, “” “USA” “”
20, Accounting, “Virginia, USA”
30, Consulting, Virginia
40, Finance, Virginia
50, “Finance”, “”, Virginia / / loc column will be empty
60, “Finance”,, Virginia / / loc column will be empty

2 ***** FIELDS TERMINATED BY WHITESPACE and the FIELDS TERMINATED BY x’09 ‘
LOAD DATA
INFILE *
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY WHITESPACE
– FIELDS TERMINATED BY x’09 ‘
(DEPTNO,
DNAME,
LOC
)
BEGINDATA
10 Sales Virginia

The 3 ***** designated loading the column
LOAD DATA
INFILE *
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY ‘,’ OPTIONALLY ENCLOSED BY ‘”‘
(DEPTNO,
FILLER_1 FILLER, / / ??the following “Something Not To Be Loaded” will not be loaded
DNAME,
LOC
)
BEGINDATA
20, Something Not To Be Loaded, Accounting, “Virginia, USA”

4 ***** position Liezi
LOAD DATA
INFILE *
INTO TABLE DEPT
REPLACE
(Deptno position (1:2)
The DNAME Position (*: 16), / / ??this field to the starting position in the first end position of a field
LOC position (*: 29)
ENTIRE_LINE POSITION (1:29)
)
BEGINDATA
10Accounting Virginia, USA

5 ***** a function date expression Trailing NULLCOLS, use
LOAD DATA
INFILE *
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY ‘,’
TRAILING NULLCOLS / / In fact, following ENTIRE_LINE in in behind the BEGINDATA data is not directly correspond to
/ / The value of the column if the first line to the 10, Sales, Virginia, 1/5/2000, on without TRAILING NULLCOLS
(DEPTNO,
DNAME “upper (: dname)”, / / ??use the function
LOC “upper (: loc)
LAST_UPDATED date ‘dd / mm / yyyy’, / / ??the date of a mode of expression as well as ‘DD-MON-YYYY’, etc.
ENTIRE_LINE “: deptno | |: dname | |: loc | |: last_updated”
)
BEGINDATA
10, Sales, Virginia, 1/5/2000
20, Accounting, Virginia, 21/6/1999
30, Consulting, Virginia, 5/1/2000
40, Finance, Virginia, 15/3/2001

6 ***** using a custom function / / solve problems
create or replace
function my_to_date (p_string in varchar2) return date
as
type fmtArray is table of varchar2 (25);

l_fmts fmtArray: = fmtArray (‘dd-mon-yyyy’, ‘dd-month-yyyy’,
‘Dd / mm / yyyy’,
‘Dd / mm / yyyy hh24: mi: ss’);
l_return date;
begin
for i in 1 .. l_fmts.count
loop
begin
l_return: = to_date (p_string, l_fmts (i));
exception
when others then null;
end;
EXIT when l_return is not null;
end loop;

the if (l_return is null)
then
l_return: =
new_time (to_date (‘01011970 ‘,’ ddmmyyyy ‘) + 1/24/60/60 *
p_string, ‘GMT’, ‘EST’);
end if;

return l_return;
end;
/

LOAD DATA
INFILE *
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY ‘,’
TRAILING NULLCOLS
(DEPTNO,
DNAME “upper (: dname)”,
LOC “upper (: loc)
LAST_UPDATED “my_to_date (: last_updated)” / / use a custom function
)
BEGINDATA
10, Sales, Virginia ,01-april-2001
20, Accounting, Virginia, 13/04/2001
30, Consulting, Virginia, 14/04/2001 12:02:02
40, Finance, Virginia, 987268297
50, Finance, Virginia ,02-apr-2001
60, Finance, Virginia, Not a date

7 ***** merge multiple rows of row
LOAD DATA
INFILE *
concatenate 3 / / keyword concatenate a few lines of record as row
INTO TABLE DEPT
replace
FIELDS TERMINATED BY ‘,’
(DEPTNO,
DNAME “upper (: dname)”,
LOC “upper (: loc)
LAST_UPDATED date ‘dd / mm / yyyy’
)
BEGINDATA
10, Sales / / In fact, these three lines as a line 10, Sales, Virginia, 1/5/2000
Virginia,
1/5/2000
/ / This Liezi is used to CONTINUEIF List = “” can also be
Tell sqlldr to find comma-comma at the end of each line to find put the next line is appended to the previous line

LOAD DATA
INFILE *
continueif this (1:1) = ‘-‘ / / find the beginning of each line is connected characters – put the next line is connected as a line
/ / -10, Sales, Virginia,
/ / 1/5/2000 is line 10, Sales, Virginia, 1/5/2000
/ / 1:1 means that from the first line and the end of the first line continueif next continueif list the best
INTO TABLE DEPT
replace
FIELDS TERMINATED BY ‘,’
(DEPTNO,
DNAME “upper (: dname)”,
LOC “upper (: loc)
LAST_UPDATED date ‘dd / mm / yyyy’
)
BEGINDATA / / but can not seem like the right side as
-10, Sales, Virginia, -10, Sales, Virginia,
1/5/2000 1/5/2000
-40, 40, Finance, Virginia, 13/04/2001
Finance, Virginia, 13/04/2001

8 ***** load each line number

load data
infile *
into table t
replace
(Seqno RECNUM / / load each line number
the text Position (1:1024))
BEGINDATA
fsdfasj / / a row number is automatically assigned to the loading table t seqno field behavior
fasdjfasdfl / / This behavior …

9 ***** data loaded by a newline
Note: Unix and Windows & / n
<1> Use a non-newline character
LOAD DATA
INFILE *
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY ‘,’
TRAILING NULLCOLS
(DEPTNO,
DNAME “upper (: dname)”,
LOC “upper (: loc)
LAST_UPDATED “my_to_date (: last_updated)”,
COMMENTS “replace (: comments, ‘/ n’, chr (10)) / / replace the use of help convert line breaks
)
BEGINDATA
10, Sales, Virginia ,01-april-2001, This is the Sales / nOffice in Virginia
20, Accounting, Virginia, 13/04/2001, This is the Accounting / nOffice in Virginia
30, Consulting, Virginia, 14/04/2001 12:02:02, This is the Consulting / nOffice in Virginia
40, Finance, Virginia, 987268297, This is the Finance / nOffice in Virginia

<2> to use the fix property
LOAD DATA
INFILE demo17.dat “fix 101”
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY ‘,’
TRAILING NULLCOLS
(DEPTNO,
DNAME “upper (: dname)”,
LOC “upper (: loc)
LAST_UPDATED “my_to_date (: last_updated)”,
COMMENTS
)
demo17.dat
10, Sales, Virginia ,01-april-2001, This is the Sales
Office in Virginia
20, Accounting, Virginia, 13/04/2001, This is the Accounting
Office in Virginia
30, Consulting, Virginia, 14/04/2001 12:02:02, This is the Consulting
Office in Virginia
40, Finance, Virginia, 987268297, This is the Finance
Office in Virginia

/ / This mount will line breaks into the database following method will not but require different data format

LOAD DATA
INFILE demo18.dat “fix 101”
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY ‘,’ OPTIONALLY ENCLOSED BY ‘”‘
TRAILING NULLCOLS
(DEPTNO,
DNAME “upper (: dname)”,
LOC “upper (: loc)
LAST_UPDATED “my_to_date (: last_updated)”,
COMMENTS
)
demo18.dat
10, Sales, Virginia ,01-april-2001, “This is the Sales
Office in Virginia ”
20, Accounting, Virginia, 13/04/2001, “This is the Accounting
Office in Virginia ”
30, Consulting, Virginia, 14/04/2001 12:02:02, “This is the Consulting
Office in Virginia ”
40, Finance, Virginia, 987268297, “This is the Finance
Office in Virginia ”

<3> var attribute
LOAD DATA
INFILE demo19.dat “var 3”
/ / Tell the first three bytes of each record indicates the length of the record as the first record of 071 that this record is 71 bytes
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY ‘,’
TRAILING NULLCOLS
(DEPTNO,
DNAME “upper (: dname)”,
LOC “upper (: loc)
LAST_UPDATED “my_to_date (: last_updated)”,
COMMENTS
)
demo19.dat
07110, Sales, Virginia ,01-april-2001, This is the Sales
Office in Virginia
07820, Accounting, Virginia, 13/04/2001, This is the Accounting
Office in Virginia
08730, Consulting, Virginia, 14/04/2001 12:02:02, This is the Consulting
Office in Virginia
07140, Finance, Virginia, 987268297, This is the Finance
Office in Virginia

<4> str property
/ / The most flexible one can define a new line terminator win carriage return line feed: chr (13) | | chr (10)

Recorded in this column is based on the end of a | / r / n
select utl_raw.cast_to_raw (‘|’ | | chr (13) | | chr (10)) from dual;
Results 7C0D0A of

LOAD DATA
INFILE demo20.dat “str X’7C0D0A ‘”
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY ‘,’
TRAILING NULLCOLS
(DEPTNO,
DNAME “upper (: dname)”,
LOC “upper (: loc)
LAST_UPDATED “my_to_date (: last_updated)”,
COMMENTS
)
demo20.dat
10, Sales, Virginia ,01-april-2001, This is the Sales
Office in Virginia |
20, Accounting, Virginia, 13/04/2001, This is the Accounting
Office in Virginia |
30, Consulting, Virginia, 14/04/2001 12:02:02, This is the Consulting
Office in Virginia |
40, Finance, Virginia, 987268297, This is the Finance
Office in Virginia |

================================================== ============================
The data like nullif clause

10-jan-200002350Flipper seemed unusually hungry today.
10510-jan-200009945Sdivad over three meals.

id position (1:3) nullif id = blanks / / can be blanks or other expression
/ / The following is another Liezi first line in the database will become null
LOAD DATA
INFILE *
INTO TABLE T
REPLACE
(N position (1:2) integer external nullif n = ‘1 ‘,
V position (3:8)
)
BEGINDATA
1 10
20lg
————————————————– ———-

English log format, you may need to modify environment variables nls_lang or nls_date_format

================================================== ================================================== ======

Oracle SQL * Loader to use guide (reproduced)

SQL * Loader is a tool for Oracle Database Import External Data and DB2 Load tool, but there is more choice, it supports changes in loading mode, the optional loading and multi-table load.
How to use SQL * Loader tool we can use Oracle sqlldr tools to import data. For example:
sqlldr scott / tiger control = loader.ctl
Control file (loader.ctl) will load an external data file the (including delimiter). Loader.ctl as follows:
load data
infile ‘c :/ data / mydata.csv’
into table emp
fields terminated by “,” optionally enclosed by ‘”‘
(Empno, empname, sal, deptno)

The mydata.csv are as follows:
10001, “Scott Tiger”, 1000, 40
10002, “Frank Naude”, 500, 20
The following is a sample control file specified record length. “*” Represents data file with the file of the same name, to identify the data that is behind use BEGINDATA segment.
load data
infile *
replace
into table departments
(Dept position (02:05) char (4),
deptname position (08:27) char (20)
)
begindata
COSC COMPUTER SCIENCE
ENGL ENGLISH LITERATURE
MATH Mathematics
POLY POLITICAL SCIENCE

The Unloader this tools
Oracle does not provide tools to export data to a file. However, we can use SQL * Plus select and format data to output to a file:
set echo off newpage 0 space 0 pagesize 0 feed off head off trimspool on
spool oradata.txt to
select col1 | | ‘,’ | | col2 | | ‘,’ | | col3
from tab1
where col2 = ‘XYZ’;
spool off

Alternatively, you can use UTL_FILE PL / SQL package deal with:
rem Remember to update initSID.ora, utl_file_dir = ‘c :/ oradata’ parameter
declare
The fp utl_file.file_type;
begin
fp: = utl_file.fopen (‘c :/ oradata’, ‘tab1.txt’, ‘w’);
utl_file.putf (fp, ‘% s,% s / n’, ‘TextField’, 55);
utl_file.fclose (fp);
end;
/

Of course, you can also use third-party tools, such as SQLWays, Toad for Quest and so on.

Loading variable length records of the specified length, such as:
LOAD DATA
INFILE *
Into TABLE load_delimited_data,
FIELDS TERMINATED BY “,” OPTIONALLY ENCLOSED BY ‘”‘
TRAILING NULLCOLS
(Data1,
data2
)
BEGINDATA
11111, AAAAAAAAAA
22222, “A, B, C, D,”

Here is an example of the import fixed position (fixed length) data:
LOAD DATA
INFILE *
Into TABLE load_positional_data,
(Data1 POSITION (1:5)
of data2 Position (6:15)
)
BEGINDATA
11111AAAAAAAAAA
22222BBBBBBBBBB

Skip rows of data:
“SKIP n” keyword to specify the import can skip how many rows of data. Such as:
LOAD DATA
INFILE *
Into TABLE load_positional_data,
SKIP 5
(Data1 POSITION (1:5)
of data2 Position (6:15)
)
BEGINDATA
11111AAAAAAAAAA
22222BBBBBBBBBB

When you import data to modify the data:
Import data to the database, you can modify the data. Note that this is only suitable for regular import, not suitable for direct import as:
LOAD DATA
INFILE *
INTO TABLE modified_data
(Rec_no “my_db_sequence.nextval”,
region CONSTANT ’31 ‘,
time_loaded “to_char (SYSDATE, ‘HH24: MI’)”,
data1 POSITION (1:5) “: data1/100”,
data2 POSITION (6:15) “upper (: data2)”,
data3 POSITION (16:22) “to_date (: data3, ‘YYMMDD’)”
)
BEGINDATA
11111AAAAAAAAAA991201
22222BBBBBBBBBB990112

LOAD DATA
INFILE ‘mail_orders.txt’
BADFILE ‘bad_orders.txt’
APPEND
INTO TABLE mailing_list
FIELDS TERMINATED BY “,”
(Addr,
city,
state,
zipcode,
mailing_addr “decode (: mailing_addr, null,: addr,: mailing_addr)”,
mailing_city “decode (: mailing_city, null,: city,: mailing_city)”,
mailing_state
)

Import the data into multiple tables:
Such as:
LOAD DATA
INFILE *
REPLACE
INTO TABLE emp
WHEN empno! = ”
(Empno POSITION (1:4) INTEGER EXTERNAL,
ename POSITION (6:15) CHAR,
deptno POSITION (17:18) CHAR,
mgr POSITION (20:23) INTEGER EXTERNAL
)
INTO TABLE proj
WHEN projno! = ”
(Projno POSITION (25:27) INTEGER EXTERNAL,
empno POSITION (1:4) INTEGER EXTERNAL
)

Import the selected records:
In the following example: (01) represents the first character, (30:37) represents between 30-37 characters:
LOAD DATA
INFILE ‘mydata.dat’ BADFILE ‘mydata.bad’ DISCARDFILE ‘mydata.dis’
APPEND
INTO TABLE my_selective_table
WHEN (01) <> ‘H’ and (01) <> ‘T’ and (30:37) = ‘19991217 ‘
(
region CONSTANT ’31 ‘,
service_key POSITION (01:11) INTEGER EXTERNAL,
call_b_no POSITION (12:29) char
)

Skip some of the fields in the import:
The available the POSTION (x: y) to delimited data in Oracle8i can be achieved by specifying the FILLER field. FILLER field is used to skip, ignore the imported data file field:
LOAD DATA
TRUNCATE INTO TABLE T1
FIELDS TERMINATED BY ‘,’
(Field1,
field2 FILLER
field3
)

Import multiple rows:
You can use one of the following two options to achieve the multiple rows of data into a record:

CONCATENATE: – use when SQL * Loader should combine the same number of physical records together to form one logical record.

CONTINUEIF – use if a condition indicates that multiple records should be treated as one. Eg. By having a ‘#’ character in column 1.

SQL * Loader data submission:
Under normal circumstances is submitted after the import data file data.
You can also specify a ROWS = parameter to specify each time you submit a number of records.

Improve the performance of SQL * Loader:
1) a simple and easy to ignore the problem, not the imported table using any indexes and / or constraints (primary key). If you do so, even in the ROWS = parameter will obviously reduce the the database import performance.
2) you can add DIRECT = TRUE to improve the performance of the import data. Of course, in many cases, you can not use this parameter.
3) specify UNRECOVERABLE options, you can close the database log. This option can only be used in conjunction with direct.
4) can run multiple import task.

Conventional import direct import means the difference between:
Regular import by using the INSERT statement to import the data. Direct import can skip the logic of the database (DIRECT = TRUE), and directly import the data into the data file.
The original address