Oracle Data Pump 研究(一)

zhanglei_itput發表於2010-06-30

Oracle Data Pump is a newer, faster and more flexible alternative to the "exp" and "imp" utilities used in previous Oracle versions.

一、Getting Started

       Create a directory object it can access,The directory object is only a pointer to a physical directory, creating it does not actually create the physical directory on the file system.

CONN sys/password@db10g AS SYSDBA
ALTER USER scott IDENTIFIED BY tiger ACCOUNT UNLOCK;
GRANT CREATE ANY DIRECTORY TO scott;

CREATE OR REPLACE DIRECTORY test_dir AS '/u01/app/oracle/oradata/';
GRANT READ, WRITE ON DIRECTORY test_dir TO scott;
二、Table Exports/Imports
expdp scott/tiger@db10g tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=expdpEMP_DEPT.log

impdp scott/tiger@db10g tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=impdpEMP_DEPT.log

The TABLE_EXISTS_ACTION=APPEND parameter allows data to be imported into existing tables
 
三、Schema Exports/Imports
expdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log

impdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=impdpSCOTT.log
 
四、Database Exports/Imports
expdp system/password@db10g full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=expdpDB10G.log

impdp system/password@db10g full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=impdpDB10G.log
 
五、Miscellaneous Information
  All data pump ".dmp" and ".log" files are created on the Oracle server, not the client machine.
    1. Advanced Queuing
      All data pump actions are performed by multiple jobs (server processes not DBMS_JOB jobs). 
These jobs are controlled by a master control process which uses Advanced Queuing. At runtime an advanced queue table, named after the job name, is created and used by the master control process. The table is dropped on completion of the data pump job. The job and the advanced queue can be named using the JOB_NAME parameter.
Cancelling the client process does not stop the associated data pump job. Issuing "ctrl+c" on the client during a job stops the client output and presents a command prompt. Typing "status" at this prompt allows you to monitor the current job.
    2.  PARALLEL parameter
        Data pump performance can be improved by using the PARALLEL parameter. This should be used in conjunction with the "%U" wildcard in the DUMPFILE parameter to allow multiple dumpfiles to be created or read.
The DBA_DATAPUMP_JOBS view can be used to monitor the current jobs
expdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR parallel=4 dumpfile=SCOTT_%U.dmp logfile=expdpSCOTT.log
 
    3. INCLUDE and EXCLUDE parameters 
      The INCLUDE and EXCLUDE parameters can be used to limit the export/import to specific objects. When the INCLUDE parameter is used, only those objects specified by it will be included in the export. When the EXCLUDE parameter is used all objects except those specified by it will be included in the export.
expdp scott/tiger@db10g schemas=SCOTT include=TABLE:"IN ('EMP', 'DEPT')" directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log

expdp scott/tiger@db10g schemas=SCOTT exclude=TABLE:"= 'BONUS'" directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log
 
 e.g.

To run this job without a parameter file, you need to escape the special characters. Incorrect escaping can result in errors such as: ksh: syntax error: '(' unexpected.
Command line examples (for Windows: type parameters on one single line) :

Windows:
D:\> expdp system/manager DIRECTORY=my_dir DUMPFILE=exp_tab.dmp LOGFILE=exp_tab.log
SCHEMAS=scott INCLUDE=TABLE:\"IN ('EMP', 'DEP')\"
Unix:
% expdp system/manager DIRECTORY=my_dir DUMPFILE=exp_tab.dmp LOGFILE=exp_tab.log \
SCHEMAS=scott INCLUDE=TABLE:\"IN \(\'EMP\', \'DEP\'\)\"

 
參考文獻:
1.
2.http://space.itpub.net/673608/viewspace-629496
 

來自 “ ITPUB部落格 ” ,連結:http://blog.itpub.net/9252210/viewspace-666738/,如需轉載,請註明出處,否則將追究法律責任。

相關文章