In order to work with oracle data pump we will have to first unlock the scott user and create a directory object. The directory object is only a pointer to a physical directory, creating it does not actually create the physical directory on the file system.
First startup the database and its associated services.
Then from the root prompt, do the following.
su - oracle
export ORACLE_SID=orcl
sqlplus
/ as sysdba
ALTER USER scott IDENTIFIED BY tiger ACCOUNT UNLOCK;
CREATE OR REPLACE DIRECTORY test_dir AS '/u01/app/oracle/oradata/orcl';
GRANT READ, WRITE ON DIRECTORY test_dir TO scott;
Note: Directory must be created physically before executing the above command.
TABLE BASED IMPORT AND EXPORT
export ORACLE_SID=orcl
$ expdp scott/tiger tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=expEMP_DEPT.log
export ORACLE_SID=orcl
$ impdp scott/tiger tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=impEMP_DEPT.log
CONTENT={ALL | DATA_ONLY | METADATA_ONLY}
# All loads all the metadata as well as data from the source dump file.
# DATA_ONLY only loads row data into the tables no database objects are created.
# METADATA_ONLY only creates database objects, no data is inserted.
SCHEMA BASED IMPORT AND EXPORT
export ORACLE_SID=orcl
$ expdp scott/tiger schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expSCOTT.log
export ORACLE_SID=orcl
$ impdp scott/tiger schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=impSCOTT.log
CONTENT={ALL | DATA_ONLY | METADATA_ONLY}
# All loads all the metadata as well as data from the source dump file.
# DATA_ONLY only loads row data into the tables no database objects are created.
# METADATA_ONLY only creates database objects, no data is inserted.
FULL DATABASE IMPORTS AND EXPORTS
For this to work you must login as the system user not as the sys user.
export ORACLE_SID=orcl
$ expdp system/password full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=expDB10G.log
export ORACLE_SID=orcl
$ impdp system/password full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=impDB10G.log
CONTENT={ALL | DATA_ONLY | METADATA_ONLY}
# All loads all the metadata as well as data from the source dump file.
# DATA_ONLY only loads row data into the tables no database objects are created.
# METADATA_ONLY only creates database objects, no data is inserted.
In the above commands you can also add a additional clause that is job_name.
Example syntax :
$ expdp system/password full=y directory=TEST_DIR dumpfile=DB10G.dmp job_name=qq
You can view the datapump jobs from the following view:
Sql> select * from dba_datapump_jobs;
Even if you exit the prompt or press ctrl+c at the command prompt
Or exit from the client side the datapump jobs will continue to run at the server.
To reattach with the running job enter the following command.
$ expdp system/password attach=qq
If the import or export job is to be stopped temporarily then type the following command.
In order to resume the job do the following.
Note: After reattaching to the Job a DBA can also kill the job by typing KILL_JOB, if he doesn’t want to continue with the export job.
IMPORT SCENARIOS:
REMAP_SCHEMA: loads all the objects from the source schema into the target schema.
$ expdp system/sys schemas=scott directory=test_dir dumpfile=scott.dmp
$ impdp system/sys remap_schema=scott:hr directory=test_dir dumpfile=scott.dmp
First startup the database and its associated services.
Then from the root prompt, do the following.
su - oracle
export ORACLE_SID=orcl
sqlplus
/ as sysdba
ALTER USER scott IDENTIFIED BY tiger ACCOUNT UNLOCK;
CREATE OR REPLACE DIRECTORY test_dir AS '/u01/app/oracle/oradata/orcl';
GRANT READ, WRITE ON DIRECTORY test_dir TO scott;
Note: Directory must be created physically before executing the above command.
TABLE BASED IMPORT AND EXPORT
export ORACLE_SID=orcl
$ expdp scott/tiger tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=expEMP_DEPT.log
export ORACLE_SID=orcl
$ impdp scott/tiger tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=impEMP_DEPT.log
CONTENT={ALL | DATA_ONLY | METADATA_ONLY}
# All loads all the metadata as well as data from the source dump file.
# DATA_ONLY only loads row data into the tables no database objects are created.
# METADATA_ONLY only creates database objects, no data is inserted.
SCHEMA BASED IMPORT AND EXPORT
export ORACLE_SID=orcl
$ expdp scott/tiger schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expSCOTT.log
export ORACLE_SID=orcl
$ impdp scott/tiger schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=impSCOTT.log
CONTENT={ALL | DATA_ONLY | METADATA_ONLY}
# All loads all the metadata as well as data from the source dump file.
# DATA_ONLY only loads row data into the tables no database objects are created.
# METADATA_ONLY only creates database objects, no data is inserted.
FULL DATABASE IMPORTS AND EXPORTS
For this to work you must login as the system user not as the sys user.
export ORACLE_SID=orcl
$ expdp system/password full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=expDB10G.log
export ORACLE_SID=orcl
$ impdp system/password full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=impDB10G.log
CONTENT={ALL | DATA_ONLY | METADATA_ONLY}
# All loads all the metadata as well as data from the source dump file.
# DATA_ONLY only loads row data into the tables no database objects are created.
# METADATA_ONLY only creates database objects, no data is inserted.
In the above commands you can also add a additional clause that is job_name.
Example syntax :
$ expdp system/password full=y directory=TEST_DIR dumpfile=DB10G.dmp job_name=qq
You can view the datapump jobs from the following view:
Sql> select * from dba_datapump_jobs;
Even if you exit the prompt or press ctrl+c at the command prompt
Or exit from the client side the datapump jobs will continue to run at the server.
To reattach with the running job enter the following command.
$ expdp system/password attach=qq
If the import or export job is to be stopped temporarily then type the following command.
press CTRL+C Export> STOP_JOB=IMMEDIATE Are you sure you wish to stop this job ([y]/n): y
In order to resume the job do the following.
export ORACLE_SID=orcl $expdp hr/hr ATTACH=qq Export> CONTINUE_CLIENT
Note: After reattaching to the Job a DBA can also kill the job by typing KILL_JOB, if he doesn’t want to continue with the export job.
IMPORT SCENARIOS:
REMAP_SCHEMA: loads all the objects from the source schema into the target schema.
$ expdp system/sys schemas=scott directory=test_dir dumpfile=scott.dmp
$ impdp system/sys remap_schema=scott:hr directory=test_dir dumpfile=scott.dmp
Datapump "NETWORK_LINK" feature With this feature data can be directly imported from another database without doing any exports and imports as it is done with the traditional approach. In this test scenario i will import scott users objects directly over the network from dell database to the target(dup) database Source dell Target dup Requirements for source 1) listener and database should be up. Requirements for Target 1) create services in tnsnames.ora. Service name created by me (abc) # tnsnames.ora Network Configuration File: /u01/app/oracle/product/ 11.2.0/db_1/network/admin/tnsnames.ora # Generated by Oracle configuration tools.http://docs.oracle.com/cd/E11882_01/server.112/e22490/toc.htm#BEGINABC = (DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = 192.168.1.39)(PORT = 1521)) ) (CONNECT_DATA = (SERVICE_NAME = dell) ) )
check accessiblity using tnsping before going further. [oracle@newyork ~]$ tnsping abcTNS Ping Utility for Linux: Version 11.2.0.1.0 - Production on 23-MAY-2013 22:29:54 Copyright (c) 1997, 2009, Oracle. All rights reserved. Used parameter files: Used TNSNAMES adapter to resolve the alias Attempting to contact (DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = 192.168.1.39)(PORT = 1521))) (CONNECT_DATA = (SERVICE_NAME = dell))) OK (100 msec)
2) create datapump directory object.SQL> create or replace directory test_dir as '/u01'; Directory created.
3) create public database linkSQL> create public database link abc using 'abc'; Database link created.
4) now run the import from target(dup). [oracle@newyork ~]$ impdp system/sys directory=test_dir logfile=NetImpScott.log network_link=abc schemas=scott table_exists_action=replaceImport: Release 11.2.0.1.0 - Production on Thu May 23 22:39:41 2013 Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Starting "SYSTEM"."SYS_IMPORT_SCHEMA_01": system/******** directory=test_dir logfile=NetImpScott.log network_link=abc schemas=scott table_exists_action=replace Estimate in progress using BLOCKS method... Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 88.18 MB Processing object type SCHEMA_EXPORT/USER ORA-31684: Object type USER:"SCOTT" already exists Processing object type SCHEMA_EXPORT/SYSTEM_GRANT Processing object type SCHEMA_EXPORT/ROLE_GRANT Processing object type SCHEMA_EXPORT/DEFAULT_ROLE Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA Processing object type SCHEMA_EXPORT/TABLE/TABLE . . imported "SCOTT"."TEST" 1835008 rows . . imported "SCOTT"."DEPT" 4 rows . . imported "SCOTT"."EMP" 14 rows . . imported "SCOTT"."SALGRADE" 5 rows . . imported "SCOTT"."BONUS" 0 rows Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT Job "SYSTEM"."SYS_IMPORT_SCHEMA_01" completed with 1 error(s) at 22:41:56
References:
0 comments:
Post a Comment