Export or Import - Taking full backup using expdp excluding unused big tables
~ Taking full backup using expdp ( RAC / Non-RAC) excluding unused big tables
~ Applied to: Oracle version above 10.2.x.x. in any platform
About Document:
Some times due to space constraint full database to be imported without unused tables / archive tables to pre-prod/ UAT / DEV database environment to test some scenarios/ to train users / patch movement.
you can follow the below method:
expdp
full=y
directory=data_pump
dumpfile=oraprod%U.dmp
logfile=oraprod_exp.log
parallel=4
filesize=20G
exclude=statistics
exclude=table:"IN\('HR.employee_bkp','CRM.tempdata','payroll.transaction_arc1','payroll.transaction_arc2'\)"
Note:
1) 'parallel' attribute can be used for parallelisim purpose to make the export faster.
2) 'filesize' attribute can be used to devide dump sizes to number of pieces.
3) Use '%U' in dumpfile name while use 'filesize' attribute.
-- Find currently export executing jobs
sql> select * from dba_datapump_jobs where state='EXECUTING';
Note: Use CLUSTER=N in expdp/ impdp while exporting/importing from/to multinode RAC database. 'parallel' import can be used while import but '%U' to be used in dumpfile while import.
Hope this will help you.
Thanks
~ Taking full backup using expdp ( RAC / Non-RAC) excluding unused big tables
~ Applied to: Oracle version above 10.2.x.x. in any platform
About Document:
Some times due to space constraint full database to be imported without unused tables / archive tables to pre-prod/ UAT / DEV database environment to test some scenarios/ to train users / patch movement.
you can follow the below method:
expdp
full=y
directory=data_pump
dumpfile=oraprod%U.dmp
logfile=oraprod_exp.log
parallel=4
filesize=20G
exclude=statistics
exclude=table:"IN\('HR.employee_bkp','CRM.tempdata','payroll.transaction_arc1','payroll.transaction_arc2'\)"
Note:
1) 'parallel' attribute can be used for parallelisim purpose to make the export faster.
2) 'filesize' attribute can be used to devide dump sizes to number of pieces.
3) Use '%U' in dumpfile name while use 'filesize' attribute.
-- Find currently export executing jobs
sql> select * from dba_datapump_jobs where state='EXECUTING';
Note: Use CLUSTER=N in expdp/ impdp while exporting/importing from/to multinode RAC database. 'parallel' import can be used while import but '%U' to be used in dumpfile while import.
Hope this will help you.
Thanks
No comments:
Post a Comment