Oct 1, 2013

Export or Import - Taking consistent backup using expdp

Export or Import - Taking consistent backup using expdp

~ Taking consistent backup using expdp ( RAC / Non-RAC)
~ Applied to: Oracle 10g/11g

About Document:Some times management asks to create pre-prod database environment to test some scenarios/ to train users / patch movement. But prod and UAT/ Pre-PROD may not be same platform. So that you can't restore from RMAN backup ( ENDIAN_FORMAT should support. Check using v$transportable_platform view). So you have choice to restore using EXPDP ( logical backup). Then you need a consistent backup to restore. So using EXPDP you can take backup upto a snap shot time ( SCN), you can follow the below method:

Note: You can use Oracle golden gate to migrate database across platform. Other third party tools can be used also.

expdp 
full=y 
directory=DATA_PUMP 
dumpfile=oraprod_full.dmp 
logfile=oraprod_full_exp.log 
exclude=statistics 
FLASHBACK_SCN=25330457434 
COMPRESSION=ALL

-- To find FLASHBACK_SCN

sql> select name,current_scn from v$database;

Note: COMPRESSION=ALL can be used to take compressed backup. Same can be ignore. This is 11g new feature.

-- Find currently export executing jobs

sql> select * from dba_datapump_jobs where state='EXECUTING';

-- Current Status of export /import
sql> select sid,serial#,sofar,totalwork,start_time,
sysdate,time_remaining,message
from v$session_longops 
where opname like '%IMPORT%';

Sample output:








Note: Use CLUSTER=N in expdp/ impdp while exporting/importing from/to multitude RAC database. You can add PARALLEL clause to make the EXPDP/ IMPDP faster.

"flashback_time" can be used for consistent backup using "expdp":

Flashback_time at current time:

The  flashback_time=systimestamp syntax: is the current time.

$ cat mybackup1.par

full=y
directory=data_pump
dumpfile=oraprod%U.dmp
logfile=oraprod_exp.log
parallel=4
filesize=20G
exclude=statistics
flashback_time=systimestamp


$ expdp parfile=mybackup.par

Flashback_time at specific time:

Instead of the current time, you can specify any time which is possible within flashback capabilities (undo log sizes). using the to_timestamp argument.

Oracle notes that the system change number (SCN) that most closely matches the specified time is found, and this SCN is used to enable the Flashback utility. The export operation is performed with data that is consistent up to this SCN.

$cat mybackup2.par

full=y
directory=data_pump
dumpfile=oraprod%U.dmp
logfile=oraprod_exp.log
parallel=4
filesize=20G
exclude=statistics
flashback_time="to_timestamp('02-08-2014 10:10:00', 'DD-MM-YYYY HH24:MI:SS')"

$ expdp parfile=mybackup2.par
OR
$ expdp \'/ as sysdba\'  parfile= mybackup2.par  ( without entering any value)

Click here to read ,more on export and import

Import a single table with remap:

see sample exmple:

impdp
full=N
directory=data_pump
dumpfile=PROD_full_%U.dmp
logfile=PROD_EMP_imp.log
TABLE_EXISTS_ACTION=TRUNCATE
schemas=HR
TABLES=HR.EMP
REMAP_TABLE=HR.EMP:EMP_NEW

Common Errors and troubleshoot during export / import:

Error:(1)

Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
ORA-31693: Table data object "HR"."EMPLOYEE" failed to load/unload and is being skipped due to error:
ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-14400: inserted partition key does not map to any partition

Issue Description: 
1) When your backup is taken without disbling Foreign key constraints and imported to a table, then you will face these errors or You may not have required partition.

Solution:

1) Add the following in import script. Create the foreign key again.
2)  Create required partition or use interval partition if valid range partition is defined.

DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS

Error:(2)

ORA-31693: Table data object "DEPT"."DEPTMASTER" failed to load/unload and is being skipped due to error:
ORA-31617: unable to open dump file "/oraclebkp/exp/test_full_14may14.dmp" for write
ORA-19505: failed to identify file "/oraclebkp/exp/test_full_14may14.dmp"
ORA-27037: unable to obtain file status
IBM AIX RISC System/6000 Error: 2: No such file or directory
Additional information: 3

Issue Description: 

Usually when you are taking logical backup using 'expdp' in RAC environment, this issue will come.

Solution:

Add below property in your main export script while exporting using 'expdp'.

CLUSTER=N

Hope this document will help. Don't forget to post a comment.
Thanks

2 comments:

  1. This document helped not to reset of sequences and protected partial transactions.

    ReplyDelete
  2. Get export import data of your competitors along with HS code/ HSN code.
    Thanks
    Visit
    Eximdata for export import data

    ReplyDelete

Translate >>