Data Pump Export-Import Performance tips


Oracle Data Pump offered lots of benefits & performance gain over original Export/Import. We can tremendously increase data pump export/import performance by considering following several methods:

Export performance tips:

parallelism in data pump.
With the help of PARALLEL parameter ( tuning parameter ), we can achieve dynamic increase & decrease of resource consumption for each job. Worker (Parallel) count should be EXACT no of dump file & twice the number of CPU’s (two workers for each CPU).

Degree of parallelism is directly proportional to memory consumption, CPU usage & I/O bandwidth usage.

To maximize parallelism use substitution variables in your file names (for example, dump_file%u.dmp), putting multiple dump file on multiple disk or channel will also help to increase IO performance.

Consider following example, into this we are exporting full database with the help of PARALLEL parameter, this export will create 4 dump files on dump directory.
[oracle@oracle ~]$ expdp system/manager directory=data_pump_bkup dumpfile=full_db_export%U.dmp logfile=Parallel_export.log full=y PARALLEL=4

Master table “SYSTEM”.”SYS_EXPORT_FULL_01″ successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_FULL_01 is:
/home/oracle/app/data_pump_bkup/full_db_export01.dmp
/home/oracle/app/data_pump_bkup/full_db_export02.dmp
/home/oracle/app/data_pump_bkup/full_db_export03.dmp
/home/oracle/app/data_pump_bkup/full_db_export04.dmp
Job “SYSTEM”.”SYS_EXPORT_FULL_01″ successfully completed at 11:18:21

Note:
This feature is limited to Oracle Enterprise Edition 11g & onward.

Use large Undo tablespace & temporary tablespace
Create and assign large Undo tablespace & temporary tablespace to user who suppose to export or import.

Use of DBMS_STATS Procedures.
Export data pump utility uses internal performance data or database statics to export database, we will be beneficial in data pump performance gain if we consider following Procedures to execute right before data pump export.

GATHER_SYSTEM_STATS Procedure
GATHER_FIXED_OBJECTS_STATS Procedure
GATHER_DICTIONARY_STATS Procedure

Execute following DBMS_STATS packages as sys ( as sysdba ) user:

SQL> EXEC DBMS_STATS.GATHER_FIXED_OBJECTS_STATS (NULL);

PL/SQL procedure successfully completed.

SQL> EXEC DBMS_STATS.GATHER_DICTIONARY_STATS;

PL/SQL procedure successfully completed.

SQL> EXEC DBMS_STATS.GATHER_SYSTEM_STATS;

PL/SQL procedure successfully completed.
Import Performance tips:

Increase the size of pga_aggregate_target
Increase the size of pga_aggregate_target, this will help you to increase data pump performance.
You should check your existing memory size on your system ( subtract SGA size ) before setting size for pga_aggregate_target.

Use large Undo tablespace & temporary tablespace
Create and assign large Undo tablespace & temporary tablespace to user who suppose to export or import.

Exclude statistics
We will also beneficial in import performance gain if we exclude statistics at the time of import, because only single data pump thread has been used to calculate statistics whether we considering parallelism or not.

Don’t forget to execute DBMS_STATS packages exactly after import done.

******************************************************************
there is one more important parameter in export

parameter consistent=y in exp

it makes all of the queries run by exp be “as of the same point in time– consistent with
regards to eachother”

imagine if you started an export at 9am of the EMP and DEPT tables.

EMP started exporting at 9am and DEPT at 9:15am.

Now, the EMP data would be “as of 9am”, but the DEPT data would be as of 9:15am. What if
you import that data now — will it work? maybe, maybe not — perhaps at 9:10am, someone
fired the last employee in department 50 and deleted department 50. Your EMP export
would have them in this deptno, your DEPT export would not have this deptno. The import
would fail.

If you use consistent=y, Oracle will export all of the data “as of 9am”, so deptno=50
will be included in all tables and the import will succeed.

Thanking you.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s