Looking for something? Try here..

Saturday, July 11, 2015

Datapump Export using Datapump API

The Data Pump API, DBMS_DATAPUMP, provides a high-speed mechanism to move all or part of the data and metadata for a site from one database to another. The Data Pump Export and Data Pump Import utilities are based on the Data Pump API.

In this post lets see how to perform a schema level export using the datapump API.
There are a few reasons (as I'm thinking as of now) to perform export/import through API rather then command mode interface is as follows.
1. When you don't have access to database server to perform an export but you only have access to the database and you know the pre created directory object.
2. When you perform a remote operation on a database from another server (say an application server) so you would like to perform export from the same connection
3. When you would like to schedule a database job from within the database itself without the necessity to create OS scripts to invoke export.

As a prerequisite, to perform export using Datapump API, exp_full_database role has to be granted to the user performing expdp directly rather then through a role. (citation needed here as I tried granted through a role to fail)
I've also modified the API code to accept schema name to be exported as input so that we can store it as a procedure and invoke as and when required. The code is as below.
CREATE OR REPLACE PROCEDURE Proc_bkp_schema 
-- orabliss.blogspot.com
(v_schema IN VARCHAR2)
IS
   dp_handle    NUMBER;
   job_status   VARCHAR2 (30);
   v_dt         NUMBER;
   v_sch_name   VARCHAR2(30);
   v_filename   VARCHAR2(30);
   v_logname    VARCHAR2(30);
BEGIN
   SELECT TO_NUMBER (TO_CHAR (SYSDATE, 'yyyymmdd')) INTO v_dt FROM DUAL;
   v_sch_name := 'IN ('''||v_schema||''')';
   v_filename := ''||v_schema||'_'||v_dt||'.dmp'; -- dumpfile name
   v_logname  := ''||v_schema||'_'||v_dt||'.log'; -- logfile name

   -- schema export mode
   dp_handle := DBMS_DATAPUMP.open (operation => 'EXPORT', job_mode => 'SCHEMA');

   -- dump file
   DBMS_DATAPUMP.add_file (
      handle      => dp_handle,
      filename    => v_filename,
      directory   => 'DATAPUMP',
      filetype    => SYS.DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);

   -- log file
   DBMS_DATAPUMP.add_file (
      handle      => dp_handle,
      filename    => v_logname,
      directory   => 'DATAPUMP',
      filetype    => SYS.DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);

   -- specify schema name
   DBMS_DATAPUMP.metadata_filter (handle   => dp_handle,
                                  name     => 'SCHEMA_EXPR',
                                  VALUE    => v_sch_name);

   DBMS_DATAPUMP.start_job (dp_handle);

   DBMS_DATAPUMP.wait_for_job (handle => dp_handle, job_state => job_status);

   DBMS_OUTPUT.put_line (
         'DataPump Export - '
      || TO_CHAR (SYSDATE, 'DD/MM/YYYY HH24:MI:SS')
      || ' Status '
      || job_status);

   DBMS_DATAPUMP.detach (handle => dp_handle);
END;
/

The procedure can be called as below.
begin PROC_BKP_SCHEMA ('ORACLE'); end;
/

This procedure above accepts a single schema as input. The same procedure can be altered to accept many users through a 'for loop' and export several schemas.
You might also make use of metadata_filter with 'NAME_EXPR' to filter out only required tables and data_filter to filter out partitions of table using the datapump API.

Happy working!

2 comments:

  1. Nice! =D

    Be careful with schema export! .. if your data is spread among tablespaces (for example you have LOBS on a separate tablespace), the schema export can be inconsistent

    ReplyDelete
    Replies
    1. Hi Gonzalo,

      I have not tried the scenario you mentioned. I'll try to perform one when I find time. Thanks!! :)

      Delete