Pro Oracle Database 11g Administration (56 page)

Read Pro Oracle Database 11g Administration Online

Authors: Darl Kuhn

Tags: #Oracle DBA

BOOK: Pro Oracle Database 11g Administration
7.32Mb size Format: txt, pdf, ePub

remap_datafile="'/ora01/dbfile/O11R2/tbsp301.dbf':'/ora02/O11R2/tb3.dbf'"

When Data Pump creates the tablespaces, for any paths that match the first part of the string (to the left of the colon [:]), the string is replaced with the text to the right of the colon.

Changing Segment and Storage Attributes

When importing, you can alter the storage attributes of a table by using the TRANSFORM parameter. The general syntax for this parameter is

TRANSFORM=transform_name:value[:object_type]

When you use SEGMENT_ATTRIBUTES:n for the transformation name, you can remove the following segment attributes during an import:

• Physical attributes

• Storage attributes

322

CHAPTER 13 ■ DATA PUMP

• Tablespaces

• Logging

Here’s an example that removes the segment attributes:

$ impdp darl/foo directory=dp dumpfile=inv.dmp transform=segment_attributes:n You can remove just the storage clause by using STORAGE:n:

$ impdp darl/foo directory=dp dumpfile=inv.dmp transform=storage:n

Importing into a Different Tablespace from the Original

Sometimes you’re exporting out of one database and want to import objects into another database, but you want to change the tablespaces for the tables and indexes—in other words, create the objects in the destination database but in tablespaces different than the source database configuration.

This example remaps the user as well as the tablespaces. The original user and tablespaces are HEERA and TBSP1. This command imports the TICKET table into the CHAYA user and the V_DATA tablespace: $ impdp darl/foo directory=dp dumpfile=rm.dmp remap_schema=HEERA:CHAYA \

remap_tablespace=TBSP1:V_DATA tables=heera.ticket

The REMAP_TABLESPACE feature doesn’t re-create tablespaces. It only instructs Data Pump to place objects in different tablespaces (from where they were originally exported). When importing, if the tablespace that you’re placing the object in doesn’t exist, Data Pump throws an error.

Changing the Size of Datafiles

You can change the size of the datafiles when importing by using the TRANSFORM parameter with the PCTSPACE option. For example, if you want the tablespaces to be created at 20 percent of the original size, specify the following:

userid=darl/foo

directory=dp

dumpfile=phredstg.dmp

full=y

include=tablespace:"like 'TBSP%'"

transform=pctspace:20

The tablespaces are created with datafiles 20 percent of their original size. The extent allocation sizes are also 20 percent of their original definition. This feature is useful when used to export production data and then import it into a smaller database. In these scenarios, you may be filtering out some of the production data via the SAMPLE parameter or QUERY parameters (discussed in the next section).

323

CHAPTER 13 ■ DATA PUMP

Filtering Data and Objects

Data Pump has a vast array of mechanisms to filter data and metadata. You can influence what is excluded or included in a Data Pump export or import in the following ways:

• Use the QUERY parameter to export or import subsets of data.

• Use the SAMPLE parameter to export a percentage of the rows in a table.

• Use the CONTENT parameter to exclude or include data and metadata.

• Use the EXCLUDE parameter to specifically name items to be excluded.

• Use the INCLUDE parameter to name the items to be included (thereby excluding other non-dependent items not included in the list).

• Use parameters like SCHEMA to specify that you only want a subset of the database’s objects (those that belong to the specified user or users).

Examples of each of these techniques are described in the following subsections.


Note
You can’t use EXCLUDE and INCLUDE at the same time. These parameters are mutually exclusive.

Specifying a Query

You can use the QUERY parameter to instruct Data Pump to write to a dump file only rows that meet a certain criterion. You may want to do this if you’re re-creating a test environment and only need subsets of the data. Keep in mind that this technique is unaware of any foreign-key constraints that may be in place, so you can’t blindly restrict the data sets without considering parent/child relationships.

It has this general syntax for including a query:

QUERY = [schema.][table_name:] query_clause

The query clause can be any valid SQL clause. The query must be enclosed by either double quotes or single quotes. I recommend using double quotes because you may need single quotes embedded in the query to handle VARCHAR data. Also, you should use a parameter file so that there is no confusion about how the operating system interprets the quotation marks.

This example uses a parameter file and limits the rows exported for two tables. Here’s the parameter file used when exporting:

userid=darl/foo

directory=dp

dumpfile=inv.dmp

tables=inv,reg

query=inv:"WHERE inv_desc='Book'"

query=reg:"WHERE reg_id <=20"

Say you place the previous lines of code in a file named inv.par. The export job references the parameter file as shown:

324

CHAPTER 13 ■ DATA PUMP

$ expdp parfile=inv.par

The resulting dump file only contains rows filtered by the QUERY parameters. Again, be mindful of any parent/child relationships, and ensure that what gets exported won’t violate any constraints on the import.

You can also specify a query when importing data. Here’s a parameter file that limits the rows imported into the INV table based on the INV_ID column:

userid=darl/foo

directory=dp

dumpfile=inv.dmp

tables=inv,reg

query=inv:"WHERE inv_id > 10"

The previous text is placed in a file named inv2.par and is referenced during the import as follows: $ impdp parfile=inv2.par

Only the rows in the INV table that are filtered via the query are imported.

Exporting a Percentage of the Data

When exporting, the SAMPLE parameter instructs Data Pump to retrieve a certain percentage of rows based on a number you provide. Data Pump doesn’t keep track of parent/child relationships when exporting. Therefore, this approach doesn’t work well when you have tables linked via foreign-key constraints and you’re trying to randomly select a percentage of rows.

Here’s the general syntax for this parameter:

SAMPLE=[[schema_name.]table_name:]sample_percent

For example, if you want to export 10 percent of the data in a table, do so as follows: $ expdp darl/foo directory=dp tables=inv sample=10 dumpfile=inv.dmp

This next example exports two tables, but only 30 percent of the REG table’s data: $ expdp darl/foo directory=dp tables=inv,reg sample=reg:30 dumpfile=inv.dmp


Note
The SAMPLE parameter is only valid for exports.

Excluding Objects from the Export File

For export, the EXLUDE parameter instructs Data Pump to not export specified objects (whereas the INCLUDE parameter instructs Data Pump to only include specific objects in the export file). The EXCLUDE

parameter has this general syntax:

EXCLUDE=object_type[:name_clause] [, ...]

The OBJECT_TYPE refers to a database object like TABLE or INDEX. To see which object types can be filtered, view the OBJECT_PATH column of DATABASE_EXPORT_OBJECTS, SCHEMA_EXPORT_OBJECTS, or TABLE_EXPORT_OBJECTS. For example, if you want to view what schema-level objects can be filtered, run this query:

325

CHAPTER 13 ■ DATA PUMP

SELECT

object_path

FROM schema_export_objects

WHERE object_path NOT LIKE '%/%';

Here’s a small snippet of the output:

OBJECT_PATH

------------------

STATISTICS

SYNONYM

SYSTEM_GRANT

TABLE

TABLESPACE_QUOTA

TRIGGER

The EXCLUDE parameter instructs Data Pump export to filter out specific objects from the export. For example, say you’re exporting a table but want to exclude the indexes and grants: $ expdp darl/foo directory=dp dumpfile=inv.dmp tables=inv exclude=index,grant You can filter at a more granular level by using NAME_CLAUSE. The NAME_CLAUSE option of EXCLUDE

allows you to specify a SQL filter. To exclude indexes that have names that start with the string “INV”, you use the following:

exclude=index:"LIKE 'INV%'"

The previous line requires that you use quotation marks; in these scenarios, I recommend that you use a parameter file. This is because when you filter by the name of the object, it uses quotation marks.

Sometimes it’s hard to predict how the operating system will interpret quotation marks on the command line. Here’s a parameter file that contains an EXCLUDE clause: userid=darl/foo

directory=dp

dumpfile=inv.dmp

tables=inv

exclude=index:"LIKE 'INV%'"

A few aspects of the EXCLUDE clause may seem counterintuitive. For example, consider the following export parameter file:

userid=darl/foo

directory=dp

dumpfile=sch.dmp

exclude=schema:"='HEERA'"

If you attempt to exclude a user in this manner, an error is thrown. This is because the default mode of export is SCHEMA level, and Data Pump can’t exclude and include a schema at the same time. If you want to exclude a user from an export file, specify the FULL mode and exclude the user: userid=darl/foo

directory=dp

dumpfile=sch.dmp

exclude=schema:"='HEERA'"

full=y

326

CHAPTER 13 ■ DATA PUMP

Excluding Statistics

If you want to exclude statistics from your export job, you can do so using the EXCLUDE parameter. Here’s an example:

$ expdp darl/foo directory=dp dumpfile=invp.dmp tables=f_sales exclude=statistics By default, when you export a table object, any statistics are also exported. You can prevent statistics from being imported via the EXCLUDE parameter:

$ impdp darl/foo directory=dp dumpfile=invp.dmp tables=f_sales exclude=statistics When importing, if you attempt to exclude statistics from a dump file that didn’t originally include the statistics, then you receive this error:

ORA-39168: Object path STATISTICS was not found.

You also receive this error if the objects in the exported dump file never had statistics generated for them.

Including Only Specific Objects in an Export File

Use the INCLUDE parameter to include only certain database objects in the export file. The following example only exports the procedures and functions that a user owns:

$ expdp darl/foo dumpfile=proc.dmp directory=datapump include=procedure,function The proc.dmp file that is created only contains the DDL required to re-create any procedures and functions the user owns.

When using INCLUDE, you can also specify that only specific PL/SQL objects should be exported: $ expdp darl/foo directory=datapump dumpfile=ss.dmp \

include=function:\"=\'IS_DATE\'\"

When you’re only exporting specific PL/SQL objects, because of the issues with having to escape quote marks on the operating-system command line, I recommend using a parameter file. When you use a parameter file, you don’t have escape the quote marks. The following shows the contents of a parameter file that exports specific objects:

directory=datapump

dumpfile=pl.dmp

include=function:"='ISDATE'",procedure:"='DEPTREE_FILL'"

If you specify an object that doesn’t exist, Data Pump throws an error but continues with the export operation:

ORA-39168: Object path FUNCTION was not found.

Exporting Table, Index, Constraint, and Trigger DDL

Suppose you want to export the DDL associated with tables, indexes, constraints, and triggers in your database. To do this, use the FULL export mode and only include tables: $ expdp darl/foo directory=dp dumpfile=phredstg.dmp content=metadata_only full=y \

include=table

327

CHAPTER 13 ■ DATA PUMP

When you export an object, Data Pump also exports any dependent objects. So, when you export a table, you also get indexes, constraints, and triggers associated with the table.

Excluding Objects from Import

In general, you can use the same techniques used to filter objects in exports to exclude objects from being imported. Use the EXCLUDE parameter to exclude objects from being imported. For example, to exclude triggers and procedures from being imported:

$ impdp darl/foo dumpfile=d.dmp directory=dp exclude=TRIGGER,PROCEDURE

You can further refine what is excluded by adding a SQL clause. For example, say you want to not import triggers that begin with the letter
B
. Here’s what the parameter file looks like: userid=darl/foo

directory=dp

dumpfile=h.dmp

schemas=HEERA

exclude=trigger:"like 'B%'"

Including Objects in Import

You can use the INCLUDE parameter to reduce what is imported. Suppose you have a schema from which you want to import tables that begin with the letter
A
. Here’s the parameter file: userid=darl/foo

directory=dp

dumpfile=h.dmp

schemas=HEERA

include=table:"like 'A%'"

If you place the previous text in a file name h.par, then the parameter file can be invoked as follows: $ impdp parfile=h.par

In this example, the HEERA schema must already exist. Only tables that start with the letter
A
are imported.

Common Data Pump Tasks

This section contains common features you can use with Data Pump. Many of these features are standard parts of Data Pump, such as creating a consistent export or taking action when imported objects already exist in the database. Other features, such as compression and encryption, require the Enterprise Edition of Oracle and/or an extra license. I’ll point out these requirements (if relevant) for the Data Pump element being described.

328

CHAPTER 13 ■ DATA PUMP

Creating a Consistent Export

A
consistent
export means that all data in the export file is consistent as of a time or an system change number (SCN). When you’re exporting an active database with many parent-child tables, then you should ensure that you get a consistent snapshot of the data. You create a consistent export by using either the FLASHBACK_SCN or FLASHBACK_TIME parameter.

Other books

Strongman by Roxburgh, Angus
River Runs Red (The Border Trilogy) by Mariotte, Jeffrey J.
The Burning City by Megan Morgan
Blood Country by Mary Logue
Wired by Sigmund Brouwer
Unknown by Unknown
Afternoon Delight by Mia Zachary