Set Oracle globalization variables required for your locale. A default locale will be assumed if no variables are set. If it is not set, then an absolute or relative path must be used to start the utilities provided in the Tools package. For example:. The environment may be configured using SET commands in a Windows command prompt or made permanent by setting Environment Variables in System Properties.
For example, to set environment variables in Windows using System Properties, open System from the Control Panel, click the Advanced tab and then click Environment Variables. After the Tools Instant Client package is installed and configured, you can connect to a database.
The utilities supplied in the Tools Instant Client are always remote from any database server. It is assumed that the server has an Oracle instance up and running and has the TNS listener running. For the Data Pump Export and Import clients, the dump files reside on the remote server; an Oracle directory object on the server must exist and should have the appropriate permissions.
To connect to a database you must specify the database using an Oracle Net connection identifier. For example, on UNIX, if your tnsnames. Uninstalling Tools Instant Client. The files to be deleted should be in the Instant Client directory that you specified at installation.
Be sure you do not remove any Oracle home files. Uninstalling the Complete Instant Client. Previous Next JavaScript must be enabled to correctly display this content. The following topics are discussed:. In this example, there is not yet any benefits information available so the column is shown as NULL in the data file, dependents. The data in the dependents. Information about the load is written to the log file, dependents.
The content of the log file looks similar to the following:. Oracle Database XE provides the following command-line utilities for exporting and importing data:. The following sections provide an overview of each utility. For a summary of when you might want to use each utility, see Table The Data Pump Export utility exports data and metadata into a set of operating system files called a dump file set. The Data Pump Import utility imports an export dump file set into a target Oracle database.
A dump file set is made up of one or more disk files that contain table data, database object metadata, and control information. The files are written in a proprietary, binary format, which means that the dump file set can be imported only by the Data Pump Import utility.
The dump file set can be imported to the same database or it can be moved to another system and loaded into the Oracle database there. Because the dump files are written by the database, rather than by the Data Pump client application, you must create directory objects for the directories to which files will be written. A directory object is a da tabase object that is an alias for a directory in the host operating system's file system. Data Pump Export and Import enable you to move a subset of the data and metadata.
This is done by using Data Pump parameters to specify export and import modes, as well as various filtering criteria. You can also perform exports and imports over a network. In a network export, the data from the source database instance is written to a dump file set on the connected database instance. In a network import, a target database is loaded directly from a source database with no intervening dump files.
This allows export and import operations to run concurrently, minimizing total elapsed time. Data Pump Export and Import also provide a set of interactive commands so that you can monitor and modify ongoing export and import jobs. In this example, suppose that you want to make some changes to the HR sample schema and then test those changes without affecting the current HR schema. You could export the HR schema and then import it into a new HRDEV schema, where you could perform development work and conduct testing.
To do this, take the following steps:. At the command prompt, issue the command appropriate to your operating system, to create the directory where the exported files will be placed:. At the SQL prompt, enter the following commands to create a directory object named dmpdir for the tmp directory that you just created, and to grant read and write access to it for user HR.
Export the HR schema to a dump file named schema. The schema. Import the dump file, schema. If a table already exists, it is replaced with the table in the export file. As the import operation takes place, messages similar to the following are displayed this output is also written to the impschema. The Export and Import utilities provide a simple way for you to transfer data objects between Oracle databases.
They are invoked with the exp and imp commands, respectively. When you run the Export utility against an Oracle database, objects such as tables are extracted, followed by their related objects such as indexes, comments, and grants , if any. The extracted data is written to an export dump file. The dump file is an Oracle binary-format dump file that can be read only by the Import utility. The version of the Import utility cannot be earlier than the version of the Export utility used to create the dump file.
Like Data Pump Import and Export, data exported with the Export utility can be imported with the Import utility into the same or a different Oracle database. See Oracle Database Utilities for further information about the Export and Import utilities and for examples of how to use them.
Skip Headers. Term Definition Exporting Copying database data to external files for import into another Oracle database only. The files are in a proprietary binary format. Importing Copying data into the database from external files that were created by exporting from another Oracle database. Unloading Copying database data to external text files for consumption by another Oracle database or another application such as a spreadsheet application.
The text files are in an industry-standard format such as tab-delimited or comma-delimited CSV. Import imp and Export exp. The step-by-step wizards have the following features: You can load or unload XML files or delimited-field text files such as comma-delimited.
You can load by copying and pasting from a spreadsheet. You can omit skip columns when loading or unloading. You can load into an existing table or create a new table from the loaded data. When loading into a new table, column names can be taken from the loaded data.
Limitations include the following: The wizards load and unload table data only. Stream record format records are differentiated using record terminators. Option 2 is incorrect. This is a characteristic of variable record format. Option 3 is correct. The record terminator can comprise both printable characters and nonprintable characters. Option 4 is incorrect. The parser can recognize stream record format. It uses several files during the course of its operation: input data files, the control file, the log file, the bad file, and the discard file.
The control file specifies information such as where the input data comes from, the data storage format, the loader configuration, and how the loader should manipulate the input data. Post a Comment. The input data file uses one of these formats: fixed record format variable record format stream record format The control file's INFILE clause can specify the input data file's format as well as its name.
It specifies information such as the location of the data how to parse and interpret the data where to write the data in the database Control files are loosely organized into three sections: The first section of the file contains session-wide information, such as bind size and which records to skip. It can also contain input data itself.
This information includes the table's name and its column names. The optional third section of the file contains input data.
This information includes descriptions of any errors that occur during the load. The load session terminates if log-file creation fails, so a log file is mandatory. It stores such a record in a bad file. Discard files Discard files , which are optional, contain records filtered out of the input data because they don't match the record-selection criteria specified in the control file. Options: A comment must be preceded by two hyphens The code is case-insensitive The code is data-manipulation language DML The code is free-format.
Launch window View the complete control-file script.
0コメント