![]() ![]() The new user was created as follows.ĬREATE USER scott_copy IDENTIFIED BY scott_copy In this section we will import the SCOTT schema, exported in the previous section, into a new user. Once the job is initiated, it can be seen under the "Export Jobs" node of the tree, where it can be monitored.Īs normal, the dump file and log file are located in the specified directory on the database server. When you are ready, click the "Finish" button. If you need to keep a copy of the job you have just defined, click on the "PL/SQL" tab to see the code. Click the "Next" button.Ĭheck the summary information is correct. The default is to run the job immediately. If you want to schedule the export to run at a later time, or on regular intervals, enter the details here. When you have selected your specific options, click the "Next" button.Įnter a suitable dump file name by double-clicking on the default name and choose the appropriate action should the file already exist, then click the "Next" button. ![]() The "Options" screen allows you to increase the parallelism of the export, name the logfile and control the read-consistent point in time if necessary. If you want to apply a WHERE clause to any or all of the tables, enter the details in the "Table Data" screen, then click the "Next" button. If you have any specific include/exclude filters, add them and click the "Next" button. When you are happy with your selection, click the "Next" button. To do this, highlight the schema of interest in the left-hand "Available" pane, then click the ">" button to move it to the right-hand "Selected" pane. For the schema export, we must select the schema to be exported. The screens that follow will vary depending on the type of export you perform. In this case I will do a simple schema export. Right-click on either the "Data Pump" or "Export Jobs" tree node and select the "Data Pump Export Wizard." menu option.Ĭheck the connection details are correct and select the type of export you want to perform, then click the "Next" button. This tree will be the starting point for the operations listed in the following sections. Expanding the "Data Pump" node displays "Export Jobs" and "Import Jobs" nodes, which can be used to monitor running data pump jobs. In this case I will be using the "system" connection.Įxpanding the connection node in the tree lists a number of functions, including "Data Pump". If no connections are available, click the "+" icon and select the appropriate connection from the drop-down list and click the "OK" button. The data pump wizards are accessible from the DBA browser (View > DBA). Data Pump (expdp, impdp) : All Articles.This article gives an overview of these wizards. SQL Developer 3.1 includes a neat GUI interface for Data Pump, allowing you to do on-the-fly exports and imports without having to remember the expdp/impdp command line syntax. Home » Articles » Misc » Here SQL Developer 3.1 Data Pump Wizards (expdp, impdp)
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |