In Oracle Database 23ai, you can copy the dump files even before the export completes.
This saves time during your migration because you can start moving files to the target host while the export is in progress. Additionally, it potentially saves disk space because you can move the files away from the source host.
Which Files Can You Move
-
After starting the Data Pump export, you must connect to the database using:
- The same user who started the export
- A user with
DATAPUMP_EXP_FULL_DATABASErole
-
Replace the job owner and name, and execute:
set serverout on declare l_job_owner varchar2(30) := 'DPUSER'; l_job_name varchar2(30) := 'SYS_EXPORT_FULL_01'; l_handle number; l_stsmask integer := dbms_datapump.ku$_status_job_status; l_job_state varchar2(30); l_status ku$_status; l_dump_file ku$_dumpfile; begin l_handle := dbms_datapump.attach(l_job_name, l_job_owner); dbms_datapump.get_status(l_handle, l_stsmask, NULL, l_job_state, l_status); dbms_datapump.detach(l_handle); for i in l_status.job_status.files.first..l_status.job_status.files.last() loop l_dump_file := l_status.job_status.files(i); if l_dump_file.file_type = dbms_datapump.ku$_dumpfile_type_template then continue; end if; if (l_dump_file.file_bytes_written = l_dump_file.file_size) then dbms_output.put_line('DONE: ' || l_dump_file.file_name); end if; end loop; end; / -
The code lists all the files that Data Pump has marked as completed. Data Pump no longer writes to those files, and you can safely start moving them to the target system.
Prerequisites
- You must use multiple dump files. You do that by including
%Lin yourDUMPFILEspecification.DUMPFILE=exp%L.dmp - You must specify a file size that allows Data Pump to rotate into multiple files when they are full. I suggest using 5G, which is also a good setting for most cloud migrations.
FILESIZE=5G
What About
-
The code examines the max file size and bytes written to determine if the file is full. This is not the same as the physical size of the file in the operating system. You can’t use the file size information from the file system.
-
What about using
rsyncor similar tools? I guess that could work; however, we didn’t test that. -
You could query the Data Pump control table for the information:
SELECT file_name FROM <control_table> WHERE process_order = -21 AND file_max_size = completed_bytes AND file_max_size <> 0 ORDER BY file_name;- However, querying the control table is not a supported approach. Use the above PL/SQL.
Happy exporting!
Daniel:
Thanks so much to advise this approach during the migration. it will save the migration time. If my database is small, I can migration all data with importing dumpfiles or using RMAN to backup database and restore to new target host server such as cloud service. Am I right?
Frank
LikeLike
Hi Frank,
If your Oracle Database is small, you might be able to import directly over a network link. But that requires SQL*Net connectivity (port 1521) from cloud back to the on-prem machine. In my experience that is often not possible for security reasons, plus there are some limitations on network import. Most of the times an export to dump file is still faster – however a network import might be more convenient.
RMAN backups or even standby databases are good ways for migrating as well, provided the source is on a compatible platform.
Regards,
Daniel
LikeLike