When you migrate Oracle Database, you often need to transport statistics using DBMS_STATS
. Also, during migrations, you want to minimize the downtime. How can you transfer statistics as quickly as possible?
Export and Import
The easiest solution is to use dbms_stats.export_database_stats
and dbms_stats.import_database_stats
. But, the procedures have no parallel capabilities. If you have an Oracle Database with many objects, it will take a long time.
Parallel Export and Import
dbms_stats
also allows you to export and import statistics on schema or table level. You can use this to your advantage to run multiple export or import commands at the same time.
Imagine a database with four schemas.
If you export database stats, the database creates a list of statistics from all four schemas. Then, it exports them one by one.
- Session 1:
exec dbms_stats.export_database_stats( ... );
In contrast, if you start four different sessions, you can export one schema in each session and thus finish much faster.
- Session 1:
exec dbms_stats.export_schema_stats(ownname=>'SCHEMA1');
- Session 2:
exec dbms_stats.export_schema_stats(ownname=>'SCHEMA2');
- Session 3:
exec dbms_stats.export_schema_stats(ownname=>'SCHEMA3');
- Session 4:
exec dbms_stats.export_schema_stats(ownname=>'SCHEMA4');
Benchmark
This benchmark gives you an idea of how much time you can save.
It is an Oracle E-Business Suite (EBS) database with 1.400 schemas and 150.000 tables/indexes.
I compare exporting and importing statistics on database level and on schema level. For schema level, I divide the schemas into chunks and process them from separate sessions simultaneously.
Method | Time to export | Time to import | Total |
---|---|---|---|
Database | 4m 5s | 21m 5s | 25m 10s |
Schema, parallel 8 | 57s | 3m 58s | 4m 55s |
Schema, parallel 16 | 53s | 3m 45s | 4m 38s |
I can save more than 20 minutes in this benchmark by doing the work in parallel. Between parallel 8 and parallel 16, there is not much to gain, probably because of contention.
How to Export and Import in Parallel Using DBMS_STATS
Here is some simple code you can use to export and import in parallel.
I haven’t coded for years, so use at your own risk 🙂
-
Create a dedicated user for the statistics staging table and create a control table with the list of schemas to process.
drop user statuser cascade; grant dba to statuser identified by statuser; alter user statuser quota unlimited on users; --Create the DBMS_STATS staging table exec dbms_stats.create_stat_table('STATUSER', 'DBSTATS', 'USERS'); --Populate control table declare l_stmt VARCHAR2(1000); l_ctl_tab_name ALL_TABLES.TABLE_NAME%TYPE := 'DOH$STATXFERCTL'; l_stat_tab_owner ALL_USERS.USERNAME%TYPE := 'STATUSER'; begin --Create control table and populate it --Order schemas by how many tables they have l_stmt := 'CREATE TABLE ' || l_stat_tab_owner || '.' || l_ctl_tab_name; l_stmt :=l_stmt || ' as select o.owner stat_id, count(o.table_name) as cnt, 0 exp_status, 0 imp_status from dba_tables o, dba_users u where u.username not in (''SYS'', ''SYSTEM'', ''' || l_stat_tab_owner || ''') and u.oracle_maintained=''N'' and o.owner=u.username group by owner order by 2 desc'; execute immediate l_stmt; end; /
-
Start any number of database sessions and run the code below in each. The code selects a schema from the control table and performs a schema-level export of statistics into the staging table.
declare l_cur_stat_id ALL_USERS.USERNAME%TYPE; l_statown ALL_USERS.USERNAME%TYPE := 'STATUSER'; l_stattab ALL_TABLES.TABLE_NAME%TYPE := 'DBSTATS'; l_ctl_tab_name ALL_TABLES.TABLE_NAME%TYPE := 'DOH$STATXFERCTL'; l_stmt VARCHAR2(1000); begin begin --Loop until you reach "no_data_found exception" while true loop --Select a schema from the control table --Order by count (cnt) to export largest schemas first --Select for update to ensure only one session process a schema l_stmt := 'select stat_id from ' || l_statown || '.' || l_ctl_tab_name || ' where exp_status = 0 and rownum=1 order by cnt desc for update'; execute immediate l_stmt into l_cur_stat_id; --Mark the schemas as "in progress" --Commit to release lock on control table l_stmt := 'update ' || l_statown || '.' || l_ctl_tab_name || ' set exp_status=1 where stat_id=:x1'; execute immediate l_stmt using l_cur_stat_id; commit; --Perform the schema level export into DBMS_STATS staging table dbms_stats.export_schema_stats (ownname=>l_cur_stat_id, stattab=>l_stattab,statid=>l_cur_stat_id, statown=>l_statown); --Mark the schema as completed in the control table l_stmt := 'update ' || l_statown || '.' || l_ctl_tab_name || ' set exp_status=2 where stat_id=:x1'; execute immediate l_stmt using l_cur_stat_id; commit; end loop; exception when no_data_found then --"No_data_found" exception occurs when there are no more rows/schemas to process null; end; end; /
-
Now move the DBSTATS schema to the target database and run the import. Start any number of database sessions and run the code below in each. The code selects a schema from the control table and performs a schema-level import of statistics into the staging table.
declare l_cur_stat_id ALL_USERS.USERNAME%TYPE; l_statown ALL_USERS.USERNAME%TYPE := 'STATUSER'; l_stattab ALL_TABLES.TABLE_NAME%TYPE := 'DBSTATS'; l_ctl_tab_name ALL_TABLES.TABLE_NAME%TYPE := 'DOH$STATXFERCTL'; l_stmt VARCHAR2(1000); no_stats_imp EXCEPTION; PRAGMA EXCEPTION_INIT(no_stats_imp, -20000); begin begin --Loop until you reach "no_data_found exception" while true loop --Select a schema from the control table --Order by count (cnt) to import largest schemas first --Select for update to ensure only one session process a schema l_stmt := 'select stat_id from ' || l_statown || '.' || l_ctl_tab_name || ' where imp_status = 0 and rownum=1 order by cnt desc for update'; execute immediate l_stmt into l_cur_stat_id; --Mark the schemas as "in progress" --Commit to release lock on control table l_stmt := 'update ' || l_statown || '.' || l_ctl_tab_name || ' set imp_status=1 where stat_id=:x1'; execute immediate l_stmt using l_cur_stat_id; commit; begin --Perform the schema level import into DBMS_STATS staging table dbms_stats.import_schema_stats (ownname=>l_cur_stat_id, stattab=>l_stattab,statid=>l_cur_stat_id, statown=>l_statown); exception when no_stats_imp THEN --An empty schema, i.e., no tables or indexes null; end; --Mark the schema as completed in the control table l_stmt := 'update ' || l_statown || '.' || l_ctl_tab_name || ' set imp_status=2 where stat_id=:x1'; execute immediate l_stmt using l_cur_stat_id; commit; end loop; exception when no_data_found then --"No_data_found" exception occurs when there are no more rows/schemas to process null; end; end; /