How to set up Streams Replication between two Databases
Purpose of Document:
This document should be used for setting up Streams Replication between two Oracle 10.2.0.x instances
Note:
In the examples below, the following servers are used:
Instance: DGDBPROD- Source database (ora256.dce.harvard.edu)
Instance: DGDBSTRM-Destination database (ora256.dce.harvard.edu)
The examples below show how to replicate DML & DDL changes in
the joel schema from the DGDBPROD instance to the DGDBSTRM instance.
I) First, create a streams admin user in both database instances (strmadmin) which will be the
user that manages the replication of data between both instances.
Execute the following on both source and destination database instances:
CREATE TABLESPACE streams_tbs
DATAFILE '/u02/cscie271/oradata/DGDBPROD/streams_tbs.dbf' SIZE 25M;
CREATE TABLESPACE streams_tbs
DATAFILE '/u03/cscie271/oradata/DGDBSTRM/streams_tbs.dbf' SIZE 25M;
CREATE USER strmadmin IDENTIFIED BY ******** DEFAULT
TABLESPACE streams_tbs TEMPORARY TABLESPACE temp QUOTA UNLIMITED ON streams_tbs;
GRANT CONNECT, RESOURCE, AQ_ADMINISTRATOR_ROLE, DBA TO strmadmin;
BEGIN
DBMS_STREAMS_AUTH.GRANT_ADMIN_PRIVILEGE(
grantee => 'strmadmin',
grant_privileges => true);
END;
/
II) Next create database links to and from each database that is part of the replication – i.e. on
DGDBPROD create a link to DGDBSTRM and on DGDBSTRM create a link to DGDBPROD.
Also, create a capture queue for the Streams capture on DGDBPROD (source) and apply queue for
Streams apply on DGDBSTRM (target).
On DGDBPROD execute:
CONNECT strmadmin/*****@dgdbprod
CREATE DATABASE LINK DGDBSTRM CONNECT TO strmadmin IDENTIFIED BY ***** USING 'DGDBSTRM';
EXEC DBMS_STREAMS_ADM.SET_UP_QUEUE(queue_name=>'STRMADMIN.CAP_DGDBPROD’);
ON DGDBSTRM execute:
CREATE DATABASE LINK DGDBPROD CONNECT TO strmadmin IDENTIFIED BY ***** USING 'DGDBPROD';
CONNECT strmadmin/strmadmin@DGDBSTRM
EXEC DBMS_STREAMS_ADM.SET_UP_QUEUE(queue_name=>'STRMADMIN.APPLY_DGDBSTRM');
III) Next, prepare the target database for Streams apply by creating apply process.
On DGDBSTRM:
For a schema:
BEGIN
DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
SCHEMA_name => 'joel',
streams_type => 'apply',
streams_name => 'apply_stream',
queue_name => 'strmadmin.APPLY_DGDBSTRM',
include_dml => true,
include_ddl => true,
source_database => 'DGDBPROD',
inclusion_rule => true);
END;
/
Or For a table:
BEGIN
DBMS_STREAMS_ADM.ADD_TABLE_RULES(
table_name => 'joel.emp',
streams_type => 'apply',
streams_name => 'apply_stream',
queue_name => 'strmadmin. 'APPLY_DGDBSTRM',
include_dml => true,
include_ddl => true,
include_tagged_lcr => true,
source_database => 'DGDBPROD',
inclusion_rule => true);
END;
/
Now specify an apply user:
BEGIN
DBMS_APPLY_ADM.ALTER_APPLY(
apply_name => 'apply_stream',
apply_user => 'strmadmin');
END;
/
Now make sure that the apply process doesn’t abort on error:
BEGIN
DBMS_APPLY_ADM.SET_PARAMETER(
apply_name => ‘apply_stream',
parameter => 'disable_on_error',
value => 'n');
END;
/
Now start the apply process:
BEGIN
DBMS_APPLY_ADM.START_APPLY(
apply_name => 'apply_stream');
END;
/
IV) Next, prepare the source database for Streams capture by
building capture and propagation rules on the source database which will determine
which changes are captured on the source.
On source DGDBPROD turn on
supplemental logging for each table being replicated (or turn it on for the
whole database):
connect SYS/password as SYSDBA
ALTER TABLE joel.emp
ADD SUPPLEMENTAL LOG GROUP dept_pk(deptno) ALWAYS;
Or
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA
(PRIMARY KEY, UNIQUE, FOREIGN KEY) COLUMNS
On source DGDBPROD now create
capture rules:
For a schema:
BEGIN
DBMS_STREAMS_ADM.ADD_SCHEMA_RULES(
schema_name
=> 'JOEL',
streams_type => 'capture',
streams_name => 'CAPTURE_JOEL_DGDBPROD',
queue_name
=> 'strmadmin.CAP_DGDBPROD',
include_dml => true,
include_ddl => true,
source_database => 'DGDBPROD',
inclusion_rule => true);
END;
/
For a table:
BEGIN
DBMS_STREAMS_ADM.ADD_TABLE_RULES(
table_name
=> 'JOEL.EMP',
streams_type => 'capture',
streams_name => 'CAPTURE_JOELTEMP_DGBPROD',
queue_name
=> 'strmadmin.CAP_DGDBPROD',
include_dml => true,
include_ddl => true,
include_tagged_lcr => true,
source_database => 'DGDBPROD',
inclusion_rule => true);
END;
/
On source DGDBPROD now create
propagation rules:
For a schema:
BEGIN
DBMS_STREAMS_ADM.ADD_SCHEMA_PROPAGATION_RULES(
schema_name
=> 'JOEL',
streams_name =>
'CAPTURE_JOEL_DGDBPROD’,
source_queue_name => 'strmadmin.CAP_DGDBPROD’,
destination_queue_name
=> 'strmadmin.APPLY_DGDBSTRM@DGDBSTRM',
include_dml => true,
include_ddl => true,
source_database => 'DGDBPROD',
inclusion_rule => true);
END;
/
For a table:
BEGIN
DBMS_STREAMS_ADM.ADD_TABLE_PROPAGATION_RULES(
table_name
=> 'JOEL.EMP',
streams_name => 'CAPTURE_JOEL_DGDBPROD',
source_queue_name =>
'strmadmin.CAP_DGDBPROD ',
destination_queue_name =>
'strmadmin.APPLY_DGDBSTRM@DGDBSTRM',
include_dml =>
true,
include_ddl => true,
include_tagged_lcr => true,
source_database => 'DGDBPROD',
inclusion_rule => true,
queue_to_queue => true);
END;
V) Next, instantiate the schema objects from source to target.
Export from the Source Database:
Specify the OBJECT_CONSISTENT=Y clause on the export
command. By doing this, an export is
performed that is consistent for each individual object at a particular system
change number (SCN).
exp USERID=SYSTEM/******@DGDBPROD OWNER=JOEL FILE=joel.dmp
LOG=exportTables.log OBJECT_CONSISTENT=Y STATISTICS =
NONE
Import into the Destination Database:
Specify STREAMS_INSTANTIATION=Y clause in the import
command. By doing this, the streams metadata
is updated with the appropriate information in the destination database
corresponding to the SCN that is recorded in the export file.
imp USERID=SYSTEM/******@DGDBSTRM FULL=Y CONSTRAINTS=Y
FILE=joel.dmp IGNORE=Y COMMIT=Y LOG=importTables.log STREAMS_INSTANTIATION=Y
VI) Next, start the capture process on the source (DOSTREAM). Connect as strmadmin:
exec DBMS_CAPTURE_ADM.START_CAPTURE(capture_name => 'CAPTURE_JOEL_DGDBPROD');
No comments:
Post a Comment