Unlocking the World of Business Intelligence with SQLBI

1. Objectives of Moving Data
🔹
Introduction
In the context of PL/SQL and Oracle databases,
data movement refers to the processes and techniques involved in transferring
data from one location to another. This movement may be required within the
same database, across different databases, or even between different platforms.
Data movement is a critical task in database administration and data
integration projects, enabling systems to handle growing data volumes
efficiently and ensuring that data is available, consistent, and optimized for
performance.
🔹 Key
Objectives
🔹 Example
Scenario
A retail company with branches in multiple
cities collects sales data locally. This data must be moved to the headquarters
database every night to perform consolidated reporting, trend analysis, and
forecasting.
2. Moving
Data: General Architecture
🔹 Overview
The architecture for moving data in Oracle
PL/SQL-based systems includes several layers and tools that ensure efficient
and reliable data transfer. The general architecture focuses on the source and
target systems, tools or utilities used for data movement, data formats, and
optional transformation processes that may occur during transit.
🔹 Components
🔹 Process
Flow
🔹 Diagram
[Source] →
[Extraction Tool] → [Intermediate File] → [Loading Tool] → [Target System]
3.
Directory Object: Overview
🔹 Definition
A Directory Object is a database object that
points to a specific directory on the server’s file system. Oracle uses it to
grant secure access to directories for reading from or writing to files during
data movement.
🔹 Importance
4. Creating
Directory Objects
🔹 Syntax
CREATE OR
REPLACE DIRECTORY directory_name AS 'absolute_path';
🔹 Example
CREATE OR
REPLACE DIRECTORY data_dir AS '/u01/oracle/datafiles';
GRANT READ,
WRITE ON DIRECTORY data_dir TO scott;
🔹 Notes
5. Data
Pump: Overview
🔹 What is
Data Pump?
Oracle Data Pump is a powerful utility for
fast data and metadata export/import operations between Oracle databases. It
replaces the older EXP/IMP tools and provides greater performance, flexibility,
and manageability.
🔹 Features
6. Data
Pump: Benefits
🔹 High
Performance
Data Pump leverages direct path and parallel
execution to transfer large datasets quickly.
🔹 Flexible
Control
Users can filter specific tables, schemas, or
rows to export/import only relevant data.
🔹
Restartable Jobs
Interrupted jobs can resume from the point of
failure, saving time and effort.
🔹 Versatile
Interfaces
Data Pump jobs can be executed via
command-line tools, Oracle Enterprise Manager, or PL/SQL APIs.
🔹 Better
Monitoring
Detailed logs and job status reports help DBAs
track the process.
7. Data
Pump Export and Import: Overview
🔹 Export
(expdp)
Used to extract data from the source database
and write it to dump files.
🔹 Import
(impdp)
Used to load data from dump files into the
target database.
🔹 Basic
Syntax
expdp hr/hr
schemas=HR directory=data_dir dumpfile=hr.dmp logfile=exp.log
impdp hr/hr
schemas=HR directory=data_dir dumpfile=hr.dmp logfile=imp.log
8. Data
Pump Utility: Interfaces and Modes
🔹 Interfaces
🔹 Modes
9.
Fine-Grained Object Selection
🔹 Concept
Oracle Data Pump allows selective
export/import of database objects using the INCLUDE and EXCLUDE
parameters.
🔹 Example
Export only EMPLOYEES and DEPARTMENTS tables:
expdp hr/hr
tables=employees,departments directory=data_dir dumpfile=hr_objs.dmp
🔹 Benefits
10.
Advanced Feature: Sampling
🔹 Purpose
To export a representative subset of a table’s
data, typically used for testing, training, or analysis.
🔹 Syntax
expdp hr/hr
tables=employees sample=20 directory=data_dir dumpfile=emp_sample.dmp
🔹 Use Cases
11. Export
Options: Files
🔹 Common
Parameters
🔹 Example
expdp hr/hr
tables=employees directory=data_dir \
dumpfile=emp1.dmp,emp2.dmp
logfile=exp_emp.log filesize=50M
🔹 Notes
12. Data
Pump File Locations
🔹
Directory-Based Storage
All dump files and logs must be stored in a
server-side directory specified by a valid directory object.
🔹 Security
🔹 Best
Practices
Comments
Post a Comment