For Identifying tables for Hybrid Columnar Compression (HCC) in Oracle Exadata involves analyzing the workload and usage patterns of your database tables.
Here’s a basic script that can help you identify candidate tables for HCC based on certain criteria like size, read vs write operations, and the frequency of full table scans.
Example:
SET LINESIZE 200
SET PAGESIZE 1000
— Identify large tables with a significant amount of read operations and full table scans
SELECT
owner,
table_name,
num_rows,
blocks,
avg_row_len,
(SELECT value FROM v$sysstat WHERE name = ‘physical reads’) AS physical_reads,
(SELECT value FROM v$sysstat WHERE name = ‘physical writes’) AS physical_writes,
(SELECT SUM(decode(name, ‘table scans (long tables)’, value, 0))
FROM v$sysstat) AS full_table_scans
FROM
dba_tables
WHERE
num_rows > 100000 — Filter for tables with more than 100k rows (tune this threshold as needed)
AND owner NOT IN (‘SYS’, ‘SYSTEM’) — Exclude system tables
AND table_name NOT IN (SELECT mview_name FROM dba_mviews) — Exclude materialized views
ORDER BY
num_rows DESC;
Explanation:
- Filtering for Large Tables: The script filters for tables with more than 100,000 rows. This threshold can be adjusted based on your environment.
- Excluding System Tables and Materialized Views : System tables and materialized views are excluded from the analysis.
- Checking Full Table Scans : The script checks the number of full table scans on these tables, as HCC is particularly beneficial for tables that undergo frequent full table scans.
Once you have identified the tables, you can apply HCC using the ALTER TABLE command.
For example:
ALTER TABLE owner.table_name MOVE COMPRESS FOR QUERY HIGH;
This command compresses the table using the “QUERY HIGH” compression level. Other levels are available depending on your needs, such as COMPRESS FOR ARCHIVE HIGH, COMPRESS FOR ARCHIVE LOW, etc.