Processing large datasets efficiently
When you need to process thousands or millions of rows in PL/SQL, doing it one row at a time with a cursor loop is painfully slow. Each SELECT and DML statement causes a context switch between the PL/SQL engine and the SQL engine. BULK COLLECT and FORALL minimize these context switches by processing data in batches.
Recipe #1 - BULK COLLECT to fetch multiple rows at once
Instead of fetching one row at a time, BULK COLLECT loads an entire result set (or a batch) into a collection:
DECLARE
TYPE t_emp_tab IS TABLE OF emp%ROWTYPE;
l_employees t_emp_tab;
BEGIN
SELECT * BULK COLLECT INTO l_employees
FROM emp
WHERE deptno = 10;
FOR i IN 1..l_employees.COUNT LOOP
DBMS_OUTPUT.put_line(l_employees(i).ename
|| ' earns ' || l_employees(i).sal);
END LOOP;
END;
Recipe #2 - BULK COLLECT with LIMIT to control memory
For very large tables, loading everything into memory at once is dangerous. Use LIMIT to process in batches:
DECLARE
TYPE t_emp_tab IS TABLE OF emp%ROWTYPE;
l_employees t_emp_tab;
CURSOR c_emp IS SELECT * FROM emp;
BEGIN
OPEN c_emp;
LOOP
FETCH c_emp BULK COLLECT INTO l_employees LIMIT 1000;
EXIT WHEN l_employees.COUNT = 0;
-- Process each batch
FOR i IN 1..l_employees.COUNT LOOP
DBMS_OUTPUT.put_line(l_employees(i).ename);
END LOOP;
END LOOP;
CLOSE c_emp;
END;
Recipe #3 - FORALL for bulk DML operations
FORALL sends all DML statements to the SQL engine in one batch, which is dramatically faster than a regular FOR loop:
DECLARE
TYPE t_empno_tab IS TABLE OF emp.empno%TYPE;
l_empnos t_empno_tab;
BEGIN
SELECT empno BULK COLLECT INTO l_empnos
FROM emp
WHERE deptno = 20;
FORALL i IN 1..l_empnos.COUNT
UPDATE emp
SET sal = sal * 1.10
WHERE empno = l_empnos(i);
DBMS_OUTPUT.put_line(SQL%ROWCOUNT || ' employees updated.');
END;
Recipe #4 - FORALL with SAVE EXCEPTIONS
By default, FORALL stops at the first error. Use SAVE EXCEPTIONS to continue processing and handle errors afterwards:
DECLARE
TYPE t_num_tab IS TABLE OF NUMBER;
l_values t_num_tab := t_num_tab(10, 0, 20, 0, 30);
e_dml_errors EXCEPTION;
PRAGMA EXCEPTION_INIT(e_dml_errors, -24381);
BEGIN
FORALL i IN 1..l_values.COUNT SAVE EXCEPTIONS
INSERT INTO test_table VALUES (100 / l_values(i));
EXCEPTION
WHEN e_dml_errors THEN
FOR i IN 1..SQL%BULK_EXCEPTIONS.COUNT LOOP
DBMS_OUTPUT.put_line('Error at index '
|| SQL%BULK_EXCEPTIONS(i).ERROR_INDEX
|| ': ' || SQLERRM(-SQL%BULK_EXCEPTIONS(i).ERROR_CODE));
END LOOP;
END;
The performance difference between row-by-row processing and bulk operations can easily be 10x to 50x faster. Always use BULK COLLECT and FORALL when working with large data sets.