Sometimes you might want to update some rows of a table in database. If the number of rows is small, its okay to fire an update command directly.
UPDATE table_name SET column_name = new_value WHERE condition
The problem comes when the 'condition' part matches huge number of rows (say more than a million). If its a production database, firing above mentioned query might end up consuming a lot of machine's memory and may hamper database's availability. How do we solve this - we do updates in batches. Batches will be logical collection of rows that will be updated and committed during 1 transaction. Usually, production tables have a column similar to CREATED_ON or CREATION_DATE. Grouping based on this attribute can be a good option, though it entirely depends on the nature of data in table. Below block can be used for making updates in batches based on CREATED_ON taking batches of 10 days at a time.
Suppose you want to update a column 'salary' for all the employees. Say you gave a hike of Rs. 1000 to each employee :)
DECLARE
var start_date timestamp;
var end_date timestamp;
var batch_size number := 10; --Change this number according to data and memory of db machine.
var updated_row_count number;
BEGIN
select max(CREATED_ON) into var start_date from employees;
select min(CREATED_ON) into var end_date from employees;
LOOP
DBMS_OUTPUT('Updating rows that were created between ' || start_date || ' and '|| (start_date- batch_size));
UPDATE employees SET salary = (salary+1000) WHERE created_on < start_date and created_on >= (start_date-batch_size);
updated_row_count := sql%rowcount; --This will give the number of rows updated.
commit;
DBMS_OUTPUT('Updated ' || updated_row_count || ' rows.');
EXIT WHEN start_date < end_date;
start_date := start_date - batch_count;
END LOOP
END
That's it guys. A small word of caution - before making updates which affects a big number of rows, you should take a backup of the table. I hope this helps.