Large number of individual updates vs bulk delete/bulk insert is my scenario.I have historical sales data for multiple customers going back years. Until I get verified data (15th of the following month), I will adjust sales numbers every day to reflect the current state as obtained from another source (this means overwriting at most 45 days of sales each day for each customer). There may be no changes, or there may be a few changes. I can either code the logic to find the differences and update/delete/insert the affected records or I can just blow away yesterday's numbers and insert today's numbers. Clearly this latter approach is simpler, but if it's going to kill the table's performance due to churn, then it's worth it to write the extra logic to identify the handful (or none) of records that changed and only update/delete/insert those.
So, I'm replacing the records, and there may be some relationship between the old records and the new records, but in general I don't necessarily want to match the old data with the new data (that would be an extra step and would result in deletions, updates, and inserts). Also, relatively few fields would be changed (at most 7 out of 20 or 2 out of 15).
The records that are likely to be retrieved together will have been inserted at the same time and therefore should be physically close to each other. Does that make up for the performance loss due to churn with that approach, and is it better than the undo/redo cost of all those individual record updates?