ir.translation, import: push the algo. to SQL, improve performance
At a language import (translation files), we want to push a big batch of
translation records into the database. These will need update-or-insert
logic (against existing ones) or even resolution of ir.model.data .
Doing this loop in Python had been slow (invoking 2x read()s, +1 for
ir.model.data, 1 insert or update), triggered the cache (fill and clean
at each iteration).
Instead, follow the old-school db recipe for mass records insertion:
- create a temporary table w/o indexes or constraints
- quickly populate the temp with all records of the batch
(through a dedicated "cursor" object)
- process the table, doing lookups in collective SQL queries (yes, SQL
is all about loops of data processing, efficiently)
- insert all records from temp into ir.model.data
- call (implicitly) all constraints of ir.model.data at the end of that
single query.
This improves performance of translation imports by ~3x at least.
bzr revid: xrg@linux.gr-
20110608162059-rfy1vvwp8w66ry0i