[CDBI] Make CDBI go fast

Perrin Harkins pharkins at gmail.com
Thu Feb 15 04:34:47 GMT 2007


On 2/14/07, Michael G Schwern <schwern at gmail.com> wrote:
> I've been handed a project to make Class::DBI go fast.

That does sound like a fun thing to work on.  I have to ask though, is
there a really solid reason to invest the time there?
Rose::DB::Object and DBIx::Class have most of the features you're
talking about adding already.  Maybe the time would be better spent on
transitional tools or a compatibility API.  I don't mean to be a
naysayer, since I am a current user of Class::DBI.  It's just that the
duplication of effort seems very clear.

> I've put in a patch which makes iterators fetch from a statement handle on demand.

Keep in mind that the behavior of the DBD classes when fetching rows
from a handle varies.  With MySQL, it will fetch all the rows into
memory as soon as you execute the statement, unless you explicitly
tell it not to.  With PostgreSQL, I think you have to use cursors to
prevent it from loading all the rows when you execute.

> * A bulk insert method
> * A bulk delete method
>
> Calling ->insert over and over again is inefficient.  Having to load an object only to delete it is even worse.  Bulk insert and delete methods would be handy.
>
> For insert the syntax it could be as simple as...
>
>     Class->bulk_insert({ foo => 42 }, { foo => 23 }, { foo => 99 });

Bulk delete is covered by the other tools mentioned above, but they
don't support any kind of bulk insert that I'm aware of.  To make bulk
inserts really fast, you have to use database specific extensions,
like MySQL's multi-row insert statement.

- Perrin



More information about the ClassDBI mailing list