Gotchas while bulk loading CouchDB

I’ve got ~15k rows in MSSQL 2005 that I want to migrate into CouchDB, where one row is one document. I have a CLR-UDF that writes n rows to an schema-bound XML file. I have an XSL transform that converts the schema-bound XML to JSON.

With these existing tools I’m thinking I can go MSSQL to XML to JSON. If I batch n rows per JSON file, I can script cURL to loop through the files and POST them to CouchDB using the bulk API _bulk_docs.

Will this work? Has anybody done a migration like this before? Can you recommend a better way?

Error “stopped because I can’t continue” in SQLLoader – DIRECT mode

When trying to load a large text file into the oracle db using SQLLoader, we get the following errors:

SQL*Loader-926: OCI-Error; uldlfca:OCIDirPathColArrayLoadStream for table <myTabele>
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
SQL*Loader-925: Error in uldlgs: OCIStmtExecute (ptc_hp)

This only happens in DIRECT mode, when we’re using the conventional path method, everything is fine (but a lot slower). So I assume it can’t be a problem with the data or the general parts of the control file.

While the error message is quite amusing, what can I do to get everything to work?

Versions: SQLLoader 9.2.0.1, Database is a 10.2.0.3.0 (64-bit)

EDIT
After some more trying, it seems that the problems are caused by using functions to convert some of the input. When I remove the functions (with the resulting changes in the table definition), everything seems to be working fine. Is it possible that when doing a direct load I cannot use functions? The documentation says that starting with version 9.x it should work …