I have about 50.000 of records to import in a Magento store. What I have already tested: The file is about 50 MB.
I strongly recommend that you avoid writing raw SQL at all costs, you will almost certainly spend days and days writing to map the attribute IDs and probably get it wrong. It will also bypass all the important indexing and other system updates that Magento relies on.
If speed is your issue, I suggest that you consider uRapidFlow from Unirgy. Usual disclaimers apply, I have no affiliation with Unirgy, but my observations has been that the quality of this work is excellent.
HTH, JD
It's very hard to create products using raw SQL queries, because Magento uses EAV pattern for storing products.
If you disable the indexer while your load runs and then re-enable and run afterwards, it should improve your load time.
$indexer = Mage::getSingleton('index/indexer');
$indexer->lockIndexer();
// ... run your processing ...
$indexer->unlockIndexer();
// Reindex everything
$processes = $indexer->getProcessesCollection();
foreach ($processes as $process)
{
// echo 'Processing: ' . $process->getIndexerCode() . "n";
$process->reindexEverything();
}
Occasionally I've noticed bulk inserts that work by first creating a template model...
$blankProduct = Mage::getModel('catalog/product');
...then avoid the creation of the model for each record...
$newProduct = clone $blankProduct;
$newProduct->setIsMassupdate(true)
...
$newProduct->save();
It's slightly more efficient but probably not enough to get that massive import to a reasonable time.