I keep on getting this error when running one of my scripts;
PHP Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 71 bytes) in ... lib/symfony-1.4.11/lib/plugins/sfDoctrinePlugin/lib/vendor/doctrine/Doctrine/Connection/Statement.php on line 246, ...
The following is the stripped-down version of the script that's triggering the error;
public function executeImportFile(sfWebRequest $request)
{
ini_set('memory_limit', '1024M');
set_time_limit ( 0 );
//more codes here...
$files = scandir($workspace.'/'.$directory);
foreach ($files as $file) {
$path = $workspace.'/'.$directory.'/'.$file;
if ($file != "." && $file != "..") {
$this->importfile($path);
}
}
}
protected function importfile($path){
$connection =
sfContext::getInstance()->getDatabaseManager()->getDatabase('doctrine')->getDoctrineConnection();
$connection->beginTransaction();
try {
//more codes here...
while ($data = $reader->read()) //reads each line of a csv file
{
// send the line to another private function to be processed
// and then write to database
$this->writewave($data);
}
$connection->commit();
} catch (Exception $e) {
$connection->rollback();
}
}
What the script does is basically to read all the csv files in a folder (which contains tens of thousands of lines each), process it, and the write it to the database using Doctrine's transaction.
While I don't think I need to set the memory limit and the time limit in both functions, the script quits as Doctrine uses up all the the allocated 1GB of memory.
It will normally stop after processing 10 files, and allocating more memory will allow it to process a bit more files, and will still crash.
Is there anything that I'm missing here that's causing the memory not to free up after processing each files?
Mohd Shakir Zakaria http://www.mohdshakir.net