i have a csv file contain 118350 lines,i want to save each line in my database table,i have read entire file in a array and parse every line for some modification and i have started to save file content in my database,i have a php program but the problem is that which save only 927 lines in one time so i have to run my php script again and again to save the further data in database.
my php code is ---
<?php
$connection = mysql_connect('localhost', 'user', 'password') or die(mysql_error($connection));
$select_db = mysql_select_db('dbname', $connection) or die(mysql_error($connection));
$file = file('ip-to-country.csv');
$data = str_replace("\"", "", $file, $j); //for removing ' " ' charactor from string
$i = 0;
for ($i = 0; $i < count($data); $i++) {
$content = explode(',', $data[$i]);
$ins = "insert into iplocation (startiprang,endiprang,concode,concode3,country) values ($content[0],$content[1],'$content[2]','$content[3]','$content[4]')";
$result = mysql_query($ins, $connection);
}
echo "done";
?>
is there any function which can store all file data in an array without limitation. myfile size is approx 6 MB. thankx in advance.......i hope you understand what i want... Thanks in advance.