Home > PHP, Tips & Tricks > PHP script to import csv data into mysql

PHP script to import csv data into mysql

February 19th, 2007

This is a simple script that will allow you to import csv data into your database. This comes handy because you can simply edit the appropriate fields, upload it along with the csv file and call it from the web and it will do the rest.

It allows you to specify the delimiter in this csv file, whether it is a coma, a tab etc. It also allows you to chose the line separator, allows you to save the output to a file (known as a data sql dump).

It also permits you to include an empty field at the beginning of each row, which is usually an auto increment integer primary key.

This script is useful mainly if you don’t have phpmyadmin, or you don’t want the hassle of logging in and prefer a few clicks solution, or you simply are a command prompt guy.
Just make sure the table is already created before trying to dump the data.
Kindly post your comments if you got any bug report.

Download file here

PHP, Tips & Tricks

  1. peace
    April 13th, 2010 at 10:52 | #1

    How do I skip the first line/row which is a header in csv file and insert the rest of the data into db?

  2. Carl
    April 18th, 2010 at 11:00 | #2

    Like Gary, I want to be able to overwrite an existing record, but by updating it, not deleting all the data in the db. Sp somewhere around here:

    if($addauto)
    $query = “insert into $databasetable values(”,’$linemysql’);”;
    else
    $query = “insert into $databasetable values(‘$linemysql’);”;

    How would I check to see if the record exists and then Update rather than Insert?

  3. peace
    April 20th, 2010 at 11:22 | #3

    The script is reading an extra line from the csv file. e.g if the csv files has 25 records It shows there are 26 records. Can you please tell me where I am going wrong?

    I have added to check if it is a first line and continue loop if its first line.

    thanks

  4. Ap.Muthu
    April 28th, 2010 at 06:02 | #4

    The following did the trick for me:
    LOAD DATA LOW_PRIORITY LOCAL INFILE
    ‘C:\\Documents and Settings\\Administrator\\Desktop\\SigmaS1.csv’
    INTO TABLE `mydb`.`mytable`
    FIELDS ESCAPED BY ‘\\’
    TERMINATED BY ‘,’
    OPTIONALLY ENCLOSED BY ‘”‘
    LINES TERMINATED BY ‘\r\n’
    (`user`, `pwd`, `sno`, `amt`, `ActiveRec`);

  5. alejandro
    May 11th, 2010 at 04:03 | #5

    This was very useful to me… but I needed to make a modification for reading in large files. I was confused at first, but then realized the script loads the whole file into memory (right?). So here is a customized version of the code, which reads lines in one at a time. it reads in files that look like
    begin file:
    2,3,-1
    1,-2,4
    end file (with many more values). i wanted each line to be a table entry and an id with the line number, as well as the first id representing the size of the table (which will not change later).

    <?php

    /********************************/
    /* Code at https://legend.ws/blog/tips-tricks/csv-php-mysql-import/
    /* Edit the entries below to reflect the appropriate values
    /********************************/
    $databasehost = "localhost";
    $databasename = "tomoku";

    $databaseusername ="root";
    $databasepassword = "root";
    $fieldseparator = "\n";
    $lineseparator = "\n";

    // this code reads tatami files into the database

    $con = @mysql_connect($databasehost,$databaseusername,$databasepassword) or die(mysql_error());
    @mysql_select_db($databasename) or die(mysql_error());

    //this loops over a bunch of files
    for($r=2; $r<14; $r++){
    for($c=$r; $c

Comment pages
  1. September 24th, 2007 at 14:12 | #1
    » Converting csv to sql using php @The Coding Pad: Programming blog, discussions, tutorials, resources
  2. November 30th, 2007 at 11:59 | #2
    Import a Comma Delimited File Into MySql with a PHP Script | eCommerce & SEO
  3. June 7th, 2008 at 10:29 | #3
    » CSV import
  4. March 5th, 2010 at 12:03 | #4
    data_type
This blog is protected by Dave\'s Spam Karma 2: 14131 Spams eaten and counting...