[mysql] Import CSV to mysql table

What is the best/fastest way to upload a csv file into a mysql table? I would like for the first row of data be used as the column names.

Found this:

How to import CSV file to MySQL table

But the only answer was to use a GUI and not shell?

This question is related to mysql csv import load-data-infile database-table

The answer is


Import CSV Files into mysql table

LOAD DATA LOCAL INFILE 'd:\\Site.csv' INTO TABLE `siteurl` FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n';

Character   Escape Sequence
\0      An ASCII NUL (0x00) character
\b      A backspace character
\n      A newline (linefeed) character
\r      A carriage return character
\t      A tab character.
\Z      ASCII 26 (Control+Z)
\N      NULL

visits : http://www.webslessons.com/2014/02/import-csv-files-using-php-and-mysql.html


I wrote some code to do this, i'll put in a few snippets:

$dir = getcwd(); // Get current working directory where this .php script lives
$fileList = scandir($dir); // scan the directory where this .php lives and make array of file names

Then get the CSV headers so you can tell mysql how to import (note: make sure your mysql columns exactly match the csv columns):

//extract headers from .csv for use in import command
$headers = str_replace("\"", "`", array_shift(file($path)));
$headers = str_replace("\n", "", $headers);

Then send your query to the mysql server:

mysqli_query($cons, '
        LOAD DATA LOCAL INFILE "'.$path.'"
            INTO TABLE '.$dbTable.'  
            FIELDS TERMINATED by \',\' ENCLOSED BY \'"\'
            LINES TERMINATED BY \'\n\'
            IGNORE 1 LINES
            ('.$headers.')
            ;
        ')or die(mysql_error());

I wrestled with this for some time. The problem lies not in how to load the data, but how to construct the table to hold it. You must generate a DDL statement to build the table before importing the data.

Particularly difficult if the table has a large number of columns.

Here's a python script that (almost) does the job:

#!/usr/bin/python    
import sys
import csv

# get file name (and hence table name) from command line
# exit with usage if no suitable argument   
if len(sys.argv) < 2:
   sys.exit('Usage: ' + sys.argv[0] + ': input CSV filename')
ifile = sys.argv[1]

# emit the standard invocation
print 'create table ' + ifile + ' ('

with open(ifile + '.csv') as inputfile:
   reader = csv.DictReader(inputfile)
   for row in reader:
      k = row.keys()
      for item in k:
         print '`' + item + '` TEXT,'
      break
   print ')\n'

The problem it leaves to solve is that the final field name and data type declaration is terminated with a comma, and the mySQL parser won't tolerate that.

Of course it also has the problem that it uses the TEXT data type for every field. If the table has several hundred columns, then VARCHAR(64) will make the table too large.

This also seems to break at the maximum column count for mySQL. That's when it's time to move to Hive or HBase if you are able.


if you have the ability to install phpadmin there is a import section where you can import csv files to your database there is even a checkbox to set the header to the first line of the file contains the table column names (if this is unchecked, the first line will become part of the data


Here's a simple PHP command line script that will do what you need:

<?php

$host = 'localhost';
$user = 'root';
$pass = '';
$database = 'database';

$db = mysql_connect($host, $user, $pass);
mysql_query("use $database", $db);

/********************************************************************************/
// Parameters: filename.csv table_name

$argv = $_SERVER[argv];

if($argv[1]) { $file = $argv[1]; }
else {
    echo "Please provide a file name\n"; exit; 
}
if($argv[2]) { $table = $argv[2]; }
else {
    $table = pathinfo($file);
    $table = $table['filename'];
}

/********************************************************************************/
// Get the first row to create the column headings

$fp = fopen($file, 'r');
$frow = fgetcsv($fp);

foreach($frow as $column) {
    if($columns) $columns .= ', ';
    $columns .= "`$column` varchar(250)";
}

$create = "create table if not exists $table ($columns);";
mysql_query($create, $db);

/********************************************************************************/
// Import the data into the newly created table.

$file = $_SERVER['PWD'].'/'.$file;
$q = "load data infile '$file' into table $table fields terminated by ',' ignore 1 lines";
mysql_query($q, $db);

?>

It will create a table based on the first row and import the remaining rows into it. Here is the command line syntax:

php csv_import.php csv_file.csv table_name

Use TablePlus application: Right-Click on the table name from the right panel Choose Import... > From CSV Choose CSV file Review column matching and hit Import All done!


Here's how I did it in Python using csv and the MySQL Connector:

import csv
import mysql.connector

credentials = dict(user='...', password='...', database='...', host='...')
connection = mysql.connector.connect(**credentials)
cursor = connection.cursor(prepared=True)
stream = open('filename.csv', 'rb')
csv_file = csv.DictReader(stream, skipinitialspace=True)

query = 'CREATE TABLE t ('
query += ','.join('`{}` VARCHAR(255)'.format(column) for column in csv_file.fieldnames)
query += ')'
cursor.execute(query)
for row in csv_file:
    query = 'INSERT INTO t SET '
    query += ','.join('`{}` = ?'.format(column) for column in row.keys())
    cursor.execute(query, row.values())

stream.close()
cursor.close()
connection.close()

Key points

  • Use prepared statements for the INSERT
  • Open the file.csv in 'rb' binary
  • Some CSV files may need tweaking, such as the skipinitialspace option.
  • If 255 isn't wide enough you'll get errors on INSERT and have to start over.
  • Adjust column types, e.g. ALTER TABLE t MODIFY `Amount` DECIMAL(11,2);
  • Add a primary key, e.g. ALTER TABLE t ADD `id` INT PRIMARY KEY AUTO_INCREMENT;

As others have mentioned, the load data local infile works just fine. I tried the php script that Hawkee posted, but didnt work for me. Rather than debug it, here's what i did:

1) copy/paste the header row of the CSV file into a txt file and edit with emacs. add a comma and CR between each field to get each on on it's own line.
2) Save that file as FieldList.txt
3) edit the file to include defns for each field (most were varchar, but quite a few were int(x). Add create table tablename ( to the beginning of the file and ) to the end of the file. Save it as CreateTable.sql
4) start mysql client with input from the Createtable.sql file to create the table
5) start mysql client, copy/paste in most of the 'LOAD DATA INFILE' command subsituting my table name and csv file name. Paste in the FieldList.txt file. Be sure to include the 'IGNORE 1 LINES' before pasting in the field list

Sounds like a lot of work, but easy with emacs.....


I have google search many ways to import csv to mysql, include " load data infile ", use mysql workbench, etc.

when I use mysql workbench import button, first you need to create the empty table on your own, set each column type on your own. Note: you have to add ID column at the end as primary key and not null and auto_increment, otherwise, the import button will not visible at later. However, when I start load CSV file, nothing loaded, seems like a bug. I give up.

Lucky, the best easy way so far I found is to use Oracle's mysql for excel. you can download it from here mysql for excel

This is what you are going to do: open csv file in excel, at Data tab, find mysql for excel button

select all data, click export to mysql. Note to set a ID column as primary key.

when finished, go to mysql workbench to alter the table, such as currency type should be decimal(19,4) for large amount decimal(10,2) for regular use. other field type may be set to varchar(255).


If you start mysql as "mysql -u -p --local-infile ", it will work fine


To load data from text file or csv file the command is

load data local infile 'file-name.csv'
into table table-name
fields terminated by '' enclosed by '' lines terminated by '\n' (column-name);

In above command, in my case there is only one column to be loaded so there is no "terminated by" and "enclosed by" so I kept it empty else programmer can enter the separating character . for e.g . ,(comma) or " or ; or any thing.

**for people who are using mysql version 5 and above **

Before loading the file into mysql must ensure that below tow line are added in side etc/mysql/my.cnf

to edit my.cnf command is

sudo vi /etc/mysql/my.cnf

[mysqld]  
local-infile

[mysql]  
local-infile  

First create a table in the database with same numbers of columns that are in the csv file.

Then use following query

LOAD DATA INFILE 'D:/Projects/testImport.csv' INTO TABLE cardinfo
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'

Examples related to mysql

Implement specialization in ER diagram How to post query parameters with Axios? PHP with MySQL 8.0+ error: The server requested authentication method unknown to the client Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver' phpMyAdmin - Error > Incorrect format parameter? Authentication plugin 'caching_sha2_password' is not supported How to resolve Unable to load authentication plugin 'caching_sha2_password' issue Connection Java-MySql : Public Key Retrieval is not allowed How to grant all privileges to root user in MySQL 8.0 MySQL 8.0 - Client does not support authentication protocol requested by server; consider upgrading MySQL client

Examples related to csv

Pandas: ValueError: cannot convert float NaN to integer Export result set on Dbeaver to CSV Convert txt to csv python script How to import an Excel file into SQL Server? "CSV file does not exist" for a filename with embedded quotes Save Dataframe to csv directly to s3 Python Data-frame Object has no Attribute (unicode error) 'unicodeescape' codec can't decode bytes in position 2-3: truncated \UXXXXXXXX escape How to write to a CSV line by line? How to check encoding of a CSV file

Examples related to import

Import functions from another js file. Javascript The difference between "require(x)" and "import x" pytest cannot import module while python can How to import an Excel file into SQL Server? When should I use curly braces for ES6 import? How to import a JSON file in ECMAScript 6? Python: Importing urllib.quote importing external ".txt" file in python beyond top level package error in relative import Reading tab-delimited file with Pandas - works on Windows, but not on Mac

Examples related to load-data-infile

MYSQL import data from csv using LOAD DATA INFILE Import CSV to mysql table Importing a csv into mysql via command line How do I import CSV file into a MySQL table? MySQL load NULL values from CSV data

Examples related to database-table

Count the Number of Tables in a SQL Server Database SQL count rows in a table Mysql: Select rows from a table that are not in another Import CSV to mysql table MySQL > Table doesn't exist. But it does (or it should) Create table in SQLite only if it doesn't exist already Copy a table from one database to another in Postgres Truncating all tables in a Postgres database Maximum number of records in a MySQL database table Why use multiple columns as primary keys (composite primary key)