Here are three methods for importing large SQL files

Method 1: Bigdump

http://www.ozerov.de/bigdump.php

I used a script called BigDump.php, a staggered SQL dump importer. It executes only a small part of the huge dump and restarts itself. The next session starts where the last was stopped to keep you from running into your server’s limits. Instructions for using it are on the BigDump website. Basically you place your SQL file in a folder on your server, along with the bigdump.php file. You edit that file with your database information and then visit the page on your server and set the import to go. This is a fairly quick process and will save you a lot of time.

 

Method 2: SSH

If you have root access to your server, create a backup using SSH

cd to the directory where you want to store your dump file:

cd /var/www/vhosts/yourdomain.com/httpdocs/sqldump

Then tell mysql to dump it out:

mysqldump –add-drop-table -u dbuser -p your_db > mybackup.sql

It will prompt you for your password. Everything should dump out without a problem.

*Note: The directory where you’re sending the dump file needs to be writeable by the server. Also, you don’t really want to leave it in that folder permanently. Just drop it there and pull it off your server so that you can manipulate it if you have to.
How to Import a Large SQL File

Importing is usually a bit messier. If you’re duplicating this database for testing purposes or moving your site to a new server, a large dump file, even when gzipped, can pose some challenges in the import process. This is the part where things can really stall out, and you need a few tricks up your sleeve.

Ordinarily I would just use phpMyAdmin to import the file. Because it’s so large, the first thing you should do is try importing via SSH:

mysql -u your_database -p db_user < mybackup.sql
Common Error when using SSH:

"got a packet bigger than ‘max_allowed_packet’ bytes"

Now what? Depending on the size of your file, running this command may fix it for you:

mysql> set global max_allowed_packet=1000000000;
mysql> set global net_buffer_length=1000000;

Then try importing your database after this. If it still doesn’t work, you might try splitting it up.

 

Method 3: splitting up your SQL file

Download a program that will split up your SQL file, such as SQLDumpSplitter2. This is a free SQL dump file splitter that will let you determine how big your chops will be and will automatically cut and save your split SQL file. Yes, this is a pain, but it is sometimes the only way to get this accomplished.

http://www.rusiczki.net/2007/01/24/sql-dump-file-splitter/

Create the splits and then upload them to a directory on your server.

If you want to restore the dump, you have to execute the yourdatabase_DataStructure.sql first because it contains the Tables structure. After that, you can execute the other .sql-files as they contain the data of the now existing tables. Using SSH, CD to your directory again and make sure to send this first:

mysql -u db_user -p db_name < yourbackup_DataStructure.sql

Then your splits:
mysql -u db_user -p db_name < yourbackup_1.sql
mysql -u db_user -p db_name < yourbackup_2.sql
etc…

 


Geplaatst

in

, ,

door

Tags:

Reacties

2 reacties op “Here are three methods for importing large SQL files”

  1. Philip avatar

    Hi there,

    I just wanted to let you know that there is a rewrite of that program now available: https://philiplb.de/sqldumpsplitter3/

  2. Philip avatar

    Hi there,
    I just wanted to let you now that I’ve rewritten the SQLDumpSplitter. 🙂
    Now cross platform and with a real parser:
    https://philiplb.de/sqldumpsplitter3/

Geef een reactie

Het e-mailadres wordt niet gepubliceerd. Vereiste velden zijn gemarkeerd met *