612
votes

Is there a way to restrict certain tables from the mysqldump command?

For example, I'd use the following syntax to dump only table1 and table2:

mysqldump -u username -p database table1 table2 > database.sql

But is there a similar way to dump all the tables except table1 and table2? I haven't found anything in the mysqldump documentation, so is brute-force (specifying all the table names) the only way to go?

11

11 Answers

1029
votes

You can use the --ignore-table option. So you could do

mysqldump -u USERNAME -pPASSWORD DATABASE --ignore-table=DATABASE.table1 > database.sql

There is no whitespace after -p (this is not a typo).

To ignore multiple tables, use this option multiple times, this is documented to work since at least version 5.0.

If you want an alternative way to ignore multiple tables you can use a script like this:

#!/bin/bash
PASSWORD=XXXXXX
HOST=XXXXXX
USER=XXXXXX
DATABASE=databasename
DB_FILE=dump.sql
EXCLUDED_TABLES=(
table1
table2
table3
table4
tableN   
)
 
IGNORED_TABLES_STRING=''
for TABLE in "${EXCLUDED_TABLES[@]}"
do :
   IGNORED_TABLES_STRING+=" --ignore-table=${DATABASE}.${TABLE}"
done

echo "Dump structure"
mysqldump --host=${HOST} --user=${USER} --password=${PASSWORD} --single-transaction --no-data --routines ${DATABASE} > ${DB_FILE}

echo "Dump content"
mysqldump --host=${HOST} --user=${USER} --password=${PASSWORD} ${DATABASE} --no-create-info --skip-triggers ${IGNORED_TABLES_STRING} >> ${DB_FILE}
132
votes

Building on the answer from @Brian-Fisher and answering the comments of some of the people on this post, I have a bunch of huge (and unnecessary) tables in my database so I wanted to skip their contents when copying, but keep the structure:

mysqldump -h <host> -u <username> -p <schema> --no-data > db-structure.sql
mysqldump -h <host> -u <username> -p <schema> --no-create-info --ignore-table=schema.table1 --ignore-table=schema.table2 > db-data.sql

The resulting two files are structurally sound but the dumped data is now ~500MB rather than 9GB, much better for me. I can now import these two files into another database for testing purposes without having to worry about manipulating 9GB of data or running out of disk space.

65
votes

for multiple databases:

mysqldump -u user -p --ignore-table=db1.tbl1 --ignore-table=db2.tbl1 --databases db1 db2 ..
59
votes

Another example for ignoring multiple tables

/usr/bin/mysqldump -uUSER -pPASS --ignore-table={db_test.test1,db_test.test3} db_test> db_test.sql

using --ignore-table and create an array of tables, with syntaxs like

--ignore-table={db_test.table1,db_test.table3,db_test.table4}

Extra:

Import database

 # if file is .sql
 mysql -uUSER  -pPASS db_test < backup_database.sql
 # if file is .sql.gz
 gzip -dc < backup_database.sql.gz | mysql -uUSER -pPASSWORD db_test

Simple script to ignore tables and export in .sql.gz to save space

#!/bin/bash

#tables to ignore
_TIGNORE=(
my_database.table1
my_database.table2
my_database.tablex
)

#create text for ignore tables
_TDELIMITED="$(IFS=" "; echo "${_TIGNORE[*]/#/--ignore-table=}")"

#don't forget to include user and password
/usr/bin/mysqldump -uUSER -pPASSWORD --events ${_TDELIMITED} --databases my_database | gzip -v > backup_database.sql.gz

Links with information that will help you

Note: tested in ubuntu server with mysql Ver 14.14 Distrib 5.5.55

8
votes

To exclude some table data, but not the table structure. Here is how I do it:

Dump the database structure of all tables, without any data:

mysqldump -u user -p --no-data database > database_structure.sql

Then dump the database with data, except the excluded tables, and do not dump the structure:

mysqldump -u user -p --no-create-info \
    --ignore-table=database.table1 \
    --ignore-table=database.table2 database > database_data.sql

Then, to load it into a new database:

mysql -u user -p newdatabase < database_structure.sql
mysql -u user -p newdatabase < database_data.sql
2
votes

You can use the mysqlpump command with the

--exclude-tables=name

command. It specifies a comma-separated list of tables to exclude.

Syntax of mysqlpump is very similar to mysqldump, buts its way more performant. More information of how to use the exclude option you can read here: https://dev.mysql.com/doc/refman/5.7/en/mysqlpump.html#mysqlpump-filtering

1
votes

In general, you need to use this feature when you don't want or don't have time to deal with a huge table. If this is your case, it's better to use --where option from mysqldump limiting resultset. For example, mysqldump -uuser -ppass database --where="1 = 1 LIMIT 500000" > resultset.sql.

0
votes

Dump all databases with all tables but skip certain tables

on github: https://github.com/rubo77/mysql-backup.sh/blob/master/mysql-backup.sh

#!/bin/bash
# mysql-backup.sh

if [ -z "$1" ] ; then
  echo
  echo "ERROR: root password Parameter missing."
  exit
fi
DB_host=localhost
MYSQL_USER=root
MYSQL_PASS=$1
MYSQL_CONN="-u${MYSQL_USER} -p${MYSQL_PASS}"
#MYSQL_CONN=""

BACKUP_DIR=/backup/mysql/

mkdir $BACKUP_DIR -p

MYSQLPATH=/var/lib/mysql/

IGNORE="database1.table1, database1.table2, database2.table1,"

# strpos $1 $2 [$3]
# strpos haystack needle [optional offset of an input string]
strpos()
{
    local str=${1}
    local offset=${3}
    if [ -n "${offset}" ]; then
        str=`substr "${str}" ${offset}`
    else
        offset=0
    fi
    str=${str/${2}*/}
    if [ "${#str}" -eq "${#1}" ]; then
        return 0
    fi
    echo $((${#str}+${offset}))
}

cd $MYSQLPATH
for i in */; do
    if [ $i != 'performance_schema/' ] ; then 
    DB=`basename "$i"` 
    #echo "backup $DB->$BACKUP_DIR$DB.sql.lzo"
    mysqlcheck "$DB" $MYSQL_CONN --silent --auto-repair >/tmp/tmp_grep_mysql-backup
    grep -E -B1 "note|warning|support|auto_increment|required|locks" /tmp/tmp_grep_mysql-backup>/tmp/tmp_grep_mysql-backup_not
    grep -v "$(cat /tmp/tmp_grep_mysql-backup_not)" /tmp/tmp_grep_mysql-backup

    tbl_count=0
    for t in $(mysql -NBA -h $DB_host $MYSQL_CONN -D $DB -e 'show tables') 
    do
      found=$(strpos "$IGNORE" "$DB"."$t,")
      if [ "$found" == "" ] ; then 
        echo "DUMPING TABLE: $DB.$t"
        mysqldump -h $DB_host $MYSQL_CONN $DB $t --events --skip-lock-tables | lzop -3 -f -o $BACKUP_DIR/$DB.$t.sql.lzo
        tbl_count=$(( tbl_count + 1 ))
      fi
    done
    echo "$tbl_count tables dumped from database '$DB' into dir=$BACKUP_DIR"
    fi
done

With a little help of https://stackoverflow.com/a/17016410/1069083

It uses lzop which is much faster, see:http://pokecraft.first-world.info/wiki/Quick_Benchmark:_Gzip_vs_Bzip2_vs_LZMA_vs_XZ_vs_LZ4_vs_LZO

0
votes

I like Rubo77's solution, I hadn't seen it before I modified Paul's. This one will backup a single database, excluding any tables you don't want. It will then gzip it, and delete any files over 8 days old. I will probably use 2 versions of this that do a full (minus logs table) once a day, and another that just backs up the most important tables that change the most every hour using a couple cron jobs.

#!/bin/sh
PASSWORD=XXXX
HOST=127.0.0.1
USER=root
DATABASE=MyFavoriteDB

now="$(date +'%d_%m_%Y_%H_%M')"
filename="${DATABASE}_db_backup_$now"
backupfolder="/opt/backups/mysql"
DB_FILE="$backupfolder/$filename"
logfile="$backupfolder/"backup_log_"$(date +'%Y_%m')".txt

EXCLUDED_TABLES=(
logs
)
IGNORED_TABLES_STRING=''
for TABLE in "${EXCLUDED_TABLES[@]}"
do :
   IGNORED_TABLES_STRING+=" --ignore-table=${DATABASE}.${TABLE}"
done

echo "Dump structure started at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
mysqldump --host=${HOST} --user=${USER} --password=${PASSWORD} --single-transaction --no-data --routines ${DATABASE}  > ${DB_FILE} 
echo "Dump structure finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
echo "Dump content"
mysqldump --host=${HOST} --user=${USER} --password=${PASSWORD} ${DATABASE} --no-create-info --skip-triggers ${IGNORED_TABLES_STRING} >> ${DB_FILE}
gzip ${DB_FILE}

find "$backupfolder" -name ${DATABASE}_db_backup_* -mtime +8 -exec rm {} \;
echo "old files deleted" >> "$logfile"
echo "operation finished at $(date +'%d-%m-%Y %H:%M:%S')" >> "$logfile"
echo "*****************" >> "$logfile"
exit 0
0
votes

For sake of completeness, here is a script which actually could be a one-liner to get a backup from a database, excluding (ignoring) all the views. The db name is assumed to be employees:

ignore=$(mysql --login-path=root1 INFORMATION_SCHEMA \
    --skip-column-names --batch \
    -e "select 
          group_concat(
            concat('--ignore-table=', table_schema, '.', table_name) SEPARATOR ' '
          ) 
        from tables 
        where table_type = 'VIEW' and table_schema = 'employees'")

mysqldump --login-path=root1 --column-statistics=0 --no-data employees $ignore > "./backups/som_file.sql"   

You can update the logic of the query. In general using group_concat and concat you can generate almost any desired string or shell command.

0
votes

Skip certain tables with mysqldump

Suppose there are some test tables in some databases and you want to exclude them from the backup; you can specify using the -- exclude-tables option, which will exclude tables with the name test across all databases:

shell> mysqlpump --exclude-tables=test --resultfile=backup_excluding_test.sql