How to restore influxdb from local backup || InfluxDB - ruby-on-rails

I am trying to import client's provided influx db data backup to my local influx db. But getting following error.
ENV: Ubuntu
Influx DB service is already running.
enter image description here
Trying to restore influx db backup.

First of all you are using the wrong command to execute influxdb task. It should be influxd instead of influx. Please follow the steps below to restore your database.
Check your data-dir path by executing: influxd config. Copy the data-dir section.
See database name by executing: SHOW Databases
Execute to restore: influxd restore -database {database_name} -data-dir {your_data_dir_path} /path/to/your/backup

Related

PG_Restore local backup to docker container using pipe viewer

I was using the following command to restore from a backup:
pv backup.sql.gz | gunzip | pg_restore -d $DBHOST
This was helpful because I can roughly see how far a long the backup has to finish.
I recently moved the DB into a docker container and wanted to be able to run the same restore, but have been struggling getting the command to work. I'm assuming I have to redirect the gunzip output somehow, but haven't been having any luck. Any help would be appreciated.

Magento, database dump

I am trying to get db dump by command
docker exec container-name sh -c 'exec mysqldump --all-databases -uroot -p""' > db-backups/some-dump-name.sql
and I am getting
Got error: 2002: "Can't connect to local MySQL server through socket '/opt/bitn
ami/mysql/tmp/mysql.sock' (2)" when trying to connect
Magento runs on this image. Any ideas what could be wrong? I can provide more details if needed.
Bitnami Engineer here,
You also need to set the hostname of the database when backing up the databases. The Magento container doesn't include a database server, it uses an external one.
You probably specified that using the MARIADB_HOST env variable. If you used the docker-compose.yml file we provide, that hostname is mariadb.
exec mysqldump --all-databases -uroot -h HOSTNAME -p""

Unable to restore complete database from pg_dump

I ran the following command to backup my PostgreSQL database:
pg_dump -U postgres -h localhost -W -F t crewdb > /home/chris1/Documents/crewcut/crewdb/crewdb_bak.tar
This file was later saved to a USB.
After installing PostgreSQL on a new Ubuntu 18.04 system I ran the following command to restore the database from the USB:
psql -U postgres -d crewdb < /media/chh1/1818-305D/crewdb_bak.tar
The structure of the database has been recovered, so tables, views etc. except the actual data in the tables which has not been recovered.
Has anyone got an idea why this is and how to solve this.
I do not know if the command you ran to restore your data is correct; on any case try to use the pq_restore as says from the official documentation "restore a PostgreSQL database from an archive file created by pg_dump" that's the correct way to do it.
In my case I use pg_dumpall -U user > backup.sql then "cat backup.sql | psql -U user database"
I recommend you to check out the flags you're using on your pg_dump

How to do Backup and Restore in PostgreSQL database on Linux server

I have a requirement of backup and restore PostgreSQL database from Dev server to QA server. Database size is 1TB.
Is there is any approach to restore directly from Dev server to QA server with out creating intermediate file?
Replicate Dev server data on QA server using the pg_basebackup command.
pg_basebackup -h Dev_Server_host_name -D /QA_Server_Data_Directory/ -P -v

Export data from InfluxDB

Is there a way (plugin or tool) to export the data from the database (or database itself) ? I'm looking for this feature as I need to migrate a DB from present host to another one.
Export data:
sudo service influxdb start (Or leave this step if service is already running)
influxd backup -database grpcdb /opt/data
grpcdb is name of DB and back up will be saved under /opt/data directory in this case.
Import Data:
sudo service influxdb stop (Service should not be running)
influxd restore -metadir /var/lib/influxdb/meta /opt/data
influxd restore -database grpcdb -datadir /var/lib/influxdb/data /opt/data
sudo service influxdb start
You could dump each table and load them through REST interface:
curl "http://hosta:8086/db/dbname/series?u=root&p=root&q=select%20*%20from%20series_name%3B" > series_name.json
curl -XPOST -d #series_name.json "http://hostb:8086/db/dbname/series?u=root&p=root"
Or, maybe you want to add new host to cluster? It's easy and you'll get master-master replica for free. Cluster Setup
If I use curl, I get timeouts, and if I use influxd backup its not in a format I can read.
I'm getting fine results like this:
influx -host influxdb.mydomain.com -database primary -format csv -execute "select time,value from \"continuous\" where channel='ch123'" > outtest.csv
As ezotrank says, you can dump each table. There's a missing "-d" in ezotrank's answer though. It should be:
curl "http://hosta:8086/db/dbname/series?u=root&p=root&q=select%20*%20from%20series_name%3B" > series_name.json
curl -XPOST -d #series_name.json "http://hostb:8086/db/dbname/series?u=root&p=root"
(Ezotrank, sorry, I would've just posted a comment directly on your answer, but I don't have enough reputation points to do that yet.)
From 1.5 onwards, the InfluxDB OSS backup utility provides a newer option which is much more convenient:
-portable: Generates backup files in the newer InfluxDB Enterprise-compatible format. Highly recommended for all InfluxDB OSS users
Export
To back up everything:
influxd backup -portable <path-to-backup>
To backup only the myperf database:
influxd backup -portable -database myperf <path-to-backup>
Import
To restore all databases found within the backup directory:
influxd restore -portable <path-to-backup>
To restore only the myperf database (myperf database must not exist):
influxd restore -portable -db myperf <path-to-backup>
Additional options include specifying timestamp , shard etc. See all the other supported options here.
If You want to export in an readable format, the inspect command is to prefer.
To export the database with the name HomeData the command is:
sudo influx_inspect export -waldir /var/lib/influxdb/wal -datadir /var/lib/influxdb -out "influx_backup.db" -database HomeData
The parameters for -waldir and -datdir can be found in /etc/influxdb/influxdb.conf.
To import this file again, the command is:
influx -import -path=influx_backup.db
If you have access to the machine running Influx db I would say use the influx_inspect command. The command is simple and very fast. It will dump your db in line protocol. You can then import this dump using influx -import command.

Resources