I have made some archive file with the tar gnome GUI on Ubuntu but when I try to extract them
tar zxvf archive_name
I get following error
Cannot open: Not a directory
What is the problem ?
Try extracting the archive in an empty directory; any existing files/directories in the extract target usually cause problems if names overlap.
I encountered the same issue (for each file within an archive) and I solved it by appending ".tar.gz" to the archive filename as I'd managed to download a PECL package without a file extension:
mv pecl_http pecl_http.tar.gz
I was then able to issue the following command to extract the contents of the archive:
tar -xzf pecl_http.tar.gz
You probably might already have a file with the same name that the tar is extracting a directory.
Try to tar in different location.
tar zxvf tar_name.tgz --one-top-level=new_directory_name
Try using tar -zxvf archive_name instead. I believe that the command format has changed, and it now requires the z (unzip) x (extract) v (verbose) f (filename...) parts as switches instead of plain text. The error comes from tar trying to do something to the file zxvf, which of course does not exist.
Related
So I've never done this before with the actual tar command windows: easy select all three extract here, Linux probably something similar. but I'm using a system that cant do that, so I need the command for extracting multiple split tars as one. Tried tar -xvf file.tar file.tar file.tar but it didn't work.
answer: $ cat filenamewithoutpartnumberorextension* > file.tar
I have a tr.gz backup file, and I need to delete some file inside of this tar.gz file without extract the tar.gz file.
Is there any solution (command line or software) in windows?
It is not possible to remove the file from tar, but you can exclude a file by the following command
tar -zxvf file.tar.gz --exclude "file_to_exclude"
or
take a backup too and proceed
OR
tar -cvf files.tar --remove-files my_directory
I have two systems that I'm splitting processing between, and I'm trying to find the most efficient way to move the data between the two. I've figured out how to tar and gzip to an archive on the first server ("serverA") and then use rsync to copy to the remote host ("serverB"). However, when I untar/unzip the data there, it saves the archive including the full path name from the original server. So if on server A my data is in:
/serverA/directory/a/lot/of/subdirs/myData/*
and, using this command:
tar -zcvf /serverA/directory/a/lot/of/subdirs/myData-archive.tar.gz /serverA/directory/a/lot/of/subdirs/myData/
Everything in .../myData is successfully tarred and zipped in myData-archive.tar.gz
However, after copying the archive, when I try to untar/unzip on the second host (I manually log in here to finish the processing, the first step of which is to untar/unzip) using this command:
tar -zxvf /serverB/current/directory/myData-archive.tar.gz
It untars everything in my current directory (serverB/current/directory/), however it looks like this:
/serverB/current/directory/serverA/directory/a/lot/of/subdirs/myData/Data*ext
How should I formulate both the tar commands so that my data ends up in a directory called
/serverB/current/directory/dataHERE/
?
I know I'll need the -C flag to untar into a different directory (in my case, /serverB/current/directory/dataHERE ), but I still can't figure out how to make it so that the entire path is not included when the archive gets untarred. I've seen similar posts but none that I saw discussed how to do this when moving between to different hosts.
UPDATE: per one of the answers in this question, I changed my commands to:
tar/zip on serverA:
tar -zcvf /serverA/directory/a/lot/of/subdirs/myData-archive.tar.gz serverA/directory/a/lot/of/subdirs/myData/ -C /serverA/directory/a/lot/of/subdirs/ myData
and, untar/unzip:
tar -zxvf /serverB/current/directory/myData-archive.tar.gz -C /serverB/current/directory/dataHERE
And now, not only does it untar/unzip the data to:
/serverB/current/directory/dataHERE/
like I wanted, but it also puts another copy of the data here:
/serverB/current/directory/serverA/directory/a/lot/of/subdirs/myData/
which I don't want. How do I need to fix my commands so that it only puts data in the first place?
On serverA do
( cd /serverA/directory/a/lot/of/subdirs; tar -zcvf myData-archive.tar.gz myData; )
After some more messing around, I figured out how to achieve what I wanted:
To tar on serverA:
tar -zcvf /serverA/directory/a/lot/of/subdirs/myData-archive.tar.gz -C /serverA/directory/a/lot/of/subdirs/ myData
Then to untar on serverB:
tar -zxvf /serverB/current/directory/myData-archive.tar.gz -C /serverB/current/directory/dataHERE
On server A, I created a tar file (backup.tar.gz) of the entire website /www. The tar file includes the top-level directory www
On server B, I want to put those files into /public_html but not include the top level directory www
Of course, tar -xzif backup.tar.gz places everything into /public_html/www
How do I do this?
Thanks!
You can use the --transform option to change the beginning of the archived file names to something else. As an example, in my case I had installed owncloud in directory named sscloud instead of owncloud. This caused problems when upgrading from the *.tar file. So I used the transform option like so:
tar xvf owncloud-10.3.2.tar.bz2 --transform='s/owncloud/sscloud/' --overwrite
The transform option takes sed-like commands. The above will replace the first occurrence of owncloud with sscloud.
Answer is:
tar --strip-components 1 -xvf backup.tar.gz
After running make distcheck I get the message that I have successfully built the package and is ready for distribution. If I untar the tar.gz with tar -zxvf hello-0.2.tar.gz it successfully extracts all of its contents. However, when I try to extract them in different machines I get:
tar: This does not look like a tar archive
tar: Skipping to next header
tar: Exiting with failure status due to previous errors
The weird thing is that it was working before.
On the machine I'm trying to build the package, I've updated my automake 1.10.1, autoconf 2.61, and tar 1.20 to automake 1.11.1, autoconf 2.65, and tar 1.23 and still the same issue.
Any ideas what could be the problem?
The problem is not on the build machine; the problem is on the target machines.
Not all versions of tar automatically recognize the decompression to apply to a compressed tar file. Given that gunzip followed by tar does work, then the tar on your target machine is one such. The versions of tar on the mainstream Unix systems (AIX, HP-UX, Solaris) do not recognize compressed tar files automatically. Those on Linux and MacOS X do.
Note that you can use:
gzip -dc hello-0.2.tar.gz | tar -xf -
to avoid creating the intermediate uncompressed file.
Actually this could happen when the server you download from applies another round of GZip and the client you used to download the file doesn't read/respect the HTTP Content-Encoding header and stores the HTTP payload as it was on the wire.
Although the file appears to have only the extension .tar.gz it is in fact .tar.gz.gz. after you run the gunzip once the file gets the extension .tar only but still this time running the tar command tar xf hello-0.2.tar recognizes the GZip format and implicitly runs the file through gunzip one more time before extracting.
You can check this by running head hello-02.tar.gz and head hello-02.tar. GZip is a very binary format, whereas tar is quite human readable. If the .tar file appears "too binary" you have a doubly encoded file on your hands.