Funny result with multiple backup tape - tar

I'm testing for lto6 tar encrypted backup
I'm using one G only for the test
tar cMpf - --tape-length=1G --blocking-factor 4096 -X /etc/file.exclude /| openssl enc -e -aes256 -salt -pass file:unixpass -out /dev/st0
The first tape work fine
Ask me for second..I insert press return and...
display content of a file!
"<custom_item>type : SQL_POLICYdescription : "2.11 sqlnet.ora settings - 'Setting for the remote_os_authent parameter'""....
this for thousand of lines,like cat command
Using a file for testing it cat /opt/nessus...
opt/nessus/var/nessus/audits/audit_warehouse.audit01402604000014563

Solution found: must insert tape name,i though was automatic generated

Related

docker -w dir prefixed with another dir [duplicate]

Earlier today, I was trying to generate a certificate with a DNSName entry in the SubjectAltName extension:
$ openssl req -new -subj "/C=GB/CN=foo" -addext "subjectAltName = DNS:foo.co.uk" \
-addext "certificatePolicies = 1.2.3.4" -key ./private-key.pem -out ~/req.pem
This command led to the following error message:
name is expected to be in the format /type0=value0/type1=value1/type2=... where characters may be escaped by . This name is not in that format: 'C:/Program Files/Git/C=GB/CN=foo'
problems making Certificate Request
How can I stop Git Bash from treating this string parameter as a filepath, or at least stop it from making this alteration?
The release notes to the Git Bash 2.21.0 update today mentioned this as a known issue. Fortunately, they also described two solutions to the problem:
If you specify command-line options starting with a slash, POSIX-to-Windows path conversion will kick in converting e.g. "/usr/bin/bash.exe" to "C:\Program Files\Git\usr\bin\bash.exe". When that is not desired -- e.g. "--upload-pack=/opt/git/bin/git-upload-pack" or "-L/regex/" -- you need to set the environment variable MSYS_NO_PATHCONV temporarily, like so:
MSYS_NO_PATHCONV=1 git blame -L/pathconv/ msys2_path_conv.cc
Alternatively, you can double the first slash to avoid POSIX-to-Windows path conversion, e.g. "//usr/bin/bash.exe".
Using MSYS_NO_PATHCONV=1 can be problematic if your script accesses files.
Prefixing with a double forward slash doesn't work for the specific case of OpenSSL, as it causes the first DN segment key to be read as "/C" instead of "C", so OpenSSL drops it, outputting:
req: Skipping unknown attribute "/C"
Instead, I used a function that detects if running on bash for Windows, and prefixes with a "dummy" segment if so:
# If running on bash for Windows, any argument starting with a forward slash is automatically
# interpreted as a drive path. To stop that, you can prefix with 2 forward slashes instead
# of 1 - but in the specific case of openssl, that causes the first CN segment key to be read as
# "/O" instead of "O", and is skipped. We work around that by prefixing with a spurious segment,
# which will be skipped by openssl
function fixup_cn_subject() {
local result="${1}"
case $OSTYPE in
msys|win32) result="//XX=x${result}"
esac
echo "$result"
}
# Usage example
MY_SUBJECT=$(fixup_cn_subject "/C=GB/CN=foo")
Found a workaround by passing a dummy value as the first attribute, for example: -subj '//SKIP=skip/C=gb/CN=foo'
I had the same issue using bash, but running the exact same command in Powershell worked for me. Hopefully this will help someone.

Grepping list of phpass hashes against a file

I'm trying to grep multiple strings which look like this (there's a few hundred) against a file which contains data:string
Example strings: (no sensitive data is provided, they have been modified).
$H$9a...DcuCqC/rMVmfiFNm2rqhK5vFW1
$H$9n...AHZAV.sTefg8ap8qI8U4A5fY91
$H$9o...Bi6Z3E04x6ev1ZCz0hItSh2JJ/
$H$9w...CFva1ddp8IRBkgwww3COVLf/K1
I've been researching how to grep a file of patterns against another file, and came across the following commands
grep -f strings.txt datastring.txt > output.txt
grep -Ff strings.txt datastring.txt > output.txt
But unfortunately, these commands do NOT work successfully, and only print out a handful of results to my output file. I think it may be something to do with the symbols contained in strings.txt, but I'm unsure. Any help/advice would be great.
To further mention, I'm using Cygwin on Windows (if this is relevant).
Here's an updated example:
strings.txt contains the following:
$H$9a...DcuCqC/rMVmfiFNm2rqhK5vFW1
$H$9n...AHZAV.sTefg8ap8qI8U4A5fY91
$H$9o...Bi6Z3E04x6ev1ZCz0hItSh2JJ/
$H$9w...CFva1ddp8IRBkgwww3COVLf/K1
datastring.txt contains the following:
$H$9a...DcuCqC/rMVmfiFNm2rqhK5vFW1:53491
$H$9n...AHZAV.sTefg8ap8qI8U4A5fY91:03221
$H$9o...Bi6Z3E04x6ev1ZCz0hItSh2JJ/:20521
$H$9w...CFva1ddp8IRBkgwww3COVLf/K1:30142
So technically, all lines should be included in the OUTPUT file, but only this line is outputted:
$H$9w...CFva1ddp8IRBkgwww3COVLf/K1:30142
I just don't understand.
You have showed the output of cat -A strings.txt elsewhere, which includes ^M representing a CR (carriage return) character at the end of each line:
This indicates your file has Windows line endings (CR LF) instead of the Unix line endings (only LF) that grep would expect.
You can convert files with dos2unix strings.txt and back with unix2dos strings.txt.
Alternatively, if you don't have dos2unix installed in your Cygwin environment, you can also do that with sed.
sed -i 's/\r$//' strings.txt # dos2unix
sed -i 's/$/\r/' strings.txt # unix2dos

How to create tar file with 7zip

I'm trying to create a tar file on windows using 7zip.
Most of the documents I found said to do something like this:
7z a -ttar -so dwt.tar dwt/
But when I tried to run it I got this error:
Command Line Error:
I won't write compressed data to a terminal
I'm currently using 7-Zip [64] 16.04
Any idea?
On Linux:
tar cf - <source folder> | 7z a -si <Destination archive>.tar.7z
from here
On Windows:
7za.exe a -ttar -so archive.tar source_files | 7za.exe a -si archive.tgz
from here.
I managed to do that making simply, with 7zip installed:
Right click on the folder you want to compress
Choose -7zip/add to file
Once there, on the new screen, on file type, you can choose 7z/tar/wim/zip
Choose tar, and there you go :)
From the manpage:
-so Write data to stdout (e.g. 7z x -so directory.tar.7z | tar xf -)
It does what you told it to. 7z can guess archive format from the file extension so it's enough to use
7z a archive.tar input/
To further compress as gzip you can use a pipe and a combination of stdin and stdout flags like in Tu.Ma.'s answer.

How do I convert a base64 image?

I am trying to use the "convert" command-line tool from ImageMagick. I have a base64 encoded png file and I need to convert it to another format. I am looking at documentation and a forum discussion which suggests that I should be able to use this syntax:
convert inline:file.txt file.jpg
But when I do this, I get this error message:
convert: corrupt image `file.txt' # error/constitute.c/ReadInlineImage/910.
What am I doing wrong? How do I get convert to read a base64 image file?
Updated Answer - now that I understand it better myself :-)
Basically, you can base64 encode an image using openssl like this:
openssl enc -base64 -in image.png > image.b64
However, if you want ImageMagick to be able to read it, you need a small header at the start, to tell ImageMagick what follows. The header must contain:
data:image/png;base64,
followed by your base64 encoded data generated using the openssl command above. So, depending on what features your shell has, you could do it like this with a compound statement in bash:
{ echo "data:image/png;base64,"; openssl enc -base64 -in input.png; } > image.b64
or like this in Windows:
echo data:image/png;base64, > image.b64
openssl enc -base64 -in image.png >> image.b64
Once you have the image in that format, you can then proceed to process it with ImageMagick like this:
convert inline:image.b64 result.png
For those who use this in css, add -A flag to output in one line
openssl enc -base64 -A -in image.png > image.b64
Original Answer
After MUCH experimenting, I can do it!!! :-)
Start with Eric's (#emcconville) setup:
# For example
convert rose: rose.png
# Create base64 file
openssl enc -base64 -in rose.png -out rose.txt
and now add this mess as the last line:
{ echo "data:image/png;base64,"; cat rose.txt; } | convert inline:- out.jpg
I guess the data:image/png;base64, is not present in the base64 file created by openssl so I create a compound statement that sends that plus the file to stdin of ImageMagick.
Updated answer
From ImageMagick format docs...
The inline image look similar to inline:data:;base64,/9j/4AAQSk...knrn//2Q==. If the inline image exceeds 5000 characters, reference it from a file (e.g. inline:inline.txt).
This hints at two "gotcha" when using the inline format. First any standard base64 whitespace (unix line break) should be removed such that all information would be on one line. And second, that any data above 5000 characters should be read from a file buffer.
# Copy data to new file, striping line-breaks & adding INLINE header. (Please advise better sed/awk.)
cat file.txt | tr -d "\r\n" | awk '{print "data:image/png;base64,"$1}' > file.inline
# Read file as expected
convert inline:file.inline file.jpg
Original (not really correct) answer
The "corrupt image" message tells me that there may be whitespace in the base64 file. If so, the tr utility would work.
# For example
convert rose: rose.png
# Create base64 file
openssl enc -base64 -in rose.png -out rose.txt
# Read inline & data from stdin -- after stripping whitespace
cat rose.txt | tr -d "\r\n" | convert inline:data:- out.jpg

How could I untar all .tar files in a directory to folders based on filename of each .tar?

I could do this for .zip files in the folder using the command below:
for f in "!"; do unzip -d "${f%*.zip}" "$f"; done
The above command extracts all .zip files in a given folder to subfolders, having content and name of respective .zip files.
But I couldn't find a command that would do the same for .tar files. Please help.
Btw, I am trying to do this on a remote server using WinSCP/putty. So, I cannot use a GUI software. I need a command, thus the question.
After a bit of fiddling I came up with for f in $(find -maxdepth 1 | grep .tar); do mkdir ${f%.tar}; tar -xaf $f -C ${f%.tar} ; done appears to work, so long as the file name does not contain any spaces. I assume you wanted the directory from foo.tar to be named foo (no file extension). If you want the directory to be named foo.tar (with file extension) then try using for f in $(find -maxdepth 1 | grep .tar); do mkdir $f ; tar -xaf $f -C $f ; done.
IIRC, the remote access client Cyberduck can handle compressed files in a GUI - so you can try that if you're fine with a GUI solution.

Resources