Simple way to order grep's result by time (reverse)? - grep

I often have to look for specific strings in a big set of log files with grep. And I get lots of results, on what I must scroll a lot.
Today, the results of grep list the results in alphabetical order. I would like to have my grep results reversed ordered by time, like a ls -ltr would do.
I know I could take the result of ls -ltr and grep file by file. I do it like this:
ls -ltr ${my_log_dir}\* | awk '{print $9}' |xargs grep ${my_pattern}
But I wonder: Is there a simpler way?
PS: I'm using ksh on AIX.

The solution I found (thanks to Fedorqui) was to simply use ls -tr. It assumes the results are passed in the right order through the | to xargs allowing then to do the grep.
My misconception was that since when I us ls, the results arrive not as a single column list but as a multiple column list, it couldn't work as an input for xargs.
Here is the simplest solution to date, since it avoids any awk parsing:
ls -tr ${my_log_dir}\* | xargs grep ${my_pattern}
I checked, and every result of the ls -t are passed to xargs even though they look not as I expected they would enter easily in it:
srv:/papi/tata $ ls -t
addl ieet rrri
ooij lllr sss
srv:/papi/tata $ ls -t |xargs -I{} echo {}
addl
ieet
rrri
ooij
lllr
sss

This will work too use find command:-
find -type f -print0 | xargs -r0 stat -c %y\ %n | sort -r | awk '{print $4}' | sed "s|^\./||"
-print0 in find to preserve files having special characters(whitespaces, tabs)
Print file status (stat with %y (Time of last modification) and %n (%n File name) with output having new-separated (-c)
Reverse sort the output from previous command. (-r for reverse)
awk '{print $4}' printing only the file-name (can be optimized as needed)
Removing the leading ./ from the file-names.

Related

Linux: Search through sub-folders recursively for a file that contains a string and move it to another file

So far, I have this command on my terminal and it doesn't do anything.
Essentially it's to look for any file that contains the word bango and move it to another directory.
grep -r ".*bango.*" /Users/user/Desktop/drums | xargs mv /Users/user/Desktop/bango
Grep has a function to list the filename only you should use that to list the name of the files.
Also xargs can build commands with positional arguments.
Try to use
grep -rlE ".*bango.*" /Users/user/Desktop/drums | xargs -I # mv # /Users/user/Desktop/bango
The option -E allows to use regular expressions.
However, a regular expression is not needed, you can activate a fast grep algorithm for fixed strings:
grep -rlF "bango" /Users/user/Desktop/drums | xargs -I # mv # /Users/user/Desktop/bango

Docker caching for travis builds

Docker caching is not yet available on travis: https://github.com/travis-ci/travis-ci/issues/5358
I'm trying to write a workaround by doing:
`docker save -o file.tar $(docker history -q image_name | grep -v missing)`
`docker load -i file.tar
Which works great, gives me all the image layers back. My only problem now is the saving takes a long time, and most of the time I'm actually changing one layer, so I don't need to rewrite all the rest. Is there a way of telling the docker save command to skip layers already in file.tar?
In the manifest.json file inside the tar you have the information you need.
tar -xOf file.tar manifest.json
Check the value of the Config keys. The first 12 characters are the image id. You can use the command above, extract the image ids that you already have, and exclude them in your docker save command.
I'm not very good with bash scripting, but this works on my mac
tar -xOf file.tar manifest.json | tr , '\n' | grep -o '"Config":".*"' | awk -F ':' '{print $2}' | awk '{print substr($0,2,12)}'
Using this outputs everything
docker history -q IMAGE_HERE | grep -v missing && tar -xOf file.tar manifest.json | tr , '\n' | grep -o '"Config":".*"' | awk -F ':' '{print $2}' | awk '{print substr($0,2,12)}'
After this you only need to get the unique values. This could be done with sort and uniq -u, but for some reason, sort doesn't work as expected. This command assumes the presence of file.tar so take that into consideration too.
I couldn't find anything about append in the docker save command. The above strategy could work with multiple file tars that are all different with each other.

How to grep for filenames found by find in other files?

How can I grep for the result of find within another pattern?
That's how I get all filenames with a certain pattern (in my case ending with "ext1")
find . -name *ext1 -printf "%f\n"
And then I want to grep for these filenames with another pattern (in my case ending on "ext2"):
grep -r '[filname]' *ext2
I tried with
find . -name *ext1 -printf "%f\n" | xargs grep -r *ext2
But this only makes grep tell me that it can not find the files found by find.
You would tell grep that the patterns are in a file with the -f option, and use the "stdin filename" -:
find ... | grep -r -f - *ext2

Delete a list of files with find and grep

I want to delete all files which have names containing a specific word, e.g. "car".
So far, I came up with this:
find|grep car
How do I pass the output to rm?
find . -name '*car*' -exec rm -f {} \;
or pass the output of your pipeline to xargs:
find | grep car | xargs rm -f
Note that these are very blunt tools, and you are likely to remove files that you did not intend to remove. Also, no effort is made here to deal with files that contain characters such as whitespace (including newlines) or leading dashes. Be warned.
To view what you are going to delete first, since rm -fr is such a dangerous command:
find /path/to/file/ | grep car | xargs ls -lh
Then if the results are what you want, run the real command by removing the ls -lh, replacing it with rm -fr
find /path/to/file/ | grep car | xargs rm -fr
I like to use
rm -rf $(find . | grep car)
It does exactly what you ask, logically running rm -rf on the what grep car returns from the output of find . which is a list of every file and folder recursively.
You can use ls and grep to find your files and rm -rf to delete the files.
rm -rf $(ls | grep car)
But this is not a good idea to use this command if there is a chance of directories or files, you don't want to delete, having names with the character pattern you are specifying with grep.
You really want to use find with -print0 and rm with --:
find [dir] [options] -print0 | grep --null-data [pattern] | xargs -0 rm --
A concrete example (removing all files below the current directory containing car in their filename):
find . -print0 | grep --null-data car | xargs -0 rm --
Why is this necessary:
-print0, --null-data and -0 change the handling of the input/output from parsed as tokens separated by whitespace to parsed as tokens separated by the \0-character. This allows the handling of unusual filenames (see man find for details)
rm -- makes sure to actually remove files starting with a - instead of treating them as parameters to rm. In case there is a file called -rf and do find . -print0 | grep --null-data r | xargs -0 rm, the file -rf will possibly not be removed, but alter the behaviour of rm on the other files.
This finds a file with matching pattern (*.xml) and greps its contents for matching string (exclude="1") and deletes that file if a match is found.
find . -type f -name "*.xml" -exec grep exclude=\"1\" {} \; -exec rm {} \;
Most of the other solutions presented here have problems with handling file names with spaces in them. Here's a solution that handles spaces properly.
grep -lRZ car . | xargs -0 rm
Notes on arguments used:
-l tells grep to print only filenames
-R enables grep recursive search in subfolders
-Z tells grep to separate results by \0 instead of \n
-0 tells xargs to separate input arguments by \0 instead of whitespace
car is the regular expression to search for
. is the folder where to search
Can also use rm -f to force the removal (as usual).
A bit of necromancy, but you can also use find, grep, and xargs
find . -type f | grep -e "pattern1" -e "pattern2" | xargs rm -rf
^ Find will need some attention to make it work for your needs potentially, such as is a file, mindepth, maxdepth and any globbing.
when find | grep car | xargs rm -f get results:
/path/to/car
/path/to/car copy
some files which contain whitespace will not be removed.
So my answer is:
find | grep car | while read -r line ; do
rm -rf "${line}"
done
So the file contains whitespace could be removed.
find start_dir -iname \*car\* -exec rm -v {} \;
I use:
find . | grep "car" | while read i; do echo $i; rm -f "$i"; done
This works even if there are spaces in the filename as well as in recursive manner, searching for directories as well.
Use rm with wildcard *
rm * will delete all files
rm *.ext will delete all files which have ext as extension
rm word* will delete all files which starts with word.

How to grep and execute a command (for every match)

How to grep in one file and execute for every match a command?
File:
foo
bar
42
foo
bar
I want to execute to execute for example date for every match on foo.
Following try doesn't work:
grep file foo | date %s.%N
How to do that?
grep file foo | while read line ; do echo "$line" | date %s.%N ; done
More readably in a script:
grep file foo | while read line
do
echo "$line" | date %s.%N
done
For each line of input, read will put the value into the variable $line, and the while statement will execute the loop body between do and done. Since the value is now in a variable and not stdin, I've used echo to push it back into stdin, but you could just do date %s.%N "$line", assuming date works that way.
Avoid using for line in `grep file foo` which is similar, because for always breaks on spaces and this becomes a nightmare for reading lists of files:
find . -iname "*blah*.dat" | while read filename; do ....
would fail with for.
What you really need is a xargs command. http://en.wikipedia.org/wiki/Xargs
grep file foo | xargs date %s.%N
example of matching some files and converting matches to the full windows path in Cygwin environment
$ find $(pwd) -type f -exec ls -1 {} \; | grep '\(_en\|_es\|_zh\)\.\(path\)$' | xargs cygpath -w
grep command_string file | sh -
There is an interesting command in linux for that: xargs, It allows You to use the output from previous command(grep, ls, find, etc.) as the input for a custom execution but with several options that allows You to even execute the custom command in parallel. Below some examples:
Based in your question, here is how to print the date with format "%s.%N" for each "foo" match in file.txt:
grep "foo" file.txt | xargs -I {} date +%s.%N
A more interesting use is creating a file for each match, but in this case if matches are identical the file will be override:
grep "foo" file.txt | xargs -I {} touch {}
If You want to concatenate a custom date to the file created
grep "foo" file.txt | xargs -I {} touch "{}`date +%s.%N`"
Imagine the matches are file names and You want to make a backup of them:
grep "foo" file.txt | xargs -I {} cp {} "{}.backup"
And finally for xargs using the custom date in the backupName
grep "foo" file.txt | xargs -I {} cp {} "{}`date +%s.%N`"
For more info about options like parallel execution of xargs visit: https://en.wikipedia.org/wiki/Xargs and for date formats: https://www.thegeekstuff.com/2013/05/date-command-examples/
Extra I have found also a normal for command useful in this scenarios It is simpler but less versatile below are the equivalent for above examples:
for i in `grep "foo" test.txt`; do date +%s.%N; done
for i in `grep "foo" test.txt`; do touch ${i}; done
for i in `grep "foo" test.txt`; do touch "${i}`date +%s.%N`"; done
for i in `grep "foo" test.txt`; do cp ${i} "${i}.backup2"; done
for i in `grep "foo" test.txt`; do cp ${i} "${i}.backup2`date +%s.%N`"; done
Have Fun!!!
grep may need --line-buffered option to emit each matching line when it matches it, otherwise it buffers up to 4K byte before printing match lines, which defeats the goal here, e.g.
tail -f source | grep --line-buffered "expression | xargs ...
grep search_string files_to_search | sh

Resources