grep current directory only - grep

I would like to search all the files in the current directory only. I tried this
grep foo *
but I get this error
grep: bar: Is a directory
I also tried this
grep -r foo
but this is searching subdirectories as well.

Depending on your version of grep, you may be able to write:
grep --directories=skip foo *

This is actually a comment, just don't have enough reputation to place it as a comment.
your 1st answer is actually correct.
if bar is a directory within the same directory you want to search files and you don't like the error.
It can also be simple just to do away with error.
e.g. grep foo * 2> /dev/null

Related

Mingw64 shell's grep ignores -r option?

I'm trying to do a grep in Microsoft Windows, using the MINGW64 shell v4.4.23(1). (That's what the title bar says. I assume this means MingW-W64.)
I want to list all files in a specified directory tree that have a certain filename extension and do not contain a certain string.
With the current directory set to the top of the tree I entered
grep -r -L thestring *.theextension
It lists only files in the current directory, not the tree.
I tried some variations and determined that grep is simply ignoring the -r option. It ignores --recursive, too.
But when I enter grep --help, it lists both -r and --recursive as valid options, with the expected meaning.
Is this a bug in the shell, or am I doing something stupid?
With grep -r -L thestring *.theextension you are telling grep to search recursively in any file or folder matching *.theextension. If you don't have any folders matching that you shouldn't expect it to go through any other folders. The -L flag doesn't mean it's going to look at anything not matching *.theextension, maybe that's what was confusing you...

Grep returns no such file or directory when using multiple flags

I am a beginner of bash script. I just started to write a script where it checks the contents of b.txt can all be found in a.txt. (line by line preferably). My code is as following:
grep -Ffw b.txt a.txt
As you can see, I want to do fixed string instead of REGEX, I want to check everything from the b.txt file, because there are some strings inside the b.txt and I want to check if all of them exist in a.txt. And I also want to match the whole word only of course. So these are the requirements, however when I run this command it returns me an error says: grep: w: No such file or directory
I am thinking that maybe there are some limitations of the flags in bash? Sorry I am not really familiar with the language, didn't read much about the MAN page etc. If anyone could help me to solve the puzzle it would be appreciated :) In addition, i think if possible I would like to add a -q to surpress the output when there is a match also, right now I didn't add it in the example since it couldn't make it through with 3 flags even. So can anyone give me some hints here? Thanks in advance!
Hereby some explanation from the manpage:
OPTIONS
Generic Program Information
...
-F, --fixed-strings
Interpret PATTERNS as fixed strings, ...
-f FILE, --file=FILE
Obtain patterns from FILE, ...
-w, --word-regexp
Select only those lines ...
As you can see, the options -F and -w are indicated ending immediately (hence the comma in -F, and -w,), but the -f switch is followed by FILE, with means they belong together.
I you want to preserve the order Ffw, that's possible, but then you need to do something like:
grep -Ff b.txt -w a.txt
As mentioned by #kvantour, the solution is simply placing the -f before the b.txt file. grep -Fwf b.txt a.txt
Should have thought it when it says 'no such file or directory' as it was a clear indication that flags after the -f were treated as the path already.

Strange behavior grep -rnw

I am using grep (BSD grep) 2.5.1-FreeBSD in MacOS and I have found the following behavior.
I have two *.tex files. Each one of these contains the following lines
$k$-th bit of
$(i-m)$-th bit of
respectively. When I ran
grep --color -rnw . -e '\$-th bit of' --include="*.tex"
I got only the second file, i.e., $(i-m)$-th bit of, while I expect the two lines. Could you help me please to understand this behavior?
Never use -r or --include or any other grep option to find files. The GNU guys really screwed up by adding those options to grep when there's a perfectly good tool named find for finding files and now they've turned grep into a convoluted mush of finding files and Globally matching a Regular Expression within a file and Printing the result (G/RE/P).
Keep it simple - find the files with find then g/re/p within then using grep:
find . -name '*.tex' -exec grep --color -n '\$-th bit of' {} +
As others pointed out your g/re/p problem was the -w arg so I've removed that above.
I have the same version of grep.
It is caused by your use of the -w option:
-w, --word-regexp
The expression is searched for as a word (as if surrounded by `[[:<:]]' and `[[:>:]]'; see re_format(7)).
The matched part of the string $k$-th bit of is bounded on the left-hand side by a word character (i.e. k) so the match is treated as being inside a "word" and it can't therefore satisfy the "searched for as a whole word" requirement.
Try without -w and it will work fine.

grep alias search command not working

I am trying to make an alias called File Search (fs) for short. That takes one argument (search term). It then searches down the directory tree for that using grep.
Example:
fs 'function my_function()'
What am I doing wrong?
alias fs='grep -R "$1" .'
What you tried would search the current directory itself, not the files in it, and certainly not the files in its subdirectories. You want something like this (from memory, I'm not at a Unix machine right now):
find . -type f | xargs grep "$1"

"grep: line too long" error message

I used the following syntax in order to find IP address under /etc
(answered by Dennis Williamson in superuser site)
but I get the message "grep: line too long".
Someone have idea how to ignore this message and why I get this?
grep -Er '\<([0-9]{1,3}\.){3}[0-9]{1,3}\>' /etc/
grep: line too long
The find/xargs solution didn't work for me, but resulted in the same error.
I solved this problem by using the -I grep option (ignore binary files). In my case there must have been a binary file in the list of files to search that had no linebreaks, so grep tries to read in a gigantic line that is too big. That's my guess at what this error means.
I got the idea from: http://web.archiveorange.com/archive/v/am8x7wI0r0243prrmYd4
This might not work for you of course if there's a text file with a line that is too long.
Use find to build a list of files to grep,
find /etc -type f -print0 | xargs -r0 grep -E '\<([0-9]{1,3}\.){3}[0-9]{1,3}\>'
In general find is a more flexible way of traversing the filesystem and building lists of files for other programs.
Perhaps your grep has a bug and scans by accident a binary file with too long lines (i.e. too much characters for grep to handle between two newlines). See this red hat page for more details (bug page).

Resources