I haven't used cscope much. I would like to know the instructions to build (database) and use cscope with opencv.
Also, Is it applicable only to C programs? how about C++?
I found this long dead question as I searched for the same topic. Here's what I've found.
According to the cscope home page
The fuzzy parser supports C, but is flexible enough to be useful for C++ and Java, and for use as a generalized 'grep database' (use it to browse large text documents!)
So, I went ahead and generated a cscope database for opencv2, more or less adapting from the cscope large projects tutorial
in order to generate the appropriate cscope.files, if searching the OCV2 directory below, you'd run the find command, pruning out a lot of superfluous directories and files.
#!/bin/bash
OCV2=~/src/opencv/opencv
find $OCV2 -path "$OCV2*/.git" -prune -o -path "$OCV2*/samples" -prune -o -path "$OCV2*/cmake" -prune -o -path "$OCV2*/data" -prune -o -path "$OCV2*/doc" -prune -o -path "$OCV2*/platforms" -prune -o -path "$OCV2*/release" -prune -o -iname "*\.cpp" -print -o -iname "*\.hpp" -print -o -iname "*\.c" -print -o -iname "*\.h" -print > cscope.files
Now, you'll want to generate the cscope database, do so by running the following from within the same directory as cscope.files
cscope -b -q -k
which will create the files:
cscope.in.out cscope.out cscope.po.out
If you have the environment variable $CSCOPE_DB set to point at cscope.out then you'll be ready to go.
Let me know if you have any other questions.
Related
One of our shared hosting sites got moved recently. New server is Red Hat 4.8.5-36. The other binaries' versions are grep (GNU grep) 2.20 and find (GNU findutils) 4.5.11
This cron job had previously functioned fine for at least 6 years and gave us a list of updated files which did not match logs, cache etc.
find /home/example/example.com/public_html/ -mmin -12 \
| grep -v 'error_log|logs|cache'
After the move the -v seems to be ineffectual and we get results like
/home/example/example.com/public_html/products/cache/ssu/pc/d/5/c
The change in results occurred immediately after the move. Anyone have an idea why it is now broken? Additionally - how do I restore the filtered output?
If you like to exclude a group of words.
grep -v -e 'error_log' -e 'logs' -e 'cache' file
With awk you can do:
awk '!/error_log|logs|cache/' file
It will exclude all lines with these words.
grep -v 'error_log|logs|cache'
only excludes strings that contain literally error_log|logs|cache. To use alternation, use extended regular expressions:
grep -Ev 'error_log|logs|cache'
GNU grep supports alternation as an extension to Basic Regular Expressions, but | needs to be escaped, so this might work as well:
grep -v 'error_log\|logs\|cache'
However, grep isn't required in the first place, we can use (GNU) find to do all the work:
find /home/example/example.com/public_html/ -mmin -12 \
-not \( -name '*error_log*' -or -name '*logs*' -or -name '*cache*' \)
or, POSIX compliant:
find /home/example/example.com/public_html/ -mmin -12 \
\! \( -name '*error_log*' -o -name '*logs*' -o -name '*cache*' \)
or, if your find supports -regex (both GNU and BSD find do):
find /home/example/example.com/public_html/ -mmin -12 \
-not -regex '.*\(error_log\|logs\|cache\).*'
I am trying to use the here maps SDK and I tried every step as mentioned on the provided link:
https://developer.here.com/news/20170208a#.WVn1zNN968p
I need help regarding the issue. Please help me if anyone has ever used the here maps SDK in iOS and Swift 3.
I hope this steps follow to solve error.
In terminal, goto project's root directory and execute one by one command
find . -type f -name '*.jpeg' -exec xattr -c {} \;
find . -type f -name '*.png' -exec xattr -c {} \;
find . -type f -name '*.tif' -exec xattr -c {} \;
Clean Xcode and Re Build. Done.
I have written script that shows Xcode warnings. e.g TODO warnings. This script will run on each build of XCode. (I have written script in "Run Phase" option).
Now I want to collect and export all these warnings to text files. Is there any way to export all warnings or build errors to a text file?
(the first bit of this is what you've already done, or something like it)
Outputting TODO, etc, as warnings.
Select your project, click the Build Phases tab, and select 'Add Build Phase > Add Run Script Build Phase' from the 'Editor' menu.
In the script box use a script something like this:
KEYWORDS="TODO:|FIXME:|\?\?\?:|\!\!\!:"
find "${SRCROOT}" \( -name "*.h" -or -name "*.m" \) -print0 | xargs -0 egrep --with-filename --line-number --only-matching "($KEYWORDS).*\$" | perl -p -e "s/($KEYWORDS)/ warning: \$1/"
(courtesy of: http://deallocatedobjects.com/posts/show-todos-and-fixmes-as-warnings-in-xcode-4)
The KEYWORDS regular expression matches TODO:, FIXME:, ???: and !!!:, but could be adjusted to find whichever indicators you want.
Making this output to a file.
The script currently outputs to stdout, which is picked up by XCode and parsed. To make it also log to a file, use tee as part of the script (see the end of line 2 for the change):
KEYWORDS="TODO:|FIXME:|\?\?\?:|\!\!\!:"
find "${SRCROOT}" \( -name "*.h" -or -name "*.m" \) -print0 | xargs -0 egrep --with-filename --line-number --only-matching "($KEYWORDS).*\$" | perl -p -e "s/($KEYWORDS)/ warning: \$1/" | tee "${SRCROOT}/NOTICES.txt"
This approach can be as complex as you like, of course, as well as teeing to a file, we can augment the script to do anything we choose:
KEYWORDS="TODO:|FIXME:|\?\?\?:|\!\!\!:"
find "${SRCROOT}" \( -name "*.h" -or -name "*.m" \) -print0 | xargs -0 egrep --with-filename --line-number --only-matching "($KEYWORDS).*\$" | perl -p -e "s/($KEYWORDS)/ warning: \$1/" | tee ${SRCROOT}/NOTICES.txt
mail -s NOTICES idmillington#example.com < ${SRCROOT}/NOTICES.txt
That emails it to me.
I've confirmed this works with XCode 5.0.2, including emailing.
Note that this does not export all warnings from the build to a file, which is strictly what you asked. I can't find a way to automate this in XCode 5.0.2, though you can do it with xcodebuild. From within the UI, the only option is to copy the log text from the log navigator to the clipboard, it seems.
Ok, I'm very new to programming but am understanding how to conceptualize and talk about what I want and need to learn and find better.
Right now I am working with a directory /Food and have .html pages that I've downloaded from several sites.
I'd like to create a script to basically use the directory /Food and all files in this folder and its sub-directories, and compare the text for files that contain the same strings I input.
So something like:
commandforsearchingtextfiles [option for directory]/food *.[or command for all files following this directory path]
salt (string1)
sugar (string 2)
flour (string 3)
echo results
The results/output should be the files that contain the strings... and if you can add extra ideas on how to organize the output
Again, if this is covered, please just point me in the right locations of where to learn about this but if you have any quick advice or a quick script, that would be great too.
You on linux? Or could use cygwin (if on windows)?
... if so the basic linux commands would cope with this pretty well.
eg to search for all files containing salt...
find Food/ -type f -name "*.html" -print0 | xargs -0 grep salt
can narrow/widen the search by adding more switches to the various commands, eg case insensitive:
find Food/ -type f -name "*.html" -print0 | xargs -0 grep -i salt
or just the filenames (not the matched text)
find Food/ -type f -name "*.html" -print0 | xargs -0 grep -l salt
for more check "grep -h".
Multi-word phrases are possible
find Food/ -type f -name "*.html" -print0 | xargs -0 grep -i "the quick brown fox"
But there is an added complication - HTML itself doesnt care about whitespace, so the phrase could be split over multiple lines. Which means the whitespace in the documents could be different to your search. eg the above wont match
the quick
brown fox
but tis valid html. Use Regex to workaround that...
find Food/ -type f -name "*.html" -print0 | xargs -0 grep -iE "the[[:space:]]+quick[[:space:]]+brown[[:space:]]+fox"
but its starting to get messy.
You could put this in a .sh to so dont have to type all of that.
eg
#!/usr/bin/sh
find Food/ -type f -name "*.html" -print0 | xargs -0 grep -i "$*"
which when saved as a file. And made executable, can be just run to run a test searc
find_in_food salt
will display a list of filenames.
(this is of course barely touching the surface of whats possible with this!)
I'm grepping through a large pile of code managed by git, and whenever I do a grep, I see piles and piles of messages of the form:
> grep pattern * -R -n
whatever/.git/svn: No such file or directory
Is there any way I can make those lines go away?
You can use the -s or --no-messages flag to suppress errors.
-s, --no-messages suppress error messages
grep pattern * -s -R -n
If you are grepping through a git repository, I'd recommend you use git grep. You don't need to pass in -R or the path.
git grep pattern
That will show all matches from your current directory down.
Errors like that are usually sent to the "standard error" stream, which you can pipe to a file or just make disappear on most commands:
grep pattern * -R -n 2>/dev/null
I have seen that happening several times, with broken links (symlinks that point to files that do not exist), grep tries to search on the target file, which does not exist (hence the correct and accurate error message).
I normally don't bother while doing sysadmin tasks over the console, but from within scripts I do look for text files with "find", and then grep each one:
find /etc -type f -exec grep -nHi -e "widehat" {} \;
Instead of:
grep -nRHi -e "widehat" /etc
I usually don't let grep do the recursion itself. There are usually a few directories you want to skip (.git, .svn...)
You can do clever aliases with stances like that one:
find . \( -name .svn -o -name .git \) -prune -o -type f -exec grep -Hn pattern {} \;
It may seem overkill at first glance, but when you need to filter out some patterns it is quite handy.
Have you tried the -0 option in xargs? Something like this:
ls -r1 | xargs -0 grep 'some text'
Use -I in grep.
Example: grep SEARCH_ME -Irs ~/logs.
I redirect stderr to stdout and then use grep's invert-match (-v) to exclude the warning/error string that I want to hide:
grep -r <pattern> * 2>&1 | grep -v "No such file or directory"
I was getting lots of these errors running "M-x rgrep" from Emacs on Windows with /Git/usr/bin in my PATH. Apparently in that case, M-x rgrep uses "NUL" (the Windows null device) rather than "/dev/null". I fixed the issue by adding this to .emacs:
;; Prevent issues with the Windows null device (NUL)
;; when using cygwin find with rgrep.
(defadvice grep-compute-defaults (around grep-compute-defaults-advice-null-device)
"Use cygwin's /dev/null as the null-device."
(let ((null-device "/dev/null"))
ad-do-it))
(ad-activate 'grep-compute-defaults)
One easy way to make grep return zero status all the time is to use || true
→ echo "Hello" | grep "This won't be found" || true
→ echo $?
0
As you can see the output value here is 0 (Success)