How do I check where the symlink is pointing to in Ant - ant

Basically what I've asked in the title. How can I use the Ant "symlink" command to find out where the symlink is pointing to?
Thanks
Steve

I think the symlink task can only create and remove symlinks. You could execute something like ls -la ${file} | awk '{print $10}' and capture the output with a redirector but this looks like an ugly hack - there should be a better way.

Related

Grep returns no such file or directory when using multiple flags

I am a beginner of bash script. I just started to write a script where it checks the contents of b.txt can all be found in a.txt. (line by line preferably). My code is as following:
grep -Ffw b.txt a.txt
As you can see, I want to do fixed string instead of REGEX, I want to check everything from the b.txt file, because there are some strings inside the b.txt and I want to check if all of them exist in a.txt. And I also want to match the whole word only of course. So these are the requirements, however when I run this command it returns me an error says: grep: w: No such file or directory
I am thinking that maybe there are some limitations of the flags in bash? Sorry I am not really familiar with the language, didn't read much about the MAN page etc. If anyone could help me to solve the puzzle it would be appreciated :) In addition, i think if possible I would like to add a -q to surpress the output when there is a match also, right now I didn't add it in the example since it couldn't make it through with 3 flags even. So can anyone give me some hints here? Thanks in advance!
Hereby some explanation from the manpage:
OPTIONS
Generic Program Information
...
-F, --fixed-strings
Interpret PATTERNS as fixed strings, ...
-f FILE, --file=FILE
Obtain patterns from FILE, ...
-w, --word-regexp
Select only those lines ...
As you can see, the options -F and -w are indicated ending immediately (hence the comma in -F, and -w,), but the -f switch is followed by FILE, with means they belong together.
I you want to preserve the order Ffw, that's possible, but then you need to do something like:
grep -Ff b.txt -w a.txt
As mentioned by #kvantour, the solution is simply placing the -f before the b.txt file. grep -Fwf b.txt a.txt
Should have thought it when it says 'no such file or directory' as it was a clear indication that flags after the -f were treated as the path already.

Why does grep make my terminal unreadable when searching for #?

I'm trying to grep my repo searching for # sign to find some e-mail addresses because of this git issue.
I type grep -rnw '/my/path' -e '#' into the terminal and I get:
Why does it happen?
P.S. I think there is no sensitive information in the picture, but someone please tell me if you think there is.
If you have issues only on files within a .git folder, you might consider excluding it from your recursive grep, as in this answer.
grep --exclude-dir=".git" -nrw ...
On CentOS, the regular grep might not include that option though.

Remove a certain amount of files with a sequence of characters in a directory

I stopped a process to troubleshoot something. Now, I would like to start the process where it left off in CentOS 6.4.
This script I stopped runs a perl script in a loop to process all of the files in /dev/shm/split/. These files were split into many parts from a larger file. An example of how they are named are as follows:
file.txt.aa
file.txt.ab
file.txt.ac
...and so on.
I have identified that the script left off at file.txt.fy. So, I would like to remove all of the files in /dev/shm/split/ that are from file.txt.aa through file.txt.fy.
I tried to create a whitelist for the rm command by doing:
ls /dev/shm/split/ > whitelist
cat whitelist | egrep -v 'file.txt.[aa-fz]' | tee whitelist.tmp
This did not do what I had intended.
Please help me! Thank you!
The problem with your command is that you cannot match two characters with the square bracket pattern in bash. You should use something like that instead:
ls file.txt.[a-e]? file.txt.f[a-y]
Basically decompose your range into two ranges, the first will match .aa to .ez, and the second .fa to .fy (included).
Note that I have used the ls command here. I always find it a good idea to first echo or ls the commands/files you're going to run when the operations you do are potentially destructive. When you're sure it produces the right output, go on and use rm instead of ls.

grep current directory only

I would like to search all the files in the current directory only. I tried this
grep foo *
but I get this error
grep: bar: Is a directory
I also tried this
grep -r foo
but this is searching subdirectories as well.
Depending on your version of grep, you may be able to write:
grep --directories=skip foo *
This is actually a comment, just don't have enough reputation to place it as a comment.
your 1st answer is actually correct.
if bar is a directory within the same directory you want to search files and you don't like the error.
It can also be simple just to do away with error.
e.g. grep foo * 2> /dev/null

find a command on $PATH

I'm writing a script, and I need to look up a command on the user's $PATH and get the full path to the command. The problem is that I don't know what the user's login shell is, or what strange stuff might be in their do files. I'm using the bourne shell for my simple little script because it needs to run on some older Solaris platforms that might not have bash.
Some implementations of "which" and "whence" will source the user's dot files, and that isn't really portable to all users. I'd love a simple UNIX utility that would just do the basic job of scanning PATH for an executable and reporting the full path of the first match.
But I'll settle for any /bin/sh solution that is stable for all users.
I'm looking for a solution that is better than writing my own /bin/sh loop that chops up $PATH and searches it one line at a time. It would seem that this is common enough that there should be an reusable way to do it.
My first approximation of the "long way" is this:
IFS=:
for i in $PATH; do
if [ -x $i/$cmd ]; then
echo $i/$cmd
fi
done
Is there something simpler and portable?
The answer seems to be the 'type' built-in.
% /bin/sh
$ type ls
ls is /bin/ls
Maybe the whereis command will work for you?
whereis -b -B `echo $PATH | sed 's/:/ /g'` -f [commands]
e.g. on my computer, this works:
whereis -b -B `echo $PATH | sed 's/:/ /g'` -f find man fsc
And results in:
find: /usr/bin/find
man: /usr/bin/man
fsc: /opt/FSharp-2.0.0.0/bin/fsc.exe /opt/FSharp-2.0.0.0/bin/fsc
One caveat from the whereis man page:
Since whereis uses chdir(2V) to run faster, pathnames given
with the -M, -S, or -B must be full; that is, they must begin
with a `/'.
This question is answered in details here: https://unix.stackexchange.com/questions/85249/why-not-use-which-what-to-use-then. Bottom line: use command -v ls.

Resources