xmllint not operator in the predicate not working - xml-parsing

my requirement is to get alias of ancestor who attribute is not equal to A
<Instance>
<Context>
<InstanceDescription alias="A">
<AttributDescription attrid="A"/>
</AttributDescription>
</InstanceDescription>
<InstanceDescription alias="B">
<Context>
<AttributDescription attrid="b">
</AttributDescription>
..
..
</InstanceDescription>
</Instance>
I tried with the following command , however it gives me syntax error .
myval="cat //Instance/InstanceDescription/Context/AttributDescription[not(#attrId = \"'A'\")]/ancestor::InstanceDescription/#alias | xmllint --shell $filename"
also tried not(#attrId = 'A'). still the same.

Related

how do i pass the selected string into a variable?

how do i store the output of a command into a variable for use in a github-actions .yaml?
docker images --format='{{.ID}}' | select -first 1
gives me:
fc6e040841a1
i've seen stuff online about select-object..but i honestly have no idea, just trying to push an image to a registry...
the following cmd doesn't work in powershell:
for /f "delims=" %a in ("docker images --format='{{.ID}}' | select -first 1") do #set "%_img%=%a"
the following cmd doesn't work in powershell:
for /f "delims=" %a in ("docker images --format='{{.ID}}' | select -first 1") do #set "%_img%=%a"
That's because this is Command Prompt syntax. Specifically, everything outside of the () would only work under cmd.exe. The PowerShell equivalent for assigning a command result to a variable is:
$variableName = COMMAND
To apply it to your use case:
$imageId = docker images --format='{{.ID}}' | Select-Object -First 1
Note that select is an alias of Select-Object and either can be used interchangeably.
Edit: While not required for setting variables unlike in the Command Prompt, for syntax is still different in PowerShell when batch scripting. You can read up on PowerShell's for, foreach, and ForEach-Object constructs when you want to learn how they are used in PowerShell scripts, and watch for this gotcha when using the foreach "statement" as part of a pipeline.
While not part of the original scope of the question, since OP did ask and I answered in the comments, I will put the bash equivalent here for the sake of completeness and how I transposed this from the PowerShell method I used above:
imageId=$(docker images --format="{{.ID}}" | head -n 1)
This is similar to the PowerShell syntax with a few changes: remove the $ from the variable name on assignment, and Select-Object is replaced by head. You can't pad the = with whitespace, and you have to subshell the command with $().

Parse the output of a grep to tag files

After a lengthy pipe which ends with a grep, I correctly end up with a set of matching absolute paths/files and match string separated by a comma delimiter for each. I want to tag each file with its match string. Complicated also in that the path has spaces but there is none between the delimiter and the preceding and succeeding characters.
I need to be able to deal with an absolute path rather than just the filename within the directory. The match strings are space_free but the filename might not be:
So by way of example, the output of the pipe might look like:
pipe1 | pipe2 |
outputs
/Users/bloggs/Directory One/matched_file.doc,attributes_0001ABC
/Users/bloggs/Directory One/matched_file1.doc,attributeY_2
/Users/bloggs/Directory One/match_file_00x.doc,Attribute_00201
/Users/bloggs/Directory One/matching file 2.doc,attribute_0004
I want to tag each using something which will probably include:
tag --add "$attribute" "$file"
Where attribute refers to the match string eg "Attribute_00201"
Normally I'd just say eg:
tag --add Attribute_00201 /Users/bloggs/Directory\ One/match_file_00x.doc
At this point I am stuck how to parse each line ideally via another pipe and to deal with spaces correctly and execute the tag command. Grateful for any help
So I'm looking for a new pipe, pipe3 to execute or give me the correctly formatted tag command:
pipe1 | pipe2 | pipe3
delivers eg
tag --add Attribute_00201 /Users/bloggs/Directory\ One/match_file_00x.doc
etc
etc
This seems to work
| tee >(cut -f2 -d","| sed 's/^/tag --add /' > temp_out.txt) >(cut -d"," -f1 | sed -e 's/[[:space:]]/\\ /g' > temp_out1.txt) > /dev/null && paste -d' ' temp_out.txt temp_out1.txt > command.sh && chmod +x ./command.sh

How do I look for ">" in a file using shell script?

I am trying to look for ">" character in a CSV using the grep command as follows:
grep ">" test_file.csv
grep \> test_file.csv
However, both these commands yield no results. I know for a fact that several instances of this character appear in this file. I am pretty sure this issue is coming up because ">" is also used as the stream input character. How do I search for ">" and get results successfully?
You can try following command
grep -F '>' application.log
grep -e '>' application.log
-F stand for input as fixed string.
-e make string as a pattern.
To print line number as well you can use -n flag.

How to grep for pointers to pointers (in C++ source)?

I have been using grep to perform recursive search of text inside a directory, i.e.
grep -Hrn "some-text" ./
However, I am running into some troubles when I need to search for pointers or pointers of pointers:
grep -Hrn "double**" ./
> grep: repetition-operator operand invalid
I have made some attempts to go around this, including some I found via Google searches:
grep -Hrn "double[**]" ./
grep -Hrn "double[*][*]" ./
but neither seems to work. Any pointers?
You have to escape * by using \. For example
$ echo "double***" | grep "double\*\*\*"
double***
If you don't escape * it matches the character before the * zero or more times. One * would therefore match e.g. doubleeeee but the second * results in an error since the operand (the character before the *) is not valid since it's again *. That's exactly what the error message tells you.
The version using [] should also work. As mentioned in the comments the issue might be that your variable declarations contain whitespace. The following regex matches these (now using the * operator):
$ echo "double **" | grep "double\s*\*\*"
double **
I usually use fixed strings, like:
grep -FHrn "double**" ./
(the -F)
-F, --fixed-strings
Interpret pattern as a set of fixed strings (i.e., force grep to
behave as fgrep).
https://www.freebsd.org/cgi/man.cgi?query=grep&sektion=&n=1 (also same in rg and GNU grep)

Illegal variable name error when using grep -v '^$' [duplicate]

This question already has an answer here:
why is a double-quoted awk command substitution failing in csh
(1 answer)
Closed 4 years ago.
I get an error Illegal variable name when I use this line of code:
set users = "` last | sort | tr -s '\t' ' ' | grep '[0,2][0-4]:[0-5][0-9] -' | grep -v '^$' | grep -v '[2][0-1]:[0-5][0-9] -' `"
But it works fine when I use this code:
set users = "` last | sort | tr -s '\t' ' ' | grep '[0,2][0-4]:[0-5][0-9] -' | grep -v '[2][0-1]:[0-5][0-9] -' `"
The code should store people who logged in between 22:00 and 05:00 (excluding 05:00) into a variable named users. It should also remove any empty lines which are in the output. This is what I'm trying to do in the first code, but it gives me the aforementioned error.
I don't know how to explain it, but it is one of these typical CSH pitfalls.
A <dollar> ($) between <double-quotes> (") (independently if they are in between <back-ticks> (`) and <single-quotes> (') are always concidered to be variable names. So if the word following the <dollar> is not a valid variable name, the thing starts to complain. Example:
$ grep "foo$" file.txt
Illegal variable name.
This is exactly what your problem is. You wrote something similar too
$ set var = "`grep -v '^$' file.txt`"
and even though the <dollar> is between <single-quotes> which are in-between <back-ticks> for command substitution which is again between <double-quotes> to retain the blanks and tabs of the command substitution, it just does not matter! There is no hope! You used <double-quotes> with all good intentions, but it just blew up in your face! Resistance is futile, your <dollar> will be assimilated to resemble a variable, even when it does not! CSH just does not care! You just want to cry! You cannot even escape it!
If you make use of last from util-linux, you might be interested in the flags --since and --until (see here and here). Otherwise you might use the following command line:
set users="`last | awk '/(2[2-3]|0[0-4]):.. [-s]/'`"
This will match all lines where the user logged in between 22 en 05 (excl) and is potentially still logged in.
As a general note, I would suggest switching from CSH to BASH for many reasons. Some of them are mentioned here and here.

Resources