Bazel rules can compose other bazel rules.
For example:
def cpp_library(name,deps=[]):
explicit_cpp_file = name + ".cpp"
explicit_hpp_file = name + ".hpp"
native.cc_library(
name = name,
srcs = [explicit_cpp_file],
hdrs = [explicit_hpp_file],
deps = deps,
)
Here we see that the cpp_library uses native.cc_library.
Is there a way to use a bazel query to print this relationship information?
You can imagine the difficulty of wanting to know which rules are used by which other rules as a project grows in size and many rules are available.
You can get this information through bazel query with --output. --output build will show the rules after macros have been run. --output graph will show the relationships between rules, e.g.
bazel query //some/package --output graph | xdot
(or e.g. | dot -Tpng > /tmp/dep.png)
See https://docs.bazel.build/versions/master/query.html#output-formats
Related
I have a requirement to match 2 files(first file is PS file and second file is a PDS file)
PS file 1-8 bytes are member names:
Example:
ABCD1234
DDFF2345
QWER3456
PDS file has 100 members and above given 3 members are matching.
Example:
AAAA1234
ABCD1234
DDFF2345
QWER3456
SSSS2222
HHHH1212
My requirement is to create output PDS file with those 3 matching members(Same as of input PDS file)
Result PDS file:
ABCD1234
DDFF2345
QWER3456
If anyone can please guide or give an idea would be very helpful.
Regards
Harry
Write a step to obtain the list of members in the PDS. There are various ways to do this, in-house utilities, third-party utilities, write your own clist or Rexx wrapping the LISTDS TSO command. Use whatever method is commonly in use in your shop. If you don't know what's commonly in use in your shop, ask your coworkers. Save the output in a file.
Write a step to execute your shop's SORT utility to match-merge the list of members from the first step with the list in the PS file. Both DFSORT and Syncsort can do this, see the JOINFILE keyword in the documentation. Save the list of members that match in a file.
Here's the tricky part: write a step to read the list of members from the match-merge step and output IEBCOPY control statements to copy the members. If you want to get fancy, you could combine this step with the previous step.
Write a step to execute IEBCOPY to execute the control statements written by the previous step.
If you have access to USS you can do what you want by following this example:
The example uses the following dataset names, but you can replace
them with the actual dataset names in your case:
PS: hlq.MEMBER.LIST <--- the list of members you want
PDS: hlq.PDS.DATASET <--- the PDS dataset with members you wan to copy
PDS: hlq.NEWPDS.DATASET <--- the PDS dataset to which you want to copy the members
Create the list of members from hlq.PDS.DATASET into a USS file pds.members
> tsocmd "listds 'hlq.PDS.DATASET' MEMBERS" >pds.members
Copy each member listed in hlq.MEMBER.LIST that is in hlq.PDS.DATASET into the hlq.NEWPDS.DATASET
> for fn in $(cat "//'hlq.MEMBER.LIST'"); do grep $fn pds.members >/dev/null 2>&1; [ $? -eq 0 ] && cp "//'hlq.PDS.DATASET($fn)'" "//'hlq.NEWPDS.DATASET'"; done
Write an edit macro to turn that list of members into IEBCOPY control statements to copy the members:
COPY INDD=INLIB,OUTDD=OUTLIB
SELECT MEMBER=(ABCD1234)
SELECT MEMBER=(DDFF2345)
SELECT MEMBER=(QWER3456)
and then use this edited data as input to an IEBCOPY job to copy those members to a new PDS:
//COPY EXEC PGM=IEBCOPY
//*
//INLIB DD DISP=SHR,DSN=<old.pds>
//OUTLIB DD DISP=SHR,DSN=<new.pds>
//*
//SYSUT3 DD UNIT=SYSDA,SPACE=(CYL,(50,50))
//SYSUT4 DD UNIT=SYSDA,SPACE=(CYL,(50,50))
//SYSPRINT DD SYSOUT=*
//SYSIN DD DISP=SHR,DSN=<new.control.statements>
If I run this command with grep, I can retrieve a list of dependencies that my project needs from a maven repository.
bazel query "deps(//my-project-server)" | grep "#maven"
I will get a list of output like the following per dependency:
#maven//:com_thoughtworks_paranamer_paranamer
#maven//:v1/https/repo.maven.apache.org/maven2/com/thoughtworks/paranamer/paranamer/2.8/paranamer-2.8.jar
#maven//:com_thoughtworks_paranamer_paranamer_2_8_extension
#maven//:v1/https/repo.maven.apache.org/maven2/com/thoughtworks/paranamer/paranamer/2.8/paranamer-2.8-sources.jar
What I actually want is the name of the package and the version.
In the example above I want to extract this pair of data com_thoughtworks_paranamer_paranamer and 2.8.
Is there a direct way that I can use to ask bazel to product such results for all the maven dependencies?
bazel query <expression> --output=build | grep tags could get you pretty far:
For example:
$ bazel query #maven//:all --output=build | grep tags
tags = ["maven_coordinates=org.hamcrest:hamcrest-core:2.1"],
tags = ["maven_coordinates=org.hamcrest:hamcrest:2.1"],
tags = ["maven_coordinates=com.google.guava:guava:27.0-jre"],
tags = ["maven_coordinates=org.codehaus.mojo:animal-sniffer-annotations:1.17"],
tags = ["maven_coordinates=org.checkerframework:checker-qual:2.5.2"],
tags = ["maven_coordinates=com.google.j2objc:j2objc-annotations:1.1"],
tags = ["maven_coordinates=com.google.guava:listenablefuture:9999.0-empty-to-avoid-conflict-with-guava"],
tags = ["maven_coordinates=com.google.guava:failureaccess:1.0"],
tags = ["maven_coordinates=com.google.errorprone:error_prone_annotations:2.2.0"],
tags = ["maven_coordinates=com.google.code.findbugs:jsr305:3.0.2"],
As part of automating a legacy deployment process, I'm trying to build a Jenkins pipeline that can, among other operations, execute parameterized builds.
One of those is the ability to execute several commands against given list of services. Given a list of service names, e.g.
Mailer
Reporter
DbMigrator
...
etc, I'd like to run certain commands against some of those.
Using Extended Choice Parameter plugin, I was able to load this list from a properties file, and display it as a list of checkboxes, however, I am looking for a way I could build a "matrix" parameter with multiple values. My goal is to do something like this:
| Service | Opt1| Opt2|
|------------------------|
| Mailer | [x] | [ ] |
| Reporter | [x] | [x] |
| DbMigrator | [ ] | [x] |
| ... | | |
So that I can apply several values (Opt1 and/or Opt2) to one parameter (e.g. Mailer).
Is there a way in Jenkins to do this?
or
Is there a better way of doing this?
You can try the matrix project plugin which comes bundled with later versions of Jenkins.
You can add a couple of groovy axis
which allows you to do this
import groovy.json.JsonSlurper
def result = []
def inputFile = new File("/path/to/prop.json")
def InputJSON = new JsonSlurper().parseText(inputFile.text)
InputJSON.prop1.each{ result << it }
return result
with this sort of JSON
{
"parm1": [
"Mailer",
"Reporter",
"DbMigrator"
],
"parm2": [
"opt1",
"opt2"
],
"filter": "parm1=='Mailer'"
}
These is also a filter option in the matrix to restrict to various combinations. I was trying to make it evaluate that from the 'filter' property above (creating an environment variable using the EnvInject and another groovy script), so far with no success but you can use a string
parm1=='Mailer' || parm2 = 'Opt2'
I'm trying to define a grammar for ninja build with xtext.
There are three tricky points that I can't answer.
Indentations by tab:
How to handle indentations. A rule in a ninja build file might have several variable definitions with preceding tab spacing (similar to make files). This becomes a problem when the language has SL comments, ignores white-spaces and does indentation by tabs (python, make,...)
cflags = -g
rule cc
command = gcc $cflags -c $in -o $out
Cross referencing reserved set of variable names:
There exists a set of reserved variables. Auto-complete should be able to reference both the reserved and the user defined set of variables.
command = gcc $cflags -c $in -o $out
Autocompleting cross referenced variable names which aren't seperated with WS
org.eclipse.xtext.common.Terminals hides WS tokens. ID tokens are seperated by white spaces. But in ninja script (similar to make files) the parsing should be done with longest matching variable name.
some_var = some_value
command = $some_var.h
Any ideas are appreciated. Thanks.
Check out the Xtext 2.8.0 release: https://www.eclipse.org/Xtext/releasenotes.html
The Whitespace-Aware Languages section states:
Xtext 2.8 supports languages in which whitespace is used to specify
the structure, e.g. using indentation to delimit code blocks as in
Python. This is done through synthetic tokens defined in the grammar:
terminal BEGIN: 'synthetic:BEGIN';
terminal END: 'synthetic:END';
These tokens can be used like other terminals in grammar rules:
WhitespaceAwareBlock:
BEGIN
...
END;
The new example language Home Automation available in the Eclipse examples (File → New → Example → Xtext Examples) demonstrates this concept. It allows code like the following:
Rule 'Report error' when Heater.error then
var String report
do
Thread.sleep(500)
report = HeaterDiagnostic.readError
while (report == null)
println(report)
More details are found in the documentation.
Does anyone have a handy powershell script that gets a set of files from TFS based on a modification date? I'd like to say "give me all the files in this folder (or subfolder) that were modified after X/Y/ZZZZ" and dump those files to a folder other than the folder they would normally go to. I know enough powershell to hack about and get this done, eventually, but I'm hoping to avoid that.
Make sure you have the Team Foundation 2015 Power Tools installed. It comes with a PowerShell snapin. You can run the PowerShell console file right from its startup group or you can execute Add-PSSnapin Microsoft.TeamFoundation.PowerShell. Then cd to your workspace and execute:
Get-TfsItemProperty . -r | Where {$_.CheckinDate -gt (Get-Date).AddDays(-30)} |
Format-Table CheckinDate,TargetServerItem -auto
CheckinDate TargetServerItem
----------- ----------------
9/14/2009 1:29:23 PM $/Foo/Trunk/Bar.sln
9/29/2009 5:08:26 PM $/Foo/Trunk/Baz.sln
To dump that info to a dir:
Get-TfsItemProperty . -r | Where {$_.CheckinDate -gt (Get-Date).AddDays(-30)} |
Select TargetServerItem > c:\recentlyChangedFiles.txt
To copy those files to another dir (this assumes you have them pulled down locally into a workfolder):
Get-TfsItemProperty . -r | Where {$_.CheckinDate -gt (Get-Date).AddDays(-30)} |
CopyItem -Path $_.LocalItem -Destination C:\SomeDir -Whatif
Note this copies files into a flat folder structure. If you want to maintain the dir structure it is a bit more involved.
Using Get-TfsItemProperty like Keith doesn't just require a workspace for the file copies. It's the wrapper for GetExtendedItems(), the server query for local info most commonly seen in Source Control Explorer. By relying on the version info it reports, you assume the files themselves were downloaded (more generally: synchronized, in the case of renames & deletes) in the last 30 days. If the workspace is not up to date, you'll miss some files / give them out-of-date names / etc. It's also quite expensive as informational commands go.
Some alternative examples:
# 1
Get-TfsChildItem $/FilesYouWant -R |
? { $_.CheckinDate -gt (Get-Date).AddDays(-30) } |
% { $_.DownloadFile(join-path C:\SomeDir (split-path $_.ServerItem -leaf)) }
# 2
Get-TfsItemHistory $/FilesYouWant -R -All -Version "D$((Get-Date).AddDays(-30).ToString('d'))~" |
Select-TfsItem |
Select -Unique -Expand Path |
Sort |
Out-File c:\RecentlyChanged.txt
The first is a straightforward adaptation of Keith's code, using a cheaper query and eliminating the workspace dependency. It's the best option if you know a high % of the items under that dir were modified recently.
The second option queries the changeset history directly. By letting the Where clause be computed in SQL instead of on the client, this can be an order of magnitude more efficient if a low % of the items were changed recently (as is often the case). However, it will lag the item-based queries if there are lots of large changesets returned, making the server's JOIN to grab the item properties expensive and forcing our client-side duplicate removal to do a lot of work.
[Yes, I know that having -Version require a string is not very Powershell-esque; mea culpa. You could create a DateVersionSpec with new-object and call its ToString(), but that's even more work.]
I didn't show every combination of API call + desired task. Goes without saying you can use #1 to generate the file list and #2 to (re)download by modifying the latter half of the pipeline. You can even combine that copy technique with the efficiency of Get-TfsItemHistory:
# 2b, with local-to-local copying
Get-TfsItemHistory $/FilesYouWant -R -All -Version "D$((Get-Date).AddDays(-30).ToString('d'))~" |
Select-TfsItem |
Select -Unique -Expand Path |
Get-TfsItemProperty |
Copy $_.LocalItem -Dest C:\SomeDir
It's true this makes a 2nd roundtrip to the server, but thanks to the initial query the GetExtendedItems() call will be scoped to the precise set of items we're interested in. And of course we remove any chance that download time becomes the bottleneck. This is likely the best solution of all when the # of changesets is small and the concerns I raised about Keith's workspace synchronization aren't relevant for whatever reason.
Can I just say that having to user powershell to do this seems absurd.
FWIW, I've been involved with TFS from inside & outside MS for 4.5yr and never seen this feature requested. If you could expand on what goal you're actually trying to accomplish, my guess is we could suggest a better way. Don't get me wrong, I wrote the Powershell extensions precisely to handle oddball scenarios like this. But frequently it's a job for another tool entirely, eg: Annotate, MSBuild, DB schema compare...