FxCop: get custom dictionary work from command line - fxcopcmd

I can't find the answer here or in google. I'm trying to do the simplest - having CustomDictionary.xml in a solution subfolder (this is a requirement) and FxCop installed via copying it to build server (requirement too) i need to run FxCopCmd analysis using custom dictionary.
It works fine except that fxcopcmd just doesn't get custom dictionary no matter what I pass in the command line.

Whatever mechanism you're using for specifying the target assembly path(s) should work equally well for the dictionary path. Failing that, you can use the placeholder %fxcop% to represent the directory from which you are running fxcopcmd.exe in a relative path. e.g.:
"D:\BuildFolder\Tools\FxCopCmd.exe" ... /dictionary:"%fxcop%\..\CustomDictionary.xml"

Related

Is there a way to add native rules to Bazel?

I would like a set of rules from my_package.bzl to be accessible to all BUILD files of a workspace without having to load my_package.bzl in the BUILD files. Basically I want the rules in the package to look like native rules. How can I achieve this?
I was thinking maybe there's a line I could add to one of the .bazelrcs or to the WORKSPACE file of the the project.
This can be achieved by adding a prelude_bazel file at //tools/build_rules:prelude_bazel (this must be a package, so tools/build_rules must contain a BUILD file).
This will be loaded and prepended to all BUILD files loaded by Bazel.
However, there are a few things to consider before going this route. It's currently undocumented, and while doing some searching to find any info on this feature, it's unclear if it will remain a part of Bazel.
It may also have performance / scaling problems. If the prelude were to change (or any of its dependencies), every BUILD file would have to be reloaded, and this may take some time depending on the size of the build graph.

Optional file dependencies in Bazel?

Is there a way to specify optional dependencies in Bazel?
I'd like to make a rule to somewhat mirror Kitware's ExternalData, but I would like to see if I can enable workflows where the developer edits the file in-tree, ideally without needing to modify the BUILD file.
Ideal Workflow
Define a rule, external_data, which can fetch a file from a given server given its SHA-512.
If the file already exists, check it's SHA-512.
If that is what is requested, symlink / copy this file (ensuring that no tests can modify the original file).
If it is different, print a warning, but proceed as normal, to allow for developers to quickly modify the large files as they need.
I would like to do this such that Bazel can switch between the file being present and not, and be robust to false-positives on caching. An example scenario that I would like to avoid, if I were to not include it as an optional dependency:
In a prior run, the file was in the workspace, Bazel built the target, everything's fine and dandy.
Developer removes the file from the workspace after uploading, satisfied with their changes and wanting to test the download process.
When running the downstream target, Bazel doesn't care about the change in the workspace since it's not an explicit dependency, and the symlink is invalidated, and the test crashes and burns.
To me, it seems like I'd run into this if I tried to implement a repository_rule rule which manually checks for the file existence, and conditionally executes (I'm not sure if analysis would retrigger this rule being "evaluated" if Step 2 happens.).
Workaround
My current thought for an alternative workflow is to have an explicit option for external_data, use_workspace: if False, it will download the file; if True, it will just mirror exports_files([]). The developer can then set this when modifying files.
(Ideally, I'd like to optionally include a file which indicates the SHA (${file}.sha512), but this seems to go back to the original ask.)
One workaround is to use Bazel's glob(...) method to effectively check for file existence.
If you have a file, say basic.bin.sha512, and you want a rule to switch modes based on that file's existence, you can use glob(["basic.bin.sha512"]), which will either match the package file exactly or return an empty list.
I had tinkered around with using this on larger sets of files, and it appears to work. However, for the time being, I've erred to having a sort-of explicit "development" mode for the target definition to keep the Bazel build relatively consistent, regardless of what files may be checked out.
Here's an example usage:
https://github.com/EricCousineau-TRI/external_data_bazel/blob/4bf1dff/WORKFLOWS.md#edit-files-in-a-sha512-group

travis-lint: why does it complain about java language field

I have this dead simple .travis.yml for a java project. When i run 'travis-lint' against the file is complains
[17:24:23#emeraldjava]$ travis-lint
/Users/pauloconnell/projects/emeraldjava/.travis.yml has issues:
Found an issue with the `language:` key:
Language must be valid
Any ideas?
My build actually works once deployed out.
Until the problem is resolved with the command line tool, and as #joshua-anderson proposed, use the web linter http://lint.travis-ci.org/.
Simply copy/paste your .travis.yml file's content their (or use your github repository directly like emeraldjava/emeraldjava) and hit validate.

Dynamically loading erlang header files

I know that you can dynamically load erlang beam files in an erlang node using "l(module_name).". My question is is it possible to load ".hrl" files the same way or some such similar without having to restart an erlang node
I am not sure this is possible, but just based on understanding, when you try to define an macro in url and you want to modify it, the compiler replaces the macro during the compilation of the erlang file by replacing the the macros that are defined in header.
Logically you should rebuilding you code and deploy it again. I don't understand a reason why you need hrl files to be loaded dynamically if you have an option for replacing the entire code dynamically. IMHO all you need to do is rebuild and upgrade and this also can be done without restarting erlang node.
".hrl" files - used only by compiler on compile sources. It is not is runtime files.
You can use popular auto-reloader by Mochi team
https://github.com/mochi/mochiweb/blob/master/src/reloader.erl
put them in your src/ folder and add to your exec erl -s reloader option

How do I change the file extension for dependencies

I'm building a program that uses Delphi Packages (BPLs) as plugins, but I'd like to use a custom extension to show that the files have a specific purpose instead of just being BPLs. That works well enough until I end up with one package having a dependency on another. Then the compiler automatically creates the binary with the extension BPL built in.
This wouldn't be too hard to fix with a hex editor, but that's sort of an extreme solution. Is there any way I could make the compiler generate the packages with the right dependency names in the first place?
EDIT: The answers so far seem to have not understood the question.
I know exactly how to create the packages with my custom TEP extension instead of a BPL extension. But if I have package1.TEP and package2.TEP, and package2 depends on package1, and then I try to load package2, it gives an error because it can't find "package1.BPL". What I want is to find some simpler way to make package2 look for the correct filename, "package1.TEP," that doesn't involve editing the binary after it's been created. Is there any way to do that?
Use the {$E} directive.
The simplest solution would be to use a post build event to rename your destination file from *.BPL to whatever specific extension you are requiring.
EDIT:
You could write a separate patch program to search for and patch the offending binaries and run it as part of the post build process. If a patch is made to the compiler, then you can remove your step easily.

Resources