A simple example illustrates my problem.
In essence, I am working on a large project that has code split across multiple repositories. In repo 1 there is an Avro schema "S1" defined in an .avdl file which gets compiled into its Avro generated class.
In repo 2, which pulls the compiled artifacts of repo 1 in as dependencies, I need to create a new schema that has multiple records and also needs to embed the "S1" schema of repo 1.
In repo 2 I have an IDL file that has all of the repo 2 schemas, but I can find no way to include the schema of repo 1 in this IDL. I cannot import it since in repo 2 I have no access to the schema file in repo 1. I do have access to the Avro generated class from repo 1, but IDL does not appear to support a way to reference that.
How can I do this? If this is impossible in IDL, how would I do this in JSON? -- Thanks!
Avro IDL's import statement will first look for the named file relative to the input file. But if that fails it looks on the Java classpath. So, if the jar file from repo 1 includes an IDL file, then an IDL file in repo 2 should be able to import it via its path in the jar file.
https://github.com/apache/avro/blob/master/lang/java/compiler/src/main/javacc/org/apache/avro/compiler/idl/idl.jj#L153
This does not appear to be documented. Let me know if it works for you.
Related
I've been away from VS for some years and am now returning and would like to make use of modern integrations, e.g., github.
Given my own project, in a VC++ solution/project, how do I most effectively add single-file header libraries from public github repositories?
Is there a way to do it directly as a dependency given the url of the repo?
Do I clone it locally and add it as a dependency?
Do I clone it locally and add the header file to the solution/project
vcpkg?
Some use of git's submodules (integrated with VS I hope...)?
Or?
I'd like it to show up in my project in such a way that I can do #include "best-json/best-json.hpp" for a file in the someuser\best-json repository on GitHub. (Or GitLab, or anywhere with a url.)
I'd like that to be the case even if, in the remote repository, the header file in question was in a subdirectory under the root called include, or maybe a subdirectory called best-json, or maybe even at top level.
By "effectively" I mean: Does the "right" thing as described above, even though not necessarily the absolute simplest way to do it.
Perhaps there's a VS extension that automates this?
[I searched the web for this but only found pages talking about how I'd export ("publish" I guess) my project to Github, or how to clone an entire project from Github, etc. etc.]
After downloading an archive throug http_archive I'd like to run a script to generate a BUILD file from the folder structure and Cmake files in it (I currently do that by hand and it is easy enough that it could be scripted). I don't find anything on how to open, read and write files in the starlark documentation but since http_archive itself is loaded from a bzl file (haven't found the source of that file yet though...) and generates BUILD files (by unpacking them from archives) I guess it must be possible to write a wrapper for http_archive that also generates the BUILD file?
This is a perfect use case for a custom repository rule. That lets you run arbitrary commands to generate the files for the repository, along with some helpers for common operations like downloading a file over HTTP using the repository cache (if configured). A repository rule conceptually similar to a normal rule, but with much less infrastructure because it's running during the loading phase when most of the Bazel infrastructure doesn't apply yet.
The starlark implementation of http_archive is in http.bzl. The core of it is a single call to ctx.download_and_extract. Your custom rule should do that too. http_archive then calls workspace_and_buildfile and patch from util.bzl, which do what they sound like. Instead of workspace_and_buildfile, you should call ctx.execute to run your command to generate the BUILD file. You could call patch if you want, or skip that functionality if you're not going to use it.
The repository_ctx page in the documentation is the top-level reference for everything your repository rule's implementation function can do, if you want to extend it further.
When using http_archive, you can use the build_file argument to create a BUILD file. To generate it dynamically, I think you can use the patch_cmds argument to run external commands.
I'm developing a DSL using xtext that is producing different outputs (Typescript and java).
The description files I want to put into a separate project and the generated output should go to two other different projects. To know where the two output projects are located, I need a kind of configuration. The best would be to put this configuration into a separate dedicated file together with the description files under version control.
Is there maybe a way to serve the content of the org.eclipse.xtext.generator.OutputConfigurationProvider from a configuration file ?
Do you may have a best practice to realize that ?
Thank you in advance,
Michael
Xtext already supports this through preferences in Eclipse. It is stored in a DSL specific prefs file in the .settings folder of the project. So if you use this it will work out of the box
In eclipse:
To correct "Plugin execution not covered by lifecycle configuration..." problem, I choose to create my lifecycle-mapping-metadata.xml instead of pollute my pom with IDE concerns.
I manage to write this file with example, but I can't find a xsd or DTD for lifecycle-mapping-metadata.xml. Where is it?
A typical lifecycle-mapping-metadata.xml file in the eclipse/m2e-core repo comes without an xsd reference.
But in that same m2e repo, you also have a org.eclipse.m2e.core/mdo/lifecycle-mapping-metadata-model.xml which helps to validate that xml file. It is used as a model in the main pom.xml, when used by the modello-maven-plugin.
I'm getting the following error when trying to build my app using Team Foundation Build:
C:\WINDOWS\Microsoft.NET\Framework\v3.5\Microsoft.Common.targets(1682,9): error MSB3554: Cannot write to the output file "obj\Release\Company.Redacted.BlahBlah.Localization.Subsystems.
Startup_Shutdown_Processing.StartupShutdownProcessingMessages.de.resources". The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
My project builds fine on my development machine as the source is only two folders deep, but TF Build seems to use a really deep directory that is causing it to break. How do I change the folders that are used?
Edit: I checked the .proj file for my build that is stored in source control and found the following:
<!-- BUILD DIRECTORY
This property is included only for backwards compatibility. The build directory used for a build
definition is now stored in the database, as the BuildDirectory property of the definition's
DefaultBuildAgent. For compatibility with V1 clients, keep this property in sync with the value
in the database.
-->
<BuildDirectoryPath>UNKNOWN</BuildDirectoryPath>
If this is stored in the database how do I change it?
Edit: Found the following blog post which may be pointing me torward the solution. Now I just need to figure out how to change the setting in the Build Agent. http://blogs.msdn.com/jpricket/archive/2007/04/30/build-type-builddirectorypath-build-agent-working-directory.aspx
Currently my working directory is "$(Temp)\$(BuildDefinitionPath)" but now I don't know what wildcards are available to specify a different folder.
You need to edit the build working directory of your Build Agent so that the begging path is a little smaller. To edit the build agent, right click on the "Builds" node and select "Manage Build Agents..."
I personally use something like c:\bw\$(BuildDefinitionId). $(BuildDefinitionId) translates into the id of the build definition (hence the name :-) ), which means you get a build path starting with something like c:\bw\36 rather than c:\Documents and Settings\tfsbuild\Local Settings\Temp\BuildDefinitionName
Good luck,
Martin.
you have to checkout the build script file, from the source control explorer, and get your elbows dirty replacing the path.