Parse file in Xtext MWE2 workflow - xtext

I would like to parse a file within an MWE2 workflow, e.g. by giving the org.eclipse.emf.mwe.utils.Reader component a file written in my DSL rather than the XMI representation of it.

Alternatively have a look at org.eclipse.xtext.mwe.UriBasedReader

I have found the solution at http://www.eclipse.org/forums/index.php/m/831365/
Workflow {
component = org.eclipse.xtext.mwe.Reader {
register = org.xtext.example.mydsl.MyDslStandaloneSetup {}
path = "modeldir"
loadResource = {
slot = "models"
}
}
}
Adjusted to the answer of Christian when using a single file it can be written
Workflow {
component = org.eclipse.xtext.mwe.UriBasedReader {
register = org.xtext.example.mydsl.MyDslStandaloneSetup {}
uri = "model.file"
loadResource = {
slot = "model"
}
}
}

Related

What is the simplest way to get GCR image digest through terraform?

Terraform GCR provider has data source called google_container_registry_image which has an argument digest, but it will be null since it is stated that data source works completely offline.
data "google_container_registry_image" "image" {
project = "foo"
name = "bar"
tag = "baz"
}
output "digest" {
value = data.google_container_registry_image.image.digest // this is null
}
Current workaround I use uses docker provider and looks like this:
terraform {
required_providers {
docker = {
source = "kreuzwerker/docker"
version = "2.11.0"
}
}
}
provider "docker" {
registry_auth {
address = "gcr.io"
config_file = pathexpand("~/.docker/config.json")
}
}
data "google_container_registry_image" "image" {
project = "foo"
name = "bar"
tag = "baz"
}
data "docker_registry_image" "image" {
name = data.google_container_registry_image.image.image_url
}
output "digest" {
value = data.docker_registry_image.image.sha256_digest
}
Using two providers and additional docker credentials seems pretty complicated for such a simple use case, is there some easier way to do it?

How can I override a package source in Nix?

So I want to replace pkgs.picom in my home-manager config with a newer fork. How can I do that?
I have a feeling it's something like:
let newPicom = pkgs.picom.override.src.url = "https://github.com/ibhagwan/picom";
in
services.picom.package = newPicom;
But knowing Nix is probably actually some really long incantation with self: super: and so on.
nixos.wiki has an example of overriding the source of a package.
You do need to provide a reproducible source. A github repo url is mutable, so you need to specify the revision.
{ pkgs, ... }:
let newPicom = pkgs.picom.overrideAttrs (old: {
version = "git"; # usually harmless to omit
src = /* put your source here; typically a local path or
a fixed-output derivation produced by
`fetchFromGitHub`.
builtins.fetchGit is also an option. Doesn't run
in parallel but does fetch private sources. */;
});
in {
services.picom.package = newPicom;
}
Overlays
let
picom_overlay = (self: super: {
picom = super.picom.overrideAttrs (prev: {
version = "git";
src = pkgs.fetchFromGitHub {
owner = "yshui";
repo = "picom";
rev = "31e58712ec11b198340ae217d33a73d8ac73b7fe";
sha256 = pkgs.lib.fakeSha256;
};
});
});
in
nixpkgs.overlays = [ picom_overlay ];
Of course, sha256 should be replaced with the relevant hash shown in the output error after building -- in this case:
sha256 = "sha256-VBnIzisg/7Xetd/AWVHlnaWXlxX+wqeYTpstO6+T5cE=";
picom-next
Note that there is also a picom-next package so one can alternatively do:
let
picom_overlay = (self: super: {
picom = super.picom.overrideAttrs (oldAttrs: rec {
inherit (super.picom-next) pname version src;
});
});
in
nixpkgs.overlays = [ picom_overlay ];
Or more simply with #RobertHensing's suggestion:
services.picom.package = pkgs.picom-next;

how to exclude fields from swagger grpc generated code

i'm trying to generate swagger JSON files using https://github.com/pseudomuto/protoc-gen-doc, I can't find a way to exclude some of the APIs of the grpc service/fields inside the messages.
found the relevant styling in swagger, but can't seem to find a way to add it in the protobuf file http://watson-developer-cloud.github.io/api-guidelines/swagger-coding-style.html#excluding-operations-from-the-sdks
service MyService {
rpc ExternalApi (ExternalApiRequest) returns (ExternalApiResponse) {
option (google.api.http) = {
post: "/my/externalApi"
};
}
rpc InternalApi (InternalApiRequest) returns (InternalApiResponse) {
option (google.api.http) = {
post: "/my/internalApi"
};
}
message ExternalApiResponse {
string prefix = 1;
string id = 2; // field to exclude
}
// message to exclude
message Header { }
is there a way to exclude actions / fields from the protocol buffer files?
You can add
string id = 2 [(grpc.gateway.protoc_gen_swagger.options.openapiv2_field).read_only = true];

Xtext crossreferencing no longer working?

I have used Xtext for many years, and have always been able to cross-reference from one grammar to another grammar. But today, on Eclipse Photon, the usual method no longer works.
In the same workspace, I create two Xtext projects, using default options, org.xtext.example.adsl.ADsl
grammar org.xtext.example.adsl.ADsl with org.eclipse.xtext.common.Terminals
generate aDsl "http://www.xtext.org/example/adsl/ADsl"
AModel:
agreetings+=AGreeting*;
AGreeting:
'AHello' name=ID '!';
and org.xtext.example.bdsl.BDsl,
grammar org.xtext.example.bdsl.BDsl with org.eclipse.xtext.common.Terminals
generate bDsl "http://www.xtext.org/example/bdsl/BDsl"
//import "http://www.xtext.org/example/adsl/ADsl" as adsl
ModelB:
bgreetings+=BGreeting*;
BGreeting:
'BHello' name=ID '!';
where BDsl would like to import ADsl via the commented-out import statement import "http://www.xtext.org/example/adsl/ADsl" as adsl for use in cross-referencing.
In the past, before uncommenting that import, I would have to add a resource reference referencedResource = "../org.xtext.example.adsl/model/generated/ADsl.genmodel" to GenerateBDsl.mwe2.
module org.xtext.example.bdsl.GenerateBDsl
import org.eclipse.xtext.xtext.generator.*
import org.eclipse.xtext.xtext.generator.model.project.*
var rootPath = ".."
Workflow {
component = XtextGenerator {
configuration = {
project = StandardProjectConfig {
baseName = "org.xtext.example.bdsl"
rootPath = rootPath
runtimeTest = {
enabled = true
}
eclipsePlugin = {
enabled = true
}
eclipsePluginTest = {
enabled = true
}
createEclipseMetaData = true
}
code = {
encoding = "UTF-8"
lineDelimiter = "\n"
fileHeader = "/*\n * generated by Xtext \${version}\n */"
}
}
language = StandardLanguage {
name = "org.xtext.example.bdsl.BDsl"
referencedResource = "../org.xtext.example.adsl/model/generated/ADsl.genmodel"
fileExtensions = "bdsl"
serializer = {
generateStub = false
}
validator = {
// composedCheck = "org.eclipse.xtext.validation.NamesAreUniqueValidator"
}
}
}
}
But when I generate the Xtext artifacts for BDsl, I now get the following error (the import still commented out).
434 [main] ERROR xt.generator.XtextGeneratorLanguage - Error loading 'ADsl.ecore'
The genmodel is certainly being found, since a completely different error is generated if the file cannot be found.
What is going on?
Am I making some stupid error?
Is this related to this bug? If so, is there a work around?
[... I am aware that the example contains no actual cross-references. I have purposely induced the error in the simplest possible manner. ...]
make sure you refer to the referenced genmodel in a way that it actually can be insolved. the ususal reference would look like platform:/resource/project/model/Some.genmodel so in your case referencedResource = "platform:/resource/org.xtext.example.adsl/model/generated/ADsl.genmodel"

Groovy Script - Logback configuration unawaited behaviour

I want to use Logback as my logging framework within Grails. therefore I set up everything in place to work but my implementation fails on the configuration file itself. the reason is, as I guess, somewhere whithin the scoping of Groovy Script but I'm not able to figure it out...
if I define my String properties without any identifier which I want to use later I get a warning that it may not be accessed. For example:
LOG_DIR = 'c:/temp/myproject/logs/'
BACKUP_DIR = LOG_DIR + 'backup/'
appender('F_MAIN', RollingFileAppender) {
file = LOG_DIR + 'test.log'
rollingPolicy(FixedWindowRollingPolicy) {
fileNamePattern = BACKUP_DIR + 'test.%d{yyyy-MM-dd}.%i.log.zip'
// .... and so on
}
}
I get the following error message from Logback, which I'm pretty sure is indicating that both LOG_DIR and BACKUP_DIR can not be reached:
13:33:32,036 |-ERROR in ch.qos.logback.classic.gaffer.AppenderDelegate#6fd00b - Appender [F_MAIN] of type [ch.qos.logback.core.rolling.RollingFileAppender] has no appplicable [LOG_DIR] property
13:33:32,068 |-ERROR in ch.qos.logback.classic.gaffer.ComponentDelegate#788ac3 - Component of type [ch.qos.logback.core.rolling.FixedWindowRollingPolicy] has no appplicable [BACKUP_DIR] property
I also tried the following approach by declaring both variables with the #Field tag, but it still does not work:
#Field String LOG_DIR = 'c:/temp/myproject/logs/'
#Field String BACKUP_DIR = LOG_DIR + 'backup/'
appender('F_MAIN', RollingFileAppender) {
file = LOG_DIR + 'test.log'
rollingPolicy(FixedWindowRollingPolicy) {
fileNamePattern = BACKUP_DIR + 'test.%d{yyyy-MM-dd}.%i.log.zip'
// .... and so on
}
}
what am I doing wrong here?
oh my!
after searching and a lot of trial/error I found the solution and it was so close and definitely seems obvious now: I had to declare both variables with def, so now they are visible throughout the whole script ;)
For example, this is working code:
def LOG_DIR = 'c:/temp/myproject/logs/'
def BACKUP_DIR = LOG_DIR + 'backup/'
appender('F_MAIN', RollingFileAppender) {
file = LOG_DIR + 'test.log'
rollingPolicy(FixedWindowRollingPolicy) {
fileNamePattern = BACKUP_DIR + 'test.%d{yyyy-MM-dd}.%i.log.zip'
// .... and so on
}
}
now, I'm also able to use a function like this within my script:
def createFilename(String directory, String name, boolean isBackupFile) {
String filename = ''
if(isBackupFile) {
filename = "${directory}backup/MyProject-${name}.%d{yyyy-MM-dd}.%i.log.zip"
} else {
filename = "${directory}MyProject-${name}.log"
}
return filename
}
def fileAppenderLog = createFilename(LOG_DIR, 'output', false)
def fileAppenderLogBackup = createFilename(LOG_DIR, 'output', true)
appender('F_MAIN', RollingFileAppender) {
file = fileAppenderLog
rollingPoliciy(FixedWindowRollingPolicy) {
fileNamePattern = fileAppenderLogBackup
// .... and so on
}
}
which is pretty useful, I think :), especially if you want to declare a bunch of different logfiles and even if you want to declare temporary logfiles which are created when Logback is rescanning this file ...

Resources