setting sdk_container_image in flex template - google-cloud-dataflow

In order to provide hermetic builds and runtime, we currently build custom flex template and worker images. As part of deployment, we build the flex template and we can specify the custom flex template image but not the custom worker image - we have to pass it to dataflow separately when invoking the flex template. This creates hidden dependencies that we need to track as part of the release process elsewhere (and seems to go against the design of flex templates of being self-contained). Is there a way to bake at least a default for the worker image ( sdk_container_image ) into the template?

In general, if you want to have a default for a template parameter, you can just use the the #Default annotation on it and change the default value for every template version you build. Though in this particular case, since sdkContainerImage parameter is declared in DataflowPipelineOptions in Java SDK (or WorkerOptions in Python SDK), and you can't control its default from the user code, I'd try to set the parameter value programmatically in the template code.
I think something like this should work, but I haven't tested it:
DataflowPipelineOptions options =
PipelineOptionsFactory.fromArgs(args).as(DataflowPipelineOptions.class);
if (options.getSdkContainerImage() == null || options.getSdkContainerImage().isEmpty()) {
// Set the default if not already set by the template runner.
options.setSdkContainerImage("...");
}
Pipeline pipeline = Pipeline.create(options);
// ...
This is for Java, but you can do a similar thing with Python SDK.

Related

Can I define new mustache template variables in swagger-codegen?

I have developed a rest-api client (in java) customised to the needs of my product. I wanted to generate tests using my rest api client using swagger-codegen modules based on yaml-file.
I have already extended DefaultCodegenConfig & even tried implementing the CodegenConfig interface to build my custom jar. I have customized the api.mustache and api_test.mustache files and passing them in the constructor and processOpts() method of my CustomCodeGen that extends DefaultCodegenConfig.
However, I want to use the custom/new mustache template variables that I have added in my customised api.mustache.
For e.g. if refer to standard api.mustache, the template variables it typically uses are
- {{classname}}
- {{#operation}}
- {{#contents}}
- {{#parameters}}
etc.
Now, I want to introduce a new template variable, let's say {{custom_param}}. Now I am not clear how do I integrate this new template variable with the implementation.
Looks like from this Mustache-Template-Variables published here, swagger-codegen does not allow adding new template-variables and perhaps we are restricted to only the variables mentioned on this page.
So, is there some way to make the new template variables work ?
Some time ago I added the uniqueItems parameter for bean validation as it was not getting processed by the engine even though it was a part of the implemented JSR.
So I believe codebase needs to be updated to use your own variable which is only possible if you fork the code.
In case it helps, these two were the PRs:
For query parameters: https://github.com/swagger-api/swagger-codegen/pull/10154.
For body parameters: https://github.com/swagger-api/swagger-codegen/pull/10490.

Dataflow: Consuming runtime parameters in template

Trying to create a template for dataflow job.
Is there any way to generate a template with runtime parameters?
Till now, whatever parameters were used at the time of creation of template, but when I tried passing different values for the variables, it is not picking the runtime values.
If any additional details are needed, will provide the same.
You can use value providers in your pipeline options to have runtime arguments in a pipeline.
But I'm afraid that this is too limited to where you can use these parameters (Mostly in DoFn).
This behaviour is expected from dataflow template as it is representation of a pipeline rather than the code itself.
Please bear in mind that you cannot create dataflow template with dynamic processing steps based on the value passed.
The steps are hard-coded into the template and cannot be changed unless the code to generate the template is executed again.
A parameter needs to be wrapped inside of a ValueProvider object in order for the template pipeline to access the runtime value of that parameter. All of the example templates provided here demonstrate how ValueProvider can be used to parameterize a template pipeline.
Take a look at the WordCount pipeline as an example.
As you can see, the pipeline uses a ValueProvider (instead of a simple String) to read the path of the file on which a WordCount needs to be executed:
#Description("Path of the file to read from")
ValueProvider<String> getInputFile();
void setInputFile(ValueProvider<String> value);
Since the value of the inputFile is unknown until runtime (when the template is actually executed with valid inputs), the transform using the ValueProvider will defer reading the value of the parameter until runtime (for e.g. inside a DoFn).
The native TextIO.Read Beam transform provides support for reading from a ValueProvider in addition to reading from String.

TFS Build Template Customization ( MSBuildArguments)

I have to customize TFS DefaultTemplate.xaml with MSBuildArguments Parameter.
Once we try to create new XAML build definition & select default template, once it gets loaded then I want to set code MSBuildArguments in Advance setting & parameter should be /p:DebugType=pdbOnly;Configuration=Release by default.
As of now this Arguments we have to put manually whenever we create a new build definition. I want to make it customize this template.
screen shot:
Specify MSBuild Arguments when create a build definition or queue a build is more convenient.
However if you want to bind the arguments with the xaml build template, then you need to custom the template.
In your scenario, you can simply replace the value of CommandLineArguments with below strings: (Note that the arguments you added here will be hidden, that means you can not see them in build definition, they are set to be the default arguments)
String.Format("/p:SkipInvalidConfigurations=true {0}", "/p:DebugType=pdbOnly;Configuration=Release").
But same time the MSBuildArguments you specified in build definition will not be available any more.
You can reference below articles to custom the build template to replace the MSBuildArguments:
Pass Relative Path Arguments to MSBuild in TFS2010 Team Build
Properly incorporate MsBuild arguments into your build process
template

Can a workflow step access environment variables provided by an EnvironmentContributingAction?

A custom plugin we wrote for an older version of Jenkins uses an EnvironmentContributingAction to provide environment variables to the execution so they could be used in future build steps and passed as parameters to downstream jobs.
While attempting to convert our build to workflow, I'm having trouble accessing these variables:
node {
// this step queries an API and puts the results in
// environment variables called FE1|BE1_INTERNAL_ADDRESS
step([$class: 'SomeClass', parameter: foo])
// this ends up echoing 'null and null'
echo "${env.FE1_INTERNAL_ADDRESS} and ${env.BE1_INTERNAL_ADDRESS}"
}
Is there a way to access the environment variable that was injected? Do I have to convert this functionality to a build wrapper instead?
EnvironmentContributingAction is currently limited to AbstractBuilds, which WorkflowRuns are not, so pending JENKINS-29537 which I just filed, your plugin would need to be modified somehow. Options include:
Have the builder add a plain Action instead, then register an EnvironmentContributor whose buildEnvironmentFor(Run, …) checks for its presence using Run.getAction(Class).
Switch to a SimpleBuildWrapper which defines the environment variables within a scope, then invoke it from Workflow using the wrap step.
Depend on workflow-step-api and define a custom Workflow Step with comparable functionality but directly returning a List<String> or whatever makes sense in your context. (code sample)
Since PR-2975 is merged, you are able to use new interface:
void buildEnvVars(#Nonnull Run<?, ?> run, #Nonnull EnvVars env, #CheckForNull Node node)
It will be used by old type of builds as well.

How to store last value of parameter in parameterized job as a default value for next build in Jenkins?

I have been using Jenkins for a few weeks and I have one small problem. I can't find any plugin or solution for storing the last value of a parameter in a parametrized job as a default value for the next build.
For example:
My parameter takes build version (1.0.0.01) in the first build. In the next build it will be changed to 1.0.0.02, but I want to have a 1.0.0.01 in the default value field as a hint.
Does anybody have a solution or advice?
The Persistent Parameter Plugin is exactly what you are looking for!
You just need to download it from the official Jenkins repository and install it, no need for any additional setup.
Then on your job, you just need to add a "Persistent Parameter" in order to have default values used and saved between builds.
You can add a System groovy build step to your job (or maybe a post build Groovy step) using the Jenkins API to directly modify the project setting the default parameter value.
Here is some code that may be useful to get you started:
import hudson.model.*
paramsDef = build.getParent().getProperty(ParametersDefinitionProperty.class)
if (paramsDef) {
paramsDef.parameterDefinitions.each{ param ->
if (param.name == 'FOO') {
println("Changing parameter ${param.name} default value was '${param.defaultValue}' to '${param.defaultValue} BAR'")
param.defaultValue = "${param.defaultValue} BAR"
}
}
}
Have a look at the class ParameterDefinition in the Jenkins model.
You probably need to modify the default param value based on the current build executing. Some code to get that would look like this:
def thisBuildParamValue = build.buildVariableResolver.resolve('FOO')
The Extended Choice Parameter plugin provides this capability by using default parameter values from a properties file. A default parameter can be selected from a specified property key and this key can be programmatically modified in your current build. I would then use a groovy script in the current build to set the value of the default property key for the next build.
As an example you would have an Extended Choice Parameter whose default value is defined by a properties file version.properties with keys as follows:
versions=1.0.0.02, 1.0.0.01, 1.0.0.00
default.version=1.0.0.02
The parameter definition would include:
Property File=version.properties
Property Key=versions
Default Property File=version.properties
Default Property Key=default.versions
The GUI for your parameter in the next build would show a selection list with 1.0.0.02 selected by default. This feature is also very useful for pipeline builds where you would want the parameters of a downstream build stage to be set by an earlier build.
The only drawback to this approach might be that the parameter UI will be a drop-down selection. You may opt to have a single value in the versions property key so not to confuse your users.
Similar to thiagolr's answer, but for those of you using pipelines! It appears the persistent-parameter-plugin doesn't work for those using pipeline 2.0. But there is a patched version at https://github.com/ashu16815/persistent-parameter-plugin which seems to work for me.
Clone it locally:
git clone https://github.com/ashu16815/persistent-parameter-plugin.git
Build it:
mvn clean install
Install it in Jenkins:
1) Navigate to Jenkins > Manage Jenkins > Manage Plugins
2) Click Advanced tab
3) Scroll down to Upload Plugin
4) Click Choose file and select the persistent-parameter.hpi in the target directory of the maven build above
Now it should persist.

Resources