How to create key binding to set XML syntax? - key-bindings

I'd like to switch to XML syntax in Sublime Text 2 using some key binding, for example Ctrl+Shift+X.
There is a command for that, I can successfully execute it from console:
view.set_syntax_file("Packages/XML/XML.tmLanguage")
I tried this binding, but it doesn't work:
{ "keys": ["ctrl+shift+x"], "command": "set_syntax_file", "args" : {"syntax_file" : "Packages/XML/XML.tmLanguage" }}
Here API reference for the set_syntax_file command can be found.
Any ideas?

Try this:
{ "keys": ["ctrl+shift+x"], "command": "set_file_type", "args" : {"syntax" : "Packages/XML/XML.tmLanguage" } }

set_syntax_file is an API command, so to use it I created a simple plug-in.

Related

How to write appwrite.json for dart ? with function that has (execution permission, env variables )

I need an example of appwrite.json that uses dart as run time and has function with execution permission, env variables
I tried appwrite cli to create a env variable but not showing in the appwrite.json file.
It should look something like:
{
"$id": "123asdf",
"runtime": "dart-2.17",
// ... other stuff
"execute": ["users"],
"variables": {
"FIRST_VAR": "value",
"SECOND_VAR": "value"
}
}
For more info, refer to the appwrite.json docs.

Set date as artifactory directory name in jenkins pipeline

I have a DSL script that creates a pipeline to push to jfrog artifactory. I want to create a target directory in artifactory with current date as directory name.
import java.text.SimpleDateFormat
env.buildDateString="\${new SimpleDateFormat('yyMMdd').format(new Date())}-\${env.BUILD_NUMBER}"
...
...
//artifactory step
{
"pattern": "*abc*.zip",
"target": "myrepo/application/\${env.buildDateString}\\n/artifacts/"
}
The above script is giving the below snippet
{
"pattern": "*abc*.zip",
"target": "myrepo/application/${env.buildDateString}\n/artifacts/"
}
I want the directory to be created using the date. How to refer the buildDateString in "target" section of artifactory so I get output like this
"target": "myrepo/application/220328/artifacts/"
Why the slashes?
backslash () is used to escape special characters in every type, except dollar-slashy string, where we must use dollar ($) to escape.
In your case just do as below,
env.buildDateString="${new SimpleDateFormat('yyMMdd').format(new Date())}-${env.BUILD_NUMBER}"
{
"pattern": "*abc*.zip",
"target": "myrepo/application/${env.buildDateString}/artifacts/"
}
sample

AWS CDK StateMachine BatchSubmitJob with dynamic environment variables

I'm trying to create a statemachine with a BatchSubmitJob in AWS CDK with dynamic environment variables in the BatchContainerOverrides. I was thinking about something like this:
container_overrides = sfn_tasks.BatchContainerOverrides(
environment={
"TEST.$": "$.dynamic_from_payload"
}
)
return sfn_tasks.BatchSubmitJob(self.scope,
id="id",
job_name="name",
job_definition_arn="arn",
job_queue_arn="arn",
container_overrides=container_overrides,
payload=sfn.TaskInput.from_object({
"dynamic_from_payload.$": "$.input.some_variable"
}))
However, upon deployment, CDK will add "Name" and "Value" to the statemachine definition, but Value is now static. This is part of the statemachine definition as seen in the console:
"Environment": [
{
"Name": "TEST.$",
"Value": "$.dynamic_from_payload"
}
]
But I need to have it like this:
"Environment": [
{
"Name": "TEST",
"Value.$": "$.dynamic_from_payload"
}
]
I also tried using "Ref::", as done here for the command parameters: AWS Step and Batch Dynamic Command. But this doesn't work either.
I also looked into escape hatches, overwriting the CloudFormation template. But I don't think that is applicable here, since the generated statemachine definition string is basically one large string.
I can think of two solutions, both of which don't make me happy: override the statemachine definition string with escape hatches with a copy in which "Value" is replaced on certain conditions (probably with regex) OR put a lambda in the statemachine that will create and trigger the batch job and a lambda that will poll if the job is finished.
Long story short: Does anyone have an idea of how to use dynamic environment variables with a BatchSubmitJob in CDK?
You can use the aws_cdk.aws_stepfunctions.JsonPath class:
container_overrides = sfn_tasks.BatchContainerOverrides(
environment={
"TEST": sfn.JsonPath.string_at("$.dynamic_from_payload")
}
)
Solved thanks to K. Galens!
I ended up with a Pass state with intrinsic functions to format the value and the aws_cdk.aws_stepfunctions.JsonPath for the BatchSubmitJob.
So something like this:
sfn.Pass(scope
id="id",
result_path="$.result",
parameters={"dynamic_from_payload.$": "States.Format('static_sub_part/{}',$.dynamic_sub_part)"} )
...
container_overrides = sfn_tasks.BatchContainerOverrides(
environment={
"TEST": sfn.JsonPath.string_at("$.result.dynamic_from_payload")
}
)

How do i parse a text file that has json and get array element

I am trying to parse a text file with json and get one of the element from jsonArray. Below is the json i am trying to parse
[
{
"ContainerConfig": {
"Labels": {
"commit-id": "abcdef123d",
"author": "Jon"
}
}
}
]
Below is my groovy implementation in jenkinsfile
def jsonStr=readFile('temp.txt').trim()
//here temp.txt consist of above json
JsonSlurper slurper = new JsonSlurper()
def parsedJson=slurper.parseText(jsonStr)
def commitId=parsedJson[0].ContainerConfig.Labels.commit-id
I am getting this errorMsg -
java.lang.ClassCastException: org.jenkinsci.plugins.workflow.steps.EchoStep.message expects class java.lang.String but received class java.util.ArrayList
Using the JsonSlurper is not really best practice and can cause problems with CPS, use readJSON instead (which is also easier to use IMO).
I also suspect that the - in commit-id causes the error and you should the ["commit-id"] snytax instead.

Jenkins: identify trigger type

Is there a way I can identify the trigger for the current build during execution. What I want is to identify if the trigger was an SCM change, cron trigger or user trigger. I have multiple triggers defined for a job and want to use trigger type as a parameter in the shell execution script.
You can use the Rest API to get this info; here's an example:
http://jenkins.yourdomain.com/job/job_name/build_number/api/json?tree=actions[causes[shortDescription]]&pretty=true
returns
{
"actions" : [
{
"causes" : [
{
"shortDescription" : "Started by an SCM change"
}
]
},
{
},
{
},
{
},
{
},
{
},
{
}
]
}
One solution is to use the Run Condition Plugin which can run a different shell script depending on the trigger type. It is not the perfect solution, but it will do what you want.
You can also do it with groovy script. Check out my answer to Jenkins Groovy: What triggered the build
you can get the Cause object and then check for which subtype of cause it is
http://javadoc.jenkins-ci.org/hudson/model/Cause.html
At http(s)://(your-jenkins-server)/jenkins/job/(job-name)/(job-number) , under the "Build Artifacts" and "Changes" sections (if you have them), you should see this icon: . The text next to it should state what caused the build.

Resources