Create artifactory directory with current date - jenkins

How to create artifactory directory with current date in jenkins. That is target path should have a directory with current date as the directory name.
rtUpload (
serverId: 'Artifactory-1',
spec: '''{
"files": [
{
"pattern": "bazinga/*froggy*.zip",
"target": "bazinga-repo/froggy-files/<CurrentDate>"
}
]
}''',
Everytime the pipeline is triggered, target path should have a directory name of that particular date. For Ex, if the pipeline runs on 2022-03-29 then:
"target": "bazinga-repo/froggy-files/220329/"

This is another option:
def current_date = new java.text.SimpleDateFormat('yyMMdd').format(new Date())
pipeline {
// do stuff, generate files...
rtUpload (
serverId: 'Artifactory-1',
spec: """{
"files": [
{
"pattern": "bazinga/*froggy*.zip",
"target": "bazinga-repo/froggy-files/${current_date}"
}
]
}"""
// more stuff
} // end pipeline
This makes use of the groovy string interpolation and a scripted pipeline variable outside the main pipeline block.
When run the variable current_date will be assigned and then when it gets to the rtUpload call the spec parameter will be evaluated and because it is using triple-double-quote notation the ${current_date} part will be replaced by the value of the groovy variable, before it is passed to the function.
This does not then rely on the rtUpload function spawning a shell and evaluating the shell environment in order to provide the date value to the spec definition.
Groovy strings
https://groovy-lang.org/syntax.html#all-strings

You can do it like this
Define this environment section
environment {
CURRENT_DATE = new java.text.SimpleDateFormat('yyMMdd').format(new Date())
}
rtUpload (
serverId: 'Artifactory-1',
spec: '''{
"files": [
{
"pattern": "bazinga/*froggy*.zip",
"target": "bazinga-repo/froggy-files/$CURRENT_DATE"
}
]
}''',

Related

How do you extract the Artifactory path from the spec to use it later in the pipeline script

I have the below code from where I am getting to the required file in Artifactory using the regex. But I want to printout the complete path at the later stage during the pipeline script execution. Can somebody tell me how do I access the complete path that Artifactory is getting the file from?
spec: """{
"files":
[
{
"pattern": "${ARTIFACTORY_PATH}/*/some_file.txt",
"target": "./",
"flat": "true",
"sortBy": ["created"],
"sortOrder": "desc",
"limit": 1
}
]}"""

Groovy Script - Active Choices Jenkins Plugin - variable reference from inside "script:" section

I'm using the "Active Choices Jenkins Plugin" in order to realize a pipeline.
I've defined my parameters.
[$class: 'DynamicReferenceParameter',
choiceType: 'ET_FORMATTED_HTML',
description: '_description',
name: '_name',
referencedParameters: '**customer**',
script:
[$class: 'GroovyScript',
script: [
sandbox: true,
script: **MUST INSERT SCRIPT**
]
]
Now i go a pretty big script to be processed, that need to evaluate:
A static variabile defined before execution
def customer_dictionary = [
"customer1": ["a", "b", "c"],
"customer2": ["a", "b", "c", "d"],
]
A runtime variabile passed through the referencedParameters, "customer" ) .
customer.each { service->
html_to_be_rendered += "<tr><td>"
configuration = configuration_dictionary[service]
I've to pass to "script:" a string.
But even if i use string interpolation ( """ script """ ) i can't manage to make him resolve this reference ( configuration_dictionary[service] ).
Someone can help me?
Thx

How do I download an Artifactory artifact that contains parenthesis in the name?

Using Jenkins declarative pipeline and an Artifactory file spec, how do I download an Artifactory artifact that contains parenthesis in the artifact name? Is there a way to escape the parenthesis?
For example, I have two artifacts in my Artifactory repository:
default-generic-local/one/two/aaabbbccc(1234).txt
default-generic-local/one/two/aaabbbccc1234.txt
When I run the pipeline defined below, it downloads aaabbbccc1234.txt. I would expect it to download aaabbbccc(1234).txt instead.
Here's an example of the pipeline script and file spec I'm using with my pipeline job:
pipeline {
agent any
stages {
stage('Download') {
steps {
rtServer(
id: 'my-art-server',
url: 'https://my.artifactory.url',
credentialsId: 'my-artifactory-creds')
rtDownload(
serverId: 'my-art-server',
spec: '''
{
"files": [
{
"pattern": "default-generic-local/one/two/aaabbbccc(1234).txt",
"target": "output/",
"flat": "true"
}
]
}''',
failNoOp: true)
}
}
}
post {
always {
cleanWs()
}
}
}

How to properly use target of artifactory rtDownload in jenkins declarative pipeline

This snipped
stage('get iter number') {
steps {
rtDownload ( //
serverId: 'MAIN-ARTIFACTORY',
spec: '''{ "files": [{"pattern": "p1/p2/p3/${BUILD_ID}/n_iter.txt", "target": "./n_iter.txt"}] }''',
)
}
}
where BUILD_ID = 'a/b'
downloads file to a location $WORKSPACE/p2/p3/a/b/n_iter.txt rather then expected $WORKSPACE/n_iter.txt
Also, very strange - why p1 is not in downloaded path?
By default, artifacts are downloaded to the target path in the file system while maintaining their hierarchy in the source repository (not including the repository name - hence p1 is missing in your example).
To download an artifact while ignoring the hierarchy, set "flat": "true" in your file spec.
For a more advanced control of the resulting hierarchy, you may want to use Placeholders.
See more information in the File Specs documentation.
Please try with the below snippet, which means all the files in the com/my-files/ Artifactory repository-path will be downloaded into the my-folder directory on the Jenkins agent file system. For more details on this please do refer to our Declarative Pipeline Syntax wiki page here.
rtDownload (
serverId: 'Artifactory-1',
spec: '''{
"files": [
{
"pattern": "com/my-files/",
"target": "my-folder/"
}
]
In addition to the above, you can refer to the rtDownload example referred to in our GitHub page here.

How to send referencedParameters value in Jenkins pipeline for CascadeChoiceParameter to ScriptlerScript

I'm declaring 2 parameters in properties section of a Jenkins pipeline:
a data center (can have multiple environment types)
an environment type
The data center is of type ChoiceParameter; the list is retrieved from a database; when the data center changes in the drop-down list, the environment types should populate accordingly, also from the database, through a ScriptlerScript.
The problem is that when changing selection on data center, nothing happens for environment types list, which is a CascadeChoiceParameter with referencedParameters: 'DataCenter'.
How should I link the referenced parameter to the scriptlet script I'm using - what do I have to send ?
The issue is with [name:'DataCenter', value: '$DataCenter'] for second parameter - the value is not sent to the ScriptletScript when first drop-down value changes.
If I define the 2 parameters from the Jenkins interface - so not through the DSL pipeline - under Configure section, everything works as expected.
It's not working for me to use something other than properties section - I've tried using activeChoiceParameter inside pipeline section, but I get an error 'Build parameters definitions cannot have blocks # line (...)', which is a known issue (see first example link below).
Examples I've used:
Jenkinsfile Active Choice Parameter
Active Choices Reactive Reference Parameter in jenkins pipeline
properties([
parameters([
[
$class: 'ChoiceParameter',
choiceType: 'PT_SINGLE_SELECT',
name: 'DataCenter',
script: [
$class: 'ScriptlerScript',
scriptlerScriptId:'getdatacenters.groovy',
parameters: [
[name:'StatusId', value:'']
]
]
],
[
$class: 'CascadeChoiceParameter',
choiceType: 'PT_SINGLE_SELECT',
name: 'EnvironmentType',
script: [
$class: 'ScriptlerScript',
scriptlerScriptId:'getenvtypesbydatacenter.groovy',
referencedParameters: 'DataCenter',
parameters: [
[name:'DataCenter', value: '$DataCenter']
]
]
]
])
])
pipeline {
...
Expected result: Second drop-down list populates when data center changes
Actual result: Nothing happens when data center changes
Pipeline with params configured in UI - behaves ok (environment types loading on data center change):
One thing to keep in mind: Scriptlers are not secure, you should not use them: https://wiki.jenkins.io/display/JENKINS/Scriptler+Plugin !.
That being said, if you still want to go on and use Scriptler plugin and CascadeChoiceParameter, the code might look like this:
properties([
parameters([
[
$class: 'ChoiceParameter',
choiceType: 'PT_SINGLE_SELECT',
name: 'DataCenter',
randomName: 'datacenter-choice-parameter-102102304304506506',
script: [
$class: 'ScriptlerScript',
scriptlerScriptId:'getdatacenters.groovy',
fallbackScript: [ classpath: [], script: 'return ["N/A"]']
]
],
[
$class: 'CascadeChoiceParameter',
choiceType: 'PT_SINGLE_SELECT',
name: 'EnvironmentType',
randomName: 'envtype-choice-parameter-101012020230303404',
referencedParameters: 'DataCenter',
script: [
$class: 'ScriptlerScript',
scriptlerScriptId:'getenvtypesbydatacenter.groovy',
fallbackScript: [ classpath: [], script: 'return ["N/A"]'],
]
]
])
])
The groovy code for getdatacenters.groovy for demo purposes is (but might be retrieved from a DB, as an alternative):
return["Dev","Prod"]
The groovy code for getenvtypesbydatacenter.groovy might look like this:
import groovy.sql.Sql
import jenkins.model.*
nodes = Jenkins.instance.globalNodeProperties
nodes.getAll(hudson.slaves.EnvironmentVariablesNodeProperty.class)
sql = Sql.newInstance("jdbc:sqlserver://SQLServerHere;connectionDataHere", "com.microsoft.sqlserver.jdbc.SQLServerDriver")
envTypes = sql.rows("exec [DbHere].[schema].[GetEnvTypes] #DataCenter = $DataCenter").collect({ query -> query.EnvTypeName})
envTypes.add(0,'')
return envTypes
The most important thing to note here is that referencedParameters: 'DataCenter' was not inside the script block, but at the "root" level. If you need more parameters, you can separate them with comma.
Because DataCenter is a referenced parameter and automatically transmitted to the scriptler, $DataCenter variable from inside the SQL query will be mapped with its value. As a note, DataCenter should be added as parameter of the scriptler, in the UI Parameters section.
Credits for the solution go to CloudBees.

Resources