I have tried below command but it does not retrieve anything instead says 'Asset search returned no results'
https://<nexus_url>/nexus/service/rest/v1/search/assets/download?sort=version&repository=<my_repo>&group=<grp_id>&name=<name>&maven.baseVersion=<version>
Related
I have some terraform code that uses a map of tfvars to deploy multiple lambdas to Aws. it all works fine except I want to run the script in a Jenkins pipeline which would need to download the jars from Nexus first for each lamda. is there a way I can read the tfvars file in the Jenkins pipeline to get the names of the jars to download from Nexus, copy them into the working dir on Jenkins and then upload them using terraform?
I'd like the files generated by Jenkins script to be automatically copied into a given directory on the local network.
Is there a plugin or script for doing that ?
In my case it worked by using SCP in a build step "Execute shell". The remote server needs to be accessible via ssh.
If your artifacts are the result of a maven build, maybe a Nexus Repository Manager is what you are looking for. A simple mvn deploy would do the job.
I have configured jenkins using its docker image to scale and deploy it on a kubernetes cluster (minikube) using the kubernetes plugin and successfully able to generate the slaves dynamically. But I am not able to run groovy scripts by passing the file path of the the groovy file on the slave. I have tried using SSH and scp command but not able to run the scripts on slave node. Any other idea??
To try I created a sample groovy file on the slave node and gave its path to the groovy plugin and tried to build the job which works. Can we create a file in our local system and make it run on the slave node.
I have been struggling to get the Publish over FTP to successfully transfer my DotNetCore build artifact from my Jenkins Linux slave to my AzurePack server. This plugin worked for the same host and target server in the conventional Jenkins pipeline.
I would like to know why it is saying the transfer was successful when 0 files were transferred?
I would also like to know why it is not transferring any files when I am listing the source directory before and after the ftpPublish call and all of the files are being listed?
I would also like to know what should be in the paramPublish variable?
Below is my groovy code:
Publish over FTP stage in JPaC
Below is my console output from Jenkins:
Jenkins Console Output
I have a Jenkins job that successfully builds my project, now i am trying to push the build to remote Amazon EC2 server using the MsDeploy command as follows:
"C:\Program Files\IIS\Microsoft Web Deploy V3\msdeploy.exe" -verb:sync -allowUntrusted -source:contentPath="%WORKSPACE%\dist" -dest:auto,computerName="https://ec2-xx-xxx-xx-xxx.compute-1.amazonaws.com:8172/MSDeploy.axd?site=Default Web Site",username="administrator",password="xxxxxxxxxxx",authtype="Basic",includeAcls="False"
This command is working from my local machine command prompt as well as from the Jenkins server command prompt but giving ERROR_USER_UNAUTHORIZED when ran as a Execute windows batch command build step from Jenkins Job. Below are the error details:
When i copy the password from the above formatted Msdeploy command from the Jenkins console output and try to remote into the respective server, it doesn't log me in. Whereas if i copy the password from else where say like notepad, word etc.. and try to remote into the server, its working. It looks like the problem is with the way Jenkins is formatting the Msdeploy command. Should i have to format it in a specific way or is there any specific order the arguments in the command has to be framed ?
I know there are other options like using the publish over ssh plugin in Jenkins, but wondering why it can't be achieved with a simple Msdeploy command.