With Jenkins it's possible to run a pipeline with steps: https://www.jenkins.io/doc/pipeline/tour/running-multiple-steps/
Is it possible to mark different steps in a subprocess?
I have multiple automations written in python and would love to trigger a step inside a python script without having to rewrite the logic as Jenkins Pipeline.
Related
I am using cypress version 8.5.0 to create my UI automation scripts. Automation repository is placed in Gitlab. These scripts are triggered through Jenkins pipeline wherein we have created a Jenkinsfile containing docker image
image: cypress/included:8.5.0
and running tests in Linux container. All infra like Jenkins etc. are hosted in cloud. At this moment scripts are successfully running sequentially in Electron using following command in Jenkinsfile
sh 'npx cypress run'
I have two queries -
(a) We wanted to specify folder path for execution and browser as well with following command but it is failing
sh 'npx cypress run --spec "folder path" --browser=chrome'
(b) We wanted to reduce our execution time by parallel execution. cypress dashboard is not an option for us due to budget constraints.
I saw some folks mentioned about sorry-cypress as an alternative and I explored on that. But I am not able to figure out changes required in Jenkinsfile to make this work.
Thanks in advance for sharing your valuable suggestions/work.
I am using jenkins pipeline and my Jenkinsfile has several stages and jobs. Is there any way to run specific job outside of jenkins pipeline ?
Example: Let's say one of the stage is to do "scp build artifacts to remote location". For some reason this got failed and if at all I want to run rest of the jobs manually out of jenkins pipeline, how can I do that ?
I am least interested to invoke a new build. So can we run remaining jobs after failure outside of jenkins pipeline manually ?
You may be able to do it by writing unit test cases to your Jenkinsfile and test them as a maven project. This may or may not solve your problem without looking at your entire problem but if you can reorganize your logic to achieve 100% test coverage then it is doable. You can find more information about writing test cases of Jenkins pipelines here
I have a python project, locally I have setup tox to automate pep8, bandit scans, pytest etc...
Now I 'm asked to move to existing CICD and they have given me a Jenkins file.
I need to add these tox functionality to Jenkins that Jenkins file. Can I directly run these tox commands in the jenkins? Or do I need to look for similar functionality using Jenkins plugins ?
There is literally documentation on the tox website for using it in conjunction with Jenkins - https://tox.readthedocs.io/en/latest/example/jenkins.html.
Try to use Stackoverflow to get help with a solution that you came up with.
I am in the process of writing a declarative jenkinsfile to build a pipeline project. Some of the steps within a few of the stages will require some remote commands to be run. The remote ssh sites have been configured in the main jenkins configuration.
How can I declare these steps within my jenkinsfile? I know I can run shell commands locally using sh but it's the remote servers that i need to know about?
You need to install ssh-agent plugin then you will be able to execute remote commands using ssh. Check this for examples
I'm rewriting what is essentially a custom CI server into Jenkins pipeline jobs that heavily rely on a global pipeline library.
So far, Jenkins pipeline has been an awesome tool, but I'm confused on a what the preferred method for writing steps is... are you better off writing the content of the steps in groovy, or shelling out with sh steps? The one major downside i see to using the sh step and shelling out a ton is that it's more difficult to handle errors. For example, if copying a dir of files errors out because the source dir doesn't exist, I want to handle that differently than if the target location is out of disk space.
What is the preferred approach?
I'd recommend using groovy as much as you can, in this way you can easily adapt the script for moving between platforms.
Example:
Say you use shell steps then you need to support and Windows OS, you might then need to rewrite the steps for CMD or use powershell, using groovy makes this question obsolete.
Therefore I'd recommend groovy, even if there is a learning curve associated with it.
Use sh steps. Anything you write in groovy will only run on the master server, while sh steps will run on the master and slave nodes. Pipeline is really only for flow control, with some steps built in to perform actions in workspaces.