Jenkins - Multiple terminal output - jenkins

I have a Python code that when running, creates multiple other terminal windows, in addition to the one I was already running from.
subprocess.call(["gnome-terminal", "-e", "..."])
This opens multiple other terminals that runs the same program with different parameters.
In Jenkins, in a "Freestyle project", when I run the same script from the "Execute shell", the result is not the same (as I was expecting).
./python_file.py -p $MY_PARAMETER
The main console output is working fine, but the other windows terminal that were supposed to open, just don't execute. I want to be able to see the output of those terminals in the Jenkins console (or elsewhere ?)
Should I use another kind of project ? Or just add a new Plugin ? Is there an option in the project that I should checked ? I don't want to run the project on multiple nodes. I just need to see multiple terminals.
This is the error text :
Failed to parse arguments: Cannot open display:
It is not a common problem I supposed, but thanks for input!

I am not sure if a multiple windows output exists in Jenkins but I think that you can bypass this issue.
Instead of running one project in multiple console, I will modify my Python Script so that multiple projects will run one console at a time. Like that it will be easier to control which parameters I want to every single projects and what the outputs are for every single one of them also.
There is a couple way to do that ("multi-configuration project", multiple "freestyle project").

Related

How to interactively pass values between Ansible and Jenkins while the Job is Running

My Jenkins UI takes parameter a list of files/folders along with wildcards from users that they wish to delete.
Jenkins then passes the files to be deleted across all hosts by invoking ansible-playbook.
My requirement is to prompt user on Jenkins for confirmation before deleting a file / folder.
Thus, how rm -rfi /tmp/moht /var/log*/data.dat prompts interactively asking the user confirmation before deleting the files; is what I Wish to prompt on Jenkins for each Host.
Thus, for the above I expect Jenkins to prompt like below:
Are you sure you want to delete
/tmp/moht
/var/log14Mar/data.dat
I'm aware of input function in Jenkins for prompt.
I'm also aware of an ansible command module that can be used to fire rm -rfi command.
I'm also aware of how to timeout or terminate the Jenkins Job upon user input. However, in this case I would love if the user input Yes / No could be sent back to the target host via ansible and action gets performed accordingly.
I understand that this may be too much an ask but other feasible solutions or suggestions are also appreciated.
Can you please suggest how can I achieve the requirement?
I strongly feel that using Jenkins as interactive tool is a bad idea. But if you really like to implement bad ideas with unsuitable tools, you can try to use matrix job [1] based on files you want to remove.
Jenkins will create a list of jobs with a job for each file you'd like to remove. You ask confirmation and then execute an Ansible playbook to remove a file.
[1] https://plugins.jenkins.io/matrix-project/
It's horribly inefficient and you should not do it like this.

combine allure reports from several machines into one without retry

I ask you to consult on the following question about allure: I use jenkins + pytest to run the tests. The same tests run on several virtual machines, these machines differ in operating systems (different linux distributions) and test environment. After running the tests, I want to combine the results from all the machines into one report. - here the question arises - if I put all the reports in one directory and generate a report, then the results from different machines will be considered as rerun of the same test and combined into one. How can I get around this? so as not to be combined and so that it was possible to somehow sort out which result from which machine. Thanks.
i have solve this by override the names of tests/suites.
Meaning you have to make some code implementation, work with the before listeners, there you can get the current test name and override it. Set the test name by OS + Browser or something unique.
When you combine reports, they will be unique and properly displayed.
I ran into a similar issue with behave where Allure was treating each parallel build as a retry of the first build. I realize this isn't the same as pytest, but perhaps it'll help.
I was inspired by the previous answer and started experimenting. By changing the scenario name(s) within the feature, I was able to make Allure recognize each parallel build as separate tests. I accomplished this by adding a before_feature method to my environment.py file that simply added the hostname to each scenario name within that feature:
def before_feature(context, feature):
for scenario in feature.scenarios:
scenario.name = f'[{socket.gethostname()}] {scenario.name}'
Originally, I tried to directly change scenario.name in before_scenario but that seemed to have no effect in Allure.

configure grid extra (groupon) with jenkins in order to run cucumber tests

I'm struggling with something for quite a while and can't find a solution,
I got a test project (cucumber, maven) I configure jenkins to pull the project from github, build and execute the code (selenium test script) on jenkins slave and that works perfect, I added few more slave, tagged them and I'm able to execute the same job parallel(the same test cases on different machines)
my next step is to use grid extra (https://github.com/groupon/Selenium-Grid-Extras) in order to use some cool features like video recording, browser updating, selenium updates etc...
now, I know that in order to use the grid I need to address it via my code and also define the desired capabilities (browser, os etc...).
currently when I'm running the same job twice, my second request will be queued till the first one ends, if I will run the same code from two developers machine it will run at 2 different nodes and the grid can handle both request.
not sure what is wrong with my jenkins configuration or my grid hub configuration, I checked it again and again and all looks good :-)
so guess I'm missing something
any advice/direction/idea will be highly appreciated.
Thanks
Ronen

Jenkins - Running concurrent jobs with "circular" parameter

I'd like to run several builds concurrently in Jenkins for the same job. I run at maximum 3 builds concurrently. I want each build to run with a parameter that must be unique from a pool for parameters. For instance, pool=[1, 2, 3]: The 1st build picks "1", 2nd picks "2" and the 3rd picks "3".
I must ensure that different builds can't pick the same parameter.
After building, the parameter is available again.
How can I do it?
Alternative: How can I count the number of builds running in this project and pass it as parameter?
At first, select the checkbox button named build-concurrently-if-neccesary to ensure the same job could build concurrently. you'd better read the help-html seriously before
The isolated environments for building different jobs make that data could not be shared each other in a simple way.
Here is a solution that trigger the buildWithParameters link by jenkins rest api to control the pool in the program procedure of your own.
add a string-parameter in job's config.
post the string parameter to http://$JENKINS_SERVER_URL/job/$JOB_NAME/buildWithParameters
Maybe it's the most convenient way if no available plugin found.
I found a plugin in github and asked the author to publish it. It works well and solves my problem.
Jenkins Parameter Pool Plugin

Get result of a build step in Hudson/Jenkins to re-use it in another one

My question may be silly but I've been trying several ways and I still can't do what I want, i.e.:
use the scp target of Ant to target a remote machine and execute
a script there
this script creates a dynamic list of files
get this list of files (only their names) back in Hudson to use it in the next build step (another scp from Ant)
I tried to use environment variables but they are interpreted by Hudson so I'm stuck here...
Globally my question would be: how to get a result from an Ant build step ?
Thanks for your ideas,
Emmanuel
You may find File parameter useful. This allows you to create an input file, pass it to build. You may need to write script/ant script to process the file though.
In the long term you may evaluate a Hudson farm. This will allow to create tasks that span multiple machines , pass results around. (https://wiki.jenkins-ci.org/display/JENKINS/Plugins)
You can get the ID(s) of the job that triggered your job via the API and fetch their status.

Resources