Jobs vs script: What is their difference in Travis-CI? - travis-ci

It seems that I can put commands, such as echo "helloworld", in script or in jobs section of .travis.yml. What is their difference?

They are completely different functionality defined in .travis.yml
script: is a build/job phrase that you run commands in this specific step. [1]
job: is a step where you will be able to define multiple ones within .travis.yml file and each job can run an additional build job that you can define their own script inside it. [2]
[1]https://docs.travis-ci.com/user/job-lifecycle/#the-job-lifecycle
[2]https://docs.travis-ci.com/user/build-matrix/#listing-individual-jobs

Related

Optimized alternative to stashing files in jenkins

The jenkins pipeline currently does a build and deploy, stashes some files, unstashes them and runs end to end tests on these files as so:
// build and deploy code
stash([
name: 'end-to-end-tests',
includes: <a bunch of files>
])
unstash('end-to-end-tests')
// code to run tests using npm run test:end-to-end-tests
In the interest of speeding up this pipeline, is there a way to get around the stash? I need end-to-end-tests in order to run my tests with the appropriate npm command later on, but how can I use this without stashing (if possible)?

circleCI CLI - Cannot find a job named `build` to run in the `jobs:` section of your configuration file

I'm using the circleCI CLI locally to test my .circleci/config.yml. This is what it looks like:
version: 2.1
jobs:
test:
docker:
- image: circleci/node:4.8.2
steps:
- checkout
- run: echo 'test step'
workflows:
version: 2
workflow:
jobs:
- test
This fails with the following error:
* Cannot find a job named build to run in the jobs: section of your configuration file.
If you expected a workflow to run, check your config contains a top-level key called 'workflows:'
The 'hello world' workflow from the CLI docs works fine.
What am I missing here?
In the same CircleCI CLI documentation mentioned above it has in the 'limitations' section:
The CLI tool does not provide support for running workflows. By nature, workflows leverage running jobs concurrently on multiple machines allowing you to achieve faster, more complex builds. Because the CLI is only running on your machine, it can only run single jobs (which make up parts of a workflow).
So I guess running workflows with orbs works (as in the 'hello world' example), but running workflows with your own jobs does not work with the CLI.
Testing Jobs Locally
If you're looking to test your config locally like I was, you can still execute your individual jobs locally. In the same documentation linked above, under the title 'Running a Job' when using config with version 2.1+ you can explicitly call one of your jobs like so:
circleci config process .circleci/config.yml > process.yml
circleci local execute -c process.yml --job JOB_NAME

Do Travis CI tests in the same stage happen in the same instance?

In Travis docs, it states that Build stages is a way to group jobs, and run jobs in each stage in parallel, but run one stage after another sequentially.
I know that all jobs in a stage in run in parallel, but do these tests run in the same instance, i.e. do they share the same env variables?
Say I have 3 tests under a stage.
- stage: 'Tests'
name: 'Test1'
script: ./dotest1
-
name: 'Test2'
script: ./dotest2
-
name: 'Test3'
script: ./dotest3
If I set export $bleh_credential=$some_credential in test1, does it get carried over to test2? It seems like it shouldn't, as they run in parallel, correct? If that's the case, can I set a stage-wide env variable, or should I set them every time I run a new test?
No, jobs are all run on new containers, so nothing in the job process can be shared between. If you need some persistence between them, Travis requires you to use an external storage system like S3. Read more about it here: https://docs.travis-ci.com/user/build-stages/#data-persistence-between-stages-and-jobs
I would set the env vars for each job, perhaps using YAML anchors for the defaults: https://gist.github.com/bowsersenior/979804

Automate Jenkins Pipeline to run multiple times

I am using a Jenkins pipeline to process files for a UAT.
Stages are like:
Get the files
Put them in the right directory
Run bash script
Run another bash script
Run another bash script
Send results
I would like to be able to run 1 or more times this pipeline based on the number of EODs I want to run.
So if EODs are:
20180720, 20180721, 20180722
I want the pipeline to run 3 times.
So;
for eod in '20180720, 20180721, 20180722'
run pipeline $eod
done
I know you can launch it with the url and parameters like John mentions below, and I am looking for a more simple solution, if there is any.
How can I do this?

Is it possible to build 5 sub modules in one single job which have different script to execute?

I want to know how can we parameterize build in jenkins. I need to create a jenkins job which will contain 5 sub jobs in it. i need to create a drop down , selelct any of the module and build it. But the script used is different for every sub build? can any1 guide on the same is it possible.
string parameters in Jenkins result in environment variables of the same name.
So, you could write a wrapper script in bash which would look for the environment variables that could be set as a result of the parameterized build (i.e. your 5 sub-jobs) in a series of if-elif statements, and within each one, you would invoke the necessary build script from there.
The build script that you would have Jenkins run would be the wrapper script.

Resources