Install new gitlab and have variable error - docker

I run a new Gitlab and get an error:
There was an error fetching the variables
when going to Admin Area > CI/CD.
I test different docker image versions (gitlab/gitlab-ce:15.0.0-ce.0 and latest version) but it not fix my problem:

Related

CloudBuild fastlane task failing with JAVA_HOME is not set and no 'java' command could be found in your PATH

I am attempting to run a fastlane build inside Google Cloud Build however I'm getting the following error:
ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
My first step is to install the Android SDK, which works fine, and then run the fastlane command, however each and every time, no matter what I do in the name before the id:fastlane I get the same Java error. I've downloaded both the Android and Fastlane images from the Cloud Builder and Community Cloud Builders github repo and placed them in our GCP project repository, so each are accessible in the name tasks.
steps:
# Android SDK
- name: 'gcr.io/$PROJECT_ID/android:29'
id: android
args: ["./gradlew", "assembleDebug"]
# run fastlane
- name: 'gcr.io/$PROJECT_ID/fastlane'
id: fastlane
args: ['distribute_staging', 'signingPassword:${_PASSWORD}', 'firebaseToken:${_TOKEN}']
Keep in mind that each step are containers. They are loaded one by one, the tasks performed and unloaded. Only the /workspace directory is kept from one step to another one.
Therefore, in your first step you load an android builder and you can build Java stuff because Java is installed in the container. Then unloaded
The second step is Fastlane. Look at the Dockerfile, it's a ruby image. No Java inside, thus, your process can't work. You have to build a custom worker
Either from the fastlane base image and install Java on it
Or from a Java image (Android?) and install Ruby and fastlane on it.

Module not found error in Jenkins while trying to run wdio from docker image

I am using webdriverio . My package.json is having all my dependency. I have created a docker image for my project throught Jenkins using a docker file. Now I am trying run the scripts from Jenkins. It is failing saying module not found. For example in config file I have used var json=require('cjson') . Same has been installed in docker image . But when I run through Jenkins it fails saying module cjson not found
Build container on your local and try to run it from there and check:
**1. Is your dependency installation script running?
See if there are any error for specific module that's failing**
Best

GitLab default autodevops CI: invalid tag name

I am running the latest gitlab 12.4.0-ee in a docker container on Windows. The docker-runner was also configured to be run in a docker executor mode.
For testing purposes I have created a test project, with only a default mater branch in it. Committed some dummy code to repo, and the CI was run automatically. Worth noticing, that the project has no gitlab-ci.yml in it!
Meaning that the default CI that was run is the Auto DevOps CI (I have actually no idea where the template or the source code is located at). This is the default setting in gitlab.
And so, the build has failed with following error:
invalid argument "/master:9456d6a27179d3cf28b7a2670d2a53204fc327aa" for "-t, --tag" flag: invalid
As you see in the error message - the wrong tag is used for docker build command.. Is there anything I can do to make the default Auto DevOps just do it's job without throwing the error? I have not created any scripts so I expect this process to be working out of the box, without any tweaks.
I have found some similar issues on the internet but unfortunately the solution is not applicable to me, as I am using all-default settings provided by gitlab devs.
Here is what supposed to be a fix but I can't use it:
The tag should be formated like name:version, and you giving it
/master:9fa2d4358e6c426b882e2251aa5a49880013614b which is not a valid
tag. You could try to delete the / before master
Any thoughts on how to make CI work?

Aurelia build using cli produces different results on different platforms

I'm trying to setup a jenkins job to automate builds of our aurelia app. When I build the app locally it works 100%, but when I build it on jenkins using the same process, then the files in the script directory is different.
I'm running Debian testing on my laptop. The Jenkins is a docker image running on Rancher.
When I test the build with jenkins, I get the following error:
http://localhost/src/main.js Failed to load resource: the server responded with a status of 404 (Not Found)
vendor-bundle.js:2 Error: Script error for "main"(…)r # vendor-bundle.js:2
Both my local environment and jenkins have the following tools and versions:
node: 6.9.1
npm: 3.10.8
aurelia: 0.22.0
The app is written in typescript.
we are also using aurelia-cli to build the app using : "au build --env prod"
The process we use to build the app:
npm install aurelia-cli
npm install
typings install
au build --env prod
Any help appreciated

Unable to get Maven version from /usr/share/Maven/bin/MVN Could not find system variable

I want to run bamboo remote agent using docker.
When I run the image atlassian/bamboo-java-agent in my machine it produces the following error
Unable to get Maven version from /usr/share/maven/bin/mvn Could not find system variable
If you wish to use perforce please set the location as a capability.
How to solve these two errors?
I later updated my Maven to higher version 3.3, but that is not reflected when I run the image as shown in the pic attached.

Resources