How so I install Syndesis on OpenShift when I don't have cluster:admin rights. Is this even possilble? - syndesis

I've build Syndesis from source on my local machine and I now want to deploy it to a remote OpenShift in a shiny new namespace.
I've tried:
$ syndesis install -p kurt --local
ERROR: No CRD Syndesis installed or not enough permissions to read them. Please run --setup and/or --grant as cluster-admin. See 'syndesis install --help' for more information.
$ install --setup --grant developer
Installing Syndesis CRD
ERROR: Cannot install CRD 'Syndesis'. You have to be a cluster admin to do this.
I don't have cluster:admin.
Can it be done w/o having cluster:admin rights?

It should be, for the most part, by using the oc commands from the Minishift installation guide. The most part, i.e. when this fails, is with the permissions needed to run Camel K or use Knative, if those haven't been setup by the cluster administrator the oc new-app will error out at the end with permission issues, though the bits to run Syndesis without Camel K or Knative should be installed and setup.
I think there's an ongoing effort to replace all installation methods with the Syndesis operator, installing Operator Lifecycle Manager and setting it up by the cluster administrator should make this friction-less for non cluster administrator users.

Related

visual-studio-code : failed to connect. is docker installed?

my environment :
MacOS M1 chip
VSCode version 1.66.2 arm64
local installed docker version : 20.10.22
I have situations that docker is not working in VSCode.
I already installed docker in local. But when I'm trying to connect docker in VSCode, repeatedly asking install docker extensions. (but I do have docker already ). and if I do reinstall with following the VSCode, the docker version was broken (changed to intel chip docker).
Does anybody know what's wrong?
Docker Extensions for VS Code have nothing to do with the Docker engine itself. They are like an additional layer of tools and commands over the installed Docker. E.g. they provide IntelliSense for editing Docker-related files, you can run Docker commands from F1 drop-down, etc. But you should be able to do all the required tasks even without Docker Extensions, e.g. from the Terminal in VS Code, but for this the path to Docker CLI (command line interface) should be added to PATH environment variable.
If you are getting failed to connect error then maybe Docker engine is not running. Please refer to https://docs.docker.com/desktop/install/mac-install/ and https://docs.docker.com/desktop/troubleshoot/overview/ about how to check if the engine is running and how to troubleshoot the issues.
If that doesn't help, please provide some specific error and steps, which led to it, then we'll try to find out.

how to set up GitLab CI for iOS project

Is there a most recent and detailed-enough tutorial out there on how to set up a CI for an iOS project using Gitlab ?
I found many tutorials (see list below) - but it seems that things have changed at GitLab since these tutorials were made. Or they are simply not detailed and well-enough explained for me as a beginner. Therefore I wonder if there is a most accurate step-by-step explanation on how to set Gitlab-CI up for iOS on a Mac ?
In particular, I am looking for a Gitlab-CI step-by-step tutorial for an iOS project using Fastlane and having Cocoapods Dependencies.
Below you find the list of tutorials and pages that all say things about GitLab CI setup for iOS projects.
(I've followed all of them - but none is detailed enough for me as a beginner in CI or, really, is no longer accurate for 2019 and what GitLab represents today).
For all of the tutorials, I end up with Gitlab Pipeline errors.
Here a list of my open stackoverflow questions, each with its own Gitlab CI trial:
gitlab-runner register without sudo: I end up with "permission denied"
Stackoverflow Nr1
gitlab-runner register with docker: I end up with "root error" Stackoverflow Nr2
sudo gitlab-runner register with sudo: I end up with "root error" Stackoverflow Nr 3
Concrete questions:
can you use "docker" with Gitlab for iOS ? (or must it be "shell")
do you need to use the "sudo" word or not for your gitlab-runner registration ? (why or why not)
how do you set permissions on your Mac so that Gitlab CI works once you leave all sudo words out ?
Here is the list of tutorials I've found that explain Gitlab CI with iOS projects:
Setting up GitLab CI for iOS projects
How to set up GitLab Continuous Integration for iOS projects without a hassle
How to set up GitLab CI for iOS in three relatively simple steps
iOS Project (CI/CD): Integrating GitLab-CI, Fastlane, HockeyApp, and AppCenter
Here is the best tutorial on how to set up GitLab CI for an iOS-project that I have found.
Here are the findings that lead to a successful GitLab CI for an iOS-project:
It is especially important to recognise that as for an iOS-project,
your GitLab CI must be registered as "shell" executor (not "docker").
Also, you are not allowed to use any sudo when dealing with the
gitlab-runner. (no sudo allowed whatsoever since Apple wants you to
have a user-mode connection with GitLab (and no system-mode as in a
docker or else)...
The steps are as follows:
gitlab-runner stop (optional if already previous trials...)
gitlab-runner uninstall (optional if already previous trials...)
gitlab-runner register \
--non-interactive \
--url "https://gitlab.com/" \
--registration-token "ABCDEFG21sadfSAEGEAERE" \
--description "MyApp runner with shell" \
--tag-list ios \
--executor "shell"
(feel free to use different tags. Also, the token can be found under the GitLab page-->under Settings-->CI/CD-->Runner expand)
gitlab-runner install
gitlab-runner start
Furthermore, it turned out that the "permission denied" error on my GitLab Pipeline had nothing to do with GitLab itself - but was due to a Ruby version mismatch on my Mac that I connected with the gitlab-runner.
I was able to update my Ruby version with the help of this post (i.e. using chruby). There are other possibilities out there on how to update Ruby on your Mac.
It is important to understand that GitLab requires your Mac to have a stable Ruby environment.

First run of Docker -- Running makeitopen.com's F8 App

I'm reading through makeitopen.com and want to run the F8 app.
The instructions say to install the following dependencies:
Yarn
Watchman
Docker
Docker Compose
I've run brew install on all of these, and none appeared to indicate that any of them had already been installed. I have not done any config or setup or anything on any of these new packages.
The next step is to run yarn server and here's what I got from that:
$ docker-compose up
ERROR: Couldn't connect to Docker daemon at http+docker://localhost - is it running?
If it's at a non-standard location, specify the URL with the DOCKER_HOST environment variable.
error Command failed with exit code 1.
Not having any experience with any of these packages, I don't really know what to do (googling brings up so many different scenarios). What do I do now?
PS. Usually when I work with React Native I run npm start to start the expo-ready app, but the F8 project doesn't respond to npm start.
UPDATE (sort of):
I ran docker-compose up which appeared to run all the docker tasks, and I'm assuming the server is running (although I haven't tried yarn server again).
I continued with the instructions, installing dependencies with yarn (which did appear to throw some errors. quite a few, actually, but also a lot of success).
I then ran yarn ios, and after I put the Facebook SDK in the right folder on my computer, the XCode project opened.
The Xcode build failed. Surprise, right? It did make it through a lot of the tasks. But it can't find FBSDKShareKit/FBSDKShareKit.h (although that file does appear to exist in FBSDKShareKit/Headers/)
Any thoughts? Is there any way in the world I can just run this in expo?
If docker and docker-compose are installed properly, you either need root priviledges or use the docker group to add yourself:
usermod -aG docker your-username
Keep in mind, that all members of the docker usergroup de facto have root access on the host system. Its recommended to only add trusted users and keep precautionary measures to avoid abuse, but this is another topic.
When docker is not working properly, check if it's daemon is running and maybe restart the service:
# systemctl status docker
● docker.service - Docker Application Container Engine
Loaded: loaded (/lib/systemd/system/docker.service; enabled)
Active: active (running) since Thu 2019-02-28 19:41:47 CET; 3 weeks 3 days ago
Then create the container again using docker-compose up.
Why a simple npm start doesn't work
The package.json file shows that those script exists, but it runs npm start. Looking at the docker-compose.yml file, we see that it creates 5 containers for it's mongo database as well as grapql and the frontend/backend. Without docker, it wouldn't be possible to set up a lot of services that fast. You'd need to install and configure them manually.
At the end your system may be messed up with software, when playing around with different software or developing for multiple open source projects. Docker is a great way to deploy modern applications with keeping them flexible and separated. It's worth to get started with those technology.

Can I Install Docker Over cPanel?

Can I install Docker over a server with pre-installed cPanel and CentOS 7? Since I am not aware of Docker, I am not completely sure whether it will mess with cPanel or not. I already have a server with CentOS 7 and cPanel configured. I want to know if I can install Docker over this configuration I mentioned without messing up?
Yes you can install docker over cPanel/WHM just like installing it on any other CentOS server/virtual machine.
Just follow these simple steps (as root):
1) yum install -y yum-utils device-mapper-persistent-data lvm2 (these should be already installed...)
2) yum-config-manager --add-repo https://download.docker.com/linux/centos/docker-ce.repo
3) yum install docker-ce
4) enable docker at boot (systemctl enable docker)
5) start docker service (systemctl start docker)
The guide above is for CentOS 7.x. Don't expect to find any references or options related to Docker in the WHM interface. You will be able to control docker via command line from a SSH shell.
I have some docker containers already running on my cPanel/WHM server and I have no issues with them. I basically use them for caching, proxying and other similar stuff.
And as long as you follow these instructions, you won't mess-up any of your cPanel/WHM services/settings or current cPanel accounts/settings/sites/emails etc.
Not sure why you haven't tried this already!
I've been doing research and working on getting Docker working on cPanel. It's not just getting it to work on a CentOS 7 box but rather making it palatable for the cPanel crowd in the form of a plugin. So far I can confirm that it's absolutely doable. Here's what I've accomplished and how:
Integrate Docker Compose with cPanel (which is somewhat a step
further from WHM)
Leverage the user-namespace kernel feature in Linux so Docker
services can't escalate their privileges (see userns remap)
Leverage Docker Compose so users can build complex services and
start ready apps from the store with a click
Make sure services starting via Docker run on a non-public IP on the
server. Everything gets routed via ProxyPass
cPanel has been gracious to provide a Slack channel for people to discuss this upcoming plugin. I'd be more than happy to invite you if you'd like to be kept updated or to contribute. Let me know!
FYI, there's more info here on https://www.unixy.net/docker if you're interested. Please note that this plugin is in private beta but more than happy to let people use it!
Yes you could, in fact someone else has done it already: https://github.com/mirhosting/cPanel-docker

How to install docker-engine using docker binary without internet connection

I have downloaded docker binary version 1.8.2 and copied that to my backup server (centos server) which doesn't have internet connectivity. I have marked this as executable and started the docker daemon as mentioned in [https://docs.docker.com/engine/installation/binaries/][1]. But it doesn't seem to get installed as a docker service. For all the commands, I have to execute as sudo ./docker-1.8.2 {command}. Is there a way to install docker-engine as a service? Currently sudo docker version shows command not found. I'm a newbie to docker setup. Please advise.
Why not download the rpm package (there are also centos 6 packages), copy to USB stick and then to your server and simply install it with rpm command and that's it. That way you'd get the same installation as if you were to run yum.
Of course you may have some dependencies missing, but you could download all of these as well.
Firstly, if you're downloading bare binaries on an enterprise linux, you're probably doing things in a very bad way. Immediately, you're breaking updates and consistency, and leaving your system in a risky, messy state.
Try using yumdownloader --resolve to get the docker installable and anything it needs.
A better option may be to mirror the installation artifacts, and grab it from the local mirror, but that's beyond the scope if you don't do this already.

Resources