Using Telnet in Jenkins - jenkins

We are currently running several Projects on several Servers. For our workflow we installed Telnet to communicate with theses servers. More than this, we run the Build/Make mechanism (own development php) on theses servers.
Now we like to introduce a CI mechanism to this. Therefore we installed a central Jenkins application, from which we like to trigger all these builds. Our standard way is to open a telnet connection an trigger the build command on shell. Now we like to adopt this mechanism and bring it to Jenkins.
Is there a possibility to open a Telnet connection via Jenkins?
Installing a slave Jenkins on the remote Servers is a not as much preferred option, because we don't like to install more Software on the server as necessary.

Yes, It is posible, you can use pipeline to include groovy program that use telnet commands, it could be one posibility another you can use ssh commands in a free style job.

Related

Functionality to start and stop jobs

I am not sure something like Jenkins is built for this functionality, but I am curious if it is possible.
Say I have some sort of code, that I want to run from 7am until 7pm. Typically Jenkins jobs are done when whatever process is complete. Like a python script closing.
my goal would be to be able to have a script that will infinetly run, and will be terminated by Jenkins at a certain time. Doing this would still allow me to see the nice web ui, remotely start it, easily add hooks, etc.
Is this possible in Jenkins, or is there another platform like Jenkins that supports this type of functionality. Basically instead of using Jenkins for 'builds', you would be using them to control services
I have completely replaced Linux Cron with Jenkins, so yeah you can do what you're wanting to do. Only limited by your imagination :)
I have all of my Linux servers configured as nodes (via ssh) within Jenkins, and the connecting account on each of them has sudo privileges, so I can essentially have Jenkins do anything I want it to, at the OS level.

How can I deploy to Google Compute Engine via CI/CD

I have a jar and a docker image that I wish to deploy to my Compute Engine instance and run docker compose down/up after it being there. I can use git on the instance if that helps.
I want to do this using CI/CD tools, something like Google cloud build, gitlab, bitbucket pipelines. Ideally something that has a free tier.
I am aware this might be a bit vague, so am willing to add more details if necessary
In your case you can try Jenkins and use an ssh plugin to execute commands on your remote instance and send the files. There are some considerations that you might want to take before doing that.
1.- Add your ssh keys in the metadata for that instance .
2.- Make sure your firewall rules allow incoming traffic on port 22.
Once your instance allows incoming traffic on port 22 and you’d installed the ssh plugin, you just have to type the commands (docker-compose up/down) in the ssh section added by the plugin.

Deploying code on multiple server with Jenkins

I'm new to Jenkins, and I like to know if it is possible to have one Jenkins server to deploy / update code on multiple web servers.
Currently, I have two web servers, which are using python Fabric for deployment.
Any good tutorials, will be greatly welcomed.
One solution could be to declare your web servers as slave nodes.
First thing, give jenkins credentials to your servers (login/password or ssh login+private key or certificate. This can be configured in the "Manage credentials" menu
Then configure the slave nodes. Read the doc
Then, create a multi-configuration job. First you have to install the matrix-project plugin. This will allow you to send the same deployment intructions to both your servers at once
Since you are already using Fabic for deployment, I would suggest installing Fabric on the Jenkins master and have Jenkins kick off the Fabric commands to deploy to the remote servers. You could set up the hostnames or IPs of the remote servers as parameters to the build and just have shell commands that iterate over them and run the Fabric commands. You can take this a step further and have the same job deploy to dev/test/prod just by using a different set of hosts.
I would not make the webservers slave nodes. Reserve slave nodes for build jobs. For example, if you need to build a windows application, you will need a windows Jenkins slave. IF you have a problem with installing Fabric on your Jenkins master, you could create a slave node that is responsible for running Fabric deploys and force anything that runs a fabric command to use that slave. I feel like this is overly complex but if you have a ton of builds on your master, you might want to go this route.

How can I run a script on my server using gcloud compute?

I'm deploying my Rails apps on Compute Engine, and my code is hosted at Github. I want to push changes to my master branch, and then execute a gcloud compute command to tell my instances to pull the master repository and restart nginx.
If I can't execute a script from SSH, what's the best way to tell my instances to update to the latest git commit and restart, so my apps are running on the latest codebase?
I've tried using the Release Pipeline, but it doesn't seem to work for Rails.
You can use a server automation system for something like this. For example:
Salt Stack allows remote command invocation as well as a thousand other useful server management features.
Ansible, which is built on top of SSH, is great for running commands remotely.
Most other server automation systems (chef, puppet) also provide some way to run a command remotely.

jenkins (Publish with SSH Plugin)

I want to build my maven project in Jenkins and copy all the the jar files to a remote Unix machine.
Also I want to connect to a LDAP data Store and start the services and test if the services are up and running
Basically I want to do the following tasks after my project is successfully build in Jenkins:-
1)Copy current version of my project to designated machine and location
2)Copy configure to connect to a designated integration test DS
3)Start the services in my project
4)Test that it is running.
Can I achieve this by Publish over SSH plugin provided in jenkins??
Or Shall I create some scripts which can automate the above tasks.The reason I am asking this is because I am not very familiar with Jenkins and Unix scripting.
Is there any good approach to do this task.
Thanks in advance.
Ansia
The Publish over SSH plugin will allow you copy files to remote server and execute arbitrary commands on the remote server.
Question is - do you know how you would achieve the following on the remote server?
2)Copy configure to connect to a designated integration test DS
3)Start the services in my project
4)Test that it is running
If yes, just enter those commands into Publish over SSH configuration. Or provide a script to be executed.
If you don't know how to achieve that, then that's a separate question.
Yes, you can use the publish over ssh plugin to copy the jars, and execute a script which launches your services. Take a look here to see how to launch a script "in the background" so it does not get killed when the session ends or to avoid blocking the Jenkins build by making it wait for the script to finish executing
Can't say much about LDAP as I haven't used it but depending on your needs I guess you could create a basic helper-jar with spring-ldap or any other similar library.

Resources