cannot get authentication to work with svn:// protocoll in docker - docker

I want to setup a svn server in a docker container. I am using https://github.com/MarkusH1975/svnserver.svn.mh.
I went through all the steps that are explained in the readme.md.
But I can only checkout when I when anon-access in the volume/svnrepo/myRepo1/conf/svnserve.conf is read:
[general]
anon-access=read
auth-acces=write
But then I can not commit. When I try to commit I always get the error
svn: E170001: Authentication error from server: Internal server error in authentication
Only when I write anon-access=write I can commit.
I would love to be able to set anon-access=none, but then I can't even checkout. And what puzzles me is, that I do not get a password prompt, when I do not specify a password.
> svn checkout --username myUsername1 svn://192.168.XXX.XXX/myRepo1
svn: E170013: Unable to connect to a repository at URL 'svn://192.168.XXX.XXX/myRepo1'
svn: E170001: Authentication error from server: Internal server error in authentication
Shouldn't there be a password prompt?
Background:
I want to setup a svn server on my synology with DSM 7. So there may be something specific.

Related

Jenkins: stderr: Permission denied (publickey) error not resolving

I have setup SSH key on CENTOS 7 server and also have added SSH in my Bitbucket personal settings. But still I am getting Jenkins error as:
Although I have setup ssh key on CENTOS 7 server where the Jenkins is installed. On running ssh -v git#bitbucket.org command in terminal I am getting a number of lines and it seems that the connection has been established successfully. The terminal output was:
But still I am getting authentication error.
I have tried other solutions like Jenkins Shared Library: Permission denied (publickey) and Jenkins : stderr: Permission denied (publickey). fatal: The remote end hung up unexpectedly But none of them worked. Please help me.
I think your Repository URL is not correct. If you have setup SSH connection in bitbucket then you should use url accordingly.
Go to Bitbucket and click on clone button ion your repository.
Select SSH on the top right of the dialogue being displayed.
Copy the url written after git clone keyword. It will go like
git#.....
That should work.

Webhook execution failed: execution expired

I am trying to trigger jenkins build whenever there is a push to GitLab.
I am referring to https://github.com/jenkinsci/gitlab-plugin.
When I test the connection for webhook it shows execution expired.
I am using:
Jenkins ver. 2.60.1
GitLab version 9.4.0-rc2-ee
Git lab plugin 1.4.6
The exact error message, clicking "Test setting" from GitLab:
We tried to send a request to the provided URL but an error occurred: execution expired
As mentioned in issue 128:
This looks and sounds like a configuration or network error.
Maybe your machine is not publicly available on the webhook address (firewall etc).
For instance, on Digital Ocean server, you would need to open up the port (mentioned in git-auto-deploy.conf.json) in the firewall:
sudo ufw allow 8866/tcp
Double-check though what you put in Manage Jenkins > Configure in term of Gitlab information (connection name, host url, credentials), as mentioned in jenkinsci/gitlab-plugin issue 391.
See GitLab Integration Jenkins: Configure the Jenkins server
It means issues in between jenkins server and gitlab or github server.
Like what I did:
I have set my local-IP:port/project/jenkins_project_name
http://192.168.1.21:8080/project/jenkins_project_name
and set the above URL in the gitlab webhook, it shouldn't work - right?
Because it's an IP that's private and not routable.
SO later I realized and set the public-IP and then hook worked.
http://public_IP:8080/project/jenkins_project_name
Note: To routable public-IP, you should expose port in your router [e.g. 8080 was for me or anything want ]
Hope this works.
I have faced the same issue.
In my case Jenkins is running in an AWS EC2 instance. I have resolved the issue by whitelisting the Public IP addresses of Gitlab on port 443 into the instance security group.

How to clone docker images from local JFrog artifactory repo to a remote over https

Please excuse me if the answer is obviuos, since I am a newbie in configuring JFrog Artifactory.
I have read that I can configure a replication for my local repository to a remote, to make docker images synchronized in both repositories.
The configuration of this replication seems to be quite obvious, but the problem is the remote repository is accessible only over https, I have to authorize to get access to it.
So, I am setting valid username and password at replication config, but constantly receive message "Unable to identify target URL as an Artifactory instance: HTTP/1.1 401 Unauthorized" . What am I doing wrong?
Thank you in advance!
That's how I am trying to configure replication

Host key verification failed using gitlab and jenkins

I get Host key verification failed error whenever I try to put my GITLAB git address into Jenkins.
I've tried:
- using multiple different SSH paths. Including removing : and replacing /. Used http
- I've ssh and tried to run the command in the terminal, when prompted to say y/n I pressed Y.
- It works with Github.
- I've tried going to my jenkins/.ssh/ida_pub and adding my keys.
Failed to connect to repository : Command "/usr/local/git/bin/git ls-remote -h git#:/.git HEAD" returned status code 128:
stdout:
stderr: Host key verification failed.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
If you make the repo you are trying to connect to public in Gitlab (Settings -> Edit Project -> Public mode) you should be able to connect using http (but only http).
If the repo is not public you will need to install an SSH key on Jenkins that has permissions to access the repo. My understanding is that the Jenkins git plugin does not currently use the SSH credentials already stored in Jenkins so you will need to install the key on the master and slaves that will run this build. How you do this will depend on your OS but I find it easiest to use an SSH config file on Linux.

BitBucket trigger jenkins build via post not working when windows auth enabled

In BitBucket, I have a POST service setup with hopes to remotely trigger a build on my jenkins CI server but when I push to my BitBucket account, it appears to not work.
I have windows auth enabled on the server that hosts jenkins.
I tried supplying the POST service with this url:
http://username:password#CiBuilderServer.com/job/MyProject/build
This url works in my browser. I tried curl and received this error:
curl: (47) Maximum (50) redirects followed
I'm guessing however BitBucket makes the request, the credentials are not sent correctly.
IIS Log:
x.x.x.x POST /job/MyProj/build - 80 - x.x.x.x Bitbucket.org 401 2 5 62
Any ideas or workarounds?
Enabling both basic auth and windows auth did the trick. I know bitbucket is written in Python. Maybe Python and Windows Auth don't play well together?

Resources