I am a beginner in regards to ArangoDB and I am trying to deploy my first project using it. The website is PHP based - what I did is that I created an Arango Docker container on Digital Ocean so that I can access it from the browser with the ipv4 provided. Public access to port 8529 is enabled. Locally, I am able to modify the .config file in order to point to the corresponding ip and I can painlessly retrieve data.
As a hosting provider I am using one.com. When uploading the same files that I am able to run locally on my own domain I get the following error:
["_database":"ArangoDBClient\Connection":private]=> string(7) "_system" } ArangoDBClient\ConnectException: cannot connect to endpoint 'tcp://xxx.xx.xxx.xxx:8529/': Connection timed out
I want to mention that I have also tried out ArangoOasis. No luck with it - I get the same error. Been at it for quite a few weeks - I would very much use some guidance. Even what to do next as I am out of ideas and documentation to read.
Related
I want to setup a local IMAP server within my home network for archiving emails. The server does not need to be accessable via the internet. Therefore I can pass on a secured access via SSL (If this makes it easier). I want to integrate the server in my current docker setup. So the server has to run within a docker container.
I already tried the following containers:
https://hub.docker.com/r/blackflysolutions/dovecot
https://hub.docker.com/r/dovecot/dovecot
https://hub.docker.com/r/mailu/dovecot
https://hub.docker.com/r/mailcow/dovecot
https://hub.docker.com/r/eilandert/dovecot
But i could not get any of them to run. At the same time none of them have a forum or anything where I can put a question. Two of them (mailu/dovecot and mailcow/dovecot) are part of a bigger mailserver package. Which I do not need, I only want a IMAP server to put some email locally. But I tried them anyway.
Does anyone know how to get any of those to run? Or suggest me another stable docker container solution.
Looking for a little help here. Trying to bootstrap a small side business, and I have never been the DevOps guy. I use the web hosted version Gitlab to store my codebase, but I am unable to use it to act as a repository for docker images that I am creating from that code. The images that I am generating are quite large and exceed the token expiration when I am attempting to push back to the registry from the group gitlab-runner that I have installed on my personal machine. I have an extra machine sitting around, so I installed gitlab-ee and exposed it through a dynamic dns service (NoIP). I then mirrored the repositories that I want to generate images for on my locally hosted gitlab instance. At first, I tried to use a runner that was on the same machine as my gitlab instance, but always failed due to all available memory being consumed and locked up the machine. All in all, gitlab docs pretty much don’t run the runner and instance on the same machine. So, I went back to using the runner I originally used for the web hosted instance, but I am having issues pushing to my local instance. When trying to push to my repository (through the ddns URL), I end up getting a lot of this:
e4fdbd3bf512: Retrying in X seconds
And it eventually times out due to job time limit or token time limit. I am guessing this is due to my connectivity not being great. What I would like to do is have the (installed on a local machine) runner push to the local IP on my network, but I am unsure how to do this with the SSL setup. When trying to login and push in my pipeline, I get the following error:
Error response from daemon: Get "https://xxx.xxx.xxx.xxx:xxxx/v2/": x509: cannot validate certificate for xxx.xxx.xxx.xxx because it doesn't contain any IP SANs
How do I correct this without affecting the https:// SSL that is already setup for when accessing the instance from the DDNS? Appreciate any help you can give me.
I abandoned attempts at getting this to work. Ran through a bunch of scenarios of creating my on CA and trying to create certificates for the IP address and share that with the other machine. Ultimately, gitlab obscures some things with LetsEncrypt. Funny enough it was just a connectivity issue where I was getting timeouts. Ended up hard-lining both machines and getting better throughput. Able to push ~6GB docker images up through the URL.
I'm trying to access my redis database via Grafana Cloud on my laptop. The database is a redis container working as a cache on a different device (pi). Accessing the Redis database via Python script on my remote device is no problem but trying to connect to it via Grafana (using Redis Datasource Plugin) doesn't work as intended and throws a connection error. Poorly the documentation leaves me kinda clueless whats the specific cause (any missing plugin dependencies?) so I'm thankful for every hint.
To be able to access Redis Server from Grafana Cloud it should be exposed to the Internet as Jan mentioned.
If you run Grafana in Docker container it should be started in the host network mode (https://docs.docker.com/network/host/) to be able to access it from other devices.
If something is lacking or not clear in the Redis plugins documentation, please open an issue and we will update it: https://github.com/RedisGrafana/RedisGrafana/issues
I am using DDEv and Docker with Windows 10 pro to set up a localhost install of drupal 8.8 using Composer. I have set up and configured the local drupal installation (it is a fresh install) and it appears to be running correctly, but in the admin section of the drupal site I receive a warning to change write permissions of sites/default/settings.php.
I tried to change settings using Filezilla, but it appears that local files in Filezilla do not provide access to write permissions? When I right-click the file in Filezilla, no permissions option appears.
Following troubleshooting tips from ddev, I tried to access phpmyadmin at https://mysitename.ddev.site:8036
Instead of loading phpmyadmin, I got the following error message:
Secure Connection Failed
An error occurred during a connection to dmckimep.ddev.site:8036. SSL received a record that exceeded the maximum permissible length.
Error code: SSL_ERROR_RX_RECORD_TOO_LONG
The page you are trying to view cannot be shown because the authenticity of the received data could not be verified.
Please contact the website owners to inform them of this problem.
I've been searching around for a couple of hours now and do not find a solution to this. I ran ddev describe and all seems fine with the installation. The drupal site in the container seems to run okay. There are no port conflicts present so far as I have found, so I am not sure why I cannot get access to phpmyadmin.
I am a relative newbie in terms of skills, but have successfully maintained drupal 4-7 on localhost with XAMPP and my web host. Now I am wrestling with the move to drupal 8/composer/docker/ddev. Any suggestions would be much appreciated.
Thank you!
Update 2022-09-14: DDEV has had https support fpr PHPMyAdmin and MailHog for years now, ddev describe will show you the URL.
(Original answer) ddev's PHPMyAdmin connection doesn't support https, just http. You can find the links for both PHPMyAdmin and MailHog using ddev describe; both are http-only, as in your example, http://mysitename.ddev.site:8036. It would be possible to provide https URLs for PHPMyAdmin and MailHog, but nobody has ever asked for them, and there's no security reason to do so.
Note that the key reason for https on the actual project URL is because real projects run behind https and people need to see problems like mixed content during the development phase. But there's no such need for PHPMyAdmin. However, I'm sure if people ever want it, we'll do it, it's not hard to do.
Just as a general add on, after ddev start you can run ddev launch -p in order to open PHPMyAdmin for the current project database in the browser.
I have set up a node.js server and run it in a Docker container.
I am hosting MySQL database on my computer and have the server connected.
I have deployed my website on netlify and it uses REST API to send and retrieve data to the server.
Currently, the API url uses 'http://localhost:4941/api/v1...' When I access the website on the same computer as the database is hosted, I am able to see the data retrieved from the server.
However, when I access the website on another computer (and from a different network), I am not able to see the data obtained from the server.
I have tried using 127.0.0.1 as well as the Docker container's IPAddress of 172.17.0.2 but was unable to solve the problem too.
I expect to be able to use my website from any computers in the world that have access to Internet and be able to send and retrieve data from the server.
So, does the problem lie in the API url of me using localhost? If so, what address should I use instead?