i have use magento 2 with docker. it's working fine but my .less not compile with grunt. please any tutorial link share with me or any suggest me how to configure grunt in docker file with magento 2
thanks in advance.
Related
My PHP project is stored in WSL, accessed by PhpStorm installed on Windows and running with Docker Desktop installed on Windows.
The Project itself is totally fine, but running Tests is not possible because PhpStorm cannot find the vendor autoload or phpunit.phar in Test Framework configuration.
Setup:
Windows 10 with WSL2 Ubuntu 20.04 LTS
PhpStorm on Windows
Docker Desktop on Windows, Docker Compose files in WSL
Code in home folder in WSL (see following screens)
I read in some older threads that Docker Compose v2 needs to be enabled in Docker Desktop. It is:
Docker is configured inside of PhpStorm and shows that the connection is successful (I know that works because things like Xdebug is working without any issues):
Notice that I configured a path mapping here for the project root.
in WSL: \\wsl$\Ubuntu\home\USERNAME\workspace\PROJECTNAME-web-docker
in Docker: /var/www/PROJECTNAME-web
I can see that those paths are correct by either logging into the Docker container or by checking the Service Tab of PhpStorm and inspecting files:
This is my CLI Interpreter using the docker-compose configuration:
It does not matter if I use the existing container or if it should be starting a new one
PHP Version is always detected
And finally the error inside of Test Framework:
Here I tried different things:
use composer autoloader or phpunit.phar
it doesn't matter if I use a full path /var/www... or just vendor/...
tried different path mappings here
clicking on refresh shows this error in a small popup
Cannot parse PHPUnit version output: Could not open input file: /var/www/PROJECTNAME-web/SUBTOPIC/vendor/phpunit/phpunit/phpunit
autoload.php is definitely available and correct, phpunit is installed and available.
Maybe someone has a hint what is missing or wrong? Thanks!
EDIT:
How do I know that autoload is available or path mapping is correct?
I have Xdebug configured and running. When Xdebug stops in my code, I know that the path mapping is correct. The output of Debug -> Console for example shows stuff like this:
PHP Deprecated: YAML mapping driver is deprecated and will be removed in Doctrine ORM 3.0, please migrate to annotation or XML driver. in /var/www/PROJECTNAME-web/SUBTOPIC/vendor/doctrine/orm/lib/Doctrine/ORM/Mapping/Driver/YamlDriver.php on line 62
so I know the path mapping for xdebug works, but seems like Test Framework config does not like it.
So I'm trying to create a proof of consept of getting a Jenkins API from swagger, importing it into mockoon, and then dockerizing it to run as a service in a kubernetes cluster.
I downloaded the JSON from swagger and was able to import into the mockoon desktop application. I then exported it into a file. I downloaded the mockoon cli and then tried to run dockerize but I get the following error:
mockoon-cli dockerize -d jenkins-mockoon-api.json -o `./Dockerfile`
Error: This export file is too old and cannot be run with the CLI
Please re-export the data using a more recent version of the application
I tried re-exporting but no luck. I'm starting to think I'm over complicating it. I just want to be able to deploy a pod with mockoon with that API.
Thanks for the help!
I found the issue. When Exporting you have to specifically do it as a Mockoon environment JSON, not a Swagger or anything else. Not a very good error message though!
I would like to install apache airflow and apache beam together using either Docker or Docker-Compose.
What I'm trying to accomplish is: Currently, I've apache airflow DAGS with different tasks and in one of the task I want to incorporate/ integrate apache beam data pipeline. And In order to run this workflow successfully I need to install the apache beam dependency. For example: pip install apache-beam and pip install apache-beam[gcp]
Currently, I'm using Puckel Docker image for apache airflow. My preferences are:
1. https://github.com/apache/airflow
2. https://github.com/puckel/docker-airflow
I want to go with the first link as its an official image. If anyone has solution with puckel docker image then please help me to understand how can I get the latest changes everytime from the repository.
I'm new to docker, apache-airflow and apache-beam. I'm learning and growing slowly. So, your help is much appreciated. Thank you so much!!! :)
Technology used:
- Windows 10
- Docker for Windows
- DevilBox
- Drupal 8.6.4
(Optional tech: cygwin to simulate linux commands).
When attempting to add a new module via URL or file upload in Drupal 8, the site asks me for FTP credentials and I have no more ideas where to find or set them.
I have a basic install of Devilbox running a brand new installation of Drupal 8. (Devilbox is a dockerized php stack).
To solve my problem I bypassed finding the FTP credentials.
I will change the accepted answer to the first correct answer that is not mine and a bypass.
First step, stopped using cygwin. Started using Powershell.
Next step, navigate to the site's installation within devilbox:
/devilbox/data/wwww/<yoursite>/htdocs
Then run command: composer self-update
Followed by:
composer require drupal/<drupal module to add>
Magically, module is under the modules page on drupal 8.
I have Grails project in which I have added elastic search dependencies.
Now I want to install head plugin .
According to documentation of head plugin, command for this is:
elasticsearch/bin/plugin -install mobz/elasticsearch-head
But I am not able to find bin directory of ES.
So where is the elasticsearch installed in Grails ?
Do :
cd /usr/share
and
elasticsearch/bin/plugin -install mobz/elasticsearch-head
You have a Car with Satellite radio in it. If there is a request to add a new channel to the satellite radio service, then that has to be done in the broadcasting station instead of installing any kind of component/tool in your Car. :)
Similarly, When you say you added elastic search dependencies to a Grails project that does not mean you have a elastic server running in the same application.
The documentation for head plugin (here plugin means an add-on to elastic server, instead of a Grails plugin) refers to elastic server (where you can find a bin directory).
Your best approach would be Running as a standalone webapp if you have elastic server running in your localhost during development. Or run it as a plugin to elastic server installation wherever it is installed. I hope I was able to convey. :)
Do:
cd /usr/share
and
elasticsearch/bin/plugin install mobz/elasticsearch-head