I'm using a docker-yeoman image to launch a container and then running the Yeoman gulp-webapp generator to generate a project skeleton.
Once generated, I'm running it in the same container using:
docker run -it -p 9000:9000 -p 35729:35729 --rm -v $(pwd):/src cthulhu666/yeoman:gulp gulp serve`
This bit is all working fine. I can view the site by visiting 192.168.99.100:9000 (I'm using Docker Toolbox so it's a VM rather than localhost).
But something strange is happening:
If I add body { background: red; } to /app/styles/main.scss, the change is detected by Gulp, the tasks are run and the background of the page goes red. So far so good.
However, if I then create a /app/styles/partials/_test.scss and then move the body selector into that file (including it in main.scss with import "partials/test";) then no changes are detected by Gulp when I save the file. The only way I can get it to "notice" the changes is to re-save the main.scss file, at which point Gulp kicks in and the changes to the _test.scss file are also applied.
I tested this by installing the generator locally and re-running the same experiment. Sure enough, the changes are applied just fine when made in either main.scss and _test.scss, as expected.
Can anyone shed any light on why this isn't the case in the Docker container?
It seems to be something to do with the underscore in the filename. SCSS allows you to use an variety of naming conventions:
Filename: /partials/_test.scss /
SCSS: #import "partials/_test";
Filename: /partials/_test.scss /
SCSS: #import "partials/test";
Filename: /partials/test.scss /
SCSS: #import "partials/test";
All three of the above import the partial correctly. However, only the last allows Gulp to detect changes (though only in a Docker container - it works fine locally).
I'd still be interested to know why this is, if anyone has any ideas :)
Virtualbox shared folders don't work very well with file watching.
https://www.virtualbox.org/ticket/10660
https://github.com/mitchellh/vagrant/issues/707
(I'm sure you can find many others)
Virtualbox is what is being used by Docker Toolbox/Docker Machine to provide the VM.
Related
I received a .tar docker file from a friend that told me that it should contain all dependences for a program that I've been struggling to get working and that all I need to do is "run" the Docker file. The Docker file is of a .tar format and is around 3.1 GB. The program this file was setup to run is call opensimrt. The GitHub link to the file is as follows:
https://github.com/mitkof6/OpenSimRT
The google drive link to the Docker file is as follows:
https://drive.google.com/file/d/1M-5RnnBKGzaoSB4MCktzsceU4tWCCr3j/view?usp=sharing
This program has many dependencies, some big ones to note is that it runs off ubuntu 18.04 and Opensim 4.1.
I'm not a computer scientist by any means, so I've been struggling to even learn to do docker basics like load and run a image. However, I desperately need this program to work. If you have any steps or advice on how to run this .tar I'd greatly appreciate it. Alternatively if you are able to find a way to get opensimrt up and running and can post those steps I'd be more than happy with that solution as well.
I've tried the commands "docker run" and "docker load" followed by their respective tags, file paths, args..etc. However, even when I fix various issues I always get stuck with a missing var/lib/docker/tmp/docker-import-....(random numbers) file. The numbers change every so often when trying to solve the issue, but eventually I always end up getting some variation of this error: Error response from daemon: open /var/lib/docker/tmp/docker-import-3640220538/bin/json: no such file or directory.
ps: I have extracted the .tar already and there is no install guide/instruction, .exe, install application. As a result I'm not sure how to get the program installed and running.
I'm trying to do something a bit slick, and of course running into issues.
Specifically, I am using a Docker-based environment build system (Lando). I'm feeding it a Dockerfile that it then builds a cluster around, using docker-compose. Locally, this works fine. I also have it working great inside a GitHub Action to spin up a local-dev-identical environment for testing. Quite nice.
However, now I'm trying to expand the Dockerfile using the Dockerfile Plus extension. My Dockerfile looks like this:
# syntax = edrevo/dockerfile-plus
INCLUDE+ ./docker/prod/Dockerfile
COPY docker/dev/php.ini /usr/local/etc/php/conf.d/zzz-docker-custom.ini
# Other stuff here.
This works great for me locally, and I get the contents of docker/prod/Dockerfile included into my docker build.
When I run the same configuration inside a GitHub Actions workflow, I get a syntax error on the INCLUDE+ line, indicating that the extension is not being loaded. It uses BuildKit (according to the project page), which should be enabled by default on any recent Docker version, it says. Yet whatever is on GitHub is not using BuildKit. I've tried enabling it by setting the env vars explicitly (as specified on the Dockerfile+ project page), but it still doesn't seem to work.
How can I get Dockerfile+ working in GitHub Actions? Of note, I do not run the docker build command myself (it's run by docker-compose, using files generated on the fly by Lando), so I cannot modify that specific command. But I didn't need to locally anyway, so I don't know what the issue is.
For years I've been running a Docker container on my local machine and using it as a remote Python interpreter via SSH in PyCharm. This works great (though 2022.2.1 brought a lot of new bugs that have been slowly being ironed out) for running my code! I'm now on 2022.2.3.
However, I'm having issues running unit tests. In the past (i.e. before version 2022.2.1), I could simply right click my tests directory (a direct child of my main project directory) and click Run Python tests in test... and it would all work as expected.
Now, though, when I click this, I receive an error message about "No such file or directory."
I've tried everything I can think of- I've setup my path mappings in the Python test run config to exactly match those shown in my Python run config, and have tried every version of directory and subdirectory in the mappings and working directory, but I always receive an error about either having an empty test suite (no tests found), or that the directory "must be in the project."
It seems like no matter what I do, PyCharm is trying to create a temp directory somewhere, or is trying to read from some temp directory that I never specified, because I see errors this like:
AssertionError: /tmp/pycharm_project_405/docker/tests: No such file or directory
Yet I never created, specified, or requested a temp directory of any sort, let alone one named /tmp/pycharm_project_405/; this is a mystery to me.
PyCharm with an SSH interpreter is rapidly becoming unusable for me and my team because we cannot figure out how to set this up. Can anybody please offer some guidance on what we need to do?
Thank you all so very much!
I tried:
Changing run config for Python tests to match the working directory and path mapping of Python run configs (which work)
Directly specifying the path to the tests from the container's perspective
Setting up run config templates
Specifying one directory up/down from the actual tests
Expected:
Unit tests to be found and run as they were in previous versions of PyCharm
Answer
Create a run config for testing
In the testing run config, set Target: to Custom
Set the correct remote interpreter
Set Working directory to the test folder
Set TWO path mappings: 1) Map the code directory (in my case, the parent directory of the tests folder) and 2) Map the test directory itself
Voila!!!
This one is quite strange.
I am running a very typical Docker container that holds a Rails API. Inside this API, I have an endpoint which takes an upload of a CSV and does some things and stuff.
Here is the exact flow:
vim spec/fixtuers/bid_update.csv
# fill it with some data
# now we call the spec that uses this fixture
docker-compose run --rm web bundle exec rspec spec/requests/bids_spec.rb
# and now the csv is loaded and I can see it as plaintext
However, after creating this, I decided to change the content of the CSV. So I do this, adding a column and respective value to it for each piece.
Now, however, when we run our spec again after saving this it has the old version of the CSV. The one originally used at the breakpoint in the spec.
cat'ing out the CSV shows it clearly should have the new content.
Restarting the VM does nothing. The only solution I've found is to docker-machine rm dev and build a new machine (my main one for this is called dev).
I am entirely perplexed as to what could cause this or a simple means to fix it (building with all those images takes a while).
Ideas? Inform me I'm an idiot and I just had to press 0 for an operator and they would have fixed it?
Any help appreciated :)
I think it could be an issue with how virtualbox shares folders with your environment. More information here https://github.com/mitchellh/vagrant/issues/351#issuecomment-1339640
I have a project which for its tests runs:
./node_modules/.bin/mocha tests/**/*.coffee -r coffee-script/register -c
tests/ looks like this:
_helper.coffee
database-tests.coffee
routers/
index-router-tests.coffee
team-router-tests.coffee
On my windows dev machine it works fine running _helper.coffee first and then the rest of the files.
On my CI server running debian it only tries to run routers/* missing out anything in the root folder.
I am assuming that tests/**/*.coffee isnt right for unix?
Moving Comments to an Answer for others since it appears to have fixed your problem.
I have had the same problems on Windows where it is not returning the files in the same order that you see them listed on the drive. I have therefore used tests/*.coffee and then tests/**/*.coffee.
I found that Windows will retrieve the files in the order they were likely written to the hard drive, while a directory or other list will have them sorted for display. This seemed to be the problem I was encountering.
The parent directory ('tests') does not seem to be included when using tests/** which seams to mean directories under the tests folder, and does not include the tests folder itself.