Airflow cannot access local files in python code - python-import

I have a folder structure:
root_folder
- file.conf
- models
- __init__.py
- model1.py
- airflow
- dags
- dag1.py
so in the above case dag1 imports model1. When doing so it breaks because models/__init__.py loads the file.conf file. I tried adding root folder to sys.path through sys.path.append that doesn't seem to solve the problem. I also tried relative paths from init but still it cannot locate the file given through relative path. What is a good way to bundle your own code with airflow code?
After trying various ways to make it work, what seems to work for me is using absolute path of the files as os.path.abspath(os.path.join(__file__ ,"../..") + '/file.conf'). If you know a better way please answer below. Thanks :)

You can add an __init__.py file inside the root_folder:
-root_folder
- __init__.py
- file.conf
- models
- __init__.py
- model1.py
- airflow
- dags
- dag1.py
Then add to $PYTHONPATH the path to your root_folder.
Finally in the dag1.py file import:
import file.conf
and also you can import
import models.model1

Related

Importing module from package raises ModuleNotFoundError for subpackages

I'm trying to create a package (that includes several subpackages) for reuse and distribution. The plan is to provide a CLI entry point to allow easy launch. After building the package and installing it in a virtualenv, I get a ModuleNotFoundError for imports from the subpackages included in the main package.
I think this has something to do with setting the right paths in __init__.py, but having read multiple examples on the web, I'm still rather confused as to why anything should go in __init__py and what that something is.
The package is built so that the package name (and thus folder created in site-packages) is the same as the root in the directory structure below.
The directory structure is (simplified and with names changed):
mypackage
|- __init__.py
|- entrypoint.py
|- subpackage1
|-- __init__.py
|-- module1.py
|- subpackage2
|-- __init__.py
|-- module2.py
Note that all the __init__.py are empty
And entrypoint.py is:
from subpackage1.module1 import foo
from subpackage2.module2 import baz
if __name__ == "__main__":
pass
In my pyproject.toml, I define:
\[project.scripts\]
mypackage-cli = "maypackage:entrypoint"
After installing with pip, I run (in a virtualenv where I pip installed the package):
(myvenv) me#mymachine ~ % mypackage-cli
But I get:
ModuleNotFoundError: No module named subpackage1
Two things to note:
When running source locally, I have no issues
If I edit the files in site-packages to have from mypackage.subpackage1.module1 import foo I don't get the error anymore when running the installed package, but then when trying to run the same modified imports (i.e. changing to import mypackage.subpackage1.module1) locally in my dev env, I get a ModuleNotFoundError
What's the correct way to get the imports to work when packaged and when running locally in my dev env?
Thanks!
Your "top-level importable package" seems to be mypackage so all your import statements should start from there. For example from mypackage.subpackage1.module1 import foo.
In order to avoid confusion between "local" and "installed" (in site-packages), I recommend you to use the so-called "src-layout" for this project's directory structure. In combination with the "editable" installation, it is very convenient to work with.

Prevent the .env file being copied to a Docker container

i'm newbie in Docker and i'm trying to make a "Starter-Boilerplate" with express and mongodb,
but i don't want the .env file in my docker image/containers, i've tried with .dockerignore file but the .env file still being copied inside the container in all docker-compose builds.
My .dockerignore file:
node_modules
.env
Link of the repo to reproduce the case:
https://github.com/josuerodcat90/expressjs-server-starter/tree/docker
Any idea or suggestion?, thanks in advance mates.
I found a solution for these cases when you need to keep the docker volumes without troubles with your .env file and your docker-compose rules, see below:
First i declare 2 new variables in my .env file:
ENV=DOCKER //this can switch between DOCKER and LOCAL
DOCKER_PORT=3000 //for example
and then i've made a little change in my index.js and database.js files, when im working in Docker the enviroment vars must be different of my Local vars, and then the problem solves.
Thanks to all for your help and interest in my issue, sorry for my bad english haha.
Create a .dockerignore file if you don't already have one and add **/*.env inside it. This will prevent to copy any .env file from any folder.

Dart cannot import files

I'm trying to create a simple web app with Dart.
But I cannot import anything that is outside the current or in a sub directory. I'm using VSCODE
My project structure:
/chatapp
/web
/main.dart
/bin
/server.dart
/views
/view.dart
/chatsignin.dart
Now I can import view.dart in chatsignin.dart but cannot import view.dart to main.dart or bin/server.dart. Or any of the other file to any other file in a different directory.
I tried import with package name:
import 'package:chatapp/';
But it shows no subdirectories or files in VS Code
I tried pub get and restarted vscode multiple times but it doesn't work.
After reading the docs I found that all files and folders should be inside the web directory.
I'm embarrassed of myself doing such a silly mistake

Create docker directory structure based on local directory structure

I'm trying to write a Dockerfile. From a directory structure such as:
something/Keep1
something/Keep2
something/... # lots of other files
otherthing/Keep1
otherthing/Keep2
otherthing/...
I would like my dockerfile to import the Keep1 and Keep2 files, without the rest of the files / folders.
Ideally, this would be a simple COPY or ADD command, but I can't seem to get that to work - e.g. COPY */Keep1 ./ only shows a single Keep1, in the current directory, presumably the last one copied.
I could import the whole folder structure and delete anything that isn't Keep1 or Keep2 - but I don't know how this will affect my storage space - presumably all the files will still exist in the layer files?
(Probably not important, but for context, what I'm actually trying to do is import Gemfile and Gemfile.lock files so that I can run bundler from the project folders during docker's build phase and save spin-up time. The Dockerfile bridges several ruby projects that need to co-exist within the same container - but I don't want to hardcode those folder names)

Importing a SASS file from outside of a rails app

I'm trying to #import an external SASS file to my rails app with no luck.
The folder structure is as follows
- /
- External_dir
- sass
- main.sass <-- file I need
- Rails_App
- app
- assets
- stylesheets
- app.sass <-- file where it should be imported
I've tried using relative paths (../../../../External_dir/sass/main), absolute paths (/External_dir/sass/main) and symlinks but nothing is working. Does anyone have any ideas? I can't continue without these other styles and I don't want to have to copy them over. Thanks in advance.
As cimmanon said in the comments this is a duplicate of SASS: Import a file from a different directory?
The issue had to do with the #imports in the external SASS files using relative paths that the current directory wasn't able to access. By updating the #imports to paths accessible from the current working directory, the SASS was able to import correctly.

Resources