How do I embed the version from pyproject.toml so my package can use it? - python-import

The version in my project is stored in pyproject.toml:
[tool.poetry]
name = "package_name"
version = "1.2.3"
# ...
I now want to have a __version__ in package_name/__init__.py as well and the general suggestion seems to be:
import importlib_metadata
__version__ = importlib_metadata.version('package_name')
But that doesn't work for me. The moment I run my unittests I get this error:
importlib_metadata.PackageNotFoundError: No package metadata was found for package_name
How can I make this work during development?

You can try add this to __init__:
from ._version import version as __version__
source

Related

Nim cannot open 'opencv/highgui'

I'm trying to use this library https://github.com/dom96/nim-opencv a wrapper for OpenCV.
After running nimble install opencv
I can see the package in ~/.nimble/pkgs/opencv-0.1.0/
But when attempting to use I get a compile error
Error: cannot open 'opencv/highgui'
import opencv/highgui
const fileName = "g.jpg"
let img_colr = loadImage(fileName, 1)
echo img_colr.width
echo img_colr.height
Not sure what the problem is something to do with Nim import paths?...
Are you compiling with nimble or nim? If the former then make sure you've got requires "opencv" in your .nimble file.

Where to set the path to the protoc to import standards Protocol Buffers

Where need I to set the path to the protoc to get import standards Protocol Buffers (protobuf), like empty.proto and timestamp.proto in Windows and Dart?
When the protoc is ran:
protoc --dart_out=grpc:lib/src/protos/generated -Iprotos
protos/organization.proto
--plugin=protoc-gen-dart=D:\Users\Samuel\AppData\Roaming\Pub\Cache\bin\protoc-gen-dart.bat
The following error is presented:
google/protobuf/empty.proto: File not found. organization.proto:
Import "google/protobuf/empty.proto" was not found or had errors.
organization.proto:14:27: "google.protobuf.Empty" is not defined.
In IntelliJ Settings on Protobuf Support plugin the path is define where standard protos (*.proto) are:
Additionally this path is define in IntelliJ on Project Structure \ Global Libraries:
The code organization.proto that import google/protobuf/empty.proto to use Empty class :
syntax = "proto3";
package auge.protobuf;
import "google/protobuf/empty.proto";
service OrganizationService {
rpc GetOrganizations (google.protobuf.Empty) returns (OrganizationsResponse) {}
}
IntelliJ analyzer recognizes the import "google/protobuf/empty.proto" and Empty class on IDEA, but protoc can not find.
The environment is:
SO: Windows 7 x64
protoc: libprotoc 3.6.1
Dart: 2.2.0-edge
Say you have /some/path/to/google/protobuf/empty.proto, you need to pass --proto_path=/some/path/to/ so protoc can locate it.
use two dot before protobuf folder path
<Protobuf Include="..\Proto\protobuf.proto" ProtoRoot="Proto" />

PyLint not recognizing cv2 members

I am running pylint on an opencv project and I am getting many pylint errors in VS code about members not being present.
Example code:
import cv2
cv2.imshow(....)
Errors obtained:
However , the code runs correctly without any errors.
Versions : pylint 1.8.1 , astroid 1.6.0
This is from pylint. You can generate a pylint config file in the root of your project with this command:
(I find this to be helpful if you work in a team or on different computers from the same repo)
pylint --generate-rcfile > ~/.pylintrc
At the beginning of the generated .pylintrc file you will see
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code.
extension-pkg-whitelist=
Add cv2 so you end up with
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code.
extension-pkg-whitelist=cv2
Save the file.
The lint errors should disappear.
On VScode: CTRL + Shift + P
Choose "Preferences: Open Settings (JSON)"
Add this line into JSON file:
"python.linting.pylintArgs": ["--generate-members"]
Done, it works for me
Note: Make sure you choose "Preferences: Open Settings (JSON)", not "Preferences: Open Default Settings (JSON)"
Setting File would look like
{
"workbench.iconTheme": "vscode-icons",
"python.dataScience.sendSelectionToInteractiveWindow": true,
"kite.showWelcomeNotificationOnStartup": false,
"python.dataScience.askForKernelRestart": false,
"python.dataScience.jupyterServerURI": "local",
"python.pythonPath": "/usr/bin/python3",
"workbench.colorTheme": "Monokai",
"vsicons.dontShowNewVersionMessage": true,
"python.linting.pylintArgs": ["--generate-members"] }
Try import cv2 like this:
from cv2 import cv2
Yes it is because the extension has not been installed.
Set this: extension-pkg-whitelist=cv2 and you're good to go.
However it may not detect the functions or modules implemented in cv2
Here the code snippet for the settings.json file in MS V Code
"python.linting.pylintArgs":["--extension-pkg-whitelist=cv2"]
I didn't have to change anything in the pylint Jason file like the most of the answers here My solution is to change the import statement to the form below
from cv2 import cv2
Eventually, cv2 members can be used!
In VSCode, edit the Settings JSON (Ctrl+Shift+P, > "Preferences: Open Settings JSON)
Then, paste the following into the JSON:
"python.linting.pylintArgs": [
... // prievious arguments
"--generated-members=cv2.*"
]
Don't know why, but other solutions (allowlist, etc) weren't working for me, and I didn't want ot create the .pylintrc file.
I used below config settings in settings.json of vscode and it helped me avoid the unessential flags by pylint, and also got intellisense for cv2 working,
it it doesn't work try uninstalling and deleting cv2 packages from C:\Anaconda3\envs\demo1\Lib\site-packages folder, and reinstalling opencv-python package
{
"python.linting.pylintEnabled": true,
"python.linting.enabled": true,
"python.linting.pylintArgs": [
"--extension-pkg-whitelist=cv2"
]
}

CocoaPods project depend on a local static C library

I have locally built C library (.h and .a files) that I want to include in a Swift-based CocoaPods pod. How do I configure the podspec to depend on the .a files and the module.map? With a normal non-CocoaPods Xcode project, I simply drag in the directory that contains include and lib and then add a module.map. With CocoaPods I can't do this because pod install will overwrite the Xcode project file. s.library won't work because the the static library isn't hosted anywhere. I tried s.vendored_libraries but module.map still remains unknown to Xcode, the end result being that import foo from my Swift files is an error.
Edit: I tried using preserve_paths, vendored_libraries and xcconfig as answered here. The issue is still how to import the module from Swift.
Edit 2: I also tried using module_map to point to my module.map file as documented here, but sadly CocoaPods 1.1.1 crashes ([!] Oh no, an error occurred.).
I got it working. In my case I'm depending the libtiff C library which is prebuilt for iOS (x86 and arm) using https://github.com/ashtons/libtiff-ios.
I used a subspec as outline here. Here's the podspec subspec snippet, assuming the static library lives at libtiff off the root of the pod module.
s.subspec 'libtiff' do |libtiff|
libtiff.source_files = 'libtiff/include/*.h'
libtiff.public_header_files = 'libtiff/include/*.h'
libtiff.preserve_paths = 'libtiff/include/*.h'
libtiff.vendored_libraries = 'libtiff/lib/libjpeg.a', 'libtiff/lib/libpng.a', 'libtiff/lib/libtiff.a', 'libtiff/lib/libtiffxx.a'
libtiff.xcconfig = { 'HEADER_SEARCH_PATHS' => "${PODS_ROOT}/#{s.name}/libtiff/include/**" }
# you can't specify "libz" here, must specify "z", see https://github.com/CocoaPods/CocoaPods/issues/3232
libtiff.library = 'z'
end

Why have I errors when installing from Luarocks?

I am the developer of this small library: https://github.com/martin-damien/babel and I have a problem with Luarocks releases.
From source
When I install from source with Luarocks I have no problem:
$ luarocks make --local rockspecs/babel-1.2-2.rockspec
From internet
But when deployed (using: tag master, add new rockspec release and publish to Luarocks), I can't install using
$ luarocks install --local babel
Because I encounter the following error:
Installing https://luarocks.org/babel-1.2-2.src.rock...
Using https://luarocks.org/babel-1.2-2.src.rock... switching to 'build' > mode
stat: malsukcesis eltrovi statinformon pri «locales/zh-HK.lua»: No such > file or directory
Error: Build error: Failed installing locales/zh-HK.lua in /home/damien/.luarocks/lib/luarocks/rocks/babel/1.2-2/lua/locales/zh-HK.lua: locales/zh-HK.lua: No such file or directory
As you can see in https://github.com/martin-damien/babel/issues/14 the error occure on different files (but until now, only with locale files, not with the babel.lua file).
I have no idea why it randomly crash like this, so if someone know why or have an idea from where it could come from...
Thanks in advance,
Damien
The location of the files in the build.modules table is (from the docs on the rockspec format):
relative to source.dir
Where source.dir is
source.dir (string) - the name of the directory created when the source archive is unpacked. Can be omitted if it can be inferred from the source.file field. Example: "luasocket-2.0.1"
and source.file is
source.file (string) - the filename of the source archive. Can be omitted if it can be inferred from the source.url field. Example: "luasocket-2.0.1.tar.gz"
You don't specify source.dir or source.file in your rockspec but you do set source.url (because you have to).
So you have source.url = https://github.com/martin-damien/babel/archive/v1.2-2.zip which (presumably) ends up with source.file = v1.2-2.zip and then source.dir = v1.2-2 but your zip file extracts into a babel-1.2 so luarocks can't find your source files. (The screenshot in the linked issue seems to indicate that luarocks uses source.file = v1.2.zip and the archive extracts to babel-1.2 but I'm not sure how that's possible.)
Add dir = "babel-1.2" to your rockspec's source table an I expect it will work.

Resources