If I have a tarball, helloworld.tar.gz in a local directory, say /home/user/tarballs/, how can I make my bitbake recipe fetch from that directory?
my helloworld.bb is
SECTION = "examples"
LICENSE = "Proprietary"
LIC_FILES_CHKSUM = "file://COPYING; md5=1b1b8016e15e07a2fec59623ebf12345"
SRC_URI = "file://helloworld.tar.gz"
but when I bitbake, I get the below warning message:
WARNING: Unable to get checksum for helloworld SRC_URI entry helloworld.tar.gz: file could not be found
I read something about FILES and FILESEXTRAPATHS can influence the download path but not sure where/how to set them.
I did a bitbake -c show FILESEXTRAPATHS but get an error message:
ERROR: Nothing PROVIDES 'FILESEXTRAPATHS'
Well, if you want to fetch from a local directory, use e.g.:
SRC_URI = "file:///home/user/tarballs/helloworld.tar.gz"
The FILES and FILESEXTRAPATHS variables tells bitbake where to find files which are referenced as:
SRC_URI = "file://helloworld.tar.gz"
These files are searched for in the locations specified by those two variables. (Or rather, FILESEXTRAPATHS is searched and then some possible subdirectories of the directories specified in FILESEXTRAPATHS, amongst those the expanded values of DISTRO, MACHINE, ARCH, etc).
FILES (and FILESEXTRAPATHS) are used to find files stored together with the meta-data, i.e. under the paths meta-/recipes-/name/XXX.
See http://www.yoctoproject.org/docs/1.7/mega-manual/mega-manual.html#var-FILES and http://www.yoctoproject.org/docs/1.7/mega-manual/mega-manual.html#var-FILESEXTRAPATHS
Better you can keep your files in present(where the .bb file present) directory and give the below lines in your .bb file.
FILESEXTRAPATHS_prepend := "${THISDIR}:"
SRC_URI = "file://helloworld.tar.gz"
FILESEXTRAPATHS_prepend : tells to bitbake the files are present in where the .bb file is present.
Also you can edit .bb file as follows,
FILESEXTRAPATHS_prepend := "path_to_home_folder_of_source_folders:"
SRC_URI = "file://Source_floder/*"
as an example
FILESEXTRAPATHS_prepend := "/home/username/:"
SRC_URI = "file://tarballs/*"
Related
Trying to write config.lua for lvim that wiil be separated in different files? that will be included in config.lua with require('<package>'). Everything works if i try i in .config/lvim/ directory, but i get below message, when i run lvim in different directory.
21:43:43 [WARN ] lvim: "Invalid configuration: /home/axr/.config/lvim/config.lua:6: module 'base/search' not found:\n\t
no field package.preload['base/search']\n\tno file './base/search.lua'\n\tno file '/usr/share/luajit-2.1.0-beta3/base/s
earch.lua'\n\tno file '/usr/local/share/lua/5.1/base/search.lua'\n\tno file '/usr/local/share/lua/5.1/base/search/init.
lua'\n\tno file '/usr/share/lua/5.1/base/search.lua'\n\tno file '/usr/share/lua/5.1/base/search/init.lua'\n\tno file '.
/base/search.so'\n\tno file '/usr/local/lib/lua/5.1/base/search.so'\n\tno file '/usr/lib/lua/5.1/base/search.so'\n\tno
file '/usr/local/lib/lua/5.1/loadall.so'" file="init.lua", line=49
Tried to replace / with ., nothing changed.
Checked runtimepath, .config/lvim/ was there.
Tried to replace relative path in require(<path>) with full path.
GitHub repository with files and comments: https://github.com/SATANalexander666/lvim-config
Dont use / or \\ in require()
Only use the . for entering a folder.
Using nvim the .config/nvim/lua folder has to be created manually.
After that it is easy doing to require Lua files.
Example
.config/nvim/init.vim # file
.config/nvim/lua/config.lua # file
.config/nvim/lua/base # folder
.config/nvim/lua/keys # folder
.config/nvim/lua/plugins/core # folder
.config/nvim/lua/plugins/packer # folder
Content of init.vim
lua require("config")
Will be appended/finished to: lua/config.lua
Refer nvim' help: :help lua-package-path
Content of config.lua
-- base
require('base.search') -- Search configs
require('base.indents') -- Indentation configs
require('base.visual') -- GUI configs
require('base.other')
-- keys
require('keys.alias') -- Shortcuts and incapsulation
require('keys.main') -- Keys for built-in features
require('keys.plugins') -- Keys for plugged features
-- plugins
require('plugins.core.use') -- Buil-in plugins that are being used
require('plugins.core.config') -- Configs for built-in plugins
require('plugins.packer.use') -- Packer pluggins that are being used
require('plugins.packer.config') -- Configs for packer plugins
The dot will be used to enter the folder(s) (Linux & Windows)
Refer nvim' help: :help lua-require
In /.config/lvim must be created folder named /lua and all folders required in main config.lua must be moved to this folder, however path to required folders shouldn't be changed. Example: require('base.search'), while actual path is /.config/lvim/lua/base/search.lua.
I need the path to external (or internal) dependency to pass it as an argument to a function inside. We need the location to the folder, not specific files. Also, sometimes, we need the path to the folder where a shared library, generated by cc_library.
Python file
import cppyy
cppyy.add_include_path('path/to/external/dependency/1')
cppyy.add_library_path('path/to/another/external/dependency/2')
cppyy.add_include_path('path/to/another/internal/dependency')
cppyy.include('file/in/external/dependency')
BUILD file
py_binary(
name = "sample",
srcs = ["sample.py"],
deps = [
"#cppyy_archive//:cppyy",
],
data = [
"#external-dependency//location:target",
"//internal-dependency/location:target2"
]
)
From https://docs.bazel.build/versions/master/external.html#layout:
You can see the external directory by running:
ls $(bazel info output_base)/external
How the paths in external actually look like really depends on the rule used for the archive.
For example, if it's declared using an http_file in the WORKSPACE file:
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_file")
http_file(
name = "fenix",
urls = ["https://github.com/mozilla-mobile/fenix/archive/v76.0.0-beta.2.tar.gz"],
sha256 = "94050c664e5ec5b66cd2ca9f6a8b898987ab63d9602090533217df1a3f2dc5a9"
)
You will find that v76.0.0-beta.2.tar.gz file as external/fenix/file/downloaded:
user#host:~$ file $(bazel info output_base)/external/fenix/file/downloaded
/home/user/.cache/bazel/_bazel_user/761044447e04744e746cd54d0b4b5056/external/fenix/file/downloaded: gzip compressed data, from Unix, original size modulo 2^32 15759360
I'm using container_pull in my WORKSPACE file. (It's part of bazel docker rules)
Here is what it looks like:
container_pull(
name = "base-image",
registry = "registry:9999",
repository = "base-image",
digest = "sha256:e6f44554a270025c578c0f91160d809735c2589baae80bafcdeebefb0c0b04b6",
tag = "1.1.0"
)
Howeve, there is a file containing the version of base-image, and I want it to be read from that file, instead of hardcoding in WORKSPACE.
How can I read a file content in WORKSPACE?
There's no direct way to read a file from the workspace file. The container_pull rule would have to add support for reading from a file.
A workaround is to put the file that contains the information into .bzl format, and load that from the workspace file.
Something like this:
versions.bzl:
BASE_IMAGE_VERSION = "1.1.0"
WORKSPACE:
load("//:versions.bzl", "BASE_IMAGE_VERSION")
container_pull(
name = "base-image",
registry = "registry:9999",
repository = "base-image",
digest = "sha256:e6f44554a270025c578c0f91160d809735c2589baae80bafcdeebefb0c0b04b6",
tag = BASE_IMAGE_VERSION,
)
Bazel does a similar thing in its own workspace file:
https://github.com/bazelbuild/bazel/blob/669a1a2634bdf267f890cf88833c9712d4e75016/WORKSPACE#L589
How to read a data file at a package path on Lua 5.1?
What I'm looking for is something like a io.read but at the package directory instead of the working directory (arg[0]), and without using hardcoded absolute paths. That would be something like dofile does, but without running the code, only reading it as a string.
Example:
I have a test folder, the current working directory of the test.lua script.
There's a package luapackage in another folder, somewhere specified at the LUA_PATHenviroment variable.
luapackage can:
require("luapackage.other_module");
dofile("other_module.lua").
But luapackage can't do:
io.read("data.txt")
io.read("luapackage/data.txt")
Sample structure:
+-test/
|
+-test.lua
+-luamodule/
|
+-data.txt
|
+-luamodule.lua
|
+-other_module.lua
For this example, test.lua only requires luamodule:
-- test.lua
local luamodule = require("luamodule")
And luamodule needs to read its modules and data files:
-- luamodule.lua
local other_module= dofile("other_module.lua") -- works
-- local other_module= require("luamodule.other_module") -- also works
local data = io.open("data.txt") -- fails
-- local data = io.open("luamodule/data.txt") -- also fails
It doesn't work because it searches for the file at the working directory (test) and not the package directory.
If I place a copy of the package at running script's folder, io.read(luapackage/data.txt) is possible. But every script would have to carry its own local copy of luapackage.
Note: I'm looking for a Lua solution, avoiding binary packages that could compromise cross-compatibility.
You can use debug.getinfo(1,"S").source to get the location of the current (module) file. Replace luamodule.lua with data.txt, remove leading #, and you should get the path you need.
I've grouped a lot of projects in a project group. All the info is in the project.bpg. Now I'd like to automatically build them all.
How do I build all the projects using the command line?
I'm still using Delphi 7.
I never tried it myself, but here is a German article describing that you can use make -f ProjectGroup.bpg because *.bpgs essentially are makefiles.
You can also run Delphi from the command line or a batch file, passing the .bpg file name as a parameter.
Edit: Example (for D2007, but can be adjusted for D7):
=== rebuild.cmd (excerpt) ===
#echo off
set DelphiPath=C:\Program Files\CodeGear\RAD Studio\5.0\bin
set DelphiExe=bds.exe
set LibPath=V:\Library
set LibBpg=Library.groupproj
set LibErr=Library.err
set RegSubkey=BDSClean
:buildlib
echo Rebuilding %LibBpg%...
if exist "%LibPath%\%LibErr%" del /q "%LibPath%\%LibErr%"
"%DelphiPath%\%DelphiExe%" -pDelphi -r%RegSubkey% -b "%LibPath%\%LibBpg%"
if %errorlevel% == 0 goto buildlibok
As I said as a comment to Ulrich Gerhardt answer, the make project_group.bpg is useless if your projects are in subdirectories. Make won't use relative paths and the projects won't compile correctly.
I've made a python script to compile all the DPRs in every subdirectory. This is what I really wanted to do, but I'll leave the above answer as marked. Although it didn't worked for me, It really answered my question.
Here is my script to compile_all.py . I believe it may help somebody:
# -*- coding: utf-8 -*-
import os.path
import subprocess
import sys
#put this file in your root dir
BASE_PATH = os.path.dirname(os.path.realpath(__file__))
os.chdir(BASE_PATH)
os.environ['PATH'] += "C:\\Program Files\\Borland\\Delphi7\\Bin" #your delphi compiler path
DELPHI = "DCC32.exe"
DELPHI_PARAMS = ['-B', '-Q', '-$D+', '-$L+']
for root, dirs, files in os.walk(BASE_PATH):
projects = [project for project in files if project.lower().endswith('.dpr')]
if ('FastMM' in root ): #put here projects you don't want to compile
continue
os.chdir(os.path.join(BASE_PATH, root))
for project in projects:
print
print '*** Compiling:', os.path.join(root, project)
result = subprocess.call([DELPHI] + DELPHI_PARAMS + [project])
if result != 0:
print 'Failed for', project, result
sys.exit(result)
Another vantage of this approach is that you don't need to add new projects to your bpg file. If it is in a subdir, it will compile.