This question is specific to avro-c, but the solution may be generalized to other packages in the OpenEmbedded BitBake system.
How do I create a do_populate_sdk task for avro-c?
I want to generate a Yocto SDK which includes avro-c. The avro-c layer in meta-openembedded is very small:
avro
├── avro-c
│ └── 0001-avro-c-Fix-build-with-clang-compiler.patch
└── avro-c_1.8.1.bb
The avro-c_1.8.1.bb recipe is only 20 lines:
SUMMARY = "Apache Avro data serialization system."
HOMEPAGE = "http://apr.apache.org/"
SECTION = "libs"
LICENSE = "Apache-2.0"
LIC_FILES_CHKSUM = "file://LICENSE;md5=73bdf70f268f0b3b9c5a83dd7a6f3324"
DEPENDS = "jansson zlib xz"
PV .= "+git${SRCPV}"
SRCREV = "4b3677c32b879e0e7f717eb95f9135ac654da760"
SRC_URI = "git://github.com/apache/avro \
file://0001-avro-c-Fix-build-with-clang-compiler.patch;patchdir=../../ \
"
S = "${WORKDIR}/git/lang/c"
LDFLAGS_append_libc-uclibc = " -lm"
inherit cmake
A target image which includes avro-c builds successfully, and ls /usr/bin/avro* lists the Avro functions.
However, avro-c is not included in the host SDK build. One way to troubleshoot this is to try the two commands:
$ bitbake avro-c
$ bitbake avro-c -c populate_sdk
The first command completes successfully. The second command fails with the following error messages:
ERROR: Task do_populate_sdk does not exist for target avro-c (/home/rdepew/workspace/clean1/build/../layers/meta-sporian/recipes-support/avro/avro-c_1.8.1.bb:do_populate_sdk). Close matches:
do_populate_lic
do_populate_sysroot
ERROR: Command execution failed: 1
I looked for clues in the other layers in my build system. It appeared that creating the file avro-c_%.bbappend, containing the single line
inherit nativesdk
might do the trick, but that generated two more BitBake error messages:
ERROR: Nothing PROVIDES 'virtual/x86_64-pokysdk-linux-compilerlibs' (but /home/rdepew/workspace/clean1/build/../layers/meta-sporian/recipes-support/avro/avro-c_1.8.1.bb DEPENDS on or otherwise requires it). Close matches:
virtual/nativesdk-x86_64-pokysdk-linux-compilerlibs
virtual/x86_64-pokysdk-linux-go-crosssdk
virtual/x86_64-pokysdk-linux-gcc-crosssdk
ERROR: Required build target 'avro-c' has no buildable providers.
Missing or unbuildable dependency chain was: ['avro-c', 'virtual/x86_64-pokysdk-linux-compilerlibs']
... and that's where I'm stuck. I'm not sure where to go from here.
Online places that I have researched:
I don't know if it's appropriate to list the URLS of places where I have looked for the answer. They include the GitHub repository for Avro, the Yocto Project ADT manual, and four related questions on StackOverflow. If it's appropriate, I will edit this question to include the URLs.
The right way to add something to SDK (or eSDK - Extended SDK) is via the image of your choice. So, the steps are:
Add a package to the image:
IMAGE_INSTALL_append = " avro-c"
Create Yocto SDK for an image of your choice:
bitbake core-image-full-cmdline -c populate_sdk
Create Yocto eSDK for an image of your choice:
bitbake core-image-full-cmdline -c populate_sdk_ext
Have fun! :-)
You need the following line in your recipe
BBCLASSEXTEND = "nativesdk"
This extends the same recipe to build for sdk as well. See here for more details.
EDIT:
do_populate_sdk: This task applies only for the image recipe. This handles two operations.
Target part: Compiles and installs the header and libraries for the target platform.
Host part: Installs the host part of the library and header based on SDKMACHINE
During these operations, it finds the list of packages needed for the SDK by examining the BBCLASSEXTEND variable and builds the nativesdk-<recipe_name> for combines them together in SDK.
So you have do_populate_sdk for image recipe which bundles the packages together.
See yocto manual here for more details.
Related
Use case: I have a dependency that falls back to a subproject:
./
./subprojects/
./subprojects/mylib.wrap
src/meson.build contains:
mylib_dep = dependency('mylib') # Searches for mylib with pkg-config then fall backs to mylib.wrap.
myexec_exe = executable ('myexec', 'myexec.c', dependencies : mylib_dep)
Dependency mylib_dep provides libraries, which, if not installed on the system, make the main executable of my project unusable:
$ meson build && cd build && meson compile src/my_exec
...snip'd...
$ src/my_exec
src/my_exec: error while loading shared libraries: libmylib.so: cannot open shared object file: No such file or directory
My testing script build/tests/mytests.sh is configure_filed from tests/mytests.sh.in to indicate the location of myexec, and I'd like to pass to it the library paths, so that it can adjust LD_LIBRARY_PATH and run the executable. For instance, in tests/meson.build:
conf_data = configuration_data ()
conf_data.set_quoted ('MYEXEC_PATH', myexec_exe.full_path ())
conf_data.set_quoted ('MYLIB_PATH', mylib_dep.??????)
mytest_exe = configure_file (input : 'mytests.sh.in', output : 'mytests.sh', configuration : conf_data)
and in tests/mytests.sh.in:
MYEXEC_PATH=#MYEXEC_PATH#
MYLIB_PATH=#MYLIB_PATH#
export LD_LIBRARY_PATH=$(dirname "$MYLIB_PATH"):$LD_LIBRARY_PATH
$MYEXEC_PATH
Question: What should go at the ?????? above? In other words, given a dependency object, how can I extract the libraries within it, and get their full paths?
Usually in meson you wouldn't configure_file this, you'd pass the library/executable(s) to the script as arguments in the test command:
test(
'mytest',
find_program('mytest.sh')
args : [executable_target, library_target, ...],
)
It can be frustrating trying to get this sort of info out of Meson. Fortunately, if Meson used CMake to find the dependency, you may be able to get the library path from the underlying CMake variables, which are available in the Meson dependency object. Eg, something along the following lines worked for me:
mylib_dep = dependency('mylib')
if mylib_dep.found()
mylib_path = mylib_dep.get_variable(default_value : '', cmake : 'PACKAGE_LIBRARIES')
message('Library path is:', mylib_path)
endif
I am including opencv with custom build parameters in my Yocto image. For that I have an opencv_4.1.0.bbappend recipe, in which I set custom options, specifically FFMPEG. The recipe goes something like this:
DEPENDS += "ffmpeg libpng"
EXTRA_OECMAKE_append += "-DWITH_FFMPEG=ON -DWITH_GTK=OFF" # and some other options
During configure I get cmake errors and can't seem to figure out, how to satisfy the header dependencies. The errors go like this (I assume this is the reason for do_configure to fail):
CheckIncludeFile.c:1:10: fatal error: /home/janos/dev/yocto/build/tmp/work/core2-64-poky-linux/opencv/4.1.0-r0/recipe-sysroot/usr/include/libpng/png.h: No such file or directory
1 | #include </home/janos/dev/yocto/build/tmp/work/core2-64-poky-linux/opencv/4.1.0-r0/recipe-sysroot/usr/include/libpng/png.h>
CheckIncludeFile.c:1:10: fatal error: sys/videoio.h: No such file or directory
1 | #include <sys/videoio.h>
Focusing on the missing png.h header first, I am tempted to depend libpng-dev, as I also would apt install it. But there is no package for it.
When I search oe-pkgdata-util list-pkg-files -p libpng, I can find the header in a libpng-dev package:
...
libpng-dev:
/usr/bin/libpng-config
/usr/bin/libpng16-config
/usr/include/libpng16/png.h
/usr/include/libpng16/pngconf.h
/usr/include/libpng16/pnglibconf.h
/usr/include/png.h
...
...
I can also find it in libpng-src and also ffmpeg-src package (oe-pkgdata-util find-path "*png.h" was my friend). But all of these -dev and -src packages I cannot depend on in DEPENDS.
How can I get my recipe to know those headers?
Target machine is raspberrypi4-64, on which the recipe is configuring and compiling well - it fails when I build for qemux86-64, which I use for testing. Namely, my test command is MACHINE="qemux86-64" bitbake opencv.
It doesn't really answer the question which I though was the question - but this is how the opencv recipe is easily configured:
PACKAGECONFIG = "python3 libav libv4l v4l"
Looking into the opencv 4.1.0 recipe (opencv_4.1.0.bb), I could see that FFMPEG gets enabled with the libav configurable option.
As a result of depending FFMEPG, I had to whitelist "commercial" licenses in my local.conf file:
LICENSE_FLAGS_WHITELIST = "commercial"
Looking into ./build/tmp/work/aarch64-poky-linux/opencv/4.1.0-r0/temp/log.do_configure shows that opencv is correctly configured without GUI, with v4l/v4l2:, FFMPEG, python3, etc.
And so python3 in the resulting image:
import cv2
print(cv2.getBuildInformation())
we are trying to create a Scala project which uses Spark also but we are facing issue Encountered error while reading extension file 'intellij_info_bundled.bzl': no such package '#intellij_aspect//': No WORKSPACE file found in C:/users//_bazel_user/i45wuf6d/external/intellij_aspect. Is it has something missing in Intellij?
Scala file
package src.main.scala
object HelloWorld extends App {
def main(args: Array[String]) {
println("Hello, world!")
}
}
Build file
package(default_visibility = ["//visibility:public"])
load("#io_bazel_rules_scala//scala:scala.bzl", "scala_library", "scala_test")
scala_library(
name = "hello-world",
srcs = glob(["src/main/scala/*.scala"]),
)
scala_test(
name = "Hello_test",
srcs = glob(["src/main/scala/*.scala"]),
size = "small", # Expect this test to run quickly
)
Work Space
workspace(name = "scala_example")
rules_scala_version="7522c866450cf7810eda443e91ff44d2a2286ba1" # update this as needed
http_archive(
name = "io_bazel_rules_scala",
url = "https://github.com/bazelbuild/rules_scala/archive/%s.zip"%rules_scala_version,
type = "zip",
strip_prefix= "rules_scala-%s" % rules_scala_version
)
load("#io_bazel_rules_scala//scala:scala.bzl", "scala_repositories")
scala_repositories()`enter code here`
# register default scala toolchain
load("#io_bazel_rules_scala//scala:toolchains.bzl", "scala_register_toolchains")
scala_register_toolchains()
Command and Error from console
Command: C:\ProgramData\chocolatey\bin\bazel.exe build --tool_tag=ijwb:IDEA:community --keep_going --curses=no --color=yes --experimental_ui=no --progress_in_terminal_title=no --aspects=#intellij_aspect//:intellij_info_bundled.bzl%intellij_info_aspect --override_repository=intellij_aspect=C:\Users\ADMIN.IdeaIC2017.3\config\plugins\ijwb\aspect --output_groups=intellij-compile-java,intellij-compile-py -- //...:all
INFO: Loading complete. Analyzing...
ERROR: Encountered error while reading extension file 'intellij_info_bundled.bzl': no such package '#intellij_aspect//': No WORKSPACE file found in C:/users/admin/appdata/local/temp/_bazel_sandhya/criyrv6d/external/intellij_aspect.
INFO: Found 3 targets...
WARNING: failed to create one or more convenience symlinks for prefix 'bazel-':
cannot create symbolic link bazel-out -> C:/users/admin/appdata/local/temp/_bazel_sandhya/criyrv6d/execroot/scala_example/bazel-out: Cannot create junction (name=C:\users\admin\scalaprojects\example1\bazel-out, target=C:\users\admin\appdata\local\temp_bazel_sandhya\criyrv6d\execroot\scala_example\bazel-out): ERROR: src/main/native/windows/file-jni.cc(86): nativeCreateJunction(C:\users\admin\scalaprojects\example1\bazel-out, C:\users\admin\appdata\local\temp_bazel_sandhya\criyrv6d\execroot\scala_example\bazel-out): ERROR: src/main/native/windows/file.cc(128): CreateJunction(\?\C:\users\admin\scalaprojects\example1\bazel-out): Cannot create a file when that file already exists.
cannot create symbolic link bazel-out -> C:/users/admin/appdata/local/temp/_bazel_sandhya/criyrv6d/execroot/scala_example/bazel-out: Cannot create junction (name=C:\users\admin\scalaprojects\example1\bazel-out, target=C:\users\admin\appdata\local\temp_bazel_sandhya\criyrv6d\execroot\scala_example\bazel-out): ERROR: src/main/native/windows/file-jni.cc(86): nativeCreateJunction(C:\users\admin\scalaprojects\example1\bazel-out, C:\users\admin\appdata\local\temp_bazel_sandhya\criyrv6d\execroot\scala_example\bazel-out): ERROR: src/main/native/windows/file.cc(128): CreateJunction(\?\C:\users\admin\scalaprojects\example1\bazel-out): Cannot create a file when that file already exists.
.
INFO: Building...
ERROR: command succeeded, but not all targets were analyzed.
INFO: Elapsed time: 18.108s, Critical Path: 0.05s
Make failed
This is a sample Helloworld program only
In general, like #Ittai, I would suggest you open an issue in the intellij plugin github repo.
Unfortunately, your version of the plugin is no longer supported. I, too, previously ran into an issue with an older version of the plugin and was recommended to upgrade to the latest version. Which resolved the specific issue I was facing.
When reporting the issue make sure to include the following bits of information:
intellij build number
plugin version number
rules_scala version
operating system (it seems your using Windows, while most users use unix based systems)
bazel release number
how you have opened the intellij project (BUILD file, WORKSPACE, .blazeproject)
Additionally, to verify this is in fact an issue with the plugin, I would also suggest you try to reproduce this issue on a Unix based system. It seems you are using Intellij
compile on Windows. This may be Windows specific issue with aspects not being recognized.
When attempting to reproduce, make sure to clone your repository in a separate directory, close the intellij project, and reopen the project
I am trying to find the boost libraries (cmake) inside the Yocto SDK with extended environment on krogoth.
The default cmake Find_
find_package(Boost REQUIRED)
The standard error message
Unable to find the requested Boost libraries.
Unable to find the Boost header files. Please set BOOST_ROOT to the root
directory containing Boost or BOOST_INCLUDEDIR to the directory containing
Boost's headers.
Call Stack (most recent call first):
CMakeLists.txt:3 (find_package)
The following is a snippet from my conf/local.conf
IMAGE_INSTALL_append = " boost-dev"
IMAGE_INSTALL_append = " boost"
IMAGE_INSTALL_append = " kernel-devsrc"
MACHINE_ESSENTIAL_EXTRA_RRECOMMENDS += "kernel-module-hello"
KERNEL_MODULE_AUTO_lOAD += "hello-md"
LCHAIN_HOST_TASK_append = "${SDK_EXTRA_TOOLS}"
SDK_EXTRA_TOOLS = " nativesdk-cmake
I am using the native cmake
auke#xenialxerus:~/workspace/beaglebone-dev/build$ which cmake
/home/auke/workspace/beaglebone-dev/poky-sdk/tmp/sysroots/x86_64-linux/usr/bin/
since I:
source environment-setup-cortexa8hf-neon-poky-linux-gnueabi
looking for the usual headers in:
find ./tmp/sysroots/beaglebone/usr/include/boost/
..
/tmp/sysroots/beaglebone/usr/include/boost/vmd/list/to_seq.hpp
./tmp/sysroots/beaglebone/usr/include/boost/vmd/list/to_tuple.hpp
./tmp/sysroots/beaglebone/usr/include/boost/vmd/to_list.hpp
./tmp/sysroots/beaglebone/usr/include/boost/vmd/empty.hpp
./tmp/sysroots/beaglebone/usr/include/boost/vmd/is_list.hpp
./tmp/sysroots/beaglebone/usr/include/boost/vmd/size.hpp
./tmp/sysroots/beaglebone/usr/include/boost/vmd/get_type.hpp
./tmp/sysroots/beaglebone/usr/include/boost/vmd/assert_is_identifier.hpp
./tmp/sysroots/beaglebone/usr/include/boost/vmd/is_number.hpp
..
just like the binaries:
./tmp/sysroots/beaglebone/usr/lib/libboost_system-mt.a
./tmp/sysroots/beaglebone/usr/lib/libboost_iostreams.so.1.60.0
./tmp/sysroots/beaglebone/usr/lib/libboost_serialization-mt.a
./tmp/sysroots/beaglebone/usr/lib/libboost_date_time-mt.a
./tmp/sysroots/beaglebone/usr/lib/libboost_date_time.a
./tmp/sysroots/beaglebone/usr/lib/libboost_thread.so
./tmp/sysroots/beaglebone/usr/lib/libboost_signals-mt.a
./tmp/sysroots/beaglebone/usr/lib/libboost_date_time-mt.so
./tmp/sysroots/beaglebone/usr/lib/libboost_graph-mt.a
./tmp/sysroots/beaglebone/usr/lib/libboost_iostreams.so
./tmp/sysroots/beaglebone/usr/lib/libboost_regex.so
./tmp/sysroots/beaglebone/usr/lib/libboost_wserialization.so.1
Is there something that i might have overlooked?
regards Auke
You should use
bitbake -c populate_sdk <image_name> to generate the SDK based on your image;
As an alternative to locating and downloading a toolchain installer,
you can build the toolchain installer one of two ways if you have a
Build Directory:
*Use bitbake meta-toolchain. This method requires you to still install
the target sysroot by installing and extracting it separately. For
information on how to install the sysroot, see the "Extracting the
Root Filesystem" section.
*Use bitbake -c populate_sdk. This method has significant
advantages over the previous method because it results in a toolchain
installer that contains the sysroot that matches your target root
filesystem.
Also, using the variable TOOLCHAIN_HOST_TASK to add more packages.
http://www.yoctoproject.org/docs/1.8/ref-manual/ref-manual.html
This variable lists packages the OpenEmbedded build system uses when
building an SDK, which contains a cross-development environment. The
packages specified by this variable are part of the toolchain set that
runs on the SDKMACHINE, and each package should usually have the
prefix "nativesdk-". When building an SDK using bitbake -c
populate_sdk , a default list of packages is set in this
variable, but you can add additional packages to the list.
e.g.
TOOLCHAIN_HOST_TASK += “nativesdk-libqt5core-dev”
I am the developer of this small library: https://github.com/martin-damien/babel and I have a problem with Luarocks releases.
From source
When I install from source with Luarocks I have no problem:
$ luarocks make --local rockspecs/babel-1.2-2.rockspec
From internet
But when deployed (using: tag master, add new rockspec release and publish to Luarocks), I can't install using
$ luarocks install --local babel
Because I encounter the following error:
Installing https://luarocks.org/babel-1.2-2.src.rock...
Using https://luarocks.org/babel-1.2-2.src.rock... switching to 'build' > mode
stat: malsukcesis eltrovi statinformon pri «locales/zh-HK.lua»: No such > file or directory
Error: Build error: Failed installing locales/zh-HK.lua in /home/damien/.luarocks/lib/luarocks/rocks/babel/1.2-2/lua/locales/zh-HK.lua: locales/zh-HK.lua: No such file or directory
As you can see in https://github.com/martin-damien/babel/issues/14 the error occure on different files (but until now, only with locale files, not with the babel.lua file).
I have no idea why it randomly crash like this, so if someone know why or have an idea from where it could come from...
Thanks in advance,
Damien
The location of the files in the build.modules table is (from the docs on the rockspec format):
relative to source.dir
Where source.dir is
source.dir (string) - the name of the directory created when the source archive is unpacked. Can be omitted if it can be inferred from the source.file field. Example: "luasocket-2.0.1"
and source.file is
source.file (string) - the filename of the source archive. Can be omitted if it can be inferred from the source.url field. Example: "luasocket-2.0.1.tar.gz"
You don't specify source.dir or source.file in your rockspec but you do set source.url (because you have to).
So you have source.url = https://github.com/martin-damien/babel/archive/v1.2-2.zip which (presumably) ends up with source.file = v1.2-2.zip and then source.dir = v1.2-2 but your zip file extracts into a babel-1.2 so luarocks can't find your source files. (The screenshot in the linked issue seems to indicate that luarocks uses source.file = v1.2.zip and the archive extracts to babel-1.2 but I'm not sure how that's possible.)
Add dir = "babel-1.2" to your rockspec's source table an I expect it will work.