How to start triton server after building the Windows 10 "Min" Image? - nvidia

I have followed the steps mentioned here.
I am able to build the win10-py3-min image.
After that I am trying to build the Triton Server as mentioned here
Command:
python build.py -v --no-container-pull --image=gpu-base,win10-py3-min --enable-logging --enable-stats --enable-tracing --enable-gpu --endpoint=grpc --endpoint=http --repo-tag=common:r22.10 --repo-tag=core:r22.10 --repo-tag=backend:r22.10 --repo-tag=thirdparty:r22.10 --backend=ensemble --backend=tensorrt:r22.10
I am getting error as below.
cmake : The term 'cmake' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At C:\workspace\build\cmake_build.ps1:20 char:1 + cmake "-DTRT_VERSION4{env:TRT_VERSION}" "-DCMAKE_TOOLCHAIN_FILE=${en +..—+ Categorylnfo : ObjectNotFound: (cmake:String) [], CommandNotFoundException + FullyQualifiedErrorld : CommandNotFoundException
DEBUG: 86+ >>>> ExitWithCode 1;
DEBUG: 6+ function ExitWithCode($exitcode) >>>>
DEBUG: 7+ >>>> $host.SetShouldExit($exitcode)
DEBUG: 8+ >>>> exit $exitcode
DEBUG: 33+ if ( >>>> $LASTEXITCODE -ne 0)
DEBUG: 34+ >>>> Write-Output "exited with status code $LASTEXITCODE-; exited with status code 1
DEBUG: 35+ >>>> ExitWithCode 1;
DEBUG: 6+ function ExitWithCode($exitcode) >>>>
DEBUG: 7+ >>>> $host.SetShouldExit($exitcode)
DEBUG: 8+ >>>> exit $exitcode error: build failed
and for below command
python build.py -v --no-container-pull --image=base,win10-py3-min --enable-logging --enable-stats --enable-tracing --enable-gpu --endpoint=grpc --endpoint=http --repo-tag=common:r22.10 --repo-tag=core:r22.10 --repo-tag=backend:r22.10 --repo-tag=thirdparty:r22.10 --backend=ensemble --backend=tensorrt:r22.10
getting error as below.
"C:\tmp\tritonbuild\tritonserver\build\install.vcxproj" (default target) (1) ->
"C:\tmp\tritonbuild\tritonserver\build\ALL_BUILO.vcxproj" (default target) (3) ->
"C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\triton-core.vcxproj" (default target) (5) ->
(CustomBuild target) -> C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\triton-core\_deps\repo-common-src\include\triton/common/triton_json.h(641,35): error 02039: 'GetObjectA': is not a member of 'rapidjson::GenericValue<rapidjson::UTF8<c har>,rapidjson::MemoryPoolAllocator<rapidjson::CrtAllocator>>1 [C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\triton-core\triton-core.vcxproll [C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\triton-core .vcxproj]
C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\triton-core\_deps\repo-common-src\include\triton/common/triton_json.h(641,1): error 02530: 'm': references must be initialized [C:\tmp\tritonbuild\tritonserver\build\_ deps\repo-core-build\triton-core\triton-core.vcxproj] [C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\triton-core.vcxproj]
C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\triton-core\_deps\repo-common-src\include\triton/common/triton_json.h(641,1): error C3531: 'm': a symbol whose type contains 'auto' must have an initializer [C:\tmp\tr itonbuild\tritonserver\build\_deps\repo-core-build\triton-core\triton-core.vcxproj] [C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\triton-core.vcxproj]
C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\triton-core\_deps\repo-common-src\include\triton/common/triton_json.h(641,26): error C2143: syntax error: missing ';' before ':' [C:\tmp\tritonbuild\tritonserver\build \_deps\repo-core-build\triton-core\triton-core.vcxproj] [C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\triton-core.vcxproj]
C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\triton-core\_deps\repo-common-src\include\triton/common/triton_json.h(641,46): error C2143: syntax error: missing ';' before 'y [C:\tmp\tritonbuild\tritonserver\build \_deps\repo-core-build\triton-core\triton-core.vcxproj] [C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\triton-core.vcxproj]
C:\BuildTools\MSBuild\Microsoft\VC\v160\Microsoft.CppCommon.targets(238,5): error MSB8066: Custom build for 'C:\tmp\tritonbuild\tritonserver\build\CMakeFiles\6f6d31a7577427f4fd89bcde8fd28163\triton-core-mkdir.rule;C:\tmp\triton build\tritonserver\build\CMakeFiles\6f6d31a7577427f4fd89bcde8fd28163\triton-core-download.rule;C:\tmp\tritonbuild\tritonserver\build\CMakeFiles\6f6d31a7577427f4fd89bcde8fd28163\triton-core-update.rule;C:\tmp\tritonbuild\tritonser ver\build\CMakeFiles\6f6d310577427f4fd89bcde8fd28163\triton-core-patch.rule;C:\tmp\tritonbuild\tritonserver\build\CMakeFiles\6f6d310577427f4fd89bcde8fd28163\triton-core-configure.rule;C:\tmp\tritonbuild\tritonserver\build\CMake Files\6f6d31a7577427f4fd89bcde8fd28163\triton-core-build.rule;C:\tmp\tritonbuild\tritonserver\build\CMakeFiles\6f6d31a7577427f4fd89bcde8fd28163\triton-core-install.rule;C:\tmp\tritonbuild\tritonserver\build\CMakeFiles\e0e8eabd6eb cadfabbd7ced13e471b12\triton-core-complete.rule;C:\tmp\tritonbuild\tritonserver\build\CMakeFiles\d677bfcd41cd12f160cbc1390c778655\triton-core.rule' exited with code 1. [C:\tmp\tritonbuild\tritonserver\build\_deps\repo-core-build\ triton-core.vcxproj]
2021 Warning(s)
6 Error(s)

Run Visual Studio Installer and ensure you have installed C++ CMake tools for Windows.
Run x64 Native Tools Command Prompt for VS2019
Run in the command prompt:
python build.py -v --no-container-pull --image=gpu-base,win10-py3-min --enable-logging --enable-stats --enable-tracing --enable-gpu --endpoint=grpc --endpoint=http --repo-tag=common:r22.10 --repo-tag=core:r22.10 --repo-tag=backend:r22.10 --repo-tag=thirdparty:r22.10 --backend=ensemble --backend=tensorrt:r22.10

Related

Unable to build electron using manual method

I am trying to build electron (master) using the appended script on Ubuntu 22.04. Its throwing the following error (e build doesn't report this error). I am using the latest depot_tools, gn and node.js. Please help:
root#acs-x86-node1-ghatwala-rhel:/electron/src# gn gen out/Release --args="import(\"//electron/build/args/release.gn\")"
ERROR at //electron/BUILD.gn:110:20: Script returned non-zero exit code.
electron_version = exec_script("script/print-version.py",
^----------
Current dir: /electron/src/out/Release/
Command: python3 /electron/src/electron/script/print-version.py
Returned 1 and printed out: 0a>\n/electron/src/electron/script/lib/get-version.js:19\n throw new Error('Failed to get current electron version');\n ^\n\nError: Failed to get current electron version\n at module.exports.getElectronVersion (/electron/src/electron/script/lib/get-version.js:19:11)\n at [eval]:1:37\n at Script.runInThisContext (node:vm:129:12)\n at Object.runInThisContext (node:vm:307:38)\n at node:internal/process/execution:83:21\n at [eval]-wrapper:6:24\n at runScript (node:internal/process/execution:82:62)\n at evalScript (node:internal/process/execution:104:10)\n at node:internal/main/eval_string:50:3\n\nNode.js v19.3.0\n"
File "/usr/lib/python3.8/subprocess.py", line 516, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['node', '-p', 'require("./script/lib/get-version").getElectronVersion()']' returned non-zero exit status 1.
See //electron/build/args/all.gn:2:21: which caused the file to be included.
root_extra_deps = [ "//electron" ]
^-----------
mkdir electron && cd electron
gclient config --name "src/electron" --unmanaged https://github.com/electron/electron
gclient sync --with_branch_heads --with_tags --no-history
cd src
export CHROMIUM_BUILDTOOLS_PATH=`pwd`/buildtools
gn gen out/Release --args="import(\"//electron/build/args/release.gn\")"
ninja -C out/Release electron

drake visualizer not displaying anything

OS: Ubuntu 16.04, NVIDIA Driver
I followed the drake installation procedure as described in drake website.(I have also installed nvidia driver)After installation, as per the instructions when I run:
$ xhost +local:root; nvidia-docker run -i --rm -e DISPLAY -e QT_X11_NO_MITSHM=1 -v /tmp/.X11-unix:/tmp/.X11-unix --privileged -t drake; xhost -local:root
I am getting the following error:(simulation is not being displayed, but the build is successful)
non-network local connections being added to access control list
+ [[ 0 -eq 0 ]]
+ bazel build //tools:drake_visualizer //examples/acrobot:run_passive
Starting local Bazel server and connecting to it...
INFO: Analysed 2 targets (95 packages loaded, 18023 targets configured).
INFO: Found 2 targets...
INFO: Elapsed time: 89.206s, Critical Path: 1.58s
INFO: 0 processes.
INFO: Build completed successfully, 1 total action
+ sleep 2
+ ./bazel-bin/tools/drake_visualizer
+ bazel run //examples/acrobot:run_passive
INFO: Analysed target //examples/acrobot:run_passive (0 packages loaded, 0 targets configured).
INFO: Found 1 target...
Target //examples/acrobot:run_passive up-to-date:
bazel-bin/examples/acrobot/run_passive
INFO: Elapsed time: 1.031s, Critical Path: 0.01s
INFO: 0 processes.
INFO: Build completed successfully, 1 total action
INFO: Build completed successfully, 1 total action
process 297: D-Bus library appears to be incorrectly set up; failed to read machine uuid: UUID file '/etc/machine-id' should contain a hex string of length 32, not length 0, with no other text
See the manual page for dbus-uuidgen to correct this issue.
libGL error: No matching fbConfigs or visuals found
libGL error: failed to load driver: swrast
Could not initialize OpenGL for RasterGLSurface, reverting to RasterSurface.
Could not initialize OpenGL for RasterGLSurface, reverting to RasterSurface.
Could not initialize OpenGL for RasterGLSurface, reverting to RasterSurface.
Could not initialize GLX
./setup/ubuntu/docker/entrypoint.sh: line 15: 297 Aborted (core dumped) ./bazel-bin/tools/drake_visualizer
non-network local connections being removed from access control list
We are in the process of updating the instructions to use nvidia-docker 2.0. Please check the drake repo again later this week for an update. In meantime, you may wish to try the open-source driver instructions on the same page.

cordova build Android error:Execution failed for task ':compileDebugJava'

I got error when I build the Android project,after I added cordova Camera plugin,can anybody give me a hand,thanks a lot.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':compileDebugJava'.
> Compilation failed; see the compiler error output for details.
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
D:\Angular4\myApp\platforms\android\cordova\node_modules\q\q.js:126
throw e;
^
Error code 1 for command: cmd with args: /s /c "D:\Angular4\myApp\platforms\android\gradlew cdvBuildDebug -b D:\Angular4\myApp\platforms\android\build.gradle -Dorg.gradle.daemon=true"
Command finished with error code 1: cmd /s /c "D:\Angular4\myApp\platforms\android\cordova\build.bat"
ERROR building one of the platforms: Error: cmd: Command failed with exit code 1 Error output:
D:\Angular4\myApp\platforms\android\src\org\apache\cordova\camera\CordovaUri.java:78: ����: �Ҳ�������
if(Build.VERSION.SDK_INT >= Build.VERSION_CODES.M)
^
����: ���� M
�: �� VERSION_CODES
ע: D:\Angular4\myApp\platforms\android\src\org\apache\cordova\splashscreen\SplashScreen.javaʹ�û򸲸����ѹ�ʱ�� API��
ע: �й���ϸ��Ϣ, ��ʹ�� -Xlint:deprecation ���±��롣
1 ������

YOCTO - First build for BBB

I am trying to use for the first time the Yocto tool for my BeagleBoneBlack.
First I run this bash file to install Yocto:
#!/bin/bash
WKDIR=/work
mkdir -p $WKDIR/beaglebone-black/yocto/sources
mkdir -p $WKDIR/beaglebone-black/yocto/builds
cd $WKDIR/beaglebone-black/yocto/sources
git clone -b morty git://git.yoctoproject.org/poky.git poky-morty
cd $WKDIR/beaglebone-black/yocto/
source sources/poky-morty/oe-init-build-env builds/build-bbb-morty
Then I edited the file local.conf at "build-bbb-morty/conf" diretory:
MACHINE ?= "beaglebone"
and added
DL_DIR ?= "${TOPDIR}/../dl"
IMAGE_INSTALL_append = " kernel-modules kernel-devicetree"
Then I run bitbake:> bitbake core-image-minimal
After about 8 hours in my Core i7 five generation I got this result at my terminal output and I have no idea what I need to do to fix it:
bitbake core-image-minimal
Parsing recipes: 100% |########################################################################################################| Time: 0:02:55
Parsing of 864 .bb files complete (0 cached, 864 parsed). 1318 targets, 67 skipped, 0 masked, 0 errors.
NOTE: Resolving any missing task queue dependencies
Build Configuration:
BB_VERSION = "1.32.0"
BUILD_SYS = "x86_64-linux"
NATIVELSBSTRING = "Ubuntu-16.04"
TARGET_SYS = "arm-poky-linux-gnueabi"
MACHINE = "beaglebone"
DISTRO = "poky"
DISTRO_VERSION = "2.2.1"
TUNE_FEATURES = "arm armv7a vfp neon callconvention-hard cortexa8"
TARGET_FPU = "hard"
meta
meta-poky
meta-yocto-bsp = "morty:a3fa5ce87619e81d7acfa43340dd18d8f2b2d7dc"
NOTE: Fetching uninative binary shim from http ://downloads.yoctoproject.org/releases/uninative/1.4/x86_64-nativesdk-libc.tar.bz2;sha256sum=101ff8f2580c193488db9e76f9646fb6ed38b65fb76f403acb0e2178ce7127ca
--2017-01-18 15:51:09-- http ://downloads.yoctoproject.org/releases/uninative/1.4/x86_64-nativesdk-libc.tar.bz2
Resolving downloads.yoctoproject.org (downloads.yoctoproject.org)... 198.145.20.127
Connecting to downloads.yoctoproject.org (downloads.yoctoproject.org)|198.145.20.127|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 2473216 (2.4M) [application/octet-stream]
Saving to: ‘/work/beaglebone-black/yocto/builds/build-bbb-morty/../dl/uninative/101ff8f2580c193488db9e76f9646fb6ed38b65fb76f403acb0e2178ce7127ca/x86_64-nativesdk-libc.tar.bz2’
2017-01-18 15:51:18 (297 KB/s) - ‘/work/beaglebone-black/yocto/builds/build-bbb-morty/../dl/uninative/101ff8f2580c193488db9e76f9646fb6ed38b65fb76f403acb0e2178ce7127ca/x86_64-nativesdk-libc.tar.bz2’ saved [2473216/2473216]
Initialising tasks: 100% |#####################################################################################################| Time: 0:00:14
NOTE: Executing SetScene Tasks
NOTE: Executing RunQueue Tasks
WARNING: attr-native-2.4.47-r0 do_fetch: Failed to fetch URL http ://download.savannah.gnu.org/releases/attr/attr-2.4.47.src.tar.gz, attempting MIRRORS if available
WARNING: libpng-native-1.6.24-r0 do_fetch: Failed to fetch URL http ://distfiles.gentoo.org/distfiles/libpng-1.6.24.tar.xz, attempting MIRRORS if available
ERROR: core-image-minimal-1.0-r0 do_image_wic: Function failed: do_image_wic (log file is located at /work/beaglebone-black/yocto/builds/build-bbb-morty/tmp/work/beaglebone-poky-linux-gnueabi/core-image-minimal/1.0-r0/temp/log.do_image_wic.23788)
ERROR: Logfile of failure stored in: /work/beaglebone-black/yocto/builds/build-bbb-morty/tmp/work/beaglebone-poky-linux-gnueabi/core-image-minimal/1.0-r0/temp/log.do_image_wic.23788
Log data follows:
| DEBUG: Executing python function set_image_size
| DEBUG: Python function set_image_size finished
| DEBUG: Executing shell function do_image_wic
| Checking basic build environment...
| Done.
|
| Build artifacts not found, exiting.<br/>
| (Please check that the build artifacts for the machine
| selected in local.conf actually exist and that they
| are the correct artifacts for the image (.wks file))
|
| The artifact that couldn't be found was kernel-dir:
| /work/beaglebone-black/yocto/builds/build-bbb-morty/tmp/deploy/images/beaglebone
| WARNING: exit code 1 from a shell command.
| ERROR: Function failed: do_image_wic (log file is located at /work/beaglebone-black/yocto/builds/build-bbb-morty/tmp/work/beaglebone-poky-linux-gnueabi/core-image-minimal/1.0-r0/temp/log.do_image_wic.23788)
ERROR: Task (/work/beaglebone-black/yocto/sources/poky-morty/meta/recipes-core/images/core-image-minimal.bb:do_image_wic) failed with exit code '1'
NOTE: Tasks Summary: Attempted 1771 tasks of which 6 didn't need to be rerun and 1 failed.
Summary: 1 task failed:
/work/beaglebone-black/yocto/sources/poky-morty/meta/recipes-core/images/core-image-minimal.bb:do_image_wic
Summary: There were 2 WARNING messages shown.
Summary: There was 1 ERROR message shown, returning a non-zero exit code.
While not sure this could be the reason of the problem, the prefered method to add packages to the image, in the local.conf context is using the CORE_IMAGE_EXTRA_INSTALL variable.
Therefore change:
IMAGE_INSTALL_append = " kernel-modules kernel-devicetree"
to
CORE_IMAGE_EXTRA_INSTALL += "kernel-modules kernel-devicetree"
I think there is no problem with your work method.
It seems to be a build environment problem, but the error log seems to confirm.
your log location at "/work/beaglebone-black/yocto/builds/build-bbb-morty/tmp/work/beaglebone-poky-linux-gnueabi/core-image-minimal/1.0-r0/temp/log.do_image_wic.23788"
Your error log indicates the the URL for fetching binaries failed.
You can try using tunnel through proxy. Or you can run the bitbake again because it can also fail sometimes due to network conditions.

Opencv 3.0.0 installation with contrib-modules

I have a windows 8.1 PC on a 64 bit machine. I had already installed OpenCV 3.0.0 from source without opencv_contrib. Following the answer by berak-
Nonfree module is missing in OpenCV 3.0, I downloaded the contrib and tried building opencv from source again but this time with OPENCV_EXTRA_MODULES option turned on.
During this build process however, I got strange errors from the VS compiler. They were from the file
\modules\line_descriptor\src\binary_descriptor.cpp
The errors were the following on line numbers as follows.
error C2143: syntax error : missing ';' before '=' E:\opencv\opencv-master\opencv_contrib-master\modules\line_descriptor\src\binary_descriptor.cpp line 833
error C2059: syntax error: '>' line 836
error C2143: syntax error : missing ';' before '{' E:\opencv\opencv-master\opencv_contrib-master\modules\line_descriptor\src\binary_descriptor.cpp line 837
error LNK1104: cannot open file '....\lib\Debug\opencv_line_descriptor300d.lib' E:\opencv\opencv-master\build\modules\line_descriptor\LINK opencv_test_line_descriptor
Error 7 error MSB3073: The command "setlocal
"C:\Program Files (x86)\CMake\bin\cmake.exe" -DBUILD_TYPE=Debug -P cmake_install.cmake
if %errorlevel% neq 0 goto :cmEnd
:cmEnd
endlocal & call :cmErrorLevel %errorlevel% & goto :cmDone
:cmErrorLevel
exit /b %1
:cmDone
if %errorlevel% neq 0 goto :VCEnd
:VCEnd" exited with code 1. C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V120\Microsoft.CppCommon.targets 132 5 INSTALL
i had the same problem some weeks ago. i solved it this way:
edit the \modules\line_descriptor\src\binary_descriptor.cpp
and put this line after the defines:
#undef near
run cmake again (with -DBUILD_opencv_line_descriptor=ON)
what i disabled was: _cvv and _world

Resources