ROS Launcher File Results in Error with roslaunch - ros

I have a ROS project that I'm trying out. I have the following folder structure:
catkin_ws
- src
- beginner_tutorials
- src
- listener.cpp
- talker.cpp
- config
- config.conf
- launch
- my.launch
- srv
- ModifyText.srv
- CMakeLists.txt
Assume that I have the remaining folders in the catkin_ws like the devel etc.,
Here is the CMakeLists.txt looks like:
cmake_minimum_required(VERSION 2.8.3)
project(beginner_tutorials)
## Find catkin and any catkin packages
find_package(catkin REQUIRED COMPONENTS roscpp rospy std_msgs genmsg)
## Declare ROS messages and services
#add_message_files(FILES Num.msg)
add_service_files(FILES modifyText.srv)
## Generate added messages and services
generate_messages(DEPENDENCIES std_msgs)
## Declare a catkin package
catkin_package()
## Build talker and listener
include_directories(include ${catkin_INCLUDE_DIRS})
add_executable(talker src/talker.cpp)
target_link_libraries(talker ${catkin_LIBRARIES})
add_dependencies(talker beginner_tutorials_generate_messages_cpp)
add_executable(listener src/listener.cpp)
target_link_libraries(listener ${catkin_LIBRARIES})
add_dependencies(listener beginner_tutorials_generate_messages_cpp)
Now when I tried to launch the listener and talker using the launcher file as below, I get the error as below:
joesan#joesan-InfinityBook-S-14-v5:~/Projects/ros-projects/catkin_ws$ roslaunch beginner_tutorials my.launch
... logging to /home/joesan/.ros/log/4df614ca-ebaa-11ea-b08a-3fbc143c4e1a/roslaunch-joesan-InfinityBook-S-14-v5-7071.log
Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.
started roslaunch server http://joesan-InfinityBook-S-14-v5:37169/
SUMMARY
========
PARAMETERS
* /rosdistro: noetic
* /rosversion: 1.15.8
NODES
/
listener (beginner_tutorials/listener)
talker (beginner_tutorials/talker)
ROS_MASTER_URI=http://localhost:11311
process[talker-1]: started with pid [7085]
RLException: Roslaunch got a 'No such file or directory' error while attempting to run:
gnome-terminal -e /home/joesan/Projects/ros-projects/catkin_ws/devel/lib/beginner_tutorials/listener __name:=listener __log:=/home/joesan/.ros/log/4df614ca-ebaa-11ea-b08a-3fbc143c4e1a/listener-2.log
Please make sure that all the executables in this command exist and have
executable permission. This is often caused by a bad launch-prefix.
The traceback for the exception was written to the log file
[ INFO] [1598893286.274250713]: hello world 0
[talker-1] killing on exit
[listener-2] killing on exit
Here are the contents of my launch file my.launch:
<launch>
<env name="ROSCONSOLE_CONFIG_FILE"
value="$(find beginner_tutorials)/config/config.conf"/>
<arg name = "frequency" default = "10" />
<node pkg = "beginner_tutorials"
type = "talker"
name = "talker"
output = "screen"
args="$(arg frequency)"/>
<node pkg = "beginner_tutorials"
type = "listener"
name = "listener"
output = "screen"
launch-prefix = "gnome-terminal -e"/>
</launch>
I can see that inside the devel folders, the binaries are located as below:
joesan#joesan-InfinityBook-S-14-v5:~/Projects/ros-projects/catkin_ws/devel/lib$ cd beginner_tutorials/
drwxrwxr-x joesan joesan 4 KB Mon Aug 31 18:44:39 2020  .
drwxrwxr-x joesan joesan 4 KB Mon Aug 31 18:44:37 2020  ..
.rwxrwxr-x joesan joesan 327.3 KB Mon Aug 31 18:44:39 2020  listener
.rwxrwxr-x joesan joesan 190.1 KB Mon Aug 31 18:44:39 2020  talker

Related

Liquibase "Cannot find the declaration of element 'databaseChangeLog'" in xml changeLog file

Following This guide from Liquibase's official website I've created my own changelog-master.xml
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<databaseChangeLog xmlns="https://www.liquibase.org/xml/ns/dbchangelog"
xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="https://www.liquibase.org/xml/ns/dbchangelog https://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.4.xsd">
<includeAll path="/home/username/liquibase/examples/sqlite-test" filter="sql"/>
</databaseChangeLog>
I've then created the liquibase.properties file in the same folder:
# Enter the path for your changelog file.
changeLogFile=changelog-master.xml
#### Enter the Target database 'url' information ####
url=jdbc:sqlite://home/username/liquibase/examples/sqlite-test/testdb
Which is correct because if I run a normal .sql changelog it runs correctly and updates my DB.
I've then created a changelog file in sql which has to be automatically executed when launching liquibase update which is 000020-changelog.sql
--liquibase formatted sql
--changeset daniel:3
create table phonebook(name TEXT, surname TEXT, type TEXT, phone_number TEXT);
But when I go and launch liquibase update I get an error from the XML parser:
Unexpected error running Liquibase: cvc-elt.1.a: Cannot find the declaration of element 'databaseChangeLog'.
And I can't understand what the problem is. I've checked multiple times if the changelog-master.xml file is correct and it looks like it is. And from what I can find the cvc-elt.1.a is an XML Parsing error, like databaseChangeLog it's not declared in the xml schema.
I'm doing this so that in the future I can create as many changelogs as I want and have them executed one after the other automatically.
I've been looking for some solutions for this problem but I can't find anything. I've found a link to the official forums but it's now a dead link.
Extra info:
Liquibase version 4.0.0
Installed JRE: openjdk 11.0.8 2020-07-14
OS: Debian 10
SQLite version 3.27.2
08/09/2020 edit:
as asked in the comments this is the project's structure:
root#dev-machine:/home/username/liquibase/examples/sqlite-test# ls -la
total 68
drwxr-xr-x 2 root root 4096 Sep 4 17:28 .
drwxr-xr-x 5 username username 4096 Sep 4 17:02 ..
-rw-r--r-- 1 root root 139 Sep 4 13:35 000020-changelog.sql
-rw-r--r-- 1 root root 118 Sep 4 13:36 000030-changelog.sql
-rw-r--r-- 1 root root 201 Sep 4 17:05 000040-changelog.sql
-rw-r--r-- 1 root root 240 Sep 4 17:28 000050-changelog.sql
-rw-r--r-- 1 root root 456 Sep 4 14:22 changelog-master.xml
-rw-r--r-- 1 root root 2637 Sep 4 16:36 liquibase.properties
-rw-r--r-- 1 root root 32768 Sep 4 17:28 testdb
testdb is the sqlite database I'm using to test liquibase. The .sql files are the consecutive changelogs that must be run to update the DB
What was causing the issue I was having was the fact that in my changelog-master.xml I had an outdated XSD version, which was causing something to break into the XML Parser. To fix it I've changed it with the following:
<?xml version="1.1" encoding="UTF-8" standalone="no"?>
<databaseChangeLog
xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
xmlns:ext="http://www.liquibase.org/xml/ns/dbchangelog-ext"
xmlns:pro="http://www.liquibase.org/xml/ns/pro"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog-ext
http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-ext.xsd
http://www.liquibase.org/xml/ns/pro
http://www.liquibase.org/xml/ns/pro/liquibase-pro-4.0.xsd
http://www.liquibase.org/xml/ns/dbchangelog
http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-4.0.xsd
">
<includeAll path="schema/"/>
</databaseChangeLog>
and moved all my *.sql files into the schema/ folder. Other than that I've renamed all my .sql files using the *.databasetype.sql namescheme, in my case 000020-changelog.sqlite.sql. Had to do this because otherwise the files that used --liquibase formatted sql wouldn't be executed.
For me, this issue is more related to http and https. The following shows the successful config:
<?xml version="1.0" encoding="utf-8"?>
<databaseChangeLog
xmlns="http://www.liquibase.org/xml/ns/dbchangelog" <-- NOT HTTPS!
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" <-- NOT HTTPS!
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog <-- NOT HTTPS!
https://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-4.4.xsd">
<includeAll path="changelogs/" relativeToChangelogFile="true"/>
</databaseChangeLog>
If http in any one of the three line is changed to https, it will throw the exception:
liquibase.exception.ChangeLogParseException: Error parsing line 6 column 70
of classpath:db/main.xml: cvc-elt.1.a: Cannot find the declaration of
element 'databaseChangeLog'.

Microsoft Word crashes when invoking its COM inside Docker Container

I am building a document conversion service which needs to:
Support Office documents as input.
Be pixel accurate (i.e. Openoffice and friends are not an acceptable alternative).
The service works well in the Windows host (it uses Office Interop from C#), but I would like to containerize it to simplify CI.
I am aware that Microsoft explicitly discourages using Office server-side but that's not a valid answer due to the reasons above.
Here is a minimal, simplified reproduction of the problem.
I am creating a Docker image with Microsoft Office using this Dockerfile:
FROM microsoft/windowsservercore:10.0.14393.953
# Install Office deployment tool
ADD https://download.microsoft.com/download/2/7/A/27AF1BE6-DD20-4CB4-B154-EBAB8A7D4A7E/officedeploymenttool_8008-3601.exe C:/deploymenttool_autoextract.exe
RUN C:/deploymenttool_autoextract.exe /quiet /passive /extract:C:
# Install Office
RUN C:/setup.exe /configure configuration.xml
ENTRYPOINT powershell
Building the image works just fine (it takes a while but everything seems to be installed properly):
PS C:\> docker build -t office2016 .
Sending build context to Docker daemon 9.513 MB
Step 1/5 : FROM microsoft/windowsservercore:10.0.14393.953
---> b4713e4d8bab
Step 2/5 : ADD https://download.microsoft.com/download/2/7/A/27AF1BE6-DD20-4CB4-B154-EBAB8A7D4A7E/officedeploymenttool_8008-3601.exe C:/deploymenttool_autoextract.exe
Downloading [==================================================>] 2.373 MB/2.373 MB
---> Using cache
---> 44646838dccb
Step 3/5 : RUN C:/deploymenttool_autoextract.exe /quiet /passive /extract:C:
---> Using cache
---> 34c29ed81a48
Step 4/5 : RUN C:/setup.exe /configure configuration.xml
---> Using cache
---> ffaff2d4c553
Step 5/5 : ENTRYPOINT powershell
---> Using cache
---> 3c0905d88e16
Successfully built 3c0905d88e16
PS C:\>
But, using Office (word specifically) fails:
PS C:\> docker run -ti office2016
Windows PowerShell
Copyright (C) 2016 Microsoft Corporation. All rights reserved.
PS C:\> $Word = New-Object -ComObject Word.Application
New-Object : Retrieving the COM class factory for component with CLSID {000209FF-0000-0000-C000-000000000046} failed due to the following error: 80080005
Server execution failed (Exception from HRESULT: 0x80080005 (CO_E_SERVER_EXEC_FAILURE)).
At line:1 char:9
+ $Word = New-Object -ComObject Word.Application
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ResourceUnavailable: (:) [New-Object], COMException
+ FullyQualifiedErrorId : NoCOMClassIdentified,Microsoft.PowerShell.Commands.NewObjectCommand
PS C:\>
I found this KERNELBASE.dll crash in the event logs:
PS C:\> Get-EventLog -LogName Application -After (Get-Date).AddMinutes(-1) | Format-Table -Wrap -AutoSize
Index Time EntryType Source InstanceID Message
----- ---- --------- ------ ---------- -------
264 Mar 29 06:35 Information Windows Error Reporting 1001 Fault bucket 108819405901, type 1
Event Name: APPCRASH
Response: Not available
Cab Id: 0
Problem signature:
P1: WINWORD.EXE
P2: 16.0.7870.2031
P3: 58d7f952
P4: KERNELBASE.dll
P5: 10.0.14393.953
P6: 58ba586d
P7: c06d007e
P8: 000da882
P9:
P10:
Attached files:
\\?\C:\Windows\Temp\WER1F8D.tmp.csv
\\?\C:\Windows\Temp\WER1F8E.tmp.txt
These files may be available here:
C:\ProgramData\Microsoft\Windows\WER\ReportArchive\AppCrash_WINWORD.EXE_31f9c9
d9099ee9f1dcb30a4d4769b8bed7fbe78_7d4a7f4a_08071f9d
Analysis symbol:
Rechecking for solution: 0
Report Id: 39ac8782-7405-4d3d-acf6-401aeeedcfe7
Report Status: 4104
Hashed bucket: d26bd2875cf81eb545bb270d4c18672b
263 Mar 29 06:35 Information Windows Error Reporting 1001 Fault bucket , type 0
Event Name: APPCRASH
Response: Not available
Cab Id: 0
Problem signature:
P1: WINWORD.EXE
P2: 16.0.7870.2031
P3: 58d7f952
P4: KERNELBASE.dll
P5: 10.0.14393.953
P6: 58ba586d
P7: c06d007e
P8: 000da882
P9:
P10:
Attached files:
These files may be available here:
C:\ProgramData\Microsoft\Windows\WER\ReportQueue\AppCrash_WINWORD.EXE_31f9c9d9
099ee9f1dcb30a4d4769b8bed7fbe78_7d4a7f4a_182f1a5d
Analysis symbol:
Rechecking for solution: 0
Report Id: 39ac8782-7405-4d3d-acf6-401aeeedcfe7
Report Status: 4100
Hashed bucket:
262 Mar 29 06:35 Error Application Error 1000 Faulting application name: WINWORD.EXE, version: 16.0.7870.2031, time stamp:
0x58d7f952
Faulting module name: KERNELBASE.dll, version: 10.0.14393.953, time stamp:
0x58ba586d
Exception code: 0xc06d007e
Fault offset: 0x000da882
Faulting process id: 0x1714
Faulting application start time: 0x01d2a8915664ac21
Faulting application path: C:\Program Files (x86)\Microsoft
Office\Root\Office16\WINWORD.EXE
Faulting module path: C:\Windows\System32\KERNELBASE.dll
Report Id: 39ac8782-7405-4d3d-acf6-401aeeedcfe7
Faulting package full name:
Faulting package-relative application ID:
261 Mar 29 06:35 0 Software Protection Platform Service 1073742727 The Software Protection service has stopped.
260 Mar 29 06:35 Information Software Protection Platform Service 1073758208 Successfully scheduled Software Protection service for re-start at
2017-03-30T13:15:26Z. Reason: RulesEngine.
PS C:\>
I welcome suggestions on how to fix this or any other way to convert Office documents in a pixel-accurate manner, preferably inside a Docker container.
Extra information
I am using Docker 1.13:
PS C:\build> docker version
Client:
Version: 1.13.0
API version: 1.25
Go version: go1.7.3
Git commit: 49bf474
Built: Wed Jan 18 16:20:26 2017
OS/Arch: windows/amd64
Server:
Version: 1.13.0
API version: 1.25 (minimum version 1.24)
Go version: go1.7.3
Git commit: 49bf474
Built: Wed Jan 18 16:20:26 2017
OS/Arch: windows/amd64
Experimental: false
Here is the Report.wer file from the crash:
Version=1
EventType=APPCRASH
EventTime=131352685691694832
ReportType=2
Consent=1
UploadTime=131352685700338868
ReportIdentifier=989281b5-1485-11e7-a710-eaddffe2c349
IntegratorReportIdentifier=269019fe-dcbd-435a-adc2-2420ed62b0ba
WOW64=1
NsAppName=WINWORD.EXE
AppSessionGuid=00001934-0018-000e-c921-465a92a8d201
TargetAppId=W:0000da39a3ee5e6b4b0d3255bfef95601890afd80709!0000da39a3ee5e6b4b0d3255bfef95601890afd80709!WINWORD.EXE
TargetAppVer=2017//03//26:17:24:34!1df35e!WINWORD.EXE
BootId=4294967295
Response.BucketId=d26bd2875cf81eb545bb270d4c18672b
Response.BucketTable=1
Response.LegacyBucketId=108819405901
Response.type=4
Response.CabId=107979656579
Sig[0].Name=Application Name
Sig[0].Value=WINWORD.EXE
Sig[1].Name=Application Version
Sig[1].Value=16.0.7870.2031
Sig[2].Name=Application Timestamp
Sig[2].Value=58d7f952
Sig[3].Name=Fault Module Name
Sig[3].Value=KERNELBASE.dll
Sig[4].Name=Fault Module Version
Sig[4].Value=10.0.14393.953
Sig[5].Name=Fault Module Timestamp
Sig[5].Value=58ba586d
Sig[6].Name=Exception Code
Sig[6].Value=c06d007e
Sig[7].Name=Exception Offset
Sig[7].Value=000da882
DynamicSig[1].Name=OS Version
DynamicSig[1].Value=10.0.14393.2.0.0.400.8
DynamicSig[2].Name=Locale ID
DynamicSig[2].Value=1033
DynamicSig[22].Name=Additional Information 1
DynamicSig[22].Value=2beb
DynamicSig[23].Name=Additional Information 2
DynamicSig[23].Value=2beba6fb4680d73a8c78ca7c24ccdb46
DynamicSig[24].Name=Additional Information 3
DynamicSig[24].Value=9cd2
DynamicSig[25].Name=Additional Information 4
DynamicSig[25].Value=9cd2e275ccb4247c4efe4926a99dfc92
UI[2]=C:\Program Files (x86)\Microsoft Office\Root\Office16\WINWORD.EXE
UI[5]=Check online for a solution (recommended)
UI[6]=Check for a solution later (recommended)
UI[7]=Close
UI[8]=Microsoft Word stopped working and was closed
UI[9]=A problem caused the application to stop working correctly. Windows will notify you if a solution is available.
UI[10]=&Close
LoadedModule[0]=C:\Program Files (x86)\Microsoft Office\Root\Office16\WINWORD.EXE
LoadedModule[1]=C:\Windows\SYSTEM32\ntdll.dll
LoadedModule[2]=C:\Windows\System32\KERNEL32.DLL
LoadedModule[3]=C:\Windows\System32\KERNELBASE.dll
LoadedModule[4]=C:\Windows\System32\ADVAPI32.dll
LoadedModule[5]=C:\Windows\System32\msvcrt.dll
LoadedModule[6]=C:\Windows\System32\sechost.dll
LoadedModule[7]=C:\Windows\System32\RPCRT4.dll
LoadedModule[8]=C:\Windows\System32\SspiCli.dll
LoadedModule[9]=C:\Windows\System32\CRYPTBASE.dll
LoadedModule[10]=C:\Windows\System32\bcryptPrimitives.dll
LoadedModule[11]=C:\Windows\System32\ucrtbase.dll
LoadedModule[12]=C:\Program Files (x86)\Microsoft Office\Root\Office16\AppVIsvSubsystems32.dll
LoadedModule[13]=C:\Windows\System32\USER32.dll
LoadedModule[14]=C:\Windows\System32\win32u.dll
LoadedModule[15]=C:\Windows\System32\GDI32.dll
LoadedModule[16]=C:\Windows\System32\gdi32full.dll
LoadedModule[17]=C:\Windows\System32\SHELL32.dll
LoadedModule[18]=C:\Windows\System32\cfgmgr32.dll
LoadedModule[19]=C:\Windows\System32\windows.storage.dll
LoadedModule[20]=C:\Windows\System32\combase.dll
LoadedModule[21]=C:\Windows\System32\powrprof.dll
LoadedModule[22]=C:\Windows\System32\shlwapi.dll
LoadedModule[23]=C:\Windows\System32\kernel.appcore.dll
LoadedModule[24]=C:\Windows\System32\shcore.dll
LoadedModule[25]=C:\Windows\System32\profapi.dll
LoadedModule[26]=C:\Windows\System32\ole32.dll
LoadedModule[27]=C:\Program Files (x86)\Microsoft Office\Root\Office16\VCRUNTIME140.dll
LoadedModule[28]=C:\Program Files (x86)\Microsoft Office\Root\Office16\AppVIsvStream32.dll
LoadedModule[29]=C:\Windows\SYSTEM32\USERENV.dll
LoadedModule[30]=C:\Program Files (x86)\Microsoft Office\Root\Office16\c2r32.dll
LoadedModule[31]=C:\Windows\System32\OLEAUT32.dll
LoadedModule[32]=C:\Windows\System32\msvcp_win.dll
LoadedModule[33]=C:\Program Files (x86)\Common Files\Microsoft Shared\Office16\mso20win32client.dll
LoadedModule[34]=C:\Program Files (x86)\Microsoft Office\Root\Office16\MSVCP140.dll
LoadedModule[35]=C:\Program Files (x86)\Microsoft Office\Root\Office16\wwlib.dll
LoadedModule[36]=C:\Windows\SYSTEM32\d3d11.dll
LoadedModule[37]=C:\Windows\WinSxS\x86_microsoft.windows.gdiplus_6595b64144ccf1df_1.1.14393.953_none_baad48403594ab3f\gdiplus.dll
LoadedModule[38]=C:\Program Files (x86)\Microsoft Office\Root\Office16\oart.dll
LoadedModule[39]=C:\Windows\SYSTEM32\dxgi.dll
LoadedModule[40]=C:\Windows\WinSxS\x86_microsoft.windows.common-controls_6595b64144ccf1df_6.0.14393.953_none_89c2555adb023171\COMCTL32.dll
LoadedModule[41]=C:\Program Files (x86)\Common Files\Microsoft Shared\Office16\mso30win32client.dll
LoadedModule[42]=C:\Program Files (x86)\Common Files\Microsoft Shared\Office16\mso40uiwin32client.dll
LoadedModule[43]=C:\Program Files (x86)\Common Files\Microsoft Shared\Office16\mso50win32client.dll
LoadedModule[44]=C:\Program Files (x86)\Common Files\Microsoft Shared\Office16\mso98win32client.dll
LoadedModule[45]=C:\Windows\SYSTEM32\WTSAPI32.dll
LoadedModule[46]=C:\Program Files (x86)\Common Files\Microsoft Shared\Office16\mso99Lwin32client.dll
LoadedModule[47]=C:\Windows\SYSTEM32\MSIMG32.dll
LoadedModule[48]=C:\Program Files (x86)\Common Files\Microsoft Shared\Office16\mso.dll
LoadedModule[49]=C:\Windows\SYSTEM32\msi.dll
LoadedModule[50]=C:\Windows\SYSTEM32\bcrypt.dll
LoadedModule[51]=C:\Windows\SYSTEM32\d2d1.dll
LoadedModule[52]=C:\Windows\SYSTEM32\CRYPT32.dll
LoadedModule[53]=C:\Windows\SYSTEM32\MSASN1.dll
LoadedModule[54]=C:\Windows\SYSTEM32\WINSTA.dll
LoadedModule[55]=C:\Windows\SYSTEM32\VERSION.dll
LoadedModule[56]=C:\Program Files (x86)\Common Files\Microsoft Shared\Office16\MSPTLS.DLL
LoadedModule[57]=C:\Windows\SYSTEM32\d3d10warp.dll
State[0].Key=Transport.DoneStage1
State[0].Value=1
File[0].CabName=WERInternalMetadata.xml
File[0].Path=WERC595.tmp.WERInternalMetadata.xml
File[0].Flags=851970
File[0].Type=5
File[0].Original.Path=\\?\C:\ProgramData\Microsoft\Windows\WER\Temp\WERC595.tmp.WERInternalMetadata.xml
File[1].CabName=triagedump.dmp
File[1].Path=triagedump.dmp
File[1].Flags=808255490
File[1].Type=6
File[2].CabName=WERGenerationLog.txt
File[2].Flags=851970
File[2].Type=5
File[2].Buffer=FFFE53006E0061007000730068006F0074002000640075006D007000650072002000640065006100630074006900760061007400650064002E000D000A002D002000200053006E0061007000730068006F007400200061007600610069006C00610062006C0065003A00200030002E000D000A002D002000200053006E0061007000730068006F00740073002000640069007300610062006C00650064003A002000300030002E000D000A002D002000200020002000200053006E0061007000730068006F00740020007300740061007400750073003A002000430030003000300030003000360031002E000D000A002D002000200020002000200020002000440075006D0070006500720020007300740061007400750073003A002000300030003000300030003000300031002E000D000A002D00200020002000500072006F0063006500730073002000570045005200200066006C006100670073003A002000300030003000300030003000300030002E000D000A002D00200057006100740073006F006E00200072006500710075006500730074002000640075006D0070003A002000300030003100300030003100410034002E000D000A000D000A0053006E0061007000730068006F0074002000640075006D007000650072002000640065006100630074006900760061007400650064002E000D000A002D002000200053006E0061007000730068006F007400200061007600610069006C00610062006C0065003A00200030002E000D000A002D002000200053006E0061007000730068006F00740073002000640069007300610062006C00650064003A002000300030002E000D000A002D002000200020002000200053006E0061007000730068006F00740020007300740061007400750073003A002000430030003000300030003000360031002E000D000A002D002000200020002000200020002000440075006D0070006500720020007300740061007400750073003A002000300030003000300030003000300031002E000D000A002D00200020002000500072006F0063006500730073002000570045005200200066006C006100670073003A002000300030003000300030003000300030002E000D000A002D00200057006100740073006F006E00200072006500710075006500730074002000640075006D0070003A002000300030003100300030003100410034002E000D000A000D000A00
File[3].CabName=memory.csv
File[3].Path=WERCE00.tmp.csv
File[3].Flags=851971
File[3].Type=5
File[3].Original.Path=\\?\C:\Windows\Temp\WERCE00.tmp.csv
File[4].CabName=sysinfo.txt
File[4].Path=WERCE30.tmp.txt
File[4].Flags=851971
File[4].Type=5
File[4].Original.Path=\\?\C:\Windows\Temp\WERCE30.tmp.txt
File[5].CabName=Report.cab
File[5].Path=Report.cab
File[5].Flags=196608
File[5].Type=11
File[5].Original.Path=\\?\C:\Windows\system32\Report.cab
FriendlyEventName=Stopped working
ConsentKey=APPCRASH
AppName=Microsoft Word
AppPath=C:\Program Files (x86)\Microsoft Office\Root\Office16\WINWORD.EXE
NsPartner=windows
NsGroup=windows8
ApplicationIdentity=51EB66DF67BACB90D7BBEDF0AB950AC2
MetadataHash=497883896
I am using Windows Server 2016 as the Docker host:
PS C:\> [System.Environment]::OSVersion
Platform ServicePack Version VersionString
-------- ----------- ------- -------------
Win32NT 10.0.14393.0 Microsoft Windows NT 10.0.14393.0
Specifically, I am running this vagrant image: https://atlas.hashicorp.com/StefanScherer/boxes/windows_2016_docker
Finally, it seems that at least another person has tried to do this with Office 2013, also without success. See:
https://forums.docker.com/t/appcrash-kernelbase-dll-error-when-i-try-to-use-microsoft-office-in-docker-container/25706
https://answers.microsoft.com/en-us/msoffice/forum/msoffice_install-mso_winother/appcrash-kernelbasedll-error-when-i-try-to-use/c40e9fb8-fdbc-4245-b0fd-06c035e37c7b
https://forums.docker.com/t/appcrash-kernelbase-dll-error-when-i-try-to-use-microsoft-office-in-docker-container/25706/10 was updated with a working solution.

Adding Local Files in Beeline (Hive)

I'm trying to add local files via the Beeline client, however I keep running into an issue where it tells me the file does not exist.
[test#test-001 tmp]$ touch /tmp/m.py
[test#test-001 tmp]$ stat /tmp/m.py
File: ‘/tmp/m.py’
Size: 0 Blocks: 0 IO Block: 4096 regular empty file
Device: 801h/2049d Inode: 34091464 Links: 1
Access: (0664/-rw-rw-r--) Uid: ( 1036/ test) Gid: ( 1037/ test)
Context: unconfined_u:object_r:user_tmp_t:s0
Access: 2017-02-27 22:04:06.527970709 +0000
Modify: 2017-02-27 22:04:06.527970709 +0000
Change: 2017-02-27 22:04:06.527970709 +0000
Birth: -
[test#test-001 tmp]$ beeline -u jdbc:hive2://hs2-test:10000/default -n r-zubis
Connecting to jdbc:hive2://hs2-test:10000/default
Connected to: Apache Hive (version 1.2.1.2.3.0.0-2557)
Driver: Hive JDBC (version 1.2.1)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.2.1 by Apache Hive
0: jdbc:hive2://hs2-test:10000/def> ADD FILE '/tmp/m.py';
Error: Error while processing statement: '/tmp/m.py' does not exist (state=,code=1)
0: jdbc:hive2://hs2-test:10000/def>
What's the issue?
You can only add files on the box HiveServer2 is running on. (and I needed to remove the quotes) I found it via a blog comment on Cloudera. Not sure why this isn't in the Beeline docs.
If, like me you are stuck in the position where HiveServer2 is running remotely, beeline will let you load the files from HDFS,
hdfs fs -put /tmp/m.py
then
beeline> add file hdfs:/user/homedir/m.py;

Dataflow stock WordCount example with DataflowPipelineRunner fails when run outside of Maven

I am able to successfully run the WordCount example using DataflowPipelineRunner with the maven exec:java command shown in the docs.
However, when I attempt to run it in my own 1.8 VM, it doesn't work. I am using these args (on Windows):
--project=highfive-metrics-service \
--stagingLocation=gs://highfive-dataflow-test/staging \
--runner=BlockingDataflowPipelineRunner \
--gCloudPath=C:/Progra~1/Google/CloudS~1/google-cloud-sdk/bin/gcloud.cmd
I get the following error:
2014-12-24T04:53:34.849Z: (5eada047929dcead): Workflow failed. Causes: (5eada047929dce2e): There was a problem creating the GCE VMs or starting Dataflow on the VMs so no data was processed. Possible causes:
1. A failure in user code on in the worker.
2. A failure in the Dataflow code.
Next Steps:
1. Check the GCE serial console for possible errors in the logs.
2. Look for similar issues on http://stackoverflow.com/questions/tagged/google-cloud-dataflow.
Prior to the subsequent cleanup, I observed three harness instances on GCE as expected. Looking at the serial console for the first one, wordcount-jroy-1224043800-12232038-8cfa-harness-0, I see "normal" (comparing to what I see when running with Maven) looking output that ends with:
Dec 24 04:38:45 [ 16.443484] IPv6: ADDRCONF(NETDEV_CHANGE): docker0: link becomes ready
wordcount-jroy-1224043800-12232038-8cfa-harness-0 kernel: [ 16.438005] IPv6: ADDRCONF(NETDEV_CHANGE): veth30b3796: link becomes ready
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 kernel: [ 16.439395] docker0: port 1(veth30b3796) entered forwarding state
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 kernel: [ 16.440262] docker0: port 1(veth30b3796) entered forwarding state
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 kernel: [ 16.443484] IPv6: ADDRCONF(NETDEV_CHANGE): docker0: link becomes ready
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 12898 100 12898 0 0 2009k 0 --:--:-- --:--:-- --:--:-- 3148k
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: {"attributes":{"config":"{\"alsologtostderr\":true,\"base_task_dir\":\"/tmp/tasks/\",\"commandlines_file_name\":\"commandlines.txt\",\"continue_on_exception\":true,\"dataflow_api_endpoint\":\"https://www.googleapis.com/\",\"dataflow_api_version\":\"v1beta1\",\"log_dir\":\"/dataflow/logs/taskrunner/harness\",\"log_to_gcs\":true,\"log_to_serialconsole\":true,\"parallel_worker_flags\":{\"job_id\":\"2014-12-23_20_38_16.593375-08_10.48.106.68_-469744588\",\"project_id\":\"highfive-metrics-service\",\"reporting_enabled\":true,\"root_url\":\"https://www.googleapis.com/\",\"service_path\":\"dataflow/v1b3/projects/\",\"temp_gcs_directory\":\"gs://highfive-dataflow-test/staging\",\"worker_id\":\"wordcount-jroy-1224043800-12232038-8cfa-harness-0\"},\"project_id\":\"highfive-metrics-service\",\"python_harness_cmd\":\"python_harness_main\",\"scopes\":[\"https://www.googleapis.com/auth/devstorage.full_control\",\"https://www.googleapis.com/auth/cloud-platform\"],\"task_group\":\"nogroup\",\"task_user\":\"nobody\",\"temp_g
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 goo[ 16.494163] device veth29b6136 entered promiscuous mode
gle: cs_directory\":\"gs://highfive-dataflow-test/staging\",\"vm_id\":\"wordcoun[ 16.505311] IPv6: ADDRCONF(NETDEV_UP): veth29b6136: link is not ready
[ 16.507623] docker0: port 2(veth29b6136) entered forwarding state
t-jroy-122404380[ 16.507633] docker0: port 2(veth29b6136) entered forwarding state
0-12232038-8cfa-harness-0\"}","google-container-manifest":"\ncontainers:\n-\n env:\n -\n name: GCS_BUCKET\n value: dataflow-docker-images\n image: google/docker-registry\n imagePullPolicy: PullNever\n name: repository\n ports:\n -\n containerPort: 5000\n hostPort: 5000\n name: registry\n-\n image: localhost:5000/dataflow/taskrunner:20141217-rc00 \n imagePullPolicy: PullIfNotPresent\n name: taskrunner\n volumeMounts:\n -\n mountPath: /dataflow/logs/taskrunner/harness\n name: dataflowlogs-harness\n-\n env:\n -\n name: LOG_DIR\n value: /dataflow/logs\n image: localhost:5000/dataflow/shuffle:20141217-rc00 \n imagePullPolicy: PullIfNotPresent\n name: shuffle\n ports:\n -\n containerPort: 12345\n hostPort: 12345\n name: shuffle1\n -\n containerPort: 22349\n hostPort: 22349\n name: shuffle2\n volumeMounts:\n -\n mountPath: /var/shuffle\n name: dataflow-shuffle\n -\n mountPath: /dataflow/logs\n name: dataflow-logs\nversion: v1
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: beta2\nvolumes:\n-\n name: dataflowlogs-harness\n source:\n hostDir:\n path: /var/log/dataflow/taskrunner/harness\n-\n name: dataflow-shuffle\n source:\n hostDir:\n path: /dataflow/shuffle\n-\n name: dataflow-logs\n source:\n hostDir:\n path: /var/log/dataflow/shuffle\n","job_id":"2014-12-23_20_38_16.593375-08_10.48.106.68_-469744588","packages":"gs://dataflow-releases-prod/worker_packages/NOTICES.shuffle|NOTICES.shuffler|gs://highfive-dataflow-test/staging/access-bridge-64-fE-vq3Wgxy5FvnwmA5YdzQ.jar|access-bridge-64-fE-vq3Wgxy5FvnwmA5YdzQ.jar|gs://highfive-dataflow-test/staging/avro-1.7.7-dTlef6huetK-4IFERNhcqA.jar|avro-1.7.7-dTlef6huetK-4IFERNhcqA.jar|gs://highfive-dataflow-test/staging/charsets-7HC8Y2_U4k8yfkY6e4lxnw.jar|charsets-7HC8Y2_U4k8yfkY6e4lxnw.jar|gs://highfive-dataflow-test/staging/cldrdata-A4PVsm4mesLVUWOTKV5dhQ.jar|cldrdata-A4PVsm4mesLVUWOTKV5dhQ.jar|gs://highfive-dataflow-test/staging/commons-codec-1.3-2I5AW2KkklMQs3emwoFU5Q.jar|commons-codec-1.3-2I5AW2KkklMQs3emwoFU5Q.jar|gs://highfive-dataf
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: low-test/staging/commons-compress-1.4.1-uyvcB16Wfp4wnt8X1Uqi4w.jar|commons-compress-1.4.1-uyvcB16Wfp4wnt8X1Uqi4w.jar|gs://highfive-dataflow-test/staging/commons-logging-1.1.1-blBISC6STJhwBOT8Ksr3NQ.jar|commons-logging-1.1.1-blBISC6STJhwBOT8Ksr3NQ.jar|gs://highfive-dataflow-test/staging/dataflow-test-YIJKUxARCp14MLdWzNdBdQ.zip|dataflow-test-YIJKUxARCp14MLdWzNdBdQ.zip|gs://highfive-dataflow-test/staging/deploy-eLnif2izXW_mrleXudK0Eg.jar|deploy-eLnif2izXW_mrleXudK0Eg.jar|gs://highfive-dataflow-test/staging/dnsns-hmxeUSrhtJou0Wo-UoCjTw.jar|dnsns-hmxeUSrhtJou0Wo-UoCjTw.jar|gs://highfive-dataflow-test/staging/google-api-client-1.19.0-YgeHY_Y9dPd2PwGBWwvmmw.jar|google-api-client-1.19.0-YgeHY_Y9dPd2PwGBWwvmmw.jar|gs://highfive-dataflow-test/staging/google-api-services-bigquery-v2-rev167-1.19.0-mNojB6wqlFqAd2G9Zo7o5w.jar|google-api-services-bigquery-v2-rev167-1.19.0-mNojB6wqlFqAd2G9Zo7o5w.jar|gs://highfive-dataflow-test/staging/google-api-services-compute-v1-rev34-1.19.0-yR5ItN9uOowLPyMiTckyCA.jar|google-api-services
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: -compute-v1-rev34-1.19.0-yR5ItN9uOowLPyMiTckyCA.jar|gs://highfive-dataflow-test/staging/google-api-services-dataflow-v1beta3-rev1-1.19.0-Cg8Pyd4F0t7yqSE4E7v7Rg.jar|google-api-services-dataflow-v1beta3-rev1-1.19.0-Cg8Pyd4F0t7yqSE4E7v7Rg.jar|gs://highfive-dataflow-test/staging/google-api-services-datastore-protobuf-v1beta2-rev1-2.1.0-UxLefoYWxF5K1EpQjKMJ4w.jar|google-api-services-datastore-protobuf-v1beta2-rev1-2.1.0-UxLefoYWxF5K1EpQjKMJ4w.jar|gs://highfive-dataflow-test/staging/google-api-services-pubsub-v1beta1-rev9-1.19.0-7E1jg5ZyfaqZBCHY18fPkQ.jar|google-api-services-pubsub-v1beta1-rev9-1.19.0-7E1jg5ZyfaqZBCHY18fPkQ.jar|gs://highfive-dataflow-test/staging/google-api-services-storage-v1-rev11-1.19.0-8roIrNilTlO2ZqfGfOaqkg.jar|google-api-services-storage-v1-rev11-1.19.0-8roIrNilTlO2ZqfGfOaqkg.jar|gs://highfive-dataflow-test/staging/google-cloud-dataflow-java-examples-all-manual_build-A9j6W_hzOlq6PBrg1oSIAQ.jar|google-cloud-dataflow-java-examples-all-manual_build-A9j6W_hzOlq6PBrg1oSIAQ.jar|gs://highfive-dataf
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: low-test/staging/google-cloud-dataflow-java-examples-all-manual_build-tests-iIdI-AhKWiVKTuJzU5JxcQ.jar|google-cloud-dataflow-java-examples-all-manual_build-tests-iIdI-AhKWiVKTuJzU5JxcQ.jar|gs://highfive-dataflow-test/staging/google-cloud-dataflow-java-sdk-all-alpha-PqdZNVZwhs6ixh6de6vM7A.jar|google-cloud-dataflow-java-sdk-all-alpha-PqdZNVZwhs6ixh6de6vM7A.jar|gs://highfive-dataflow-test/staging/google-http-client-1.19.0-1Vc3U5mogjNLbpTK7NVwDg.jar|google-http-client-1.19.0-1Vc3U5mogjNLbpTK7NVwDg.jar|gs://highfive-dataflow-test/staging/google-http-client-jackson-1.15.0-rc-oW6nFU6Gme53SYGJ9KlNbA.jar|google-http-client-jackson-1.15.0-rc-oW6nFU6Gme53SYGJ9KlNbA.jar|gs://highfive-dataflow-test/staging/google-http-client-jackson2-1.19.0-AOUP2FfuHtACTs_0sul54A.jar|google-http-client-jackson2-1.19.0-AOUP2FfuHtACTs_0sul54A.jar|gs://highfive-dataflow-test/staging/google-http-client-protobuf-1.15.0-rc-xYoprQdNcvzuQGZXvJ3ZaQ.jar|google-http-client-protobuf-1.15.0-rc-xYoprQdNcvzuQGZXvJ3ZaQ.jar|gs://highfive-dataflow-test/st
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: aging/google-oauth-client-1.19.0-b3S5WqgD7iWrwg38pfg3Xg.jar|google-oauth-client-1.19.0-b3S5WqgD7iWrwg38pfg3Xg.jar|gs://highfive-dataflow-test/staging/google-oauth-client-java6-1.19.0-cP8xzICJnsNlhTfaS0egcg.jar|google-oauth-client-java6-1.19.0-cP8xzICJnsNlhTfaS0egcg.jar|gs://highfive-dataflow-test/staging/guava-18.0-HtxcCcuUqPt4QL79yZSvag.jar|guava-18.0-HtxcCcuUqPt4QL79yZSvag.jar|gs://highfive-dataflow-test/staging/hamcrest-all-1.3-n3_QBeS4s5a8ffbBPQIpFQ.jar|hamcrest-all-1.3-n3_QBeS4s5a8ffbBPQIpFQ.jar|gs://highfive-dataflow-test/staging/hamcrest-core-1.3-DvCZoZPq_3EWA4TcZlVL6g.jar|hamcrest-core-1.3-DvCZoZPq_3EWA4TcZlVL6g.jar|gs://highfive-dataflow-test/staging/httpclient-4.0.1-sfocsPjEBE7ppkUpSIJZkA.jar|httpclient-4.0.1-sfocsPjEBE7ppkUpSIJZkA.jar|gs://highfive-dataflow-test/staging/httpcore-4.0.1-_SGEPUOMREqA8u_h7qy9_w.jar|httpcore-4.0.1-_SGEPUOMREqA8u_h7qy9_w.jar|gs://highfive-dataflow-test/staging/idea_rt-6II88e1BKUeCOQqcrZht-w.jar|idea_rt-6II88e1BKUeCOQqcrZht-w.jar|gs://highfive-dataflow-test/staging/jacce
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: ss-laKenN34W6jKKivkBUzVcA.jar|jaccess-laKenN34W6jKKivkBUzVcA.jar|gs://highfive-dataflow-test/staging/jackson-annotations-2.4.2-7cAfM1zz0nmoSOC_NlRIcw.jar|jackson-annotations-2.4.2-7cAfM1zz0nmoSOC_NlRIcw.jar|gs://highfive-dataflow-test/staging/jackson-core-2.4.2-3CV4j5-qI7Y-1EADAiakmw.jar|jackson-core-2.4.2-3CV4j5-qI7Y-1EADAiakmw.jar|gs://highfive-dataflow-test/staging/jackson-core-asl-1.9.13-Ht2i1DaJ57v29KlMROpA4Q.jar|jackson-core-asl-1.9.13-Ht2i1DaJ57v29KlMROpA4Q.jar|gs://highfive-dataflow-test/staging/jackson-databind-2.4.2-M7rkZKQCfOO3vWkOyf9BKg.jar|jackson-databind-2.4.2-M7rkZKQCfOO3vWkOyf9BKg.jar|gs://highfive-dataflow-test/staging/jackson-mapper-asl-1.9.13-eoeZFbovPzo033HQKy6x_Q.jar|jackson-mapper-asl-1.9.13-eoeZFbovPzo033HQKy6x_Q.jar|gs://highfive-dataflow-test/staging/javaws-O8JqID6BpsXsCSRRkhii3w.jar|javaws-O8JqID6BpsXsCSRRkhii3w.jar|gs://highfive-dataflow-test/staging/jce-eMjjWzdqQh30yNZ9HMuXMA.jar|jce-eMjjWzdqQh30yNZ9HMuXMA.jar|gs://highfive-dataflow-test/staging/jfr-xDzacRGMQeIR4SdPe69o1A.jar|jfr
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: -xDzacRGMQeIR4SdPe69o1A.jar|gs://highfive-dataflow-test/staging/jfxrt-5aSYnU7M458Xy_hx5zXF8w.jar|jfxrt-5aSYnU7M458Xy_hx5zXF8w.jar|gs://highfive-dataflow-test/staging/jfxswt-X8I_DFy9gs_6LMLp6_LFPA.jar|jfxswt-X8I_DFy9gs_6LMLp6_LFPA.jar|gs://highfive-dataflow-test/staging/joda-time-2.4-EIO48_0LMn2_imYqUT5jxA.jar|joda-time-2.4-EIO48_0LMn2_imYqUT5jxA.jar|gs://highfive-dataflow-test/staging/jsr305-1.3.9-ntb9Wy3-_ccJ7t2jV2Tb3g.jar|jsr305-1.3.9-ntb9Wy3-_ccJ7t2jV2Tb3g.jar|gs://highfive-dataflow-test/staging/jsse-HOItnWzBlT4hG5HPmlF56w.jar|jsse-HOItnWzBlT4hG5HPmlF56w.jar|gs://highfive-dataflow-test/staging/junit-4.11-lCgz3FeSwzD13Q_KNW4MuQ.jar|junit-4.11-lCgz3FeSwzD13Q_KNW4MuQ.jar|gs://highfive-dataflow-test/staging/localedata-R9ei3T8qar8cibFNN0X7Qg.jar|localedata-R9ei3T8qar8cibFNN0X7Qg.jar|gs://highfive-dataflow-test/staging/management-agent-kiuGeHiVpYKGCDNexcQPIg.jar|management-agent-kiuGeHiVpYKGCDNexcQPIg.jar|gs://highfive-dataflow-test/staging/mockito-all-1.9.5-_T4jPTp05rc7PhcOO34Saw.jar|mockito-all-1.9.5-_T4jPTp0
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: 5rc7PhcOO34Saw.jar|gs://highfive-dataflow-test/staging/nashorn-x8si6abt-U04QaVUHvl_bg.jar|nashorn-x8si6abt-U04QaVUHvl_bg.jar|gs://highfive-dataflow-test/staging/paranamer-2.3-rdmhSrp7GRPVm0JexWjzzg.jar|paranamer-2.3-rdmhSrp7GRPVm0JexWjzzg.jar|gs://highfive-dataflow-test/staging/plugin-TG6U30mOzKi8yMGKYd7ong.jar|plugin-TG6U30mOzKi8yMGKYd7ong.jar|gs://highfive-dataflow-test/staging/protobuf-java-2.5.0-g0LcHblB4cg-bZEbNj3log.jar|protobuf-java-2.5.0-g0LcHblB4cg-bZEbNj3log.jar|gs://highfive-dataflow-test/staging/resources-RavNZwakZf55HEtrC9KyCw.jar|resources-RavNZwakZf55HEtrC9KyCw.jar|gs://highfive-dataflow-test/staging/rt-Z2kDZdIt-eG8CCtFIinW1g.jar|rt-Z2kDZdIt-eG8CCtFIinW1g.jar|gs://highfive-dataflow-test/staging/slf4j-api-1.7.7-M8fOZEWF4TcHiUbfZmJY7A.jar|slf4j-api-1.7.7-M8fOZEWF4TcHiUbfZmJY7A.jar|gs://highfive-dataflow-test/staging/slf4j-jdk14-1.7.7-hDm19oG8Vzi6jVY9pLtr_g.jar|slf4j-jdk14-1.7.7-hDm19oG8Vzi6jVY9pLtr_g.jar|gs://highfive-dataflow-test/staging/snappy-java-1.0.5-WxwEQNTeXiDmEGBuY9O3Og.jar|snappy-java
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: -1.0.5-WxwEQNTeXiDmEGBuY9O3Og.jar|gs://highfive-dataflow-test/staging/sunec-ffsdkJzKsC8XbuZa-XHp3Q.jar|sunec-ffsdkJzKsC8XbuZa-XHp3Q.jar|gs://highfive-dataflow-test/staging/sunjce_provider-4x9-ynTri_pg6Hhk2Zj9Ow.jar|sunjce_provider-4x9-ynTri_pg6Hhk2Zj9Ow.jar|gs://highfive-dataflow-test/staging/sunmscapi-5TwnMDAci3Hf47yMZYmN1g.jar|sunmscapi-5TwnMDAci3Hf47yMZYmN1g.jar|gs://highfive-dataflow-test/staging/sunpkcs11-vCiFLLKN99XBpHW2JTkOBw.jar|sunpkcs11-vCiFLLKN99XBpHW2JTkOBw.jar|gs://highfive-dataflow-test/staging/xz-1.0-6m1HjeacPsPpniZtMte8kw.jar|xz-1.0-6m1HjeacPsPpniZtMte8kw.jar|gs://highfive-dataflow-test/staging/zipfs-SIKQJJIhpGOgSa4tT6nStA.jar|zipfs-SIKQJJIhpGOgSa4tT6nStA.jar"},"description":"GCE Instance created for Dataflow","disks":[{"deviceName":"persistent-disk-0","index":0,"mode":"READ_WRITE","type":"PERSISTENT"}],"hostname":"wordcount-jroy-1224043800-12232038-8cfa-harness-0.c.highfive-metrics-service.internal","id":8960015560553137779,"image":"","machineType":"projects/537312487774/machineTypes/n1-stan
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: dard-4","maintenanceEvent":"NONE","networkInterfaces":[{"accessConfigs":[{"externalIp":"130.211.184.44","type":"ONE_TO_ONE_NAT"}],"forwardedIps":[],"ip":"10.240.173.213","network":"projects/537312487774/networks/default"}],"scheduling":{"automaticRestart":"TRUE","onHostMaintenance":"MIGRATE"},"serviceAccounts":{"537312487774#developer.gserviceaccount.com":{"aliases":["default"],"email":"537312487774#developer.gserviceaccount.com","scopes":["https://www.googleapis.com/auth/any-api","https://www.googleapis.com/auth/bigquery","https://www.googleapis.com/auth/cloud-platform","https://www.googleapis.com/auth/compute","https://www.googleapis.com/auth/datastore","https://www.googleapis.com/auth/devstorage.full_control","https://www.googleapis.com/auth/logging.write","https://www.googleapis.com/auth/ndev.cloudman","https://www.googleapis.com/auth/pubsub","https://www.googleapis.com/auth/userinfo.email"]},"default":{"aliases":["default"],"email":"537312487774#developer.gserviceaccount.com","scopes":["https://www.goog
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: leapis.com/auth/any-api","https://www.googleapis.com/auth/bigquery","https://www.googleapis.com/auth/cloud-platform","https://www.googleapis.com/auth/compute","https://www.googleapis.com/auth/datastore","https://www.googleapis.com/auth/devstorage.full_control","https://www.googleapis.com/auth/logging.write","https://www.googleapis.com/auth/ndev.cloudman","https://www.googleapis.com/auth/pubsub","https://www.googleapis.com/auth/userinfo.email"]}},"tags":["dataflow"],"zone":"projects/537312487774/zones/us-central1-a"}
Dec 24 04:38:45 wordcount-jroy-1224043800-12232038-8cfa-harness-0 google: No startup script found in metadata.
Not sure what I should be looking for, but this seems to reliably fail for me in this manner. I see the same problem when I try to run a custom pipeline of my own (i.e. not WordCount), and also when I run the WordCount example on Linux.
I saved off a file where I recorded:
The complete output from the WordCount main class
The metadata field values set on the GCE instance
The complete serial console output
It is available here.
Things I've tried so far, without success:
Forcing the language level of the compiled classes to 1.7 (am using 1.8 JRE)
Modifying DataflowPipelineRunner::detectClassPathResourcesToStage to not emit JRE jar files (this is a difference I noticed in the log compared to Maven; when running under Maven the JRE jars are not staged).
EDIT: Attempting to set the classpath to EXACTLY the same as what Maven ends up using (removing all of our projects' dependencies). This seemed to change the behavior a bit and I got to a java.lang.ClassNotFoundException: com.google.cloud.dataflow.examples.WordCount$ExtractWordsFn in the worker output.
Strongly suspicious that the problem lies with the staged classpath, but without more specific error messages, I'm shooting in the dark. Would appreciate ideas of where to look next or other things to try.
When running pipelines using [Blocking]DataflowPipelineRunner from the Cloud Dataflow Java SDK, the runner automatically copies everything from your local Java class path to a staging location in Google Cloud Storage, which is being accessed by workers on-demand.
ClassNotFoundException in the Cloud Dataflow worker environment is an indication that required dependencies for your pipeline are not properly staged in a Google Cloud Storage bucket. This likely root cause can be confirmed by looking at the contents of your staging bucket in Google Developers Console and the console output of BlockingDataflowPipelineRunner.
Now, the problem can be fixed by bundling all dependencies into a single, monolithic jar. In Maven, the following command can be used to create such a jar as long as the bundle plugin is properly configured to embed all transitive dependencies:
mvn bundle:bundle
Then, the bundled jar can be executed normally, such as:
java -cp <bundled jar> <main class> --project=<project> ...
Alternatively, the problem can be fixed by manually adding dependencies to your local class path. For example, the following command may be helpful when running an unbundled jar:
java -cp <unbundled jar>:<dep1>:<dep2>:...:<depN> <main class> --project=<project> ...
where dep1 to depN are all the dependencies needed for execution of the program. This is clearly error prone, and we don't endorse it. Our documentation recommends using mvn exec:java because that sets the execution class path automatically from the dependencies listed in the POM file. Specifically, to run WordCount example, use:
mvn exec:java -pl examples \
-Dexec.mainClass=com.google.cloud.dataflow.examples.WordCount \
-Dexec.args="--project=<YOUR GCP PROJECT NAME> --stagingLocation=<YOUR GCS LOCATION> --runner=BlockingDataflowPipelineRunner"
The main difference between bundled and unbundled version is in the upload activity before pipeline submission. Unbundled version has an advantage that it can automatically use unchanged dependencies that may have been uploaded in previous submissions.
To summarize, use mvn exec:java when running an unbundled jar, or bundle the dependencies into a monolithic jar. We'll try to clarify this in the documentation.
There's a very high likelihood that this is an issue with staging dependencies.
There's a high probability if you create a bundled jar it will just work. You can create a bundled jar by running the command
mvn bundle:bundle
This will create a single jar that should pull in all dependencies transitively. You then just need to add that jar to your class path and Dataflow should automatically stage it; Thereby ensuring your code as well as any dependencies are available on the worker.
Most likely the job worked with mvn exec, because maven automatically generates a class path with all dependencies from the POM. When running manually, that doesn't happen. i.e if you invoke java directly e.g.
java -cp <JAR FILES> your.main.class --project=<YOUR PROJECT> ....
then you must add all dependencies to the class path so that they get staged. Creating a bundled jar as suggested above is usually the easiest way to do that.
My suggestion would be to look at the worker logs to see if we can find additional information about what's going on in the workers.
There are three ways to get this information. The first is via the Dataflow UI. Go to the Google Cloud Console and then select the Dataflow option in the left hand frame. You should see a list of your jobs. You can click on the job in question. This should show you a graph of your job. On the right side you should see a button "view logs". Please click that. You should then see a UI for navigating the logs and you can look for errors.
The second option is to look for the logs on GCS. The location to look for is:
gs://PATH TO YOUR STAGING DIRECTORY/logs/JOB-ID/VM-ID/LOG-FILE
You might see multiple log files. The one we are most interested in is the one that starts with "start_java_worker". If that log file doesn't exist then the worker didn't make enough progress to actually upload the file; or else there might have been a permission problem uploading the log file.
In that case the best thing to do is to try to ssh into one of the VMs before it gets torn down. You should have about 15 minutes before the job fails and the VMs are deleted.
Once you login to the VM you can find all the logs in
/var/log/dataflow/...
The log we care most about at this point is:
/var/log/dataflow/taskrunner/harness/start_java_worker-SOME ID.log
If there is a problem starting the code that runs on the VM that log should tell us. That log and the other logs should also tell us if there is a permission problem that prevents the code running on the worker from being able to access Dataflow.
Please take a look and let us know if you find anything.

grails emacs mode - "Cannot open load file" "project-mode"

I am trying to use emacs for grails development. Tried on grails-emacs-mode
There are emacs and emacs23 on my ubuntu 12.04.
prayag#prayag:~$ ls -l /usr/share/emacs[tab][tab]
emacs/ emacs23/
As the grails-emacs-mode suggests, I copied grails-mode.el and primary.org files to my emacs23/site-list
prayag#prayag:~$ ls -l /usr/share/emacs23/site-lisp/
total 32
-rw-r--r-- 1 root root 3013 Nov 17 00:39 debian-startup.elc
drwxr-xr-x 2 root root 4096 Nov 17 00:39 dictionaries-common
-rw-r--r-- 1 root root 18205 Feb 14 01:11 grails-mode.el
-rw-r--r-- 1 root root 0 Feb 14 01:11 primary.org
-rw-r--r-- 1 root root 106 Sep 22 01:16 subdirs.el
Then, created init.el inside .emacs.d as there exists no .emacs file in home directory. The init.el conatains
(require 'grails-mode)
(setq grails-mode t)
(setq project-mode t)
(add-to-list 'auto-mode-alist '("\.gsp$" . nxml-mode)) ; Use whatever mode you want for views.
(project-load-all) ; Loads all saved projects. Recommended, but not required.
now, opening emacs23 doesn't show any grails in the menubar.
I also tried
M-x
load-file .emacs.d/init.el
which throws
Warning (initialization): An error occurred while loading `/home/prayag/.emacs.d/init.el':
File error: Cannot open load file, project-mode
To ensure normal operation, you should investigate and remove the
cause of the error in your initialization file. Start Emacs with
the `--debug-init' option to view a complete error backtrace.
On starting emacs23 --debug-init, following error is thrown.
Debugger entered--Lisp error: (file-error "Cannot open load file" "project-mode")
require(project-mode)
eval-buffer(#<buffer *load*<2>> nil "/usr/share/emacs/23.3/site-lisp/grails-mode.el" nil t) ; Reading at buffer position 422
load-with-code-conversion("/usr/share/emacs/23.3/site-lisp/grails-mode.el" "/usr/share/emacs/23.3/site-lisp/grails-mode.el" nil t)
require(grails-mode)
eval-buffer(#<buffer *load*> nil "/home/prayag/.emacs.d/init.el" nil t) ; Reading at buffer position 23
load-with-code-conversion("/home/prayag/.emacs.d/init.el" "/home/prayag/.emacs.d/init.el" t t)
load("/home/prayag/.emacs.d/init" t t)
#[nil "\205\264
grails-mode requires project-mode as mentioned on the emacs-grails-mode page. So, you'll also need to install project-mode.
Also grab the remaining groovy packages(all but grails-mode) from here.
emacs-grails-mode-ext is a modest contribution to grails-mode allowing you to run Grails commands directly from emacs. For a given project(project-mode), you can run Grails commands such as create-domain-class, create-service, etc.
I also use the function ido-find-file-in-tag-files from here, I bind it to C-x C-M-f .
Simple guide with emacs-grails-mode:
Create a project from the command line or eshell -> grails create-app yourapp
Using dired, go to your Grails project folder
M-x project-new -> to create a new project(project-mode)
M-x project-save -> Save the project
M-x project-load-and-select -> Your project-name as argument
There's also a Grails menu if you use the menubar
You could also use my current emacs setup here, if you have emacs24 installed. I believe that it's available for Ubuntu 12.04, but I'm not sure. I usually build emacs from source on OSX or I use emacs-snapshot in Ubuntu.
Hope this helps.
I don't know Grail-mode at all. I've just clicked your link, and they state that project-mode is a dependency:
Dependencies:
project-mode is the only dependency.
As a consequence you will also have to install it. Link to the Emacs project code.
I am hoping for you this project does not have additional dependencies...
As a side note: later, try to install the last emacs (v24) which embeds a very convenient way to deal with package installation and dealing with packages dependencies. I've just check it out, it is present on an alternative (but very know) repository: Marmalade-repo.org.

Resources