GraalVM: Access to native code is not allowed by the host environment - graalvm

I just recently setup a Centos7 VM to play around with GraalVM. I downloaded graalvm-1.0.0-rc1, installed Netbeans8.2, and downloaded the FastR extension (via gu). I then wrote a simple java program to test some of the various supported languages. Below is the code I wrote:
package javatest;
import org.graalvm.polyglot.*;
import java.io.PrintStream;
import java.util.Set;
public class JavaTest {
public static void main(String[] args) {
PrintStream output = System.out;
Context context = Context.create();
Set<String> languages = context.getEngine().getLanguages().keySet();
output.println("Current Languages available in GraalVM: " + languages);
// TODO code application logic here
System.out.println("Java: Hello World");
context.eval("js","print('JavaScript: Hello World')");
context.eval("R", "print('R: Hello World');");
}
}
Output is as follows:
run:
Current Languages available in GraalVM: [R, js, llvm]
Java: Hello World
JavaScript: Hello World
FastR unexpected failure: error loading libR from: /usr/local/graalvm-1.0.0-
rc1/jre/languages/R/lib/libR.so.
If running on NFI backend, did you provide location of libtrufflenfi.so as
value of system property 'truffle.nfi.library'?
The current value is '/usr/local/graalvm-1.0.0-
rc1/jre/lib/amd64/libtrufflenfi.so'.
Details: Access to native code is not allowed by the host environment.
Exception in thread "main" org.graalvm.polyglot.PolyglotException
at org.graalvm.polyglot.Context.eval(Context.java:336)
at javatest.JavaTest.main(JavaTest.java:32)
As you can see by the initial call to view the supported languages it recognizes that R is installed but once I call the eval on the language it kicks out. The trufflenfi.so file is there and available. I have defined it as a run parameter (even though I shouldn't need to).
I can find nothing on why the "access to native code is not allowed by the host environment" is being displayed and am at a loss. Any ideas on what I'm doing wrong? Note: I also tried the same test with python and ruby and got the same result but removed for the simplest of test cases.

This is a security feature of polyglot contexts created with the GraalVM polyglot API. By default every language is isolated from the host environment, therefore it is not allowed to acccess Java classes, native access or files in your filesystem. Currently with GraalVM 1.0.0-RC1 the languages Ruby and R need native access to boot their environment up. The languages JavaScript and Python don't need native access to boot.
If you want to create a context with all access you can create the context like this:
Context.newBuilder().allowAllAccess(true).build();
You can also just selectively allow access to native code:
Context.newBuilder().allowNativeAccess(true).build();
Here is your example fixed:
package javatest;
import org.graalvm.polyglot.*;
import java.io.PrintStream;
import java.util.Set;
public class JavaTest {
public static void main(String[] args) {
PrintStream output = System.out;
Context context = Context.newBuilder().allowAllAccess(true).build();
Set<String> languages = context.getEngine().getLanguages().keySet();
output.println("Current Languages available in GraalVM: " + languages);
// TODO code application logic here
System.out.println("Java: Hello World");
context.eval("js","print('JavaScript: Hello World')");
context.eval("R", "print('R: Hello World');");
}
}
Here are some more examples that uses all access for Ruby and R:
http://www.graalvm.org/docs/graalvm-as-a-platform/embed/

Related

analyzer_plugin generates Legacy protocol of Dart analyzer server instead of LSP

I try to write some custom lint rules. To achieve this, I used the analyzer_plugin package and I set up my project as it should be. Here is a simplified excerpt of the main class :
class LintAnalyzerPlugin extends ServerPlugin {
#override
Future<void> analyzeFile({required AnalysisContext analysisContext, required String path}) async {
channel.sendNotification(
AnalysisErrorsParams(path, [getAnalysisError(path)]).toNotification(),
);
}
}
channel.sendNotification is called but no message is displayed into VS Code Problems panel.
After some investigation, I found out that the JSON generated for the sent notification use Dart server Legacy protocol. But the Dart analyzer server run by Dart Code extension wait for LSP (Microsoft Language Server Protocol).
Fortunately the extension offers a setting to start the server with the Legacy protocol:
"dart.useLegacyAnalyzerProtocol": true
And now the VS Code Problems panel populates sent notifications.
Unfortunately Dart Code extension advises to use LSP because the Legacy protocol will eventually be removed some day.
Is it possible to generate LSP? Or did I miss something?
If anyone has any suggestions, I'm all ears.

Dart Functions Framework usage

I'm new to the Dart functions framework. My goal is to use this package to create several functions and deploy them to Cloud Run (in combination with Firebase, but I guess that's irrelevant to this question).
I've run the quick starts and I've read all of the contents in the docs.
The quick start mentions just one function at a time (e.g. Hello World, Cloud Events, etc..), like this:
import 'package:functions_framework/functions_framework.dart';
import 'package:shelf/shelf.dart';
#CloudFunction()
Response function(Request request) {
return Response.ok('Hello, World!');
}
But as you can see in the quickstarts only one function is handled in a project at a time. How about me wanting to deploy several functions? Should I:
Write several functions in the same project / file, so that the function framework compiles the 'server.dart` by itself
OR
Create a different functions_framework for each function?
Let me be more specific. Should I do the following (option 1 - which makes more sense to me):
import 'dart:math';
import 'package:functions_framework/functions_framework.dart';
import 'package:shelf/shelf.dart';
#CloudFunction()
Response function(Request request) {
return Response.ok('Hello, World!');
}
#CloudFunction()
Response function2(Request request) {
if (Random().nextBool()) {
return Response.ok('Hello, World!');
} else {
return Response.internalServerError();
}
}
Or should I build a different folder by running a build_runner for each function I need in my project?
Is there a difference and/or a best practice?
Thanks in advance.
EDIT. This question is related to the deployment on Cloud Run itself, and not just testing on my own PC. To test my own functions I did the following:
Run dart run build_runner build, so that it updates the server.dart file correctly (I can see that the framework does a lot behind the scenes and that the _nameToFunctionTarget is basically a router);
Run the server in two different terminals, like this: dart run bin/server.dart --port MYPORT --target MYFUNCTION (where MYPORT and MYFUNCTION are either 8080/8081 or function/function2 respectively).
I guess I'm just confused on how to correctly manage this framework once deployed on Cloud Run.
EDIT 2. I just gave up using Dart as a Serverless language or even a Backend language. There's just too much jargon even for the basic things. Any backend framework is either dead, or maintained by one single enthusiast guy (props to him!). This language has not yet received enough love from the Google Team / the community and at this moment in time is basically not possible to go fullstack on just Dart. It's a dream, but it can't be realized now. Furthermore, Dart hardly lacks a proper SDKs to use Firestore, etc., so Firebase isn't an option. I find it easier to just learn NodeJS and exploit the Firebase support for Firebase Functions written in NodeJS, and I'll wait for more support in there in the future, if there ever will be.
The documentation is a bit sparse right now (and I'm new to it also! I couldn't find any good examples, so here goes...)
You can only have a single function that is served. It should be
named 'function' (the type and name can be overriden, see the
cloudevent example dartfn generate cloudevent)
You 'could' have many of these deployed so that each does a specific thing, such as processing cloudevents above, but most people
want something more REST-like (see next)
You need to attach a Router() so that you can have the single entry point (function) handled by specific logic in your code.
Example for Rest
add to pubspec.yaml (in dependencies:) shelf_router: ^1.1.2
delegate the #CloudFunction to use the Router()
functions.dart
import 'package:functions_framework/functions_framework.dart';
import 'package:shelf/shelf.dart';
import 'package:shelf_router/shelf_router.dart';
Router app = Router()
..get('/health', (Request request) {
return Response.ok('healthy');
})
..get('/user/<user>', (Request request, String user) {
// fetch the user... (probably return as json)
return Response.ok('hello $user');
})
..post('/user', (Request request) {
// convert request body to json and persist... (probably return as json)
return Response.ok('saved the user');
});
#CloudFunction()
Future<Response> function(Request request) => app.call(request);

Generate Swagger 2.0 Specification from JAX-RS 1.0 annotation via Java Main class

I have a legacy project such that we are using JAX-RS 1.x, and Ant builds.
I would like to generate a Swagger Specification via scanning the annotations but.... I don't want to require people to have a running instance of my web application. Instead, I would like to do it via an Ant task that (perhaps) just invokes a java main method that invokes the scanner and writes the specification to an output directory.
I have found lots of documentation on how to generate a Swagger Spec in the context of a running web application, but NOT from a Java main application running context.
I do recognize that the URI of the endpoints is not well-defined outside the context of a running web app, but that doesn't concern us because we are mainly interested in generating documentation.
Here's some code that works except for the fact that I can't seem get the info stuff into the actual generated JSON
final Info info = new Info();
info.setTitle(API_TITLE);
info.setDescription(API_DESCRIPTION);
info.setVersion(API_VERSION);
BeanConfig beanConfig = new BeanConfig();
// TODO Some of these do not seem to end up in the JSON file:
beanConfig.setTitle(API_TITLE);
beanConfig.setDescription(API_DESCRIPTION);
beanConfig.setVersion(API_VERSION);
beanConfig.setSchemes(new String[]{"https"});
beanConfig.setHost(HOST);
beanConfig.setBasePath(BASE_PATH);
beanConfig.setResourcePackage(RESOURCE_PACKAGE);
beanConfig.setScan(true);
beanConfig.setInfo(info); // TODO - This has no effect
Swagger swagger = beanConfig.getSwagger();
swagger.setInfo(info); // TODO - This has no effect
ObjectMapper objectMapper = new ObjectMapper();
File outputFile = new File(SWAGGER_OUTPUT_FILE_PATH + SWAGGER_OUTPUT_FILENAME);
// Force file and directory to be created if it doesn't exist. Otherwise an error will be thrown when we pass
// it to ObjectMapper.
outputFile.getParentFile().mkdirs(); // Force creation of output directory if it doesn't exit
outputFile.createNewFile();
objectMapper.writerWithDefaultPrettyPrinter().writeValue(outputFile, swagger);

Open ucanaccess/jackcess database in wildfly using iso-8859-1

I have a connection to an MS Access 2000 database defined in wildfly 9.0.2. Works fine. Using the commandline UCanAccess, I run it with -Dfile.encoding=ISO-8859 in order to have national characters (Norwegian) displayed correctly, on Ubuntu. On OS X the commandline displays national characters correctly without any jre-option. However the Wildlfy instance is also running on OS X, and does not display national characters correct (currently they're just written to console in a simple test) Using UcanAccess-driver in any java-based sql client like DBeaver or SQLSquirrel "just works" when it comes to character set. However, querying the database via JPA and wildfly, the national characters are replaced with '?'.
So, there is a way to specify a praticular "opener" on the jdbc-url for Jackcess:
......mdb;jackcessOpener=ucaextension.JackcessWithCharsetISO88591Opener
where the "opener" looks like this:
public class JackcessWithCharsetISO88591Opener implements JackcessOpenerInterface {
public Database open(File f, String pwd) throws IOException {
DatabaseBuilder db = new DatabaseBuilder(f);
db.setCharset(Charset.forName("ISO-8859-1"));
try {
db.setReadOnly(false);
return db.open();
} catch (IOException e) {
db.setReadOnly(true);
return db.open();
}
}
}
(yes, the exception handling should at least issue a warning.)
So I packaged this as a jar-file (maven), removed the old connection, driver and module definitions in wildfly. Then I added this jar-file, along with the others for the ucanaccess module (ucanaccess itself, hsqldb etc), recreated the driver and connection, now with the opener-parameter, and re-reployed the war using it. But wildfly complains:
Caused by: java.lang.ClassNotFoundException: ucaextension.JackcessWithCharsetISO88591Opener from [Module "com.ucanaccess:main" from local module loader #1060b431 (finder: local module finder #612679d6 (roots: /Users/jonmartinsolaas/Local/wildfly-9.0.2.Final/modules,/Users/jonmartinsolaas/Local/wildfly-9.0.2.Final/modules/system/layers/base))]
So clearly the url-parameter has been picked up, but the class is not found, even though it is deployed along with the other jars for the driver. The class is actually in the jar-file. But do I need to reference it from any other MANIFEST.INF classpath in the other jars or something?
The case, it seems, is that various consoles doesn't show the national characters. That, and the fact that I actually have to specify charset running on the ubuntu commandline, led me to believe there was a problem, and actually displaying data in the browser and not in the logging console showed just that. No need for a jackcess "opener" for a specific character set.

Why Java Web Service Client (CXF, JAX-WS, JDK1.6) Exhibits Different Behavior in Grails app? A CLASSPATH fix?

BACKGROUND:
Current Grails application has to interact w/ a 'legacy' web service
from a third party vendor - (systinet) Used the Apache CXF
Wsdl2Java tool to generate complex types and service interfaces.
Pretty standard stuff so far and this works perfectly from Java.
After writing some test classes and main() methods to
exercise the Java code, and providing a thin layer above for a
simplified interface, I wanted to call this code from Grails app.
Specifically, Grails controllers, services, quartz jobs ,and the
like. However, this is where things got interesting.
First stack trace from Grails CXF plug-in it was causing a FileNotFoundException. Beyond not needing to load a WSDL definition - since I already successfully ran CXF's Wsdl2Java tool, it seems there is something I'm missing here. Tried substituting a file:/// url***for the WSDL and got another exception.
At the end of all this -- removing plug-ins of any sort, I reconfigured the project with the CXF dependencies by hand** and now got a MarshallingException, essentially from the CXF-generated code! Which by the way executes perfectly from a Java class.
Someone I am sure must've come across this issue in your Grails integrations. As always your guidance is most appreciated!
1)Why in the Grails application, does the runtime attempt to parse the wsdl ? Also, note JDK versions are same java version "1.6.0_12".
2) Any CLASSPATH workarounds anyone can suggest? I guess an alternative approach is to re-write the Java middle layer calls with GroovyWS but that would be quite an effort - given number of services and the custom types the vendor has baked in.
static {
URL url = null;
try {
url = new URL("http://mydevhost:9080/wasp/bmc-security/ctsa/person");
} catch (MalformedURLException e) {
System.err.println("Can not initialize the default wsdl from server");
// e.printStackTrace();
}
WSDL_LOCATION = url;
}
/* static {
URL url = null;
try {
url = new URL( "file:///C:/Projects/beta/workspace/reqmgr3/wsdl/Person.wsdl" );
url.getPath();
} catch (MalformedURLException e) {
System.err.println("Can not initialize the default wsdl from file system");
// e.printStackTrace();
}
WSDL_LOCATION = url;
} */
`
****Stack traces
INFO: No Trust Decider configured for Conduit ...
Aug 11, 2010 6:26:16 PM org.apache.cxf.transport.http.HTTPConduit finalizeConfig
INFO: No Basic Auth Supplier configured for Conduit '...
Aug 11, 2010 6:26:16 PM org.apache.cxf.transport.http.HTTPConduit prepare
INFO: Chunking is set at 2048.
Aug 11, 2010 6:26:16 PM org.apache.cxf.phase.PhaseInterceptorChain doIntercept
INFO: Interceptor has thrown exception, unwinding now
org.apache.cxf.interceptor.Fault: Marshalling Error: com.systinet.wsdl.com.bmc.security.ess.webservice.holder.ArrayOfLog
inPairHolder is not known to this context
at org.apache.cxf.jaxb.JAXBEncoderDecoder.marshall(JAXBEncoderDecoder.java:132)
at org.apache.cxf.jaxb.io.XMLStreamDataWriter.write(XMLStreamDataWriter.java:42)
at org.apache.cxf.jaxb.io.XMLStreamDataWriter.write(XMLStreamDataWriter.java:30)
at org.apache.cxf.interceptor.BareOutInterceptor.handleMessage(BareOutInterceptor.java:73)
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:148)
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:215)
at org.apache.cxf.frontend.ClientProxy.invokeSync(ClientProxy.java:73)
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:122)
at $Proxy44.login(Unknown Source)
...
... 2 more
UPDATE 15-Aug:
Decided, out of both modularity and expediency, to put this code into separate WAR project, which will offer its ltd. services, rather than expose the original vendor web services, which are too unwieldy.
This project will be pure Java and leverages the Metro 2.0.1 runtime, which is around 16mb.
Calling the Java-based middleware services from Grails now becomes possible, after clearing out the lib and src/java folders -- basically just installed ws-client plugin and setup local services such as the following:
import groovyx.net.ws.WSClient
import org.grails.plugins.wsclient.service.WebService
class LocalPersonService {
WebService webService
groovyx.net.ws.WSClient _proxy
static final String PERSON_WSDL_URL = "http://localhost:9090/pri/PersonServicePort?wsdl"
def transactional = false
def getPersonDetails( String customerId, User userAccount, String userCredential ) {
// must cache the proxy
if ( _proxy == null ) {
print( "init proxy. Parsing wsdl..." )
try {
_proxy = webService.getClient(PERSON_WSDL_URL)
}
catch ( Throwable tr ) { println( tr.getMessage() ) }
}
// method shall return a (com.siventures.example.service.PersonDetails)
return _proxy.getPersonDetails( customerId, userAccount, userCredential, ... )
}

Resources