Unable to find DEFAULT_INSTANCE when query datastore in dataflow - google-cloud-dataflow

I am basically just follow the word count example to pull data from datastore in dataflow like
DatastoreV1.Query.Builder q = DatastoreV1.Query.newBuilder();
q.addKindBuilder().setName([entity kind]);
DatastoreV1.Query query = q.build();
DatastoreIO.Source source = DatastoreIO.source()
.withDataset([...])
.withQuery(query)
.withNamespace([...]);
PCollection<DatastoreV1.Entity> collection = pipeline.apply(Read.from(source));
But it keeps failing on:
java.lang.RuntimeException: Unable to find DEFAULT_INSTANCE in com.google.api.services.datastore.DatastoreV1$Query at com.google.protobuf.GeneratedMessageLite$SerializedForm.readResolve(GeneratedMessageLite.java:1065) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at ...
Couldn't find any solution that seems relevant in the internet so far.
Might somebody could suggest maybe a general direction on what might be going wrong?

Protocol Buffers have certain restrictions. Among others, you have to link in the protobuf Java runtime that matches the version of the protoc compiler that the code was generated with, and you can (normally) have only one runtime present. This applies to all use cases of Protocol Buffers, and they aren't Dataflow specific.
Dataflow SDK for Java, version 1.4.0 and older, depends on protobuf version 2.5 and links in a Datastore client library generated with the corresponding protoc compiler. The easiest solution is not to override any protobuf-java and google-api-services-datastore-protobuf dependencies and let them be brought into your project by the Dataflow SDK.
If you really have to upgrade to protobuf version 3 for an unrelated reason, you should also upgrade google-api-services-datastore-protobuf to version v1beta2-rev1-4.0.0, because that one was generated with the corresponding protoc compiler. Please note that this is a workaround for Datastore only -- I would expect other dependencies that require protobuf version 2 to break, unless they are upgraded too.
Now, we are actively working on upgrading the Dataflow SDK to protobuf version 3. I'd expect this functionality in the next minor release, possibly 1.5.0. Since any version of the Dataflow SDK can support only one protobuf at a time, support for version 2 will break at that time, unless a few dependencies are manually rolled back.

Related

I see strange errors in my Dataflow job that may be related to library versioning

Errors range from 404s, IOExceptions, or encoding exceptions. They can be buried in the error stack, and occasionally suggest a versioning problem.
How can I prevent or address this class of errors?
The Dataflow service's SDKs and worker take dependencies on common third-party components, which themselves import various dependencies. Version collisions can result in unexpected behavior in the service. If you are using any of these packages in your code, be aware that some libraries are not forward-compatible and you may need to pin to the listed versions that will be in scope during execution. In order to determine whether your JAR has a conflicting version in use, consider inspecting the dependency tree of your project. Consult the list of specifically pinned versions if you suspect a problem here, and also avoid using "latest" for any of these libraries.

Compile errors finding symbols including Pipeline, PCollection, PipelineOptions, etc

As of today, I'm getting a build break for existing code that used to compile correctly, due to error locating many key classes in the Dataflow SDK for Java. For example:
[ERROR] /tmp/first-dataflow/src/main/java/com/google/cloud/dataflow/examples/common/DataflowExampleUtils.java:[30,37] cannot find symbol
[ERROR] symbol: class Pipeline
[ERROR] location: package com.google.cloud.dataflow.sdk
Have the APIs changed?
Existing Maven projects that use the previously recommended version range [1.0.0, 2.0.0) for the Google Cloud Dataflow SDK for Java may soon pick up the new 2.0.0-beta1 version of that SDK. This new version has APIs that are not compatible with the 1.x releases, so using this it with existing code will cause these kinds of breakages.
If you are impacted, update your Maven pom.xml to use a version range that precludes anything in the 2.x family, for example by using [1.0.0, 1.99), as follows:
<dependency>
<groupId>com.google.cloud.dataflow</groupId>
<artifactId>google-cloud-dataflow-java-sdk-all</artifactId>
<version>[1.0.0,1.99)</version>
</dependency>
This should fix your compile issues and allow you to continue with the most recent release in the 1.x series (currently 1.9.0).
For more information and updates, you can follow this GitHub issue.
Separately, you can learn more about the 2.0.0-beta1 release, including what these incompatible API changes are, in its release notes. But be aware that it is an early preview and has corresponding caveats about API stability, support timelines, and documentation.

Fortify 4.2 SSC Error Processing

I’m using fortify 4.2 (SCA version: 6.2). I’m having hard time to upload an fpr file to the SSC. I always get “Error Processing” and I cannot tell what the problem is. I’ve opened the fpr file with auditworkbench and checked the Certification: it is valid.
Furthermore, I have a plenty of Scan errors and the majority is labeled with error code 1103 with the following description: Translator execution failed. Translator returned status 139.
Any leads?
PS: I’m scanning a large application which includes different modules written in C, C++ and JAVA.
Based on your comment about the two files:
myFile.h
myFile.H
I am suspecting that the database in the back-end of SSC is case-insensitive?
Starting with Fortify 4.30 (or possibly 4.40) SSC started enforcing that database collation be case-sensitive (before this time it was stated it should be but was not enforced).
You need to upgrade the version of Fortify that you are using to one of the newer versions (current latest is 16.10). The SSC install documentation will contain instructions on migrating your SSC database from case-insensitive to case-sensitive.
The 139 error usually means that you are using a newer version of C/C++ than Fortify supports. Currently we are using C++14 or higher and getting the error with 19.2.

How to Configure SonarQube for delphi?

I want to configure SonarQube so it can analyze Delphi project too, and when I search online I saw there used to be a delphi plugin for SonarQube. But when I look at the plugins with the latest build it doesn't show the delphi plugin.
Is the plugin still available in an other way?
Or is it possible to configure SonarQube for delphi without the plugin?
As of G. Ann response was actually discontinued puglin for Sonar, but searching the internet, and recently (3 days) the developer Fabricio Columbus made it happen!
We tested and is running the current version of Sonar:
Compatible with SonarQube 4.5.x and SonarQube 5.1.2
https://github.com/fabriciocolombo/sonar-delphi
Release: https://github.com/fabriciocolombo/sonar-delphi/releases
JAR: https://github.com/fabriciocolombo/sonar-delphi/releases/download/0.3.3-SNAPSHOT/sonar-delphi-plugin-0.3.3-SNAPSHOT.jar
PS: Translated from Portuguese to English by Google Translate.
To analyze the files of language X, you need a plugin for language X that recognize's X's structure, syntax &etc. Without that you can't derive metrics (LOC, complexity, &etc.) or recognize bad code (i.e. raise issues for antipatterns.) So to answer your second question first, you won't be able to analyze Delphi code without some kind of Delphi plugin.
The Delphi plugin was deprecated quite a while ago because it seemed to suffer from a lack of interest all around & didn't evolve to maintain compatibility with the platform as it evolved.
If you look, you can find downloads of the old plugin, but to use it, you'd have to retrogress to a quite old version of the platform, & I don't recommend that. I'm not sure how far back you'd have to go - you could crack open the jar and get that from the pom - but it looks like the last mailing list activity on this plugin was Feb. 2012. So again, I don't recommend going this route.

OpenAM source code failed to build using ant?

We are using this openAM 9.5 RC1 branch source in our project. https://github.com/svn2github/openam.git
In order to fix some bugs, we have to modify existing openAM's amserver library. For this, we have downloaded the source code from above location and tried to compile it offline using ant (as stated in README). But we are not able to compile it anyhow (even after making necessary changes, adding dependencies etc.)
Is there any way to construct required library (amserver.jar) from this source code ?
The OpenAM 9.5.x and 10.0.x versions are rather difficult to build, but from 11.0.0 the build process should be much more simpler since the project has been migrated over to Maven build system.
In any case, the version you are using (Snapshot 9.5.1 RC1) is very much outdated and most likely has several critical issues (not to mention the security issues).
I would strongly advise against putting effort into backporting fixes to that ancient version. Instead you should realize that you are running a more than 4 years old version of a security component, and upgrade your system to a more recent version as soon as possible.

Resources