Below code worked for Dataflow 1.9 sdk, migrating to 2.X
PCollection<TableRow> tableRow = ...
tableRow.apply(BigQueryIO.Write()
.to(String.format("%1$s:%2$s.%3$s",projectId, bqDataSet, bqTable))
.withSchema(schema)
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND));
I get
The method apply(PTransform<? super PCollection<TableRow>,OutputT>) in the type PCollection<TableRow> is not applicable for the arguments (BigQueryIO.Write<Object>)
Release notes are not much of a help here and documentation on 2.X is non existant redirects to beam API page.
Have you tried using BigqueryIO.writeTableRows()?
Apache Beam 2.1.0 BigqueryIO documentation
https://beam.apache.org/documentation/sdks/javadoc/2.1.0/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.html
You can try providing TableRow type explicitly (BigQuery.<TableRow>write()...) or use BigQuery.writeTableRows() as suggested above.
Looks like the interface was made generic in 2.x. Earlier version had TableRow hard coded.
Related
I am using WriteToBigQuery in a beam python pipeline like this:
beam.io.gcp.bigquery.WriteToBigQuery(
table_id,
schema=table_schema,
create_disposition=beam.io.BigQueryDisposition.CREATE_IF_NEEDED,
write_disposition=beam.io.BigQueryDisposition.WRITE_EMPTY,
method='STREAMING_INSERTS')
But when I run the pipeline, I am seeing this erorr, which occurred after I fetched and rebased from the beam GitHub master branch today.
"ValueError: Write disposition WRITE_EMPTY is not supported for streaming inserts to BigQuery"
Looks like a recent change, intentionally disabled this due to a bug. So write_disposition=beam.io.BigQueryDisposition.WRITE_EMPTY is no longer allowed with method='STREAMING_INSERTS'
The fix is to now use write_disposition=beam.io.BigQueryDisposition.WRITE_APPEND which should give the same behaviour.
beam.io.gcp.bigquery.WriteToBigQuery(
table_id,
schema=table_schema,
create_disposition=beam.io.BigQueryDisposition.CREATE_IF_NEEDED,
write_disposition=beam.io.BigQueryDisposition.WRITE_APPEND,
method='STREAMING_INSERTS')
in couchbase SDK version 3 I can't find consistency set in the insert and find query like java
N1qlParams adhoc2 = N1qlParams.build().consistency(ScanConsistency.STATEMENT_PLUS).adhoc(true);
is that deprecated?
and if it's deprecated how can I fix this problem?
regards.
checkout options of cluster.query()
https://docs.couchbase.com/sdk-api/couchbase-node-client/Cluster.html#query
Check out the class QueryScanConsistency
cluster.query(queryString, QueryOptions.queryOptions()
.scanConsistency(QueryScanConsistency.REQUEST_PLUS) )
.rowsAs(MyEntity.class)
You should use REQUEST_PLUS instead of STATEMENT_PLUS
I am trying to use the attributeService.getEntityAttributes function to obtain some server attributes of my device. I was using the .getEntityAttributesValues function when working with the 2.x version of Thingsboard and it was working fine. With the current version I am using the following code:
var conf = {
ignoreLoading: false,
ignoreErrors: true,
resendRequest: true
};
var myattr = attributeService.getEntityAttributes(entityID,'SERVER_SCOPE',["myattribute"],conf);
But I get no data or error back. I was using the .getEntityAttributesValues with .then() method but it doesn't seem to work anymore. It says ".then is not a function".
What am I doing wrong? Please help and tell me how to use the new function properly. I am using TB v.3.1.1 CE.
Thingsboard 2.x UI was made with AngularJS.
Thingsboard 3.x UI now uses Angular.
One of the key differences between these frameworks in regards of your problem is the move from Promise based services, to Observable based services.
Let's look at the source of the getEntityAttributes function:
https://github.com/thingsboard/thingsboard/blob/2488154d275bd8e6883baabba5519be78d6b088d/ui-ngx/src/app/core/http/attribute.service.ts
It's mostly a thin wrapper around a network call made with the http.get method from Angular.
Therefore, if we have a look at the official documentation: https://angular.io/guide/http#requesting-data-from-a-server, it is mentioned that we must subscribe to the observable in order to handle the response. So something along the lines of:
attributeService.getEntityAttributes(entityID,'SERVER_SCOPE',["myattribute"],conf).subscribe((attributes) => {…})
I am trying to create my composite LWM2M object by using objlink type.
For Leshan, the only source on how to write the spec file in JSON seems to be the official oma-objects-spec.json, which does not contain examples of objlinks.
Can anyone provide an example on how to create an objlink object?
If it is not possible in Leshan, have anyone tried other implementations?
Hope it's not too late.
As of now there is no support of OBJLNK in Leshan API.
I was also needed OBJLNK support in Leshan so i have modified and created a pull request for supporting OBJLNK.
If you want to have objlnk support can use my branch which is forked from Leshan.
https://github.com/DevendraKurre/leshan
Leshan has added support to this feature. I have tested it with version 0.1.11-M14.
Reading can be done as usual, and writing is done as follows.
WriteRequest writeReq = new WriteRequest(
WriteRequest.Mode.UPDATE,
9, 0,
LwM2mSingleResource.newObjectLinkResource(
13,
new ObjectLink(5566, 7788)
)
);
I would like to work with Neo4j packages for java.
I see that the function newEmbeddedDatabaseBuilder is deprecated.
What is the best way to work now with Neo4j using java code?
thanks
In Neo4j 3.0, you'll use the GraphDatabaseFactory-
graphDb = new GraphDatabaseFactory().newEmbeddedDatabase( DB_PATH );
The Neo4j Java manual is available here: http://neo4j.com/docs/java-reference/current/#tutorials-java-embedded