How to insert into stream programmatically? - ksqldb

There are several messages from producer that I receive and I need to store them in to the stream but I don't want to do it manually from ksqldb console and need to handle those insertions via programming (Python/Java). Could anybody show me a sample code or any reference?
Thanks!

you can use the REST APIs or the Java client to interact with ksqlDB. Here's the doc for Java client with code snippet for insert: https://docs.ksqldb.io/en/latest/developer-guide/ksqldb-clients/java-client/#stream-inserts.

Related

AWS redrive to source queue in Java

AWs recently added a feature that allows you to send messages from a DLQ back to source queue via a lick of a button "redrive to source". I wanted to know is this possible via an API call.
I know how to extract a message from dlq queue and re send it, But with this new function i was hoping i wouldnt need to handle the messages, but rather just call a method perhaps on the queue and if its configured it would do the redelivery.
Anyone know if this is possible, as im searching in the net.
I believe currently this feature is only available via the management console UI and not as an API

Is there any way to send a filter/parameter in Client OperationMode?

Hello Stackoverflow Community,
i developed an abap program that displays roles in a tree output and now want to create a ui5 application with the same functionality. For this, i created an OData Service where the GET-Method simulates my abap program via a SUBMIT-Call and returns the output tree after then displays it using a TreeTable.
Now the problem: I am using OperationMode "Client" for my OData Service so filtering is done on the client side. My backend program needs parameters to function though. Is there a way to pass any arguments to my GET-Method while using OperationMode "Client"?
I have tried the bindRows() approach where u pass filters but this only works in "Server" OperationMode. Sadly i cant use "Server" OperationMode because it would result in having to simulate my abap program everytime the user expands a TreeTable-Node, rendering my program unuseable because of performance issues.
Hoping someone can help me on this issue and looking forward to your answers!
Solved this issue by sending a request in "Server"-Mode, saving the odata response to a json model and then binding said model to my treetable with "Client" operationmode.

How can I include a custom data file with each design automation job?

We have a drawing creation task that we'd like to offload to the design automation API. Every time the task runs, it will need as input a bunch of data that will affect what it creates in the DWG. What's the best way to make this job-specific data available to each job? In our case if we could include a text file that might be 1mb in size, that would work fine.
I have looked at the API documentation and other than the zip package, I don't see a way to accomplish this other than attempting to have our automation make outbound web service calls when it runs which i'm not sure would be allowed on the remote server.
You should be able to create a custom App package with a job that accepts your custom data has input argument, take a look at those demos:
design.automation-.net-input.output.sample
design.automation-.net-custom.activity.sample
You can also have your own web service that returns the needed payload, it is possible to issue an http call from an activity, see following sample for a demo:
https://github.com/szilvaa/variadic
Hope that helps

How to publish & consume Blob Stream message using Message Queue?

I'm trying to produce and consume blob(pdf) and stream(log ) file from message queue using java(jms).After googling ,i found that ActiveMQ support Blob message.I tried to implement using ActiveMQ. But there is no complete solution or example over the internet.
Could you please help me by giving sample code (with proper broker url) or how to do this? Also is there any other MQ that support Stream & Blob messages?
Default url should be like this:
tcp://localhost:61616?jms.blobTransferPolicy.defaultUploadUrl=http://localhost:8161/fileserver/
If you need a complete code solution,please go through this link below:
http://kuntalganguly.blogspot.in/2014/07/publish-consume-blob-and-stream-message.html

Any Ruby AMF clients out there?

I'm looking for a way to push/receive AMF0 / AMF3 messages in Ruby(Rails).
From what I read rubyAMF can only act as a server.
What I need is a library that allows client access to FMS/Wowza.Any ideas?
As the developer of RocketAMF http://github.com/warhammerkid/rocket-amf, I don't know of any AMF libraries that can act as clients out of the box. However, if you're interested in it, it shouldn't be that difficult to reverse the server code in RocketAMF to work as a client. You would just write a serializer for RocketAMF::Request that uses the standard message calling style (#<RocketAMF::Request:0x10167b658 #headers=[], #messages=[#<RocketAMF::Message:0x10167ae88 #response_uri="/1", #data=["session stirng", 42.0], #target_uri="App.helloWorld">], #amf_version=3>). Then you would write a deserializer for RocketAMF::Response.
I'll try to put together a new RocketAMF build in the next couple days that can communicate with FMS, but it's not a guarantee.

Resources