Accessing Cloud Pub/Sub Message attributes in Cloud DataFlow - google-cloud-dataflow

According to what I read of DataFlow, the Pub/Sub datasource only gives the message body to work with in the pipeline. We have a use-case where we want to inspect the attributes of the message to make certain decisions. Is there any way of achieving this currently? I'm open to extending the Pub/Sub I/O to incorporate this if required.

Currently, there is no way to access the message attributes of your messages via the PubsubIO connector, but it would clearly be useful to do so. This is tracked in Apache Beam (incubating) as the issue BEAM-404.
I recommend following this issue to keep abreast of new developments.

Related

Custom template creation in Google dataflow

Is it possible to create custom template in Google dataflow for live data streaming into cloud SQL?
No. Google Cloud SQL is not a supported I/O transform. See Built-in I/O Transforms.
Edit: You may be able to connect to Google Cloud SQL with Apache Beam's JdbcIO class. However, the class has been annotated as #Experimental. See the JdbcIO documentation.

Google Cloud Dataflow streaming pipeline with PubSubIO resource setup failed

I already read this question, but it didn't solve my problem.
I read from a PubSub topic in my Dataflow topology, but I am always getting the error of "resource setup failure":
Even if I have already enabled all of the Google Cloud APIs for the project.
Do you have any ideas? Could it be some issue with credentials?
Where can I get a more meaningful error message?
I needed to create the topics by hand.
Dataflow automatically creates the subscriptions, not topics.

How to intercept sent / consumed RabbitMQ messages

I am developing RabbitMQ token auth plugin, where the token needs to be included in AMQP header so it can be validated upon every sent / consumed message.
I am wondering how can I achieve it? So far I am only familiar with RabbitMQ auth plugins and do not know much about other plugin mechanisms. After quick research I have found rabbit_channel_interceptor behavior which sounds like it could do the job.
I have read rabbitmq's source code about auth. In the source code tree, please pay attention to the files named as "rabbit_auth_machanism.erl", "rabbit_auth_backend", "rabbit_auth_backend_internal". In addition, there is another plugin named "xxx ldap".
After reading carefully and know how to integrate and build the rabbitmq project groups, you can start programming.

How can I order a job in Control-M using a message queue?

I am trying to find a way to order a Control-M job via a message from an external application. We are using Control-M v8. We are able to send messages to the queue, but we have been unsuccessful in receiving messages that perform some sort of action in Control-m.
Erick, look at the documentation for the Control-M Business Process Integration Suite Manual. This suite provides the capability that you are looking for.
We have application back-end in UNix and, we use Control-M in-built utilities to call jobs from unix. The jobs should be created in desktop, and should have been uploaded to control M database without any specific schedule. A utility called 'ctmorder' can be used to call these jobs as and when required.

How do I send durable messages with Grails?

Disclaimer: .Net guy trying to learn grails.
I've gotten used to building services with a distributed and durable messaging layer for inter-service communication with NServiceBus and MSMQ.
For anyone unfamiliar, NServiceBus provides messaging simply by referencing the assembly, doing some quick dependency injection.
Then, to work with it, I can send a message simply by doing something like bus.Send("location", messageObject) for a command, and bus.publish(messageObject) for a publish/subscribe situation. Then, all I have to do is create a service that "listens" for my messageObject type and I get the message.
It also provides something they call timeouts - which basically will trigger some event handler after x amount of time (useful for sending reminders or doing something on a schedule).
I'm looking for something similar. I found an article that suggest using grails itself as an ESB, but I don't see how grails can provide reliable and durable messaging. What I mean by that - if service A sends a message to service B, and service B is down, service A will retry later. A more involved example would be that of a saga - where the client starts a saga, service A does something, and service B does something, both report to the saga when they're done processing, and then the saga sends a message to service C so it can do its thing, knowing that both service A and B have done their job.
PS: if this question is too broad, please let me know how I can refine it. I'm at the very beginning of learning grails, so I'm not even sure where I need to start researching stuff.
EDIT: realized I forgot to add the article- http://jlorenzen.blogspot.com/2009/03/grails-create-app-esb
I'd probably use some AMQP queue that has a Grails plugin (like RabbitMQ).
While this wouldn't give you all the features of NServiceBus on MSMQ, you would get the durable messaging behavior you wanted. Things you'd give up / have to implement yourself include some of the retry logic, sagas, and message idempotence.

Resources