I am a beginner to spring cloud data flow and i am following their official doc. But when i deploy the stream from spring cloud data flow dashboard it just stuck on loading and the stream is never deployed.
The DSL for the stream i want to deploy is:
http | log
I changed the ports for skipper but nothing works
I expect that when i click on deploy the stream then it should show me the status 'deploying' but instead it just keeps on loading forever.
When reporting for issues like this, it'd be great if you could share the versions in use and the logs for review.
Depending on the platform (local, cf or k8s), I'd recommend reviewing the troubleshooting steps included in the SCDF Microsite.
If you followed those steps and if you still see issues, please update the description of the post with the relevant details, and we can review then.
Related
Can't figure out what could possible be wrong. I've deployed a service, set the trigger to require authentication.
Created a new service account for the cloud scheduler: scheduler-invoker#<REDACTED>.iam.gserviceaccount.com
Went to cloud run's permissions and added that account as cloud run invoker (although during creation I have already set up that role)
On cloud scheduler, I add this account as the service account, and the audience is set to theurl of the service.
But invocations are failing with a 403 error. Can't figure out this one, followed every step outlined at https://cloud.google.com/run/docs/triggering/using-scheduler and I'm pretty sure I've done this in the past with no issues.
Any ideas?
I saw a few posts here on SO but even after reading them I'm still on the same spot
I missed the fact that I was on a project where cloud scheduler was activated before 2019. Adding service-[project-number]#gcp-sa-cloudscheduler.iam.gserviceaccount.com seems to fix it
I have a series of jenkins pipeline jobs to move Apps to Cloud Foundry. My client application need to be able to listen to all the updates of a push. I.e. apart from getting text logs, i need other events like Git repo cloned, cloud foundry logged in, App pushed.
One crud way of doing this is to submit POST requests to an event server from a shell script(Curl). However, I think it is unlikely that such a functionality does not exist already on Jenkins(either through a plugin or something like that).
I need an advice from best practices point of view.
Thanks.
As commented by mdabdullah. But this needs a person to set up kibana or splunk. (I did not try this).
Statistics gatherer plugin
https://plugins.jenkins.io/statistics-gatherer/
Jenkins notification plugin
https://plugins.jenkins.io/notification/
Both 2,3 are available plugins in the Jenkins community. They need to configured for server endpoints before use.
i am looking at the documentation for spring Cloud Data flow.
https://dataflow.spring.io/docs/recipes/rabbitmq/rabbit-source-sink/
This example that uses RabbitMQ as source and sink is using Spring Cloud Streams framework - which is fine. But it doesn't show how these 3 apps - source, sink and processor can be deployed to Spring Cloud Data Flow (SCDF), it simply just runs three jars locally and they talk to each other via RabbitMQ Queues.
I am not sure how this shows the use of SCDF in this case. There's no involvement of SCDF in this case. A proper example that shows how to deploy this jars as apps inside the SCDF needs to be provided. am i missing anything in this case?. i am hoping somebody else has tried them and can share their feedback about my concern.
The documentation here covers the SCDF part of how to manage those source, processor and sink applications.
I already read this question, but it didn't solve my problem.
I read from a PubSub topic in my Dataflow topology, but I am always getting the error of "resource setup failure":
Even if I have already enabled all of the Google Cloud APIs for the project.
Do you have any ideas? Could it be some issue with credentials?
Where can I get a more meaningful error message?
I needed to create the topics by hand.
Dataflow automatically creates the subscriptions, not topics.
I am developing RabbitMQ token auth plugin, where the token needs to be included in AMQP header so it can be validated upon every sent / consumed message.
I am wondering how can I achieve it? So far I am only familiar with RabbitMQ auth plugins and do not know much about other plugin mechanisms. After quick research I have found rabbit_channel_interceptor behavior which sounds like it could do the job.
I have read rabbitmq's source code about auth. In the source code tree, please pay attention to the files named as "rabbit_auth_machanism.erl", "rabbit_auth_backend", "rabbit_auth_backend_internal". In addition, there is another plugin named "xxx ldap".
After reading carefully and know how to integrate and build the rabbitmq project groups, you can start programming.