Connect Tableau Public to Firebase using Tableau Web Data Connector - firebase-realtime-database

I have some data hosted in Google Firebase and I need to make some analysis using Tableau Public (free version). And the Tableau data should be updated daily.
I read that a possible solution could be using a Tableau Web Data Connector but I'm not sure of this. If I use the Tableau WDC is there a way to schedule the data update? As far as I have understood a Tableau WCD is an intermediate page that downloads the data for example from a rest API and then put them into a tableau page.
Is it the correct way to achieve my goal?
cheers

This is not achievable through Tableau Public.
If you were using Tableau Desktop,
Currently as of 05/16/2018, there is no official connector for firebase. Also, I do not see third-party Tableau Web Data connector for firebase yet.
A Tableau WDC is a html page consisting of javascript hosted on a webserver and grabs data for you through http and returns it to the machine where you entered in the URL for the specified WDC.
An alternative would be to import your firebase data into BigQuery. Then, accessing your data from there.

Related

Google Cloud Pub/Sub to ingest data from API endpoint and publish as message

I have been trying to build a pipeline in Google Cloud Data Fusion where data source is a 3rd party API endpoint. I have been unable to successfully use the HTTP Plugin, but it has been suggested that I use Pub/Sub for the data ingest.
I've been trying to follow this tutorial as a starting point, but it doesn't help me out with the very first step of the process: ingesting data from API endpoint.
Can anyone provide examples of using Pub/Sub -- or any other viable method -- to ingest data from an API endpoint and send that data down to Data Fusion for transformation and ultimately to BigQuery?
I will also need to be able to dynamically modify the URI (e.g., date filter parameters) in the GET request in this pipeline.
In order to achieve the first step in the tutorial you are following
Ingest CSV (Comma-separated values) data to BigQuery using Cloud Data Fusion.
You need to set up a functioning pub/sub system. This can be done via the command line, the console, or in your case the best would be to use, one of the client libraries. If you follow this tutorial you should have a functioning pub/sub system.
At that point you should be able to follow the original tutorial

How to use API instead of using Google BigQuery Data Transfer Service?

I am trying to create BigQuery Data transfer config for Google Adwords through API using a programming language (Python, Java). I looked at the documentation about BigQuery data transfer API. But there is no proper process for that. Maybe I could not understand properly. Can anyone help me in understanding how to use API to get daily analytic data from YouTube instead of paying YouTube to use their BigQuery Data transfer?
You need to get started using Adwords SQL
https://developers.google.com/adwords/api/docs/guides/first-api-call
Refer to the Getting Started section of the Python client library README file to download and install the AdWords API client library for Python.

Pushing data from iOS to Google Cloud BigQuery

I am new to Google Cloud Platforms and not quite sure with the whole architecture but what I am trying to achieve is to save some data to Google Cloud from an iOS application and do some analytics work on this data using Google Cloud Products, such as: Dataproc and Datalab. From what I read so far I would need to create a dataset in Google Cloud BigQuery and create a table in it. I have done this using the Google Cloud Web UI but now I want to populate the table from my iOS app. I can't seem to find how to do that.
The most painless route would be to wire up Firebase Analytics and then turn on its daily log export to Big Query, as described by Google in the walkthrough Importing Firebase Analytics Data into BigQuery. Google maintains the entire analytic export stack for you then, seeing as they also maintain Firebase. The downside is that the analytics export happens only daily.
Alternatively, you'd be looking at using the Big Query REST API to upload data, as documented by Google in their Loading Data with a POST Request how-to guide. The iOS tooling for that would be your usual NSURLSession and NSURLDataTask APIs, or whatever abstraction you prefer that's built atop them.
Google does maintain a collection of iOS-native APIs, but unfortunately, Big Query is not included amongst the supported APIs as of May 2017. There are native Big Query clients for Go, C#, and Java, amongst others. So you could use your own API for upload to a server you control, and then use one of those client APIs serverside to implement the actual Big Query integration, if you wished.

Is JIRA Web Data Connector live or is it just an extract?

I am trying to create dashboards in Tableau from my JIRA data and I was investigating on ways with which I could pull data into Tableau. I found this online:
https://marketplace.atlassian.com/plugins/com.kaanha.jira.tableau/cloud/overview
I wanted to know if the WDC created would have a live connection with JIRA or is it just a one-time extract from JIRA?
Thank you!
With WDCs it is not possible to have a live connection.
You can refresh the data manually for Tableau Desktop or if you have access to Tableau Server you will have to install the WDC on Tableau server and let it refresh on a schedule to make the latest data available to users.

RTD Server : how to read RTD server data in C#

I have developed an application to read data from excel spreadsheet which get data from RTD server. So whenever spreadsheet get updated I can write that data to database using C#.
But my concern is that I don't want to rely on that spreadsheet as an intermediate source of data from RTD server. Is it possible to get data directly from that RTD server in my application so why I can store that in database.
That must be possible, and i just need hint to do this.
Thanks. DnyaN :)

Resources