Google Data Catalog systems list not refreshing - google-data-catalog

I am using the connector suggested in some GCP documentation to sync metadata from Tableau to Google Data Catalog (https://github.com/GoogleCloudPlatform/datacatalog-connectors-bi). The sync process goes as expected and I am able to search and explore the data. However the list of available systems does not get updated after several days. Not sure if there is something else you need to do in order to make this happen.

Related

Google Sheet Data Connector to Bigquery Disappeared

Existing working connections to Bigquery from Google Sheets (using the new data connector) just stopped working yesterday, and now do not show up in the "Data" menu as before.
Anyone else experiencing? And, knows how to mitigate?
Support suggested trying to make a new sheet in incognito mode, but it still is just missing from the menu now:
It appears the exiting sheets just don't have "Refresh" as an option anymore, despite remaining in the sheet:
you are right. It was available for "Business", but now it is restricted to Entreprise+.
Below the response I got from Sheets Team.
Thanks for contacting Google Cloud Support, my name is XXXXXXXX and I
am with the Sheets Specialist Team. (...). I see that you have lost
access to Bigquery in Sheets. I regret to inform you that the feature
is no longer available for G Suite Business accounts, you would need
to upgrade to G Suite Enterprise for the feature to be available
again, you will find additional information in this Help Center
article: https://support.google.com/a/answer/9604541. Please let me
know if this answers your question or if additional support is
required.
Best Regards.
Gabriel
It appears Google has released a big update to the Data Connector for bigquery, which actually adds a lot of functionality (including pivots, calculated columns within the sheet, and a refresh schedule).
However, they have now limited which types of accounts can use the connector:
See https://support.google.com/docs/answer/9077536?co=GENIE.Platform%3DDesktop&hl=en
And, this list (at least as of 7/18/2020) no longer includes "Business" tier -- which is disappointing to say the least.

Can we get records from firebase in pdf or excel

We have an ios app which uses firebase as database. We don't have configure server for the application. We are inserting, updating and getting data (records) from firebase storage (behaves like server). My requirement is i need to get one week records in excel or pdf format and send the file over email. Can we automate this process once in every week. Is there any possibility to execute some scripts in firebase console to automate this process.
Thanks in advance.
This question is asking multiple questions. In the future, try to stick to one per post as multiple questions can make answers very long and convoluted.
1) My requirement is i need to get one week records in excel or pdf
format
No, firebase does not have that as a direct option. Firebase is a JSON database and exporting (from the Real Time Data Base) will be json formatted text.
It's pretty straightforward get the data you want via a query for the Real Time Database or query for Firestore and then export it from your app into whatever format you like.
Note that Excel, Numbers.app, OpenOffice, other processing apps etc can read comma delimited files easily so that may be an option instead of creating a specific filetype. There are also a number of JSON 'converters' as well.
Lastly, Firestore exporting is supported through their managed export options which would enable data to be exported to BigQuery for example.
2) Can we automate this process once in every week.
Yes you can automate tasks to occur on a regular basis via a cron job and Firebase Cloud Functions.
There's a great Firebase Blog on Scheduling Cloud Functions for Firebase (cron) as well as additional reading in the Firebase Guide Schedule Functions
and for reference here's information about cron

Import data from another source into Adobe Analytics

I’m trying to tie data from another product with my data inside of Adobe Analytics.
We have Adobe Analytics javascript on our website collecting data and we use a third party tool to track how users interact with certain parts of the website. We’re trying to use the Adobe API to tie the data together.
So far we’ve gone down the path of using the Data Insertion API, but it wasn’t quite right as it’s meant to be used as a replacement for the JS, from what I can tell.
We also explored using the Data Sources API. Now the documentation for this suggests you can use a transaction ID to tie offline data with the data collected from the JS, we’ve tried this and it doesn’t match the data up. We’re now exploring using Visitor ID to tie the sessions together but we’re having problems uploading any rows with the Visitor ID column, Adobe just returns the error “Column header: ‘visitorid’ is not a valid column header”. We’ve tried several different variations of visitor id, such as “visitor_id”, “visitor-id”, “vistor id”, etc and still no luck.
The end goal is for us to be able to upload data to Adobe that will update/add eVars for already existing sessions earlier that day. How would I go about doing this? Is there something I'm missing or doing wrong?
Edit: I managed to solve this problem by using the Adobe SAINT API. When a user arrives at the site, we push an eVar for that user with a unique ID and then the day after we use the SAINT API and the unique ID in the eVar we pushed previously to add the additional data we needed.
It could be a good idea to look back at the Data Insertion API and combine it with the visitorId approach where you tie existing/old visitorID's to new eVars and use the timestamp to "update" the dataset.
Although this is experimental, it might be worth a try.
Best regards,

get spreadsheet URL

I have a problem in my reporting, i create every day a google doc tracker where all the stack holders in my department update their work progress in it, so i have plenty of spreadsheets to monitor which is a hassle, here is what im trying to do, I'm trying to create a big google doc tracker where i can have an access over date applied in the normal spreadsheets, all what i need is the spreadsheet's URLs that exist in my google drive to be retrieved in this big tracker, with this i'll be able to drive all the needed data from the normal trackers.
PS: I'm not good with google scripts.
You can use the Drive Service to get a list of files with MIME type "application/vnd.google-apps.spreadsheet" using getFilesByType. This returns a FileIterator, which you can use to individually get each Spreadsheet file. From there, just use getUrl() to find the URL's. The FileIterator link has examples of how to loop through all the matching files.

How to retrieve the list of top 1000 games in the iOS App Store in real-time

I'd like to retrieve the list of top 1000 apps (specifically games) in the iOS App Store in real-time. This information is public (at least the top 300), but Apple doesn't offer any API or automated way to fetch these lists. Does anyone know a good service for this?
I've listed similar topics in forums below, as well as different useful resources. Most of these help you track your own applications, but I'm interested in following trends for ALL apps in the App Store.
Thanks!
Similar topics:
How can I use Appstore API to get top100 list? What is the common architecture to build a appstore application website?
https://stackoverflow.com/questions/2689711/itunes-app-store-api
http://www.cocos2d-iphone.org/forum/topic/13167
Solutions:
http://www.appfigures.com
http://majicjungle.com
http://itunes.apple.com/rss/generator
There appears to be no documented public API, but you can (currently) still get at the data.
You can use wireshark (or similar) to figure out the URL sequences and the user agent that iTunes on a Mac or PC uses to get all the popularity sorted pages that it displays when manually clicking through the pages. It's all (currently) done in plain HTTP. You can get 100's of pages for many 1000's of apps this way. Then parse and decode the XML returned for these URLs to get the app names displayed on each page. A bunch of perl scripts driving wget or curl might work.
Note that the URLs, user agent and the format of the XML returned often changes when Apple updates iTunes. So you will need to periodically re-adapt your retrieval mechanism.

Resources