Export Processmaker Collection Records to Excel - Processmaker - processmaker

Please, how can I export collection records to a downloadable excel? I have written a script that collects the records and put them in an array but I need a way to put the array records in an excel sheet that can be downloaded by users.

Related

How to create external table using dbt from Google Sheet to BigQuery?

I want to create external table in BigQuery and the data source is from Google Sheet. Is it possible to do it using dbt? In the yml file, where should I put the URI?
The main problem is, I don’t have the access to create it directly in BigQuery.
One way to handle a Google Sheet as a source is by creating a new table out of it in BigQuery via Connected Sheets.
Then, you create a new source in dbt that relies on that table, and start building your downstream models from there.
As far as I know, you cannot create a source directly from dbt, unless it is a seed file, which I woul not recommend unless it is a rather static file (e.g. country names and ISO codes, which is not prone to change over time).
We have a similar situation where the data source is from Google Sheet.
The end user updates the Google sheet on a periodical basic and we replicate it using Fivetran to our Snowflake datastore.
DBT can then pick up the data seamlessly.

How looks like the log of a google sheets sourced table update in BigQuery?

I have several tables in BigQuery that are sourced from Google Sheets tables. When the Google Sheets table is updated then automatically the table in BigQuery is also updated. I am trying to understand how the log of this event looks like in the Operations Logging. My end idea is to create a sink of theses logs in order to create a Pub/Sub and run scheduled queries based on these events.
Thank you
When you use external Table (Google sheet or other) the data are never stored in BigQuery native storage. It's always external.
Therefore, when you update your Google Sheet, nothing happens in BigQuery. It's only when you query the data, you will read (again) the sheet document and get the latest data.
Therefore, there is no insert log that you can track when you update the data in Google Sheet. The only log that you have is when you perform a request in BigQuery to read the data (external or not), as mentioned by Sakshi.
When the external data source(Google Sheet or other) is updated and the BigQuery table associated with it is queried, BigQuery initiates an insert job which is visible in Cloud Logging.
You can find this log by applying filter resource type as BigQuery Project in Cloud Logging console, ie. you will see protoPayload.methodName set to google.cloud.bigquery.v2.JobService.InsertJob.
For more information on BigQuery Logs you can refer to this documentation.

How to automate splitting google sheet columns when Jira Cloud data is imported

I am attempting to fully automate a data pull from Jira into Google Sheets using the Jira cloud add-on for Google Sheets.
I am having trouble as when doing a Jira cloud pull, it creates a column in the sheet for each "value" you are pulling out of Jira.
I want to:
place a formula on a particular column that will always be the same data, but often includes data separated by semi-colons.
have the data in specific columns automatically split into 2 columns next to each other when the scheduled weekly job runs to pull the data.
I am unsure how to do this as the data pulled overwrites the entire sheet (including removal of an additional column since the import didn't have that extra value/column when pulling the data).
You can have conditional formatting in place and it is not overwritten by the data import, but if you were to have a formula typed into a specific cell it is overwritten by the data in the import.

Grails GORM vs MS ADO with Excel Datasource

Morning All, Microsoft has different data access object to make you easy to connect and access your database. The old one is called DAO. The newer (still old) one is called ADO. I think Grails calls its GORM.
One thing great about Microsoft's is that it can connect to an Excel file just like a database. A sheet is a table. An Excel column is a database column. Row is row. You can use SQL statement to query the Excel file like Select Sum(sheet1.column1) From sheet1 Where sheet1.column10 > 25 Group By sheet1.column 5 for example.
I am learning to use POI to access the Excel file. So far so good. I can do it no problem. (I try the Grails Excel Import plugin but it keeps crashing.) But basically, I am looping through the rows and columns and a bunch of if-then-else to access the data in the Excel file.
Can I connect to an Excel file as a datasource and run the dynamic finder to query the Excel file in Grails?
Danken!
Since GORM is built upon Hibernate it should be possible to use an Excel file as a DataSource for Hibernate. Consult the Hibernate documentation on how to do so. Also this article may help give you some ideas on how to do so.

Google Data Prep - cannot import table from BigQuery (created from Google Sheets) "Nothing found"

I created one table in BigQuery from Google Sheets, when I tried importing it in Cloud Data Prep it says that there are no tables in the dataset.
I'm not sure whether it's an issue with the Google sheet integration, because when I check the details of the table it says there are 0 rows even though there are 3 rows (it's test data).
I already tried giving Cloud Data Prep viewer permission for both the dataset and the project and nothing changes.
Print screens:
Data Prep
BigQuery table info
BigQuery entries
Well apparently when you create a table from google sheets it is not recognized as an actual table, so I made a query to replicate it to a second table and it works!

Resources