I'm a bit puzzled. I can't set Zapier to trigger when a new line is added to any sheet on a Google spreadsheet. They force you to choose a specific sheet. Any way around it?
Cheers!
You will need to create a different "Zap" event for each sheet. To my knowledge, there is no way around that.
However, the fact that you think Zapier should be able to have a single Zap be triggered by multiple sheets implies a couple or things:
1.) You're not yet sure what a "Zap" is.
2.) Your data in each separate sheet is very similar. And this causes me to wonder if your overall spreadsheet setup is less than ideal. Many people wind up setting up separate sheets for data collection, whereas in most cases, data collection should only be done in one sheet setup as a standard database, while other sheets show the data in different arrangements (i.e., reports).
You haven't given much detail or provided a link to the spreadsheet. But perhaps this will prompt you to dive into understanding Zapier a bit better and reconsidering the layout of your spreadsheet(s).
I'm currently using a Google spreadsheet that has many filter functions and many sheets, when one value is changed on one sheet the filter function on the other sheets is changed due to the condition changing and a recalculation occurs.
The bug seems to be that sometimes data from the previous filter function is left in the cells and is not cleared automatically. I have made a quick script that clears cells that have the potential to hold old data, which works but shouldn't be the case.
Has anyone else ever had this issue?
Thanks and I hope this has been explained well enough.
I am fairly new to Power Apps, and am trying to make a batch data entry form.
I am prototyping this now, and while I think in theory it should be working I keep running into technical errors.
The data source I'm using is google sheets. For prototyping purposes, there are three columns, item_id, item, and recorded_value.
For this app, it will be pulling a list of standard values into a gallery, where the input values can then be selected.
The approach I have taken is to create a gallery, which is added to a collection using the code below:
ClearCollect(
collection,
ForAll(
Filter(Gallery1.AllItems,true),
{ item:t_item.Text,item_id:t_item_id.Text,
recorded_value:t_recorded_value.Text
}
)
)
This is then uploaded to google sheets, I have found "success" using the two methods below:
ForAll(collection,Patch(records, Defaults(records),{item:item,item_id:item_id,recorded_value:recorded_value}))
or
Collect(records, collection)
I would say overall I am seeing 2 main issues in the testing:
The initial 'collect' seems like it fails to capture items on occasion. I don't know if it is cache related or what, but it seems like unless I scroll all the way down it will leave some fields blank (maybe not an issue in real use, but seems odd)
Uploading of records seems to take excruciatingly long in some cases. While initially it was just straight up crashing due to the problems in issue 1, I have found that it will sometimes get to say item 85 before sitting for a minute or so and then going through the rest of the list. For just 99 items it is taking several minutes to upload.
Ultimately I am looking to know if there is a better approach for what I am doing. I am basically just wanting to take a max of 99 rows and paste it on to the table, but it feels really inefficient right now due to the looping nature of the function. I am not sure if this is more of a powerapps or google sheets issue, but any advice would be appreciated.
From everything I could research, it seems like batch upload of records like this is going to be time consuming nearly any way you approach it.
I was able to come up with a workaround however which more or less eliminates the problem.
Instead of uploading each individual record, I am taking the approach of concatenating all records in the collection in a single cell through a variable, using delimiters to differentiate the rows/columns. (set variable with concat function, then patch the variable to the data source.)
This method allows all of the data to be stored nearly instantaneously.
After that I am just going to perform some basic etl through Python to transform the data into a more standard format and load it into SQL server which is fairly trivial to do.
I recommend others looking to take a 'batch insert' approach try something similar, as it will now only take users essentially a second to load records rather than several minutes.
The goal is to hide the formulas in the cells in order to show only the data to the users in the very same spreadsheet.
Why? Currently I'm showing some information to the teams, but the formulas contain info that these teams shouldn't be able to know.
To minimize the problem I'm importing most of the data using formulas such IMPORTRANGE or QUERY but even those methods show the URL of the original spreadsheet.
Use a Script to copy isn't a viable solution due the amount of data and frequency some those sheets receive updates.
is it possible to specify to Google spreadsheet not to recalculate a specific sheet or number of cell everytime a modification is done ?
I have a very heavy sheet and it's very slow everytime i made a tiny modification, even if it's in an another sheet.
Seems that this feature is not supported by Google Spreadsheets. From a cursory search online the easiest way to work around this limitation is to make your formulas all conditional on a trigger. ie =IF(Boolean_Trigger,Your_formula,""). Trigger could be a cell, Named range, etc. Regards,