Transferring merged cells to Excel BI Publisher - bi-publisher

there is a problem, when generating a report, the last block with merged cells is transferred with breaks, please tell me how to avoid this.
Tried it:
Forced transfer to the next page
Formatted the document

Related

VBA Receiving Compile error - Out of Memory

I have created a Main Userform that incorporates, multipage many fields and buttons and those link to various other userforms and worksheets and fields. I have reached a point where when I try to F5 I get an "Compile Error - Out of Memory"
I'm newer to troubleshooting these kinds of issues and granted I did not have a plan when I started in terms of structuring the forms and modules or what this would grow into.
This specific issue came having a Page that has a scroll feature that looks at a worksheet and pulls in records into various comboboxes based on a status of Open, closed, Hold etc. Each record is retrieving approx 7-8 fields and each page has approx 50 records that could display except closed which has to have enough for all.
I have read a couple things about ending Object to = nothing and enabling some advanced Windows to allow more memory allocation. I feel like maybe its a combination of structure and not clearing memory when i move around the tool. Any advise help or resources you could point me towards?
Attached is the Error, VBA project tree and a screen shot of one of the multipage items being pulled into the userform from the worksheet. (there will be multiple pages beside "open" that could have up to 100 or more records.
Thanks again,
Attached are Project Structure, error message, and userform-multipage screenshot example
Update: I was able to move past this, My issue was that I had a very bulky form that called a lot of textboxes and combo boxes upon initialization. This obviously required a lot of memory to render all this fields at once. Hence the error.
Solution: I re-thought the form and decided to use a list box and upon selecting a record from the list the fields I needed populate in the box below the list. This allowed me to go from hundreds of boxes to 12. This also was coupled that I had multi- page within a page. Sometimes you just need to take a step back and rethink and restructure your plan.

How to locally cache GOOGLEFINANCE results?

I use GOOGLEFINANCE() to query the historical USD/GBP exchange rate at a set of fixed dates.
This works fine, except sometimes GOOGLEFINANCE returns #N/A, for whatever temporary upstream reason. When this happens, my spreadsheet becomes filled with #REF's, for all cells that depend on these exchange rates. The sheet is unreadable until the upstream data source is fixed. This can sometimes take hours.
This happens frequently and is especially annoying since I'm not using GOOGLEFINANCE to retrieve time-varying data. I'm just using it as a static dataset of historical exchange rates, so theoretically I have no reason to refresh the data at all.
Is there a way to locally cache the historical exchange rates in my sheet, and to fall back on those values if GOOGLEFINANCE returns #N/A?
(Bonus points for automatically updating the cached values if GOOGLEFINANCE changes its mind about the historical exchange rates.)
I know this is an old post and you probably don't care anymore, but I was having issues with my triggers updating my assets page every night - the totals would fail if any one stock price had an error.
I created a customfunction() which caches the googlefinance results - so it reverts to the last valid data point if googlefinace() fails.
However, this lead to the customfunction achiles heel, 'Loading' - which came up occasionally as well. So I then modified to use Triggers to update, using my new custom function code - which never fails.
I made it an open source project, with one file you need to add to you App Script.
Using it as a custom function would be something like:
=CACHEFINANCE(symbol, attribute, defaultValue)
For example:
=CACHEFINANCE("TSE:ZTL", "price", GOOGLEFINANCE("TSE:ZTL", "price"))
However, if you follow the instructions to create a trigger, it is way more reliable. It also has a built in web screen scraper to track down info on stocks googlefinance refuses to collect data for.
github cachefinance
well, you are working with historical data eg. those data won't change no matter what so you can get the data you need and just hardcode them eg. get rid of the GOOGLEFINANCE for good.
another way would be to wrap any possible #REF! into IFERROR so when the blackout occurs you will get nice blank sheet instead of the sea of #REF! errors

How to announce to user when a Google Script is completed

On a Google Sheet, I have an extensive OnOpen script (viewable here) to refresh data on several sheets within the workbook. It all takes a while for all script lines to execute, i.e. to import new raw data and to then perform five different Unique Queries against that new data and thus update data on five sheets. When the Google Sheet is opened the user does see two successive pop-up yellow "Working" boxes, then five subsequent progress bars (while each of the queries do their thing). This all takes quite a while. I'd like to make an addition to the script routine to announce to the user that "all data is now refreshed and ready to view." A simple MessageBox ("Data now refreshed') placed at the end of the scripts seems to pop up before all script commands are actually completed. Thus the message box gives misleading info and I think it also interrupts some script lines from executing until "OK" is checked. So, MessageBox doesn't seem to work. So, I'm looking for a way to confirm that all script line (and all unique queries) are, in fact, complete before informing the user that it's OK to start viewing the data. Thanks.
Place a SpreadsheetApp.flush() before the "finished" alert box; the flush() method:
Applies all pending Spreadsheet changes. Spreadsheet operations are sometimes bundled together to improve performance, such as when doing multiple calls to Range.getValue(). However, sometimes you may want to make sure that all pending changes are made right away, for instance to show users data as a script is executing.

Most strange issue with Crystal reports

This is very strange. I have a CR that takes over 30 minutes to run. It uses 5 large tables and queries the server. I made a View on the server which is IBM i to gather the data there. For some reason it is not giving me data on the CR past 08/12. When I query past that date on the server,it does have data, and even if I make a quick report on CR it will show all the data incl 2013.
The reason can possibly be this>
When I made the View, I mistakenly had a mix of databases used. And one of the 2 databases was one being used as part of a data purge. So it may have not had data past 8.12/
But since that point, I have also modified the View to add some new columns and this it does and even shows them in the data that it does show (till 8/12)
So this would tell me that the CR is fully using the new View.
So I can re create the CR but this is rather tedious. Perhaps there is one thing I am not doing?
Crystal Reports generally does better in reporting over processing a query. For a faster, and easier way of debugging, it's often better to make a procedure in your database that joins together the data from various sources. Once you have the data you want, then use Crystal to display that data.
In other words, try to avoid doing any more work in Crystal than you have to. Sure, the grouping and headers and pretty formatting will be done there. But all of the querying, joining, and sorting is better done in your database. If the query is slow there, then you can optimize there. If the wrong data is returned, you fix your procedure until it is returning what you want.
An additional benefit is when the report needs to change. If the data needs to come from a different location, you can modify the procedure and never touch Crystal. If the formatting needs to change, you can modify the Crystal and never touch the procedure. You're changing less and thus don't have to test everything.
Is the crystal report attached to a scratch server?
If you are using SQL Server, then you can modify the SQL that constitutes your view by modifying the table names to be like this: databasename..tablename I'm not certain how to do the equivalent in other DBMS.
If you modify your table like that so that the view is querying tables from the correct non-purged database and you are still not getting data more recent than 8/12, then check if there are constraints in the WHERE and/or HAVING statements, or if there are implicit/explicit constraints in ON section of the JOINs.

UITableView + Large web data source

I'm using a UITableView which hooks into a rest API.
On first launch the app retrieves the data the UITableView will display and parses it in to a Core Data database.
This works fine for small datasets. But when the dataset grows to above 300-500 items it does not perform very well. Taking minutes to finish downloading+parsing. The app isn't deadlocked during this time, but the user likely won't wait for the parsing to complete.
I then decided to use paging. So now, I only retrieve the latest 20 items, and then the user can click "Load more" to go back further. The data is cached.
This seems to work well except for one problem.
Because I'm not downloading all the data on each load, I cannot tell when an item has been deleted on the server and I cannot tell when an item has changed (say the title may have changed).
Can anyone provide me with any suggestions to resolve this?
Thanks.
We routinely request a similar number of items and display it in a table view. However in our case the API returns JSON and we store it in model objects, not Core Data. Once the data is downloaded it takes less than a second to go from JSON to appearing in the table. Core Data is a bad idea for anything that isn't actually a database, or that isn't preserved for a past a user session. But you need to identify which part of your transaction is actually taking the most time. In our case it's the backend behind the API, but once it shows up everything is quite fast.
Also, in our case the data is around 700K and we are going to GZIP it soon to minimize the network time even further.

Resources