abas ERP: Limit for table rows in additional database - erp

Is there a limit for table rows in additional databases in abas erp?
If there is a limit: On which factor the limit is based, how can I calculate the limit and what happens if I try to add more lines by GUI, FO or EDP/EPI?
Can I find it documented in the abas online help? I haven't.

Yes there is a limit, which is unfortunately not customizable.
You can see a full list of know limitations under help/hd/html/49B.1.4.html
In your specific case the limit of lines in additional databases is 65535.
If you reach the limit, the abas core will show an error message and terminate your current FOP. You can (and should) get the current amount of lines by evaluating the variable tzeilen (currTabRow)

In this case I'm also not aware of any other than the one you mentioned, but you can query ozeilen in a selection list (for master files, not for i.e. sales and purchasing because the rows there aren't physically 'rows'). tzeilen (currTabRow) is buffer related.

Related

Updating a query for false positives

I work in a compliance role at a very small start-up and review a lot of information,for example bank transfers/direct deposits/ACHs every day. A report is pulled from BigQuery,which is exported to Google Sheets.
My question is there are a lot of false positives (basically, "posting data" that repeats often). I'm trying to eliminate it.
One idea, was just to update the query for key words:
WHERE postingdata LIKE 'PersonName%'
But it's tired and time-consuming. And I feel e there's a better way, perhaps 'filtering' the results and then feeding it back to the query. Any ideas or tips or just general thoughts?
In this case you can use group by in your query. This is how you can use this clause.
You can see this code.
SELECT account,TypeTransaction,amount,currency
FROM `tblBankTransaction`
The code returns this data, and some rows are repeated; for example, rows 1 and 7 with the account 894526972455, and it's a deposit.
In this case, I will use the group by clause.
SELECT account,TypeTransaction,amount,currency
FROM `tblBankTransaction`
group by account,TypeTransaction,amount,currency
And it returns this data:
You can see in this example that the account 894526972455 with a deposit only returns 1 row. The same account returns a second row, but is a transfer; it’s a different type of transaction. It depends on the information you have and what column you want to group.
within GS you can try UNIQUE or QUERY with group by aggregation or SORTN with mode 2 as 3rd parameter

Extract Last Value as Metric from Table Calculation in Tableau?

I have raw data in Tableau that looks like:
Month,Total
2021-08,17
2021-09,34
2021-10,41
2021-11,26
2021-12,6
And by using the following calculation
RUNNING_SUM(
COUNTD(IF [Inserted At]>=[Parameters].[Start Date]
AND [Inserted At]<=[End Date]
THEN [Id] ELSE NULL END
))
/
LOOKUP(RUNNING_SUM(
COUNTD(IF [Inserted At]>=[Parameters].[Start Date]
AND [Inserted At]<=[End Date]
THEN [Id] ELSE NULL END
)),-1)*100-100
I get
Month,My_Calc
2021-08,NULL
2021-09,200
2021-10,80.4
2021-11,28.3
2021-12,5.1
And all I really want is 5.1 (last monthly value) as one big metric (% Month-Over-Month Growth).
How can I accomplish this?
I'm relatively new to Tableau and don't know how to use calculated fields in conjunction with the date groupings aspect to express I want to calculate month-over-month growth. I've tried the native year-over-year growth running total table calculation but that didn't end with the same result since I think my calculation method is different.
First a brief table calc intro, and then the answer at the end.
Most calculations in Tableau are actually performed by the data source (e.g. database server), and the results are then returned to Tableau (i.e. the client) for presentation. This separation of responsibilities allows high performance, even when facing very large data sets.
By contrast, table calculations operate on the table of query results that were returned from the server. They are executed late in the order of operations pipeline. That is why table calcs operate on aggregated data -- i.e. you have to ask for WINDOW_SUM(SUM([Sales)) and not WINDOW_SUM([Sales])
Table calcs give you an opportunity to make final passes of calculations over the query results returned from the data source before presentation to the user. You can for instance calculate a running total or make the visualization layout dynamically depend in part on the contents of the query results. This flexibility comes at a cost, the calculation is only one part of defining a table calc. You also have to specify how to apply the calculation to the table of summary results, known as partitioning and addressing. The Tableau on-line help has a useful definition of partitioning and addressing.
Essentially, table calcs are applied to blocks of summary data at a time, aka vectors or windows. Partitioning is how you tell Tableau how you wish to break up the summary query results into windows for purposes of applying your table calc. Addressing is how you specify the order in which you wish to traverse those partitions. Addressing is important for some table calcs, such as RUNNING_SUM, and unimportant for others, such as WINDOW_SUM.
Besides understanding partitioning and addressing very well, it is also helpful to learn about the functions INDEX(), SIZE(), FIRST(), LAST(), WINDOW_SUM(), LOOKUP() and (eventually) PREVIOUS_VALUE() to really understand table calcs. If you really understand them, you'll be able to implement all of these functions using just two of them as the fundamental ones.
Finally, to partially address your question:
You can use the boolean formula LAST() = 0 to tell if you are at the last value of your partition. If you use that formula as a filter, you can hide all the other values. You'll have to get partitioning and addressing specified correctly. You would essentially be fetching a batch of data from your server, using it in calculations on the client side, but only displaying part of it. This can be a bit brittle depending on which fields are on which shelves, but it can work.
Normally, it is more efficient to use a calculation that can be performed server-side, such as LOD calc, if that allows you to avoid fetching data only for client side calculations. But if the data is already fetched for another purpose, or if the calculation requires table calc features, such as the ability to depend on the order of the values, then table calcs are a good tool.
However you do it, the % month-to-month change from 2021.11 (a value of 26) to the value for 2021.12 (a value of 6) is not 5.1%.
It's (( 6 - 26 ) / 26) * 100 = -76.9 %
OK, starting from scratch, this works for me: ( I don't know how to get exactly the table format I want without using ShowMe and Flip, but it works. Anyone else? )
drag Date to rows, change it to combined Month(Date)
drag sales to column shelf
in showme select TEXT-TABLES
flip rows for columns using tool bar
that gets a table like the one you show above
Drag Sales to color (This is a trick to simply hold it for a minute ),
click the down-arrow on the new SALES pill in the mark card,
select "Add a table calculation",
select Running Total, of SUM, compute using Table(down), but don't close this popup window yet.
click Add Secondary Calculation checkbox at the bottom
select Percent Different From
compute using table down
relative to Previous
Accept your work by closing the popup (x).
NOW, change the new pill in the mark card from color to text
you can see the 5.1% at the bottom. Almost done.
Reformat again by clicking table in ShowMe
and flipping axes.
click the sales column header and hide it
create a new calculated field
label 'rows-from-bottom'
formula = last()
close the popup
drag the new pill rows-from-bottom to the filters shelf
select range 0 to 0
close the popup.
Done.
For the next two weeks you can see the finished workbook here
https://public.tableau.com/app/profile/wade.schuette/viz/month-to-month/hiderows?publish=yes

Is there any limit to the number of rows returned by API?

I am making a bulk call with 30 posts and daily data of all. Is there any limits to the number of rows that will be returned by the API?
I am having problem getting the results.
Can anyone please help.
YouTube doesn't return any rows ... it's not relational data. That may sound like a pedantic thing to point out, but it's crucial for this next point; the API will return 50 videos at a time, along with tokens to get more results based on the same query, up to a total of 500 ... because the data isn't relational, you can't just "select all rows" that match a certain criteria. Rather, it is probabilistically determining relevance to your search parameters, and after about 500 results the algorithms don't have enough certainty to make additional results relevant.
So in your case, where you can change the date as needed (to allow the algorithms to be more specific), you'll want to do a series of calls; perhaps one at a time (since you have to paginate anyway to get more than 50 results, it's probably not that much more expensive in terms of network bandwidth).

Is there a way to tell if an activerecord query hit its limit

given the following query:
Cars.where(color: red).limit(5)
Is there a way to tell if the limit was hit. Say there are 6 red cars, do I have to do a separate query to count the total number of red cars?
Basically, I am trying to send a message to the user letting them know that a search was limited due to reaching max number of results allowed.
The only way to combine it into one query is to do a limit(6). If the size is 6, then remove the last element and record that their are more results.
Alternatively, do a separate query Cars.where(color: red).count. Although this will do a separate SQL query, count queries are sometimes very fast for databases.
The direct answer to your question is that you cannot get this information back in one single query. You would have to run two queries: one to return the limited set, and one to find the total count.
The only time this wouldn't be necessary is when you are querying the first page of results (the offset is 0) and the number of returned results is less than the limited value (e.g., you set the limit to 5 but get 4 results back).
Sample code would look like this:
limit = 5
cars_scoped = Cars.where(color: red)
cars = cars_scoped.limit(limit)
cars_count = cars.length < limit ? cars.length : cars_scoped.count

How do I handle large amounts of logfile data for display in dynamic charts?

I have a lot of logfile data that I want to display dynamic graphs from, for basically arbitrary time periods, optionally filtered or aggregated by different columns (that I could pregenerate). I'm wondering about the best way to store the data in a database and access it for displaying charts, when:
the time resolution should be variable from one second to a year
there are entries that span several 'time buckets', e.g. a connection might have been open for a few days and I want to count and display the user for every hour she was connected, not just in the hour 'slot' the connection was created or finished
Are there best practices, or tools/plugins for rails that help handle this kind and amount of data? Are there maybe database engines specifically tailored towards this, or having helpful functions (e.g. CouchDB indexes)?
EDIT: I'm looking for a scalable way to handle this data and access pattern. Things we considered: Run a query for each bucket, merge in app - probably way too slow. GROUP BY timestamp/granularity - does not count connections correctly. Preprocessing data into rows by smallest granularity and downsampling on query - probably the best way.
I think you can use mysql timestamps for this.
The way I solved it in the end was to pre-process the data into per-minute buckets, so there's one row for every event and minute. That makes it easy and fast enough to select and yields correct results. To get different granularity, you can do integer arithmetic on the timestamp columns - select abs(timestamp/factor)*factor and group by abs(timestamp/factor)*factor.

Resources