Distribution chart not showing bars in AnyLogic - histogram

So I've set up a simulation to measure the time it would take for a ship to get from houston to sydney. I put a time measure start and end between the move to block and made a distribution chart using a histogram. But when I run the simulation, the bars don't show up. The numbers are still there, and the rest of the histograms are working properly, but no bars on this particular one.Here is a screenshot of the simulation, histogram in question is the second one.
#Emile Zankoul

Can't be 100% sure why.
Try to copy and paste one of the two histograms that work then change the time measure name in the new histogram's properties.
I think most likely, you have "Show PDF" unchecked:
Make sure it is checked.
After reviewing the model I realized that the total measured duration is over 1,000,000 seconds. It seems there is a limitation to the maximum value for the histogram. So to resolve this, change your model time units from seconds to anything bigger (e.g. minutes, hours, etc.).

Related

image similarity algorithm for charts

for automating the tests of a legacy Windows application I need to compare screenshots of charts.
Pixel comparison works fine as long as the Windows session opens with same resolution, DPI, color depth, font size, font family, etc., otherwise the screenshot taken during the test may differ slightly from that recorded during the development of the test.
Therefore, I am looking for a method that allows slight variations and produces a score rather than a boolean.
Started with scaling the retrieved screenshot to match the size of recorded one. Of course, pixel comparison fails.
Then I tried to use SSIM to get a similarity score (used https://github.com/rhys-e/structural-similarity). Definitely it does not work for my case -- see below a simplified experiment.
Any suggestions?
Thanks in advance,
Adrian.
SSIM experiments
This is the reference picture:
This one contains a black line slightly above than the reference --> getting 0.9447093986742424
This one, completely different --> getting 0.9516260505445076

Why is Contour Series much slower when using filled mode

I created a TeeChart with two TContourSeries: One with filled set to false and one set to true. Both get exactly the same data and both are not active while feeding the data to them.
When I activate the not filled series it takes less than a second to paint itself. Not so the filled series. It takes at least 10 times longer to draw itself.
Why is that so? I would imagine that the filled series uses the same algorithm as the not filled one and then some sort of flood fill is used. That should not take that long.
Is there a way to speed up painting of the series in filled mode? Data reduction is not an option here.
I'm copying the reply from here.
I'm afraid the filling of the contour series isn't so simple as one could think at a first glance.
We are internally using a TIsoSurface to draw the cells, which makes the process slower.
Some references:
https://www.steema.com/support/viewtopic.php?f=3&t=12326
https://www.steema.com/support/viewtopic.php?f=3&t=10796
http://bugs.teechart.net/show_bug.cgi?id=1438

Gun Accuracy Leaderboard Calculation

I'm working on a leaderboard system for Garry's Mod and I have run into a slight problem to do with one of the statistics I'm tracking.
I am tracking a lot of statistics including the number of bullet's shot and the number of bullets that actually hit, and am using that information to work out the accuracy of the player like so:
(gunHits / gunShots) * 100
The problem with the way I'm doing it is that people can go to the top of the leaderboards by just logging on and shooting one bullet and hitting someone with it, therefore having an accuracy of 100%.
Is there any way I can get around this?
You could set a minimum amount of shots fired required to be ranked. I think 100 would be a good number.
On a side note you don't need to put any parenthesis on the operation since divide and multiply are operators with the same priority and they are already in the right order. It would've been necessary if you were doing it this way.
100 * (gunHits / gunShots)

ImageJ round method not working properly?

I've been testing this for about an hour and I don't understand what's going on.
In imageJ, if I say:
i = 3.5;
print(round(i));
I get 4.
However, If I say:
print(65535/(27037-4777)*(26295-4777));
print(round(65535/(27037-4777)*(26295-4777)));
For some reason, I am getting:
63350.5
63350
Shouldnt it be rounding up to 63351?
Taking a look at your comments, the number that was generated through that calculation is actually 63350.499999..., and so when you try and round, the number gets rounded down and you get 63350. One thing that I can suggest is to add a small constant that may seem innocuous in hindsight, but it will resolve situations like this. You want to make it small enough so that it'll push the fractional part of your number over to the 0.5 range so it'll round successfully, but it won't interfere how round works for the other fractional parts.
The Java API has a function called Math.ulp that will compute the next possible fractional component that is after a particular floating point number that you specify. However, because ImageJ doesn't have this functionality, consider adding something small like 1e-5. This may seem like a foolish hack, but this will certainly avoid the situation like what you're experiencing now. This constant that you're adding should also not affect how round works in ImageJ.
tl;dr: Add a small constant to your number (like 1e-5) then round. This shouldn't affect how round works overall, and it'll simply push those numbers with a fractional component that are hovering at the 0.5 mark to be truly over 0.5.
Good luck!

How to profile on iOS?

I'm using instruments to profile the CPU activity of an iOS game.
The problem I'm having is that I'm not entirely sure what the data I'm looking at represents.
Here is the screen I see after running my game for a couple of minutes,
I can expand the call tree to see exactly what methods are using the most CPU time. I'm unsure if this data represents CPU usage for the entire duration the profiler was running or is it just at that point in time.
I've tried running the slider along the timeline to see what effect that has on the numbers and it doesn't seem to have any. So that leeds me to believe the data represents CPU usage for the duration the game was running.
If this is the case then is it possible to access CPU usage at a particular point in time. There are a few spikes along the time line, I would like to see exactly what was happening at that time to see if there are any improvements I can make.
Thanks in advance for any responses.
To select a time range, use the "inspection range" buttons at the top of the window (left of the stop watch).
First select the start of the range by clicking on the graph ruler, the press the left most button to select the left edge. Then select the end of the range on the graph ruler and press the right most button to select the right edge.

Resources