I am trying to somehow replicate the range bar chart here.
I've found this reference but I don't fully grasp the code.
What I have is a series of task (sometimes accomplished in different periods).
let d = [("task1", DateTime.Parse("11/01/2014 08:30"), DateTime.Parse("12/01/2014 10:30"));
("task2", DateTime.Parse("15/01/2014 09:30"), DateTime.Parse("16/01/2014 10:30"));
("task3", DateTime.Parse("11/01/2014 08:30"), DateTime.Parse("16/01/2014 10:30"))]
let chart = d |> FSharp.Charting.Chart.RangeBar
chart.ShowChart()
I am struggling to understand the logic of the API.
I have also tried:
let chart = new Windows.Forms.DataVisualization.Charting.Chart(Dock = DockStyle.Fill)
let area = new ChartArea("Main")
chart.ChartAreas.Add(area)
let mainForm = new Form(Visible = true, TopMost = true, Width = 700, Height = 500)
mainForm.Controls.Add(chart)
let seriesColumns = new Series("NameOfTheSerie")
seriesColumns.ChartType <- SeriesChartType.RangeBar
type SupportToChart(serieVals: Series) =
member this.addPointXY(lbl, [<ParamArray>] yVals: Object[]) =
serieVals.Points.AddXY(lbl, yVals) |> ignore
let supporter = SupportToChart(seriesColumns)
supporter.addPointXY("AAA", DateTime.Parse("11/01/2014 08:30"), DateTime.Parse("12/01/2014 10:30") )
which results in
System.ArgumentOutOfRangeException: You can only set 1 Y values for
this data point.
Has something changed in the API since then?
I'm not entirely sure that F# Charting is currently powerful enough to be able to reconstruct the above chart. However, one of the problems seems to be that it treats dates as float values (for some reason) and incorrectly guesses the ranges. You can at least see the chart if you use:
Chart.RangeBar(d)
|> Chart.WithYAxis(Min=41650.0, Max=41660.0)
Please submit this as an issue on GitHub. If you want to dig deeper into how F# Charting works and help us get this fixed, that would be amazing :-)
The trick is initializing the Series with
let serie = new Series("Range", yValues)
where yValues defines the max number of "Y-values".
Related
Here is what i am doing!
chart.highlightValue(x: timeStampValue, dataSetIndex: totalCount)
==> In the above line,
timeStampValue is x axis value which i have set while filling up the array.
totalCount is total count of array of data which i am displaying in chart.
What i need to achieve is
When chart screen comes up, i need to display marker by default and for that, i am using "highlightValue" method of chart which is not working.
Please let me know the solution to show marker by default programatically.
NOTE: I am using marker whose UI is custom which is working fine when i tap manually at point in chart:
let marker = CustomMarkerView.viewFromXib()!
marker.chartView = chart
chart.marker = marker
chart.drawMarkers = true
Library used : https://github.com/danielgindi/Charts
Chart Data set :
let data = CombinedChartData()
data.lineData = LineChartData(dataSets:[viewModel.lineChartDataSet, viewModel.emptylineChartDataSet])
data.lineData.highlightEnabled = true
viewModel.lineChartDataSet.highlightColor = AssetsColor.highlightedColor.color
viewModel.lineChartDataSet.drawHorizontalHighlightIndicatorEnabled = false
viewModel.lineChartDataSet.highlightLineDashPhase = 2
viewModel.lineChartDataSet.highlightLineDashLengths = [5, 2.5]
you are using the wrong value for dataSetIndex param
based on your code, the datasets only contains 2 data
data.lineData = LineChartData(dataSets:[viewModel.lineChartDataSet, viewModel.emptylineChartDataSet])
dataSetIndex is not data count, in linechart dataset represent a line that has many data (x,y), so dataSetIndex is more like which line
so your code should be something like this
chart.highlightValue(x: timeStampValue, dataSetIndex: 0)
chart.highlightValue(x: timeStampValue, dataSetIndex: 0, dataIndex: 0)
When i added 1 more parameter which is dataIndex as 0 and it worked.
Here, dataSetIndex is set to 0 because it is CombinedChartView where i have merged 2 data set.
I only just began learning swift and wanted to create a simple chart displaying some data.
I am creating a line chart using AnyChart library and there is a series of lines I am plotting to the chart. I noticed that I am repeating pretty much all of the properties. How can I make the below code less dry.
I am creating a line chart using AnyChart library and there is a series of lines I am plotting to the chart. I noticed that I am repeating pretty much all of the properties, the only thing that is changing is the initial variable name.
How can I make produce less code that will take into account the names of the variables intact (series 1, series 2)?
let series1Mapping = set.mapAs(mapping: "{x: 'x', value: 'value'}")
let series2Mapping = set.mapAs(mapping: "{x: 'x', value: 'value2'}")
let series1 = chart.line(data: series1Mapping)
let series2 = chart.line(data: series2Mapping)
series1.name(name: data.seriesNames[0])
series1.hovered().markers().enabled(enabled: true)
series1.hovered().markers()
.type(type: anychart.enums.MarkerType.CIRCLE)
.size(size: 4)
series1.tooltip()
.position(position: data.position)
.anchor(anchor: anychart.enums.Anchor.LEFT_CENTER)
.offsetX(offset: 5)
.offsetY(offset: 5)
series2.name(name: data.seriesNames[1])
series2.hovered().markers().enabled(enabled: true)
series2.hovered().markers()
.type(type: anychart.enums.MarkerType.CIRCLE)
.size(size: 4)
series2.tooltip()
.position(position: data.position)
.anchor(anchor: anychart.enums.Anchor.LEFT_CENTER)
.offsetX(offset: 5)
.offsetY(offset: 5)
You can do it like this.. Its just a rough idea ...i don't know what is Series object and mapping that you are doing ... But you can have one function that return series and get parameters to create that series
func getSeries(number:Int, mapping:String) -> Series {
let series = chart.line(data: set.mapAs(mapping: mapping))
series.name(name: data.seriesNames[number])
series.hovered().markers().enabled(enabled: true)
series.hovered().markers()
.type(type: anychart.enums.MarkerType.CIRCLE)
.size(size: 4)
series.tooltip()
.position(position: data.position)
.anchor(anchor: anychart.enums.Anchor.LEFT_CENTER)
.offsetX(offset: 5)
.offsetY(offset: 5)
return series
}
And then Create Series
let series1 = getSeries(number:0 , mapping:"{x: 'x', value: 'value'}")
let series2 = getSeries(number:1 , mapping:"{x: 'x', value: 'value2'}")
if you want to make it more simpler ... you can create mapping string from the number as well
Thanks
Using Tiled I generated a Lua file which contains a table. So I figured that I'd write a for loop which cycles through the table gets the tile id and checks if collision is true and add collision if it was. But, I've been unable to get the tile id's or check they're properties. But it returned a error saying that I tried to index nil value tileData.
Here is the Map file
return {
version = "1.1",
luaversion = "5.1",
-- more misc. data
tilesets = {
{
name = "Tileset1",
firstgid = 1,
tilewidth = 16,
tileheight = 16,
tiles = {
{
id = 0,
properties = {
["Collision"] = false
}
},
}
}
layers = {
{
type = "tilelayer",
name = "Tile Layer 1"
data = {
-- array of tile id's
}
}
}
}
And here is the for loop I wrote to cycle through the table
require("Protyping")
local map = love.filesystem.load("Protyping.lua")()
local tileset1 = map.tilesets
local tileData = tileset1.tiles
local colision_layer = map.layers[1].data
for y=1,16 do
for x=1,16 do
if tileData[colision_layer[x*y]].properties["Colision"] == true then
world:add("collider "..x*y,x*map.tilewidth, y*tileheight,tilewidth,tileheight)
end
end
end
Try this:
tileset1 = map.tilesets[1]
instead of
tileset1 = map.tilesets
lhf's answer (map.tilesets[1] instead of map.tilesets) fixes the error you were getting, but there are at least two other things you'll need to fix for your code to work.
The first is consistent spelling: you have a Collision property in your map data and a Colision check in your code.
The second thing you'll need to fix is the way that the individual tiles are being referenced. Tiled's layer data is made of 2-dimensional tile data laid out in a 1-dimensional array from left-to-right, starting at the top, so the index numbers look like this:
You would think you could just do x * y to get the index, but if you look closely, you'll see that this doesn't work. Instead, you have to do x + (y - 1) * width.
Or if you use zero-based x and y, it looks like this:
Personally, I prefer 0-based x and y (but as I get more comfortable with Lua, that may change, as Lua has 1-based arrays). If you do go with 0-based x and y, then the formula is x + 1 + y * width.
I happen to have just written a tutorial this morning that goes over the Tiled format and has some helper functions that do exactly this (using the 0-based formula). You may find it helpful: https://github.com/prust/sti-pg-example.
The tutorial uses Simple Tiled Implementation, which is a very nice library for working with Tiled lua files. Since you're trying to do collision, I should mention that STI has a plugins for both the bump collision library and the box2d (physics) collision library.
I'm fairly new to Swift, only having used Python and Pascal before. I was wondering if anyone could help with generating a floating point number in range. I know that cannot be done straight up. So this is what I've created. However, it doesn't seem to work.
func location() {
// let DivisionConstant = UInt32(1000)
let randomIntHeight = arc4random_uniform(1000000) + 12340000
let randomIntWidth = arc4random_uniform(1000000) + 7500000
XRandomFloat = Float(randomIntHeight / UInt32(10000))
YRandomFloat = Float(randomIntWidth / UInt32(10000))
randomXFloat = CGFloat(XRandomFloat)
randomYFloat = CGFloat(YRandomFloat)
self.Item.center = CGPointMake(randomXFloat, randomYFloat)
}
By the looks of it, when I run it, it is not dividing by the value of the DivisionConstant, so I commented this and replaced it with a raw value. However, self.Item still appears off screen. Any advice would be greatly appreciated.
This division probably isn't what you intended:
XRandomFloat = Float(randomIntHeight / UInt32(10000))
This performs integer division (discarding any remainder) and then converts the result to Float. What you probably meant was:
XRandomFloat = Float(randomIntHeight) / Float(10000)
This is a floating point number with a granularity of approximately 1/10000.
Your initial code:
let randomIntHeight = arc4random_uniform(1000000) + 12340000
generates a random number between 12340000 and (12340000+1000000-1). Given your final scaling, that means a range of 1234 and 1333. This seems odd for your final goals. I assume you really meant just arc4random_uniform(12340000), but I may misunderstand your goal.
Given your comments, I think you've over-complicated this. The following should give you a random point on the screen, assuming you want an integral (i.e. non-fractional) point, which is almost always what you'd want:
let bounds = UIScreen.mainScreen().bounds
let x = arc4random_uniform(UInt32(bounds.width))
let y = arc4random_uniform(UInt32(bounds.height))
let randomPoint = CGPoint(x: CGFloat(x), y: CGFloat(y))
Your problem is that you're adding the the maximum value to your random value, so of course it's always going to be offscreen.
I'm not sure what numbers you're hoping to generate, but what you're getting are results like:
1317.0, 764.0
1237.0, 795.0
1320.0, 814.0
1275.0, 794.0
1314.0, 758.0
1300.0, 758.0
1260.0, 809.0
1279.0, 768.0
1315.0, 838.0
1284.0, 763.0
1273.0, 828.0
1263.0, 770.0
1252.0, 776.0
1255.0, 848.0
1277.0, 847.0
1236.0, 847.0
1320.0, 772.0
1268.0, 759.0
You're then using this as the center of a UI element. Unless it's very large, it's likely to be off-screen.
Here is a necessary code snippets,
X axis label formatter,
NSDateFormatter dateFormatter = new NSDateFormatter();
dateFormatter.DateFormat = "dd/MM";
var timeFormatter = new CPTTimeFormatter(dateFormatter);
timeFormatter.ReferenceDate = NSDate.FromTimeIntervalSinceNow(0);
x.LabelFormatter = timeFormatter;
Delegate method of getting records,
public override NSNumber NumberForPlot(CPTPlot plot, CPTPlotField forFieldEnum, nuint index)
{
if (forFieldEnum == CPTPlotField.ScatterPlotFieldX)
return new NSNumber((index + 1) * 86400);
Debug.WriteLine("{0}", Data[(int)index].Rate);
return Data[(int)index].Rate;
}
See attached screenshot for result looks like. You can see that markers are not aligned to X axis. First data point should display on “01/01” but it is displaying just before it. Same for all other points.
Let me know if anybody wish to look at any other part of code. I just need direction or clue what could lead to this record shifting. I have already looked at sample code provided in coreplot but didn't get any clue on this.
Edit:
Ranges are as below,
plotSpace.XRange = new CPTPlotRange(NSNumber.FromDouble(0).NSDecimalValue, NSNumber.FromDouble(86400 * 9).NSDecimalValue);
plotSpace.YRange = new CPTPlotRange(NSNumber.FromDouble(-1).NSDecimalValue, NSNumber.FromDouble(9).NSDecimalValue);
Also tried,
var space = graph.DefaultPlotSpace as CPTXYPlotSpace;
space.ScaleToFitPlots(new CPTPlot [] { dataSourceLinePlot });
Edit: Graph setup code
The problem is with the ReferenceDate for the time formatter. It is being initialized with the current date and time, so the offset will vary throughout the day depending on when the setup code runs. There are several ways to make an NSDate object that corresponds to a fixed time of day. The most straightforward is by using NSDateComponents.
Several of the Core Plot example apps, including the "Date Plot" demo in the Plot Gallery app, use this technique to generate reference dates.
Also, the automatic axis labeling policy doesn't work well with dates. It's picking tick locations that fall on "nice" numbers of seconds between ticks, but that doesn't correspond to even numbers of days. You should use the fixed interval policy (the default) or one of the ones that let you provide the tick locations directly.