I've got a spreadsheet that's automatically being populated by a form. But when I try to make graphs out of the data I always get some kind of error...
The timeline diagram should support date/time format, but I've even tried it with just a date format, converting it to decimals using =DateValue(), other graph types, ...
This is a screenshot of the data and error
The data in the timestamp column is date/time and the data in the time column is a number.
Yet the chart isn't rendering...
Timestamp time
23-12-2020 9:31:44 0.16
23-12-2020 11:06:08 0.75
23-12-2020 11:55:27 0.24
23-12-2020 12:14:30 0.12
23-12-2020 15:18:25 0.73
23-12-2020 17:17:46 0.6
24-12-2020 13:33:49 0.16
24-12-2020 13:51:57 0.01
24-12-2020 15:28:08 1.21
24-12-2020 17:38:36 0.11
24-12-2020 23:40:46 0.15
25-12-2020 11:34:45 0.13
25-12-2020 15:51:53 0.16
25-12-2020 16:08:12 0.06
26-12-2020 11:01:35 0.75
26-12-2020 11:52:03 0.18
26-12-2020 12:24:22 0.15
Can anyone help me out here?
Copy of the sheet:
https://docs.google.com/spreadsheets/d/1qJrC55_EPcTZ7nMPsscU69MLPVclImjBq3Ij-nG_P7I/edit?usp=sharing
the issue is with your B column. you are using Netherlands locale settings where:
0.16 > not number
0,16 > is valid number
now you got two options. you either:
change locale to United Kingdom or USA
delete the chart
select column B and format it as Number
select column A and format it as Date time
create a timeline chart with range A:B
or you can:
change dot . for comma , in B column
delete the chart
format B column as Numbers
format A column as Date time
create a timeline chart with range A:B
This is happening because the locale is not properly defined. Go to File > Settings and change Locale to your language. This will ensure that Sheets understands your datetime format. You may need to add the data again to be detected properly. If you have the data without formatting, you’ll know that it was read properly because the datetime will be aligned to the right.
References
Set a spreadsheet’s location & calculation settings
Related
I have an array of dates and values and want to calculate in a formula at what date a certain value will be reached or be bigger.
Example:
1/1/2022 10
1/10/2022 13
1/20/2022 16
1/30/2022 19
At what date will 50 be reached?
GS has formulas to forecast the value for a date, but I know the value - I need the date.
Any help appriciated.
A4 is the first date a4:a7 is the dates, b4:b7 is the values, 365 is how many days I want to plot out into the future and 50 value is the number you requested to find the date.
=vlookup(50,arrayformula({Growth(B4:B7, A4:A7-$A$4, sequence(365)-1),sequence(365)+$A$4-1}),2)
50 will be reached on 10th May 2022...
try TREND function:
=TREND(A1:A4; B1:B4; 50)
or FORECAST:
=FORECAST(50, A1:A4, B1:B4)
or GROWTH:
=GROWTH(A1:A4, B1:B4, 50)
or LOGEST, LINEST based on your specific project needs
I have a list of dates in a Google Sheets file that comprises only weekdays. Finding MAX Date from this list is easy, as is finding Max Date - 1 Day.
The issue I need to solve is when Max date is a Monday, which results in no data for Max Date - 1 Day (i.e. a Sunday).
I suspect that if then statements can handle this occurrence, but does a more programatic approach exist using MAX Date? There are solutions for this scenario in Python and SQL, but I found nothing for Google Sheets.
Use WORKDAY and subtract it by 1.
Formula:
=WORKDAY(MAX(A1:A), -1)
Output:
Given a timeseries of (electricity) marketdata with datapoints every hour, I want to show a Bar Graph with all time / time frame averages for every hour of the data, so that an analyst can easily compare actual prices to all time averages (which hour of the day is most/least expensive).
We have cratedb as backend, which is used in grafana just like a postgres source.
SELECT
extract(HOUR from start_timestamp) as "time",
avg(marketprice) as value
FROM doc.el_marketprices
GROUP BY 1
ORDER BY 1
So my data basically looks like this
time value
23.00 23.19
22.00 25.38
21.00 29.93
20.00 31.45
19.00 34.19
18.00 41.59
17.00 39.38
16.00 35.07
15.00 30.61
14.00 26.14
13.00 25.20
12.00 24.91
11.00 26.98
10.00 28.02
9.00 28.73
8.00 29.57
7.00 31.46
6.00 30.50
5.00 27.75
4.00 20.88
3.00 19.07
2.00 18.07
1.00 19.43
0 21.91
After hours of fiddling around with Bar Graphs, Histogramm Mode, Heatmap Panel und much more, I am just not able to draw a simple Hours-of-the day histogramm with this in Grafana. I would very much appreciate any advice on how to use any panel to get this accomplished.
your query doesn't return correct time series data for the Grafana - time field is not valid timestamp, so don't extract only
hour, but provide full start_timestamp (I hope it is timestamp
data type and value is in UTC)
add WHERE time condition - use Grafana's macro __timeFilter
use Grafana's macro $__timeGroupAlias for hourly groupping
SELECT
$__timeGroupAlias(start_timestamp,1h,0),
avg(marketprice) as value
FROM doc.el_marketprices
WHERE $__timeFilter(start_timestamp)
GROUP BY 1
ORDER BY 1
This will give you data for historic graph with hourly avg values.
Required histogram may be a tricky, but you can try to create metric, which will have extracted hour, e.g.
SELECT
$__timeGroupAlias(start_timestamp,1h,0),
extract(HOUR from start_timestamp) as "metric",
avg(marketprice) as value
FROM doc.el_marketprices
WHERE $__timeFilter(start_timestamp)
GROUP BY 1
ORDER BY 1
And then visualize it as histogram. Remember that Grafana is designated for time series data, so you need proper timestamp (not only extracted hours, eventually you can fake it) otherwise you will have hard time to visualize non time series data in Grafana. This 2nd query may not work properly, but it gives you at least idea.
I'm using Tableau Desktop, my data are like this:
KPI,date,monthValue
coffee break,01/06/2015,10.50
coffee break,01/07/2015,8.30
and I want to build a table like this
KPI, year(date), last value
coffee time, 2015, 8.30
How can I set a calculated field in order to show me the last value available in that year? I tried to do:
LOOKUP([MonthValue], LAST())
But it didn't work and tells me 'cannot mix aggregate and non-aggregate', so I did:
LOOKUP(sum([MonthValue]), LAST())
But it didn't work too. How should I proceed?
If you are using Tableau 9 then you can do this with an LOD calc that looks for the max value in your date field and then checks if the current date value is the same as the max date value.
[Date] == {fixed: max([Date])}
As you can see in the example below when you use the calc as a filter you will only get the last row from your example above.
UPDATE: to get the values per year you can do something like:
Here I am using a table calculation to find the max date per year and then ranking those dates and filtering down to the latest date in each year (which will be the one that has a rank equal to 1).
!max date is WINDOW_MAX(ATTR(Date))
!rank is RANK(Date)
You need to make sure that the table calculations are computer in the correct way (in this case across the values of each year).
I'm trying to display a simple tableview in IOS with data from Sqlite. My database date is stored as a timestamp. I thought was an unix timestamps but if i try to use dateWithTimeIntervalSince1970 i've really strange result.
Examples of date rows stored:
1352208510267
1352208512266
1352208514266
1352208516266
1352208530266
1352208532265
Use a query like this
SELECT datetime(timestamp, 'unixepoch') from YOURTABLENAME
WHERE id = someId;
This should convert it to some readable value.
Have a look here
I found the answer here. I compared the results with the previous answers:
SELECT strftime('%Y-%m-%d %H:%M:%S', datetime(ZDATE+978307200, 'unixepoch', 'localtime')), datetime(ZDATE, 'unixepoch', 'localtime') FROM ZTABLE
The query with the adjustment for Apple's epoch (Jan 1 2001) gives me the correct date:
"2015-09-29 20:50:51", "1984-09-28 20:50:51"
"2015-09-29 21:03:10", "1984-09-28 21:03:10"
"2015-09-29 21:25:30", "1984-09-28 21:25:30"
Unix timestamps are defined as the number of seconds since Jan 1 1970.
Just now, this would be about 1365525702.
Your values are one thousand times larger, i.e., they are measured in milliseconds.
Decide whether you actually need the millisecond precision, and then add * 1000 or / 1000 at the appropriate places.