Query influxdb for a date - influxdb

I have a table in influxdb that has a column called 'expirydate'. In the column I have afew dates e.g. "2016-07-14" or "2016-08-20". I want to select only the 2016-07-14 date, but I am unsure how?
My query is currently:
SELECT * FROM tablee where expirydate = '2016-07-14' limit 1000
But this does not work. Can someone please help me?

Assuming the value table**e** is a valid measurement...
If you are looking at selecting all of the points for the day '2016-07-14', then your query should look something like.
Query:
SELECT * FROM tablee where time >= '2016-07-14 00:00:00' and time < '2016-07-15 00:00:00'
You might also be interested in the influx's date time string in query.
See:
https://docs.influxdata.com/influxdb/v0.9/query_language/data_exploration/#relative-time
Date time strings Specify time with date time strings. Date time
strings can take two formats: YYYY-MM-DD HH:MM:SS.nnnnnnnnn and
YYYY-MM-DDTHH:MM:SS.nnnnnnnnnZ, where the second specification is
RFC3339. Nanoseconds (nnnnnnnnn) are optional in both formats.
Note:
The limit api could be redundant in your original query as it is there to impose restriction to the query from returning more than 1,000 point data.

I had to force influx to treat my 'string date' as a string. This works:
SELECT * FROM tablee where expirydate=~ /2016-07-14/ limit 1000;

Related

Between operator in influxDB

i want to select record between two date time string value in influxdb. Example:
select value from series where time between start_time and end_time
I am querying like this.
"select value from series time >= start_time and time <=end_time"
Is this correct ??
I have answered this question here
Let me know, if it solves your problem.

Data retrieving from sqlite DB between two dates - using objective c

I am using the below query with date filtering, but I am getting wrong result.
SELECT * FROM TRANSACTIONSHISTORY
WHERE DATE > "29-01-2015 12:00:00"
AND DATE < "30-01-2015 00:00:00" AND USERID=abc
I am getting result with date column with value of 29-Jan-2016 records, what am I missing here, can any one help me to get out of this value.
The date format in your SQL will not work because SQLite doesn't have a native datetime type, so it's generally stored either as a string, in YYYY-MM-DD HH:MM:SS.SSS format, or as an numeric value representing the number of seconds since 1970-01-01 00:00:00 UTC. See date and time types on SQLite.org. Note that if you're using the string representation that the sequence is year, month, day (which, when sorting/querying this string field, the this alphanumeric string will sort correctly by year first, then month, and then day, which is critical when doing queries like yours).
If you really stored dates in the database as a string in the DD-MM-YYYY HH:MM:SS format, you should consider changing the format in which you saved the values into one of the approved date formats. It will make the date interactions with the database much, much easier, allowing queries like the one you asked for (though, obviously, with DD-MM-YYYY replaced with YYYY-MM-DD format).
You have cast your string to Date
SELECT * FROM TRANSACTIONSHISTORY WHERE DATE between Datetime('29-01-2015 12:00:00') and Datetime('30-01-2015 00:00:00') AND USERID=abc
The first answer is exactly what you need. What you did in your code would be comparing strings using ASCII values.
I would recommend you to use the linux time stamps like: 1453818208, which is easier to save and compare. In addition, it can always be translated to human-readable dates like: 29-01-2015 12:00:00.
SELECT * FROM TRANSACTIONSHISTORY
WHERE DATE > "29-01-2015 12:00:00"
AND DATE < "30-01-2015 00:00:00" AND USERID=abc
I hope this helps you :)
Try this first try without Time,after that try date and time both , Hope i will work for you
SELECT TRANSACTIONSHISTORY
FROM SHIPMENT
WHERE DATE
BETWEEN '11-15-2010'
AND '30-01-2015'
// you can try this one also
SELECT * FROM TRANSACTIONSHISTORY WHERE DATE BETWEEN "2011-01-11" AND "2011-8-11"

Earliest date after current date without associated record

I have a Rails model DailyAssignment with a date column, and would like to find the first date after today which does not have a DailyAssignment associated with it.
For instance, if I have an instance today, no instance tomorrow, and an instance the day after tomorrow, this method should return tomorrow.
If I were to do this in Ruby, it would be something like:
(Date.today..1.year.since.to_date).find do |date|
DailyAssignment.where(date: date).empty?
end
This is medium okay since it will terminate the iteration once it finds a record, but has two issues:
Iterating through a collection in Ruby is slow.
Barring some sort of while construct, I need to specify an 'end' date.
Is there a nice, efficient way to do this in PostgreSQL?
If you can, you should use a custom query to search through your database (these kind of searches are a lot faster within the DB).
If you search for a date within a time range, you can use the
generate_series(timestamp, timestamp, interval) function:
select s
from generate_series(?, ? + interval '1 year'), interval '1 day') s
left join daily_assignment on s = "date"
where "date" is null
limit 1
If you have no real upper bound, you can use a self-join to get the next free date:
select coalesce(
(select c."date" + interval '1 day'
from daily_assignment c
left join daily_assignment n on n."date" = c."date" + interval '1 day'
where c."date" > ? - interval '1 day'
and n."date" is null
order by c."date"
limit 1),
? + interval '1 day'
)
? marks mean the parameter of today (you may need casts, depending on your input); you could use now() instead, if you prefer.
P.S.: please, do not use date as a column name, it is a reserved word in SQL, and tells nothing about the column itself. Instead, you can use names like created_at, updated_at, happens_at, etc. or even at_date.
What I propose is to do 1 select query between dates, then loop your results and compare them with your selected results.
# select all dailyassignments
results = DailyAssignment.where("date >= from_date AND date <= to_date")
not_found_dates = []
(Date.today..1.year.since).find do |date|
found_assignment = results.detect {|instance| instance.date == date }
not_found_dates << date if found_assignment.nil?
end
You can try it this way:
def first_date_without_assignment
assignments = DailyAssignment.select('date').where('date > ?', Date.today)
return Date.tomorrow if assignments.empty?
assignment_dates = assignments.map(&:date)
date_range = (Date.tomorrow..(assignment_dates.last.advance(days: 1)).to_a
(date_range - assignment_dates).first
end
I didn't test it so I could mistype something, but it could work. I also find this, it should work on postgres http://www.postgresql.org/message-id/4F96EC90.6070600#encs.concordia.ca but it could be quite hard to write in rails or at least bad looking.

sqlite Date Sorting

I am a parsing a file into a sqlite database that contains dates in the YYYY-MM-DD format. I want to store the entries into sqlite in such a way that I can sort the entries by date (strings not cutting it). What is the normal protocol for storing and ordering dates in sqlite? Should convert the dates into a number. Is there a way to convert YYYY-MM-DD dates into timestamps?
SQLite supports "DATE" in table creation. (More about that later.)
CREATE TABLE test (dt DATE PRIMARY KEY);
INSERT INTO "test" VALUES('2012-01-01');
INSERT INTO "test" VALUES('2012-01-02');
INSERT INTO "test" VALUES('2012-01-03');
SELECT dt FROM test ORDER BY dt;
2012-01-01
2012-01-02
2012-01-03
Values in the form yyyy-mm-dd sort correctly as either a string or a date. That's one reason yyyy-mm-dd is an international standard.
But SQLite doesn't use data types in the way most database workers expect it. Data storage is based on storage classes instead. For example, SQLite allows this.
INSERT INTO test VALUES ('Oh, bugger.');
SELECT * FROM test ORDER BY dt;
2012-01-01
2012-01-02
2012-01-03
Oh, bugger.
It also allows different date "formats" (actually, values) in a single column. Its behavior is quite unlike standard SQL engines.
INSERT INTO test VALUES ('01/02/2012');
SELECT * FROM test ORDER BY dt;
01/02/2012
2012-01-01
2012-01-02
2012-01-03
Oh, bugger.
You don't have to do anything special to store a timestamp in a date column. (Although I'd rather see you declare the column as timestamp, myself.)
INSERT INTO test VALUES ('2012-01-01 11:00:00');
SELECT * FROM test ORDER BY dt;
2012-01-01
2012-01-01 11:00:00
2012-01-02
2012-01-03
Oh, bugger.
SQLite will try to do the Right Thing as long as you feed consistent data into it. And it will sort dates correctly if you use the standard format.
Instead of storing date in format "YYYY-MM-DD", store the time-stamp of that date and that will help you to sorting the table.
If You want to Current TimeStamp then use
SELECT strftime('%s','now');
If You want toYYYY-MM-DD date TimeStamp then use
SELECT strftime('%s','YYYY-MM-DD');
where %s=seconds since 1970-01-01
i have the date field store in this way DD/MM/YYYY.
For sorting the date ( date field is a string ) i have to convert it before order it.
select (substr(date, 7, 4) || '-' || substr(date, 4, 2) || '-' || substr(date, 1, 2)) as new_date from work_hour order by new_date desc

Is it possible to search for dates as strings in a database-agnostic way?

I have a Ruby on Rails application with a PostgreSQL database; several tables have created_at and updated_at timestamp attributes. When displayed, those dates are formatted in the user's locale; for example, the timestamp 2009-10-15 16:30:00.435 becomes the string 15.10.2009 - 16:30 (the date format for this example being dd.mm.yyyy - hh.mm).
The requirement is that the user must be able to search for records by date, as if they were strings formatted in the current locale. For example, searching for 15.10.2009 would return records with dates on October 15th 2009, searching for 15.10 would return records with dates on October 15th of any year, searching for 15 would return all dates that match 15 (be it day, month or year). Since the user can use any part of a date as a search term, it cannot be converted to a date/timestamp for comparison.
One (slow) way would be to retrieve all records, format the dates, and perform the search on that. This could be sped up by retrieving only the id and dates at first, performing the search, and then fetching the data for the matching records; but it could still be slow for large numbers of rows.
Another (not database-agnostic) way would be to cast/format the dates to the right format in the database with PostgreSQL functions or operators, and have the database do the matching (with the PostgreSQL regexp operators or whatnot).
Is there a way to do this efficiently (without fetching all rows) in a database-agnostic way? Or do you think I am going in the wrong direction and should approach the problem differently?
Building on the answer from Carlos, this should allow all of your searches without full table scans if you have indexes on all the date and date part fields. Function-based indexes would be better for the date part columns, but I'm not using them since this should not be database-specific.
CREATE TABLE mytable (
col1 varchar(10),
-- ...
inserted_at timestamp,
updated_at timestamp);
INSERT INTO mytable
VALUES
('a', '2010-01-02', NULL),
('b', '2009-01-02', '2010-01-03'),
('c', '2009-11-12', NULL),
('d', '2008-03-31', '2009-04-18');
ALTER TABLE mytable
ADD inserted_at_month integer,
ADD inserted_at_day integer,
ADD updated_at_month integer,
ADD updated_at_day integer;
-- you will have to find your own way to maintain these values...
UPDATE mytable
SET
inserted_at_month = date_part('month', inserted_at),
inserted_at_day = date_part('day', inserted_at),
updated_at_month = date_part('month', updated_at),
updated_at_day = date_part('day', updated_at);
If the user enters only Year use WHERE Date BETWEEN 'YYYY-01-01' AND 'YYYY-12-31'
SELECT *
FROM mytable
WHERE
inserted_at BETWEEN '2010-01-01' AND '2010-12-31'
OR updated_at BETWEEN '2010-01-01' AND '2010-12-31';
If the user enters Year and Month use WHERE Date BETWEEN 'YYYY-MM-01' AND 'YYYY-MM-31' (may need adjustment for 30/29/28)
SELECT *
FROM mytable
WHERE
inserted_at BETWEEN '2010-01-01' AND '2010-01-31'
OR updated_at BETWEEN '2010-01-01' AND '2010-01-31';
If the user enters the three values use SELECT .... WHERE Date = 'YYYY-MM-DD'
SELECT *
FROM mytable
WHERE
inserted_at = '2009-11-12'
OR updated_at = '2009-11-12';
If the user enters Month and Day
SELECT *
FROM mytable
WHERE
inserted_at_month = 3
OR inserted_at_day = 31
OR updated_at_month = 3
OR updated_at_day = 31;
If the user enters Month or Day (you could optimize to not check values > 12 as a month)
SELECT *
FROM mytable
WHERE
inserted_at_month = 12
OR inserted_at_day = 12
OR updated_at_month = 12
OR updated_at_day = 12;
"Database agnostic way" is usually a synonym for "slow way", so the solutions will unlikely be efficient.
Parsing all records on the client side would be the least efficient solution in any case.
You can process your locale string on the client side and form a correct condition for a LIKE, RLIKE or REGEXP_SUBSRT operator. The client side of course should be aware of the database the system uses.
Then you should apply the operator to a string formed according to the locale with database-specific formatting function, like this (in Oracle):
SELECT *
FROM mytable
WHERE TO_CHAR(mydate, 'dd.mm.yyyy - hh24.mi') LIKE '15\.10'
More efficient way (that works only in PostgreSQL, though) would be creating a GIN index on the individual dateparts:
CREATE INDEX ix_dates_parts
ON dates
USING GIN
(
(ARRAY
[
DATE_PART('year', date)::INTEGER,
DATE_PART('month', date)::INTEGER,
DATE_PART('day', date)::INTEGER,
DATE_PART('hour', date)::INTEGER,
DATE_PART('minute', date)::INTEGER,
DATE_PART('second', date)::INTEGER
]
)
)
and use it in a query:
SELECT *
FROM dates
WHERE ARRAY[11, 19, 2010] <# (ARRAY
[
DATE_PART('year', date)::INTEGER,
DATE_PART('month', date)::INTEGER,
DATE_PART('day', date)::INTEGER,
DATE_PART('hour', date)::INTEGER,
DATE_PART('minute', date)::INTEGER,
DATE_PART('second', date)::INTEGER
]
)
LIMIT 10
This will select records, having all three numbers (1, 2 and 2010) in any of the dateparts: like, all records of Novemer 19 2010 plus all records of 19:11 in 2010, etc.
Watever the user enters, you should extract three values: Year, Month and Day, using his locale as a guide. Some values may be empty.
If the user enters only Year use WHERE Date BETWEEN 'YYYY-01-01' AND 'YYYY-12-31'
If the user enters Year and Month use WHERE Date BETWEEN 'YYYY-MM-01' AND 'YYYY-MM-31' (may need adjustment for 30/29/28)
If the user enters the three values use SELECT .... WHERE Date = 'YYYY-MM-DD'
If the user enters Month and Day, you'll have to use the 'slow' way
IMHO, the short answer is No. But definitely avoid loading all rows.
Few notes:
if you had only simple queries for exact dates or ranges, I would recommend using ISO format for DATE (YYYY-MM-DD, ex: 2010-02-01) or DATETIME. But since you seem to need queries like "all years for October 15th", you need custom queries anyways.
I suggest you create a "parser" that takes your date query and gives you the part of the SQL WHERE clause. I am certain that you will end up having less then a dozen of cases, so you can have optimal WHEREs for each of them. This way you will avoid loading all records.
you definitely do not want to do anything locale specific in the SQL. Therefore convert local to some standard in the non-SQL code, then use it to perform your query (basically separate localization/globalization and the query execution)
Then you can optimize. If you see that you have a lot of query just for year, you might create a COMPUTED COLUMN which would contain only the YEAR and have index on it.

Resources