SQL Syntax: Inserting the time portion of DATETIME - google-fusion-tables

I'm attempting to periodically push data to a fusion table, and I have everything working except for the ability to insert the time portion of a DATETIME.
It works as expected if I omit the datetime field all together, e.g.:
service.query().sql(sql='INSERT INTO <my table> (Humidity, Temperature, Luminosity) VALUES (40, 70, 100)')
However, I get a synatx error every time I attempt to write the date. If I separate the date with /'s, it errors on those, so I switched to -'s, which seems to appease it. Only now, it chokes on the :'s that separate hour from minute:
service.query().sql(sql='INSERT INTO <my table> (Date, Humidity, Temperature, Luminosity) VALUES (04-12-2014 13:51, 40, 70, 100)')
yields:
HttpError: <HttpError 400 when requesting https://www.googleapis.com/fusiontables/v1/query?alt=json&sql=INSERT+INTO+(my table id)+%28Date%2C+Humidity%2C+Temperature%2C+Luminosity%29+VALUES+%2804-12-2014+13%3A51%2C+40%2C+70%2C+100%29 returned "Invalid query: Parse error near ':' (line 1, position 117).">
There's only one ':', so it's gotta be the time separator. Is there some other character I should be using to separate hours from minutes? Does it need to be escaped somehow? All of examples I can find in Google's documentation don't interact with DATETIME's.
When I tried without a ':' at all (e.g., 04-12-2014 1351), it didn't throw an error, but the date was interpreted in a wildly inaccurate way (09/01/1351).
Thanks!

Have you tried adding quotation marks around datetime?

Related

Different date format for same column in Informatica IICS

I am getting data from rest API in JSON forma and have a scenario where a column can have multiple date format. The current date format could be either 2011-02-12T01:00:00 or 2020-04-15T20:44:57.38or could be null or something else also.
I want to parse it through expression and trying to capture the full date string. The following expression seems to be working fine however it is truncating the millisecond part and returning value upto second only.
iif(isnull(%date_fields%),'\N',
to_char(To_date(to_char(%date_fields%),'MM/DD/YYYY HH24:MI:SS'),'YYYY-MM-DD HH24:MI:SS'))
But when I tried it with millisecond usinf below expression:
iif(isnull(%date_fields%),'\N',
to_char(To_date(to_char(%date_fields%),'MM/DD/YYYY HH24:MI:SS.MS'),'YYYY-MM-DD HH24:MI:SS.MS'))
It is throwing error:
TT_11132 Transformation [Expression3] had an error evaluating output column [JobStartDate_out].
Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
... t:TO_DATE(u:TO_CHAR(t:<02/12/2011 01:00:00>,u:'MM/DD/YYYY HH24:MI:SS'),u:'MM/DD/YYYY HH24:MI:SS.MS')].
I searched few option using below but getting parsing error.
DECODE (TRUE,
iff(isnull(%date_milli%),
'\N',
is_date(To_date(to_char(%date_milli%),'MM/DD/YYYY HH24:MI:SS'),'YYYY-MM-DD HH24:MI:SS'),
is_date(To_date(to_char(%date_milli%),'MM/DD/YYYY HH24:MI:SS.MS'),'YYYY-MM-DD HH24:MI:SS.MS'),
ERROR('NOT A VALID DATE')))
What could be the possible resolution to handle the multiple date format in Informatica? Here JSON date format is string and I am mapping it to date/time type and using Output Marco Fields to combine multiple similar column together.
Why dont you check both options - with and without milliseconds?
You can use below iif condition. Also i think your expression has some issues.
I assumed date_milli is a character type. If its a date, then you can change below expressions accordingly.
iff(isnull(%date_milli%),null,
iif( is_date(to_char(%date_milli%),'MM/DD/YYYY HH24:MI:SS'), to_date(to_char(%date_milli%),'MM/DD/YYYY HH24:MI:SS'),
iif( is_date(to_char(%date_milli%),'MM/DD/YYYY HH24:MI:SS.MS'), to_date(to_char(%date_milli%),'MM/DD/YYYY HH24:MI:SS.MS')
))
)

How do I pull the month from text strings in this Twilio format 2019-08-22 06:12:58 MDT?

I am using the Twilio log file to crunch some data and need to convert the Twilio format for dates into something that Google Sheets can recognize as a date so I can then extract what month the date is referring to. I think that first, the text string has to be converted to be a better format and then I can use one of Google Sheets functions to extra the month. Currently, this is the format in the log file:
"2019-08-22 06:12:58 MDT"
I used GoogleSheets TIMEVALUE and TEXT functions.
=TIMEVALUE(I2)
and
=text(I2,”mmmm”)
I get "Formula Parse Error"
The timezone stamp is messing up the Google formulas for you.
So you may want to try getting rid of that with something like this:
=text(index(split(I2," "),,1),"mmmm")
The split function breaks up the logged time stamp into 2019-08-22 | 06:12:58 | MDT across three columns.
And the index function then gets the just the first column - the date bit from there.
And then the text function gets the month name out of the date.
you can use:
=TEXT(LEFT(A1, 10), "mmmm")
and in array it would be:
=ARRAYFORMULA(TEXT(LEFT(A1:A, 10), "mmmm"))

Converting DT_WSTR to DT_DATE

So Im pretty new to this stuff but working through a few issues. What I am trying to do is pull source files from a Flat File Source but the dates in all my source files are formatted to YYYYMMDD so I have inserted a Derived Column task and created an expression to format all the columns with dates YYYYMMDD to YYYY-MM-DD and that looks like this:
LEFT(ISSUE_DT,4) + "-" + SUBSTRING(ISSUE_DT,5,2) + "-" + RIGHT(ISSUE_DT,2)
All is good with that except it's in the data type of DT_WSTR so I dropped in a Columns Conversion task to convert DT_WSTR to DT_DATE but I keep getting the following error:
[Columns Conversion [1]] Error: Data conversion failed while converting
column "ISSUE_DT Formatted" (258) to column "Copy of ISSUE_DT Formatted"
(16). The conversion returned status value 2 and status text "The value
could not be converted because of a potential loss of data.".
I have tried opening the advanced editor and navigated to the Data Conversion Output Columns and tried changing the DataType under Data Type Properties to DT_DATE but still the same error..
What am I missing or doing wrong??
Data Flow
Formatted Dates
Column Conversion
Column Conversion Advanced Editor
You have some dates that are not in the format that your SSIS package is expecting. Perhaps single-digit months or days do not have a leading 0. That scenario would definitely cause this specific error.
Take today for example, if single-digit months or days did not have leading zeroes, you have 2018918 instead of 20180918. Without seeing the data, I can't guarantee that this is the exact issue, but it is something like this. The error is occurring when converting the string to the date. So continuing with the example above, after your Derived Column ISSUED_DT formatted would have a value of '2018-91-18' which of course is not a valid date and causes the error.

PostegreSQL Combine columns and convert to timestamp with local time zone

I'm creating a time slot table in Rails with PostegreSQL that contains columns like
slots
name | type
-----|-----
day | date
hour | int
min | int
hour would be like 11, 12, 13, 14 ...
min would be like 0, 5, 10, 15 ...
I'm trying to use these three columns and create a timestamp in order to compare against Time.now to pull records that's upcoming in the future.
Since PG's to_timestamp function creates timestamp with UTC as default timezone, I want to create time from the three columns to use server's timezone and my attempt is below.
Slot.select("
to_timestamp(
concat_ws(
' ',
day::text,
concat_ws(
':',
hour::text,
min::text),
'#{Time.now.zone}'),
'YYYY-MM-DD HH24:MI (TZ)')
AS t")
And it gives me the error:
PG::FeatureNotSupported: ERROR: "TZ"/"tz"/"OF" format patterns are not supported in to_date
Any suggestions or thoughts would be great.
Thanks
The to_timestamp() function returns a timestamp with time zone value. If you do not explicitly specify a time zone, then the time zone of the server is used. That seems to be all that you need, so you can safely forget about specifying anything beyond the simple date and time.
Seeing what you are trying to do, however, it would be much easier to use the make_time() function and add the resulting time to the day date to get the timestamp you need. This saves you lots of conversions to text and then back to a timestamp:
Slot.select("day + make_time(hour, min, 0.0::float) AS t");

How ot write points into influxdb 0.8 with time in seconds

I would like to write points into an influx 0.8 database with the time values given in seconds through HTTP. Here's a sample point in JSON format:
[
{
"points": [
[
1435692857.0,
897
]
],
"name": "some_series",
"columns": [
"time",
"value"
]
}
]
The documentation is unclear what the format of time values should be (nano or milli seconds?) and how to specify to influxdb what to expect. Currently I'm using a query parameter: precision=s
That seems to work fine, the server returns HTTP Status code 200 as expected. When querying against the database using influx' admin interface using select * from some_series the datapoints in the table are returned with the expected timestamp. On the graph however, the time axis is indexed with fractions of seconds and queries like select * from some_series where time > now() - 1h dont yield any results.
I assume that there is something wrong with the timestamps. I tried multiplying my value by 1000 but then nothing gets inserted into the database with no visible errors.
Whats the problem?
By default, supplied timestamps are assumed to be in milliseconds. I think your writes are defaulting to milliseconds because the query string parameter should be time_precision=s, not precision=s.
See the details under "Time Precision on Written Data" on https://influxdb.com/docs/v0.8/api/reading_and_writing_data.html.
I also think the time value should be an integer rather than a float. I'm not sure how to explain the other behaviors, where the timestamp seems to be the right date and multiplying by 1000 doesn't solve the issue, but I wonder if it's related to writing floats.
Please contact the InfluxDB support team at support#influxdb.com for further assistance.
I found the solution! The problem was only in part with the precision. Your answer was correct, the query parameter is called time_precision and I should post integers instead of floats. Which was probably the first thing I attempted with no results...
However, due to some time zone problems, my time values where in the future relative to server time and by default, any select statement includes a where time < now() statement. So, in fact values were written into the database, but not displayed because of that hidden where statement. The solution was to tell the database to return "future" values, too:
select value from some_series where time < now() + 1h

Resources