Decode GPS Data - No idea what the format is - parsing

I have a GPS tracker that a friend lent me. It's a chinese model, with sparse documentation.
It's got a built in gps and a gprs module (sim) and it's sending me my data to a particular IP address.
I can't figure out what all the numbers mean. I got latitude and long thanks to the N and E. but the rest I'm not sure about.
Here's an extract from my log:
4/28/2011 6:48:01 PM (001__450BP00BP05000001__450BP00110428A2451.6491N06700.6385E000.013474342.72000000000L0001ADFE)
4/28/2011 6:48:18 PM (001__450BP00BP05000001__450BP00110428A2451.6491N06700.6385E000.013480942.72000000000L0001ADFE)
4/28/2011 6:49:23 PM (001__450BP00BP05000001__450BP00110428A2451.6491N06700.6385E000.013490942.72000000000L0001ADFE)
4/28/2011 6:50:33 PM (001__450BP00BP05000001__450BP00110428A2451.6362N06700.6297E000.0135016198.8300000000L0001ADFE)
4/28/2011 6:51:39 PM (001__450BP00BP05000001__450BP00110428A2451.5203N06700.5738E000.0135114135.3800000000L0001AEFF)
4/28/2011 6:51:42 PM (001__450BP00BR02110428V2451.4962N06700.5942E000.0135133143.7700000000L0001AF23)
Note: the exact string from the tracker is stored within the round brackets (...)
I gave the dates and times because they may help decode the data if the tracker reports UTC time or something. Didn't see anything matching the time signature though

It would help if you post some more information (any serial numbers or other text on the device).
However, the messages look like GPS518.
I'm mostly guessing, but if I deconstruct the first line, I think this is the meaning:
Request
001 : ?
450 : deviceid
BP00 : handshake
BP05 : command
000001 : ?
Response
450 : device id
BP00 : command
110428 : date (format yymmdd)
A
2451.6491N : Latitude
06700.6385E : Longitude
000.0 : Speed (format nnn.n)
134743 : Time (format hhmmssas UTC) You probably live in GMT-7
42.720 : Heading/Bearing (?)
00000000L : Elevation
0001ADFE : ?
There's a discussion here that might be of interest:
http://sourceforge.net/projects/opengts/forums/forum/579834/topic/3871481
After some googling, I found this. It seems to generate message in roughly the same format as the ones that you are receiving:
http://kmmk.googlecode.com/svn/trunk/kmmk/src/com/gps/testmock/CommAdapterYD518.java

You can listen to the GPS data and parse it.
Please check the following link for more information:
https://github.com/anupama513/Tk102-gps-data-parser-nodejs-server
This is node js server:
listens continuously to the gps data
parse the GPRMC data
can store the data into database
Can also post the data to another webserver/socket.
The parse logic might be slightly different. But, most of the data matches.

Related

Converting unknown date to human format in SQLite Database

There's an iOS app that I'm looking at where the SQLite table for date shows up as the following
ZRECORDDATE
423942471
423873218
423512431
423950419
423954082
423954975
424551647
at first I believe it was UNIX time but after using a converter - 1983 definitely isn't it.
I'm probably missing the obvious here but can anyone point me in the right direction for converting this to a human date/time?
Thanks in advance
With your indication, it may be a time interval since 01 Jan 2001 in seconds.
let sec : TimeInterval = 423942471
let date=Date(timeIntervalSinceReferenceDate:Sec)
print("date \(date)")

What is the format of this hex timestamp from the Amazon SES message ID?

Amazon SES message IDs are in the following format:
01020170c41acd6e-89acae55-6245-4d89-86ca-0a177e59e737-000000
This seems to consist of 3 distinct parts
01020170c41acd6e appears to be some sort of hex timestamp. The difference between two timestamps is the time elapsed in milliseconds but it doesn't seem to begin at epoch
c2daf94a-f258-4d59-8fdb-a5512d4c7638 is clearly a standard version 4 UUID
000000 remains the same for first sending and I assume is incremented for redelivery attempts
I have a need to generate a 'fake' message ID in some scenarios. It is trivial to fake 2 and 3 above however I cannot seem to deduce what the format of the timestamp above is. Here are some further examples with corresponding approximate times:
01020170c450e280 - Mar 10, 2020 at 12:00:00.190
01020170c44c2e6a - Mar 10, 2020 at 11:54:51.987
01020170c0e30119 - Mar 09, 2020 at 20:01:07.407
What format is this timestamp?
Taking your first example of 01020170c450e280, the string can be split into 01020 and 170c450e280.
170c450e280 hex == 1583841600128 dec == 2020-03-10T12:00:00.128Z.
However, I'm afraid that the 01020 prefix remains a mystery to me.

R - how to exclude pennystocks from environment before calculating adjusted stock returns

Within my current research I'm trying to find out, how big the impact of ad-hoc sentiment on daily stock returns is.
Calculations functioned quite well and results also are plausible.
The calculations until now with quantmod package and yahoo financial data look like below:
getSymbols(c("^CDAXX",Symbols) , env = myenviron, src = "yahoo",
from = as.Date("2007-01-02"), to = as.Date("2016-12-30")
Returns <- eapply(myenviron, function(s) ROC(Ad(s), type="discrete"))
ReturnsDF <- as.data.table(do.call(merge.xts, Returns))
# adjust column names
colnames(ReturnsDF) <- gsub(".Adjusted","",colnames(ReturnsDF))
ReturnsDF <- as.data.table(ReturnsDF)
However, to make it more robust towards noisy influence of pennystock data I wonder, how its possible to exclude stocks that once in the time period go below a certain value x, let's say 1€.
I guess, the best thing would be to exclude them before calculating the returns and merge the xts object results or even better, before downloading them with the getSymbols command.
Has anybody an idea how this could work best? Thanks in advance.
Try this:
build a price frame of the Adj. closing prices of your symbols
(I use the PF function of the quantmod add-on package qmao which has lots of other useful functions for this type of analysis. (install.packages("qmao", repos="http://R-Forge.R-project.org”))
check by column if any price is below your minimum trigger price
select only columns which have no closings below the trigger price
To stay more flexible I would suggest to take a sub period - let’s say no price below 5 during the last 21 trading days.The toy example below may illustrate my point.
I use AAPL, FB and MSFT as the symbol universe.
> symbols <- c('AAPL','MSFT','FB')
> getSymbols(symbols, from='2018-02-01')
[1] "AAPL" "MSFT" "FB"
> prices <- PF(symbols, silent = TRUE)
> prices
AAPL MSFT FB
2018-02-01 167.0987 93.81929 193.09
2018-02-02 159.8483 91.35088 190.28
2018-02-05 155.8546 87.58855 181.26
2018-02-06 162.3680 90.90299 185.31
2018-02-07 158.8922 89.19102 180.18
2018-02-08 154.5200 84.61253 171.58
2018-02-09 156.4100 87.76771 176.11
2018-02-12 162.7100 88.71327 176.41
2018-02-13 164.3400 89.41000 173.15
2018-02-14 167.3700 90.81000 179.52
2018-02-15 172.9900 92.66000 179.96
2018-02-16 172.4300 92.00000 177.36
2018-02-20 171.8500 92.72000 176.01
2018-02-21 171.0700 91.49000 177.91
2018-02-22 172.5000 91.73000 178.99
2018-02-23 175.5000 94.06000 183.29
2018-02-26 178.9700 95.42000 184.93
2018-02-27 178.3900 94.20000 181.46
2018-02-28 178.1200 93.77000 178.32
2018-03-01 175.0000 92.85000 175.94
2018-03-02 176.2100 93.05000 176.62
Let’s assume you would like any instrument which traded below 175.40 during the last 6 trading days to be excluded from your analysis :-) .
As you can see that shall exclude AAPL and FB.
apply and the base function any applied(!) to a 6-day subset of prices will give us exactly what we want. Showing the last 3 days of prices excluding the instruments which did not meet our condition:
> tail(prices[,apply(tail(prices),2, function(x) any(x < 175.4)) == FALSE],3)
FB
2018-02-28 178.32
2018-03-01 175.94
2018-03-02 176.62

influxdb date format when entering/displaying data

I wrote a python program to enter historical data into influxdb.
Everything seems to be ok, but I am not sure if the time field is incorrect. The time is supposed to be YYYY,MM,DD,HH,MM as integers.
This is an example of the json that I am sending to influxdb,
[{'fields': {'High': 72.06, 'Close': 72.01, 'Volume': 6348, 'Open': 72.01, 'Low': 72.01}, 'tags': {'country': 'US', 'symbol': 'AAXJ', 'type': 'ETF', 'exchange': 'NASDAQ'}, 'time': datetime.dat
e(2017, 9, 7, 15, 35), 'measurement': 'quote'}]
However, when I query the data, I get a strange number for the time like this:
time Close High Low Open Volume country exchange symbol type
---- ----- ---- --- ---- ------ ------- -------- ------ ----
1504798500000000000 144.46 144.47 144.06 144.1 112200 US NYSE IBM STOCK
Seems like either the json time format is wrong, or the number displayed by the query is an encoded date representation?
I found the answer here
Formatting the output by entering the following command in the CLI:
precision rfc3339

Decoding the expiry date of a JavaScript Web Token (JWT)?

I am unable to understand the expiry date format of the JWT embedded in my application.
For example: 1473912000
What does this translate to? 1473912000 ms, some x date? Any help will be appreciated!
Like James has pointed out:
The number is the number of seconds since Jan 1 1970.
This is converted into the Date object in a quite straight-forward way (the *1000 part is here because in JS main time unit is millisecond):
const expiryDate = new Date(1473912000*1000);
Then you can use any Date method you please.
Likewise, in Ruby you can use Time.at(1473912000) to create a new Time instance like Maxim has shown.
The number is the number of seconds since Jan 1 1970. It is commonly used on unix systems to represent time. Your time is 2016-09-15 04:00 (UTC)
To convert you can try a web based system http://www.unixtimestamp.com/index.php
This is UNIX time in seconds:
➜ ~ irb
2.2.0 :001 > Time.at(1473912000)
=> 2016-09-15 07:00:00 +0300

Resources