IOT Central Custom Rules - iot

Currently seeking assistance with the code on this site https://learn.microsoft.com/en-us/azure/iot-central/core/howto-create-custom-rules. Upon using the stream analytics job that uses a query to detect a disconnected deviceId, it returns null. Please see as follows:[all deviceId's for the disconnected device return a null, but timestamp is there.][1]
This is the query I used.
with
LeftSide as
(
SELECT
-- Get the device ID from the message metadata and create a column
GetMetadataPropertyValue([centraltelemetry], '[EventHub].[IoTConnectionDeviceId]') as deviceid1,
EventEnqueuedUtcTime AS time1
FROM
-- Use the event enqueued time for time-based operations
[centraltelemetry] TIMESTAMP BY EventEnqueuedUtcTime
),
RightSide as
(
SELECT
-- Get the device ID from the message metadata and create a column
GetMetadataPropertyValue([centraltelemetry], '[EventHub].[IoTConnectionDeviceId]') as deviceid2,
EventEnqueuedUtcTime AS time2
FROM
-- Use the event enqueued time for time-based operations
[centraltelemetry] TIMESTAMP BY EventEnqueuedUtcTime
)
SELECT
LeftSide.deviceid1 as deviceid,
LeftSide.time1 as time
INTO
[emailnotification]
FROM
LeftSide
LEFT OUTER JOIN
RightSide
ON
LeftSide.deviceid1=RightSide.deviceid2 AND DATEDIFF(second,LeftSide,RightSide) BETWEEN 1 AND 120
where
-- Find records where a device didn't send a message 120 seconds
RightSide.deviceid2 is NULL
```[enter image description here][2]
[1]: https://i.stack.imgur.com/CLOQv.png
[2]: https://i.stack.imgur.com/IU3SX.png

Since the Stream Analytics query uses the telemetry from the event hub as its input, can you please let me know if you followed the complete tutorial for the setup? In case yes, do you see the function log messages with the device id?
Also, do you receive the email from SendGrid with the deviceid? In order to analyze further, can you please also confirm that your stream analytics job is receiving inputs from the event hub?

Related

The effect of use_sim_time in ROS

I am trying to understand what effects does setting use_sim_time to true does specially when recording and playing a rosbag, but unfortunately the info is few and hard to understand.
I know already how to set it to true, so that is no problem.how to do it
I have done some experiments with a rosbag file I have and I noticed:
When I do rosplay play file.bag , the topic \clock is not published
When I do rosplay play file.bag --clock the topic \clock is published
I also have noticed that when I do rostopic echo \clock and I play the bag, many of the time published is the same! (what does this mean??)
And lastly I have noticed that use_sim_time has no effect on any of these results.
So what effect does setting this parameter to true have?
In order for a ROS node to use simulation time according to the /clock topic, the /use_sim_time parameter must be set to true before the node is initialized. In other words,
The ROS API used to get times ros::Time time = ros::Time::now() will retrieve time data from the /clock topic rather than using the system clock. If you turn use_sim_time off then any time values published to /clock will be ignored
If the /use_sim_time parameter is set, the ROS Time API will return time=0 until it has received a value from the /clock topic. Then, the time will only be updated on receipt of a message from the /clock topic and will stay constant between updates
More information follow: [1], [2]

Nested rows using STRUCT are not supported in Dataflow SQL (GCP)

With Dataflow SQL I would like to read a Pub/Sub topic, enrich the message and write the message to a Pub/Sub topic.
Which Dataflow SQL query will create my desired output message?
Pub/Sub input message: {"event_timestamp":1619784049000, "device":{"ID":"some_id"}}
Desired Pub/Sub output message: {"event_timestamp":1619784049000, "device":{“ID":"some_id", “NAME”:”some_name”}}
What I get is: {"event_timestamp":1619784049000, "device":{"ID":"some_id"}, "NAME":"some_name" }
but I need the NAME inside the “device” attribute.
SELECT message_table.device as device, devices.name as NAME
FROM pubsub.topic.project_id.`topic` as message_table
JOIN bigquery.table.project_id.dataflow_sql_dataset.devices as devices
ON devices.device_id = message_table.device.id
Unfortunately, Dataflow SQL does not currently support STRUCT/Sub queries, but we are working on it. Since there are some Apache Beam dependencies preventing its progress (Nested Rows Support, Upgrading Calcite), we cannot provide an ETA at the moment, but you can follow its progress on this issue tracker.
You need to create a struct in the projection (SELECT part)
SELECT STRUCT(message_table.device.ID as ID , devices.name as NAME) as device
FROM pubsub.topic.project_id.`topic` as message_table
JOIN bigquery.table.project_id.dataflow_sql_dataset.devices as devices
ON devices.device_id = message_table.device.id

How to add Job in Customer table with qodbc insert statement

I am trying to add a job to the QuickBooks Customer table using the following sql statement. The customer table has several CustomFields (columns). When I apply the statement without the customField/column the statment works. When I run the sql line with the customField/column - I get an error and the row IS inserted without the customField/column data
INSERT INTO Customer (Name, ParentRefFullName, Companyname, Billaddressaddr1, CustomFieldAddNumber) VALUES ('.10~Root Name 01', '.00~Root Name 01', 'Zen Enterprise', 'my Address one', '.10' )
Error: only with 'CustomFieldAddNumber' set to '.10'
What is the correct way to add a Job to Customer table with CustomFeilds/Columns ?
Thanks
I would suggest you download & install QODBC latest version 320 from below link & test again:
http://www.qodbc.com/qodbcDownload.htm
If you are still facing the issue, I kindly request you to please raise a support ticket to the QODBC Technical Support department from below mentioned link & provide requested information:
http://support.flexquarters.com/esupport/index.php?/Tickets/Submit
We may need the following information, I kindly request you to attach below listed files when replying to the ticket.
1) Screenshot of QODBC Setup Screen -- > About
2) Screenshot of the issue you’re facing.
Share Entire Log Files as an attachment in text format from
3) QODBC Setup Screen -- > Messages -- > Review QODBC Messages
4) QODBC Setup Screen -- > Messages -- > Review SDK Messages

Multiple times trigger generation in Zabbix

I am new to zabbix. I have a basic requirement of monitoring occurrence of different log messages using zabbix. Say, when there is a log message "server starting", zabbix should show that alert. The idea is that if the server (re)starts 10 times in last 10 minutes, the zabbix dashboard (or at any other place) should display that 10 times.
I have done the following for that :
Created an item under template MyTemplate:
Type : Zabbix Agent (Active)
key : log[/opt/mylog/logs/abc.log,server starting]
Type of information : Log
Update Interval (in sec) : 30
Created a trigger with expression :
{MyTemplate:log[/opt/mylog/logs/abc.log,server
starting].logeventid(1)}=0
With logeventid(1), I am seeing that the alert (trigger) is getting generated only once. It appears only once in the Dashboard --> Last 20 issues. If I go to Monitoring --> Trigger, I see the alert only once, although the log files have 10 entries of the message "server starting" (server restarted 10 times).
Then I set the trigger to following :
{MyTemplate:log[/opt/mylog/logs/abc.log,server
starting].nodata(300)}=0
Now, at Monitoring --> Trigger, I see the alert (trigger) 10 times, but, from the Dashboard --> Last 20 issues it vanishes just after 300 seconds.
My questions are :
What should be the trigger function, I should use? I want to see 10 alerts in zabbix if the same message appears 10 times in the log file within a period of time.
With nodata(300), why does the alert vanish after 300 sec?
Is it ok if I use 30 minutes instead of 300 seconds as an argument of nodata()?
Function logeventid() is normally used for Windows and VMware event logs. In this case, it should probably not be used and it is suspicious that it fires, which might indicate a bug in Zabbix.
Anyway, you can check "Multiple PROBLEM events generation" box in trigger configuration and the trigger will generate a new PROBLEM event every time the condition is true, regardless of its previous value. Instead of logeventid(), you can try using a function that is always true, for instance, strlen()>0.
If you wish the trigger to go into OK state after some time, say, 10 minutes, you can add nodata(10m). Then your trigger will look like this:
{MyTemplate:log[/opt/mylog/logs/abc.log,server starting].strlen()}>0 and
{MyTemplate:log[/opt/mylog/logs/abc.log,server starting].nodata(10m)}=0

What is the correct syntax to use when trying to create a Data Source View to a linked server?

I have tried several statements but this one at least returns data.. but I get the error message: Deferred prepare could not be prepared. Incorrect syntax near')'. Incorrect syntax near the keyword 'DECLARE'. The following statement executed when creating namedquery:
SELECT[vwStatistics].*
FROM
(
***THIS IS MY QUERY***
DECLARE #SQL1 VARCHAR(500)
SET #SQL1 = 'SELECT *
FROM OPENQUERY(PORTAL, ''SELECT DeviceID, Date, Count
FROM printer_stats.Statistics
GROUP BY DeviceID'')'
EXEC (#SQL1)
***END OF MY QUERY***
)
AS[vwStatistics] (Microsoft.AnalysisServices.Controls)
I am new to linked servers and to SSAS. This is our company's first Cube from a linked server. My query does run in Management Studio and creates a SSRS report but it is slow.
Any suggestions would be helpful. Not much info on syntax for this situation on web. I have been looking for any such situation and have not found much help other than trying changes on server. EX: Make sure openrowset is on and reinstall OWC component.. I do not have that capability.
This is what we found to work:
SELECT DeviceID, CAST(statsdt AS CHAR) AS sdt, Count FROM OPENQUERY (
PORTAL, 'select * from (select DeviceID,CAST( Date AS CHAR) statsdt, Count from printer_stats.Statistics) as pstats')

Resources