I'm looking for a way to group or window Esper events in a dynamic window, in a similar as to what Apache Flink call's Session Windows (see below)
I'd like to create a Contex per session, but so far have been unable to accomplish a way to capture sessions. Initial (not working) example;
CREATE SCHEMA EventX AS (sensorId string, timestamp long, value double);
CREATE SCHEMA SessionEvent AS (sensorId string, totalValue double, events EventX[]);
-- Unsure about Context definition
CREATE CONTEXT sensorSessionCtx
CONTEXT sensorCtx PARTITION BY sensorId FROM EventX,
CONTEXT sessionCtx INITITATED BY Eventx TERMINATED BY pattern [every EventX -> (timer:interval(3 sec) and not EventX)];
CONTEXT sensorSessionCtx
INSERT INTO SessionEvent
SELECT sensorId
, SUM(value) AS totalValue
, window(*) AS events
FROM EventX#keepall
OUTPUT LAST WHEN TERMNATED;
#Name('Sessions') SELECT * FROM SessionEvent;
And some testdata for Esper EPL Online
EventX = {sensorId='A', timestamp=0, value=1.1}
t=t.plus(1 seconds)
EventX = {sensorId='A', timestamp=1000, value=3.2}
t=t.plus(1 seconds)
EventX = {sensorId='B', timestamp=2000, value=8.4}
t=t.plus(1 seconds)
EventX = {sensorId='A', timestamp=3000, value=2.7}
EventX = {sensorId='B', timestamp=3000, value=0.2}
t=t.plus(3 seconds)
EventX = {sensorId='A', timestamp=6000, value=3.1}
Expected output;
SessionEvent={sensorId='A', totalValue=7.0, events={[EventX={sensorId='A', timestamp=0, value=1.1},EventX={sensorId='A', timestamp=1000, value=3.2},EventX={sensorId='A', timestamp=3000, value=2.7}]}}
SessionEvent={sensorId='B', totalValue=8.6, events={[EventX={sensorId='B', timestamp=2000, value=8.4},EventX={sensorId='B', timestamp=3000, value=0.2}]}}
SessionEvent={sensorId='A', totalValue=3.1, events={[EventX={sensorId='A', timestamp=6000, value=3.1}]}}
How would I create (dynamic) session windows (or contexts) in Esper?
One could also do this.
CREATE CONTEXT sensorSessionCtx
initiated by distinct(sensorId) EventX as ex
terminated by pattern [every (timer:interval(3 sec) and not EventX(sensorId=ex.sensorId))];
CONTEXT sensorSessionCtx
SELECT sensorId
, SUM(value) AS totalValue
, window(*) AS events
FROM EventX(sensorId=context.ex.sensorId)#keepall
OUTPUT WHEN TERMiNATED;
This now forgets about "sensorId" values which is useful if your space of sensorId values is endless. Otherwise if you have "partition by sensorId" that means the engine tracks all sensorIds at all time.
There is no need for "select * from SessionEvent" so I left it out.
If the output doesn't need to have events you remove #keepall i.e. CONTEXT sensorSessionCtx SELECT sensorId, sum(value) AS totalValue FROM EventX(sensorId=context.ex.sensorId) OUTPUT SNAPSHOT WHEN TERMINATED;
EDIT:
Should be "every (timer:interval(3 sec) and not EventX(sensorId=ex.sensorId))" in the termination. The previous suggestion was wrong since the initiating event never counts towards the terminating event/pattern (thus EventX -> every(...) was wrong).
To provide a Session-context in Esper, we need to create a Nested Context.
In this nested context we Keyed Segment context is defined first, to 'split' to stream into separate user, or sensor as per the example, streams. We then define a second Non-Overlapping context. The second context will only operate within the first keyed context and (thus all events in this context have the same sensorId).
CREATE CONTEXT sensorSessionCtx
CONTEXT sensorCtx
PARTITION BY sensorId FROM EventX
, CONTEXT sessionCtx
START EventX
END pattern [every (timer:interval(3 sec) AND NOT EventX)];
In the example, the output with this context definition would be;
At: 2001-01-01 08:00:06.000
Insert
SessionEvent={sensorId='A', totalValue=7.000000000000001, events={[EventX={sensorId='A', timestamp=0, value=1.1},EventX={sensorId='A', timestamp=1000, value=3.2},EventX={sensorId='A', timestamp=3000, value=2.7}]}}
Insert
SessionEvent={sensorId='B', totalValue=8.6, events={[EventX={sensorId='B', timestamp=2000, value=8.4},EventX={sensorId='B', timestamp=3000, value=0.2}]}}
At: 2001-01-01 08:00:09.000
Insert
SessionEvent={sensorId='A', totalValue=3.1, events={[EventX={sensorId='A', timestamp=6000, value=3.1}]}}
Although note that, when the internal engine timer has been disabled, for the last window to ever terminate, an event (any kind will do) should be received with a timestamp greater than the last session event timestamp + the session gap. In the EPL Online tool this can be accomplished by adding t=t.plus(10 seconds) after the last defined input event. This is especially important should you want to create any kind of unit-test
val finalEventTimestamp = Long.MaxValue - 1 //Note: Long.MaxValue won't trigger final evaluation!
engine.getEPRuntime.sendEvent(new CurrentTimeEvent(finalEventTimestamp))
Related
Here are two events:AppStartEvent and AppCrashEvent.
I need to count the number of two events over a period of time, and then to calculate the count(AppStartEvent)/count(AppCrashEvent).
My EPL is here
create context ctx4NestInCR
context ctx4Time initiated #now and pattern [every timer:interval(1 minute)] terminated after 15 minutes,
context ctx4AppName partition by appName from AppStartEvent, appName from AppCrashEvent
<------------------->
context ctx4NestInCR select count(s),count(c) from AppStartEvent as s, AppCrashEvent as c output last when terminated
And it does not work
Error starting statement: Joins require that at least one view is specified for each stream, no view was specified for s
Your post doesn't have the join? It only has the context and that wouldn't produce the message. I would suggest to correct the post.
You can also join streams by merging the two streams and treating them as one.
insert into AppEvent select 'crash' as condition from AppCrashEvent;
insert into AppEvent select 'start' as condition from AppStartEvent;
select count(condition='crash')/count(condition='start') from AppEvent;
I have a quite simple problem to modelize and I don't have experience in Esper, so I may be headed the wrong way so I'd like some insight.
Here's the scenario: I have one stream of events "ParkingEvent", with two types of events "SpotTaken" and "SpotFree". So I have an Esper context both partitioned by id and bordered by a starting event of type "SpotTaken" and an end event of type "SpotFree". The idea is to monitor a parking spot with a sensor and then aggregate data to count the number of times the spot has been taken and also the time occupation.
That's it, no time window or whatsoever, so it seems quite simple but I struggle aggregating data. Here's the code I got so far:
create context ParkingSpotOccupation
context PartionBySource
partition by source from SmartParkingEvent,
context ContextBorders
initiated by SmartParkingEvent(
type = "SpotTaken") as startEvent
terminated by SmartParkingEvent(
type = "SpotFree") as endEvent;
#Name("measurement_occupation")
context ParkingSpotOccupation
insert into CreateMeasurement
select
e.source as source,
"ParkingSpotOccupation" as type,
{
"startDate", min(e.time),
"endDate", max(e.time),
"duration", dateDifferenceInSec(max(e.time), min(e.time))
} as fragments
from
SmartParkingEvent e
output
snapshot when terminated;
I got the same data for min and max so I'm guessing I'm doing somthing wrong.
When I'm using context.ContextBorders.startEvent.time and context.ContextBorders.endEvent.time instead of min and max, the measurement_occupation statement is not triggered.
Given that measurements have already been computed by the EPL that you provided, this counts the number of times the spot has been taken (and freed) and totals up the duration:
select source, count(*), sum(duration) from CreateMeasurement group by source
I am looking for an EPL statement which fires an event each time a certain value has increased by a specified amount, with any number of events in between, for example:
Considering a stream, which continuously provides new prices.
I want to get a notification, e.g. if the price is greater than the first price + 100. Something like
select * from pattern[a=StockTick -> every b=StockTick(b.price>=a.price+100)];
But how to realize that I get the next event(s), if the increase is >= 200, >=300 and so forth?
Diverse tests with context and windows has not been successful so far, so I appreciate any help! Thanks!
The contexts would be the right way to go.
You could start by defining a start event like this:
create schema StartEvent(threshold int);
And then have context that uses the start event:
create context ThresholdContext inititiated by StartEvent as se
terminated after 5 years
context ThresholdContext select * from pattern[a=StockTick -> every b=StockTick(b.price>=context.se.threshold)];
You can generate the StartEvent using "insert into" from the same pattern (probably want to remove the "every") or have the listener send in a StartEvent or declare another pattern that fires just once for creating a StartEvent.
I am a novice in using the Esper event stream engine in Java.
I have two input stream one about States (device, state), and another about Measures (device, temperature)
It is possible to create a context to segment by device both streams?
I found an example in the documentation for you. It is at [1] and "4.2.2.1. Multiple Stream Definitions".
[1] http://esper.codehaus.org/esper-5.0.0/doc/reference/en-US/html_single/index.html#context_def_keyed
you also can create a window where you put only once the device.
for example
create window devicesDetail.std:unique(device) as (device string, temperature long, state string)
insert into devicesDetail select device, 0 as temperature, state from stream1
insert into devicesDetail select device, temperature, '' as state from stream2
and then you do a query into that window
select irstream from devicesDetail
I have a non real time Esper configuration where I feed a stream that I read from a file. I'm trying to create an expression that computes a statistic over the entire stream and outputs one value at the very end. Esper has semantics for forcing a view to output every X seconds, for instance, but is there a semantic for asking the view or the engine to "flush" the output when you know there are no more events to feed.
Turns out that at least one way to do this is to use the output clause with a variable trigger.
The expression would be:
select count(*) as totalCount from events output last when OutputSummary = true
The OutputSummary variable would be initialized like so:
epConfiguration.addVariable("OutputSummary", Boolean.class, "false");
When you're ready to flush, set the variable to true like so:
epRuntime.setVariableValue("OutputSummary", true);
long currentTime = epService.getEPRuntime().getCurrentTime();
epRuntime.sendEvent(new CurrentTimeEvent(currentTime));
It's necessary to send another time event to force the expression to evaluate.
When output requires at every 60 sec then the expression would be:
select emplyee_id from employees output snapshot every 60 sec
and when the output requires at every 10000 events then the expression would be:
select emplyee_id from employees output snapshot every 10000 events