I have a sumo logic query where i'm taking a numeric field and summing it, but that fields value is milliseconds so I want to divide the field by 1000 to get the number as seconds.
parse "DownloadDuration=*," as DownloadTime | sum(downloadtime / 1000) as TotalDownloadTime
but sumologic gives me an error: Parse error: ')' expected but '/' found. when i try to do this (even though their help docs seem to suggest this is totally legit.
I had to add another parse statement to alter the fields value.
parse "DownloadDuration=*," as DownloadTime |
(downloadtime / 1000) as DownloadTime |
sum(downloadtime) as TotalDownloadTime
Works perfectly!
Related
I want to use Sumo Logic to count how often different APIs are called. I want to have a table with API call name and value. My current query is like this:
_sourceCategory="my_category"
| parse regex "GET.+443 (?<getUserByUserId>/user/v1/)\d+" nodrop
| parse regex "GET.+443 (?<getUserByUserNumber>/user/v1/userNumber)\d+"
| count by getUserByUserId, getUserByUserNumber
This gets correct values but they go to different columns. When I have more variables, table becomes very wide and hard to read.
I figured it out, I need to use same group name for all rexexes. Like this:
_sourceCategory="my_category"
| parse regex "GET.+443 (?<endpoint>/user/v1/)\d+" nodrop
| parse regex "GET.+443 (?<endpoint>/user/v1/userNumber)\d+"
| count by endpoint
I am searching for a list of regexes in a splunk alert like this:
... | regex "regex1|regex2|...|regexn"
Can I modify this query to get a table of the regexes found along with their count. The table shouldn't show rows with 0 counts.
regex2 17
regexn 3
The regex command merely filters events. All we know is each result passed the regular expression. There is no record or indication of why or how any event passed.
To do that, you'd have to extract a unique field or value from each regex and then test the resulting events to see which field or value was present. The regex command, however, does not extract anything. You'd need the rex command or the match function to do that.
Looks like | regex line is not needed. This is working for me. Notice the extra brackets.
| rex max_match=0 "(?P<countfields>((regex1)|(regex2)|..|(regexn)))"
| stats count by countfields
How do I exclude the Time (_messagetime) metadata field from my result set?
I've tried:
field -_messagetime
But it gives me the error
Field _messagetime not found, please check the spelling and try again.
Using:
fields -time
does not remove the field either.
Currently I'm getting around this by using an aggregate (count) that has no effect on the data.
[EDIT]
Here's an example query:
Removing the Message (_raw) works. But removing the time (_messagetime) doesn't.
These results are used as email alerts, so removing the Time field from the Display isn't really an option.
The easiest way is to just turn off the field in the field browser window on the left-hand side of the results:
The other option is to aggregate and then remove the aggregate field - even if you just aggregate on _raw (which is the raw message):
_sourceCategory=blah
| count by _raw
| fields -_count
If you're still having trouble, can you share the rest of your query?
Edit based on your new query:
*
| parse "Description=\"*\"" as Description
| parse "Date=\"*\"" as Date
| count by Description, Date, Action
| fields -_count
The Time field is there as a result of the timeslice operation as far as I'm aware. The following should do the trick
| fields - _timeslice
I'm trying to do a Sumo Logic search for logs matching the following regular expression:
"Authorization \d+ for story is not voided. Story not removed"
That is, the \d+ consists of one or more digits, but it doesn't matter what they are exactly.
Based on the search examples cheat sheet (https://help.sumologic.com/05Search/Search-Cheat-Sheets/General-Search-Examples-Cheat-Sheet), I've tried to use a * | parse regex pattern for this, but that doesn't work:
I get a 'No capture group found in regex' error. I'm actually not really interested in capturing the digits, though, just in matching the regular expression in my search. How can I achieve this?
I managed to get it to work in two ways. Firstly, using the regular parse instead of parse regex:
* | parse "Authorization * for story is not voided. Story not removed" as id |
count by _sourceHost | sort by _count
or, when using a regular expression, it needs to be a named group:
* | parse regex "Authorization (?<id>\d+) for story is not voided. Story not removed" |
count by _sourceHost | sort by _count
I have request:
Model.group(:p_id).pluck("AVG(desired)")
=> [0.77666666666666667e1, 0.431666666666666667e2, ...]
but when I ran SQL
SELECT AVG(desired) AS desired
FROM model
GROUP BY p_id
I got
-----------------
| desired |
|-----------------|
| 7.76666666666667|
|43.1666666666667 |
| ... |
-----------------
What is the reason of this? Sure I can multiply, but I bet where are should be an explanation.
I found that
Model.group(:p_id).pluck("AVG(desired)").map{|a| a.to_f}
=> [7.76666666666667,43.1666666666667, ...]
Now I'm struggle with other task, I need numbers attributes in pluck so my request is:
Model.group(:p_id).pluck("p_id, AVG(desired)")
how to get correct AVG value in this case?
0.77666666666666667e1 is (almost) 7.76666666666667, they're the same number in two different representations with slightly different precision. If you dump the first one into irb, you'll see:
> 0.77666666666666667e1
=> 7.766666666666667
When you perform an avg in the database, the result has type numeric which ActiveRecord represents using Ruby's BigDecimal. The BigDecimal values are being displayed in scientific notation but that shouldn't make any difference when you format your data for display.
In any case, pluck isn't the right tool for this job, you want to use average:
Model.group(:p_id).average(:desired)
That will give you a Hash which maps p_id to averages. You'll still get the averages in BigDecimals but that really shouldn't be a problem.
Finally I've found solution:
Model.group(:p_id).pluck("p_id, AVG(Round(desired))")
=> [[1,7.76666666666667],[2,43.1666666666667], ...]