JIRA Plugins SDK: How to find out changed data? - jira

I am using the JIRA plugins sdk to work on changed Issues.
I´ve implemented an IssueListener and I have access to the Issue itself and the IssueEvent.
How do I find out which property (summary, description, estimation ...) of my Issue has been changed ?

The changelog is likely to contain what's been changed and there is a method on the IssueEvent object to get this (getChangeLog) and it returns a GenericValue object.
This post on the Atlassian Answers site gives some code related to an Atlassian tutorial on how to write JIRA Event Listeners.
The relevant code snippet is shown below:
if (eventTypeId.equals(EventType.ISSUE_UPDATED_ID)) {
List<GenericValue> changeItems = null;
try {
GenericValue changeLog = issueEvent.getChangeLog();
changeItems = changeLog.internalDelegator.findByAnd("ChangeItem", EasyMap.build("group",changeLog.get("id")));
} catch (GenericEntityException e){
System.out.println(e.getMessage());
}
log.info("number of changes: {}",changeItems.size());
for (Iterator<GenericValue> iterator = changeItems.iterator(); iterator.hasNext();){
GenericValue changetemp = (GenericValue) iterator.next();
String field = changetemp.getString("field");
String oldstring = changetemp.getString("oldstring");
String newstring = changetemp.getString("newstring");
StringBuilder fullstring = new StringBuilder();
fullstring.append("Issue ");
fullstring.append(issue.getKey());
fullstring.append(" field ");
fullstring.append(field);
fullstring.append(" has been updated from ");
fullstring.append(oldstring);
fullstring.append(" to ");
fullstring.append(newstring);
log.info("changes {}", fullstring.toString());
/* Do something here if a particular field you are
looking for has being changed.
*/
if(field == "Component") changeAssignee(changetemp, issue, user);
}
}

Related

TFS Azure 2017 workitem change event handler - branch relation

We have a plugin in tfs. This plugin notifies us when any work item changes. We can get work item details like id, changeset list as follows. But we need work item related branch name also.
Could you please help me, is there any way to get related branch name when workitem changed in below code.
Thanks
public EventNotificationStatus ProcessEvent(
IVssRequestContext requestContext,
NotificationType notificationType,
object notificationEventArgs,
out int statusCode,
out string statusMessage,
out ExceptionPropertyCollection properties)
{
statusCode = 0;
properties = null;
statusMessage = String.Empty;
try
{
if (notificationType == NotificationType.Notification && notificationEventArgs is WorkItemChangedEvent)
{
WorkItemChangedEvent ev = notificationEventArgs as WorkItemChangedEvent;
int workItemId = Convert.ToInt32(ev.CoreFields.IntegerFields[0].NewValue);
WriteLog("1WorkItemChangedEventHandler WorkItem " + ev.WorkItemTitle + " - Id " + workItemId.ToString() + " was modified");
}
}
catch (Exception)
{
}
return EventNotificationStatus.ActionPermitted;
}
You can use workItemStore.GetWorkItem(workItemId ) to get all work item info. Check the similar question here: How can I get a reference to the TFS WorkItem in a WorkItemChangedEvent?
If you use git, you can try the following example to get branch reference:
int workitemId = YOUR_ID;
WorkItemStore wiStore = new WorkItemStore("{YOUR_URI}");
WorkItem wi = wiStore.GetWorkItem(workitemId);
foreach (Link link in wi.Links)
{
if (link.BaseType == BaseLinkType.ExternalLink)
{
var brList = ((ExternalLink)link).LinkedArtifactUri.Split(new string[] { "%2F" }, StringSplitOptions.RemoveEmptyEntries).ToList();
brList.RemoveAt(0); //remove project and repo giuds
brList.RemoveAt(0);
brList[0] = brList[0].Replace("GB", ""); // remove GB prefix
Console.WriteLine("Branch: {0}", string.Join("/", brList));
}
}

how to get date wise email faster using gmail API in Asp.Net MVC with OAuth Token

Here I have written code for Gmail API to fetch mail with date filter
I am able to fetch MessageId and ThreadId using the First API. On the basis of MessageId, I put that messageId parameter in a List object and I have sent this parameter in foreach loop from List to the next API to fetch email body on basis of messageID. But the process is very slow for fetching messages from Gmail
public async Task<ActionResult> DisplayEmailWithFilter (string fromDate, string toDate) {
Message messageObj = new Message ();
Example exampleObj = new Example ();
List<GmailMessage> gmailMessagesList = new List<GmailMessage> ();
GmailMessage gmailMessage = new GmailMessage ();
var responseData = "";
//dateFilter string parameter Created with Date Values
string dateFilter = "in:Inbox after:" + fromDate + " before:" + toDate;
try {
// calling Gmail API to get MessageID Details by Date Filter
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue (scheme: "Bearer",
parameter : Session["Token"].ToString ());
HttpResponseMessage responseMessage = await client.GetAsync ("https://www.googleapis.com/gmail/v1/users/me/messages?q=" + dateFilter);
if (responseMessage.IsSuccessStatusCode) {
var data = responseMessage.Content;
}
try {
responseData = responseMessage.Content.ReadAsStringAsync ().Result;
//This Json Data Converted into List Object
var msgList = JsonConvert.DeserializeObject<Root1> (responseData);
//loop for Fetching EmailMessageData by MessageID
if (msgList.resultSizeEstimate != 0) {
foreach (var msgItem in msgList.messages) {
messageObj.id = msgItem.id;
//Calling API with MessageID Parameter to fetch Respective Message Data
HttpResponseMessage responseMessageList = await client.GetAsync ("https://www.googleapis.com/gmail/v1/users/userId/messages/id?id=" + messageObj.id.ToString () + "&userId=me&format=full");
if (responseMessageList.IsSuccessStatusCode) {
var dataNew = responseMessageList.Content;
var responseDataNew = responseMessageList.Content.ReadAsStringAsync ().Result;
//Converting json string in Object
exampleObj = JsonConvert.DeserializeObject<Example> (responseDataNew);
gmailMessage.Body = exampleObj.snippet;
//fetching Header Values comparing with string to get Data
for (int i = 1; i < exampleObj.payload.headers.Count; i++) {
if (exampleObj.payload.headers[i].name.ToString () == "Date") {
gmailMessage.RecievedDate = exampleObj.payload.headers[i].value;
}
if (exampleObj.payload.headers[i].name.ToString () == "Subject") {
gmailMessage.Subject = exampleObj.payload.headers[i].value;
}
if (exampleObj.payload.headers[i].name.ToString () == "Message-ID") {
gmailMessage.SenderEmailID = exampleObj.payload.headers[i].value;
}
if (exampleObj.payload.headers[i].name.ToString () == "From") {
gmailMessage.SenderName = exampleObj.payload.headers[i].value;
}
}
//Adding This Object Values in GmailMessgage List Object
gmailMessagesList.Add (
new GmailMessage {
Body = exampleObj.snippet,
SenderEmailID = gmailMessage.SenderEmailID,
RecievedDate = gmailMessage.RecievedDate,
SenderName = gmailMessage.SenderName,
Subject = gmailMessage.Subject,
});
}
}
}
} catch (Exception e) {
string errorMgs = e.Message.ToString ();
throw;
}
} catch (Exception e) {
string errorMgs = e.Message.ToString ();
throw;
}
return View (gmailMessagesList);
}
I can fetch Gmail email datewise but it took so much time to fetch. how can I improve my code and performance faster?
The query seems like the most you can do. If you know more information about those emails, like a specific subjects or there always come from the same sender you can try to filter that too, like you would in the Gmail interface.
Other way you would be kind of out of luck. You are limited by the files retrieved from User.messages.list.
If you need to escape from the API limitations maybe trying to retrieve the message other way would be the correct way to go. Considerate creating a small code to retrieve message by the IMAP protocol. Several questions in this topic may help you:
Reading Gmail messages using Python IMAP
Reading Gmail Email in Python
How can I get an email message's text content using Python?

Large File upload to ASP.NET Core 3.0 Web API fails due to Request Body to Large

I have an ASP.NET Core 3.0 Web API endpoint that I have set up to allow me to post large audio files. I have followed the following directions from MS docs to set up the endpoint.
https://learn.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads?view=aspnetcore-3.0#kestrel-maximum-request-body-size
When an audio file is uploaded to the endpoint, it is streamed to an Azure Blob Storage container.
My code works as expected locally.
When I push it to my production server in Azure App Service on Linux, the code does not work and errors with
Unhandled exception in request pipeline: System.Net.Http.HttpRequestException: An error occurred while sending the request. ---> Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: Request body too large.
Per advice from the above article, I have configured incrementally updated Kesterl with the following:
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseKestrel((ctx, options) =>
{
var config = ctx.Configuration;
options.Limits.MaxRequestBodySize = 6000000000;
options.Limits.MinRequestBodyDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.MinResponseDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.RequestHeadersTimeout =
TimeSpan.FromMinutes(2);
}).UseStartup<Startup>();
Also configured FormOptions to accept files up to 6000000000
services.Configure<FormOptions>(options =>
{
options.MultipartBodyLengthLimit = 6000000000;
});
And also set up the API controller with the following attributes, per advice from the article
[HttpPost("audio", Name="UploadAudio")]
[DisableFormValueModelBinding]
[GenerateAntiforgeryTokenCookie]
[RequestSizeLimit(6000000000)]
[RequestFormLimits(MultipartBodyLengthLimit = 6000000000)]
Finally, here is the action itself. This giant block of code is not indicative of how I want the code to be written but I have merged it into one method as part of the debugging exercise.
public async Task<IActionResult> Audio()
{
if (!MultipartRequestHelper.IsMultipartContentType(Request.ContentType))
{
throw new ArgumentException("The media file could not be processed.");
}
string mediaId = string.Empty;
string instructorId = string.Empty;
try
{
// process file first
KeyValueAccumulator formAccumulator = new KeyValueAccumulator();
var streamedFileContent = new byte[0];
var boundary = MultipartRequestHelper.GetBoundary(
MediaTypeHeaderValue.Parse(Request.ContentType),
_defaultFormOptions.MultipartBoundaryLengthLimit
);
var reader = new MultipartReader(boundary, Request.Body);
var section = await reader.ReadNextSectionAsync();
while (section != null)
{
var hasContentDispositionHeader = ContentDispositionHeaderValue.TryParse(
section.ContentDisposition, out var contentDisposition);
if (hasContentDispositionHeader)
{
if (MultipartRequestHelper
.HasFileContentDisposition(contentDisposition))
{
streamedFileContent =
await FileHelpers.ProcessStreamedFile(section, contentDisposition,
_permittedExtensions, _fileSizeLimit);
}
else if (MultipartRequestHelper
.HasFormDataContentDisposition(contentDisposition))
{
var key = HeaderUtilities.RemoveQuotes(contentDisposition.Name).Value;
var encoding = FileHelpers.GetEncoding(section);
if (encoding == null)
{
return BadRequest($"The request could not be processed: Bad Encoding");
}
using (var streamReader = new StreamReader(
section.Body,
encoding,
detectEncodingFromByteOrderMarks: true,
bufferSize: 1024,
leaveOpen: true))
{
// The value length limit is enforced by
// MultipartBodyLengthLimit
var value = await streamReader.ReadToEndAsync();
if (string.Equals(value, "undefined",
StringComparison.OrdinalIgnoreCase))
{
value = string.Empty;
}
formAccumulator.Append(key, value);
if (formAccumulator.ValueCount >
_defaultFormOptions.ValueCountLimit)
{
return BadRequest($"The request could not be processed: Key Count limit exceeded.");
}
}
}
}
// Drain any remaining section body that hasn't been consumed and
// read the headers for the next section.
section = await reader.ReadNextSectionAsync();
}
var form = formAccumulator;
var file = streamedFileContent;
var results = form.GetResults();
instructorId = results["instructorId"];
string title = results["title"];
string firstName = results["firstName"];
string lastName = results["lastName"];
string durationInMinutes = results["durationInMinutes"];
//mediaId = await AddInstructorAudioMedia(instructorId, firstName, lastName, title, Convert.ToInt32(duration), DateTime.UtcNow, DateTime.UtcNow, file);
string fileExtension = "m4a";
// Generate Container Name - InstructorSpecific
string containerName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{instructorId}";
string contentType = "audio/mp4";
FileType fileType = FileType.audio;
string authorName = $"{firstName} {lastName}";
string authorShortName = $"{firstName[0]}{lastName}";
string description = $"{authorShortName} - {title}";
long duration = (Convert.ToInt32(durationInMinutes) * 60000);
// Generate new filename
string fileName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{Guid.NewGuid()}";
DateTime recordingDate = DateTime.UtcNow;
DateTime uploadDate = DateTime.UtcNow;
long blobSize = long.MinValue;
try
{
// Update file properties in storage
Dictionary<string, string> fileProperties = new Dictionary<string, string>();
fileProperties.Add("ContentType", contentType);
// update file metadata in storage
Dictionary<string, string> metadata = new Dictionary<string, string>();
metadata.Add("author", authorShortName);
metadata.Add("tite", title);
metadata.Add("description", description);
metadata.Add("duration", duration.ToString());
metadata.Add("recordingDate", recordingDate.ToString());
metadata.Add("uploadDate", uploadDate.ToString());
var fileNameWExt = $"{fileName}.{fileExtension}";
var blobContainer = await _cloudStorageService.CreateBlob(containerName, fileNameWExt, "audio");
try
{
MemoryStream fileContent = new MemoryStream(streamedFileContent);
fileContent.Position = 0;
using (fileContent)
{
await blobContainer.UploadFromStreamAsync(fileContent);
}
}
catch (StorageException e)
{
if (e.RequestInformation.HttpStatusCode == 403)
{
return BadRequest(e.Message);
}
else
{
return BadRequest(e.Message);
}
}
try
{
foreach (var key in metadata.Keys.ToList())
{
blobContainer.Metadata.Add(key, metadata[key]);
}
await blobContainer.SetMetadataAsync();
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
blobSize = await StorageUtils.GetBlobSize(blobContainer);
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
Media media = Media.Create(string.Empty, instructorId, authorName, fileName, fileType, fileExtension, recordingDate, uploadDate, ContentDetails.Create(title, description, duration, blobSize, 0, new List<string>()), StateDetails.Create(StatusType.STAGED, DateTime.MinValue, DateTime.UtcNow, DateTime.MaxValue), Manifest.Create(new Dictionary<string, string>()));
// upload to MongoDB
if (media != null)
{
var mapper = new Mapper(_mapperConfiguration);
var dao = mapper.Map<ContentDAO>(media);
try
{
await _db.Content.InsertOneAsync(dao);
}
catch (Exception)
{
mediaId = string.Empty;
}
mediaId = dao.Id.ToString();
}
else
{
// metadata wasn't stored, remove blob
await _cloudStorageService.DeleteBlob(containerName, fileName, "audio");
return BadRequest($"An issue occurred during media upload: rolling back storage change");
}
if (string.IsNullOrEmpty(mediaId))
{
return BadRequest($"Could not add instructor media");
}
}
catch (Exception ex)
{
return BadRequest(ex.Message);
}
var result = new { MediaId = mediaId, InstructorId = instructorId };
return Ok(result);
}
I reiterate, this all works great locally. I do not run it in IISExpress, I run it as a console app.
I submit large audio files via my SPA app and Postman and it works perfectly.
I am deploying this code to an Azure App Service on Linux (as a Basic B1).
Since the code works in my local development environment, I am at a loss of what my next steps are. I have refactored this code a few times but I suspect that it's environment related.
I cannot find anywhere that mentions that the level of App Service Plan is the culprit so before I go out spending more money I wanted to see if anyone here had encountered this challenge and could provide advice.
UPDATE: I attempted upgrading to a Production App Service Plan to see if there was an undocumented gate for incoming traffic. Upgrading didn't work either.
Thanks in advance.
-A
Currently, as of 11/2019, there is a limitation with the Azure App Service for Linux. It's CORS functionality is enabled by default and cannot be disabled AND it has a file size limitation that doesn't appear to get overridden by any of the published Kestrel configurations. The solution is to move the Web API app to a Azure App Service for Windows and it works as expected.
I am sure there is some way to get around it if you know the magic combination of configurations, server settings, and CLI commands but I need to move on with development.

Is it possible to subcribe to a sentence of an epl module?

I have deployed an epl module with the code:
InputStream inputFile = this.getClass().getClassLoader().getResourceAsStream("Temperature.epl");
if (inputFile == null) {
inputFile = this.getClass().getClassLoader().getResourceAsStream("etc/Temperature.epl");
}
if (inputFile == null) {
throw new RuntimeException("Failed to find file 'Temperature.epl' in classpath or relative to classpath");
}
try {
epService.getEPAdministrator().getDeploymentAdmin().readDeploy(inputFile, null, null, null);
// subscribers Ok, tested before whith epService.getEPAdministrator().createEPL ()
// sentences ok, printed
EPStatement statement;
statement = epService.getEPAdministrator().getStatement("Monitor");
System.out.println(statement.getText() + ";");
statement.setSubscriber(new MonitorEventSubscriber());
statement = epService.getEPAdministrator().getStatement("Warning");
System.out.println(statement.getText() + ";");
statement.setSubscriber(new WarningEventSubscriber());
statement = epService.getEPAdministrator().getStatement("Error");
System.out.println(statement.getText() + ";");
statement.setSubscriber(new ErrorEventSubscriber());
}
catch (Exception e) {
throw new RuntimeException("Error deploying EPL from 'Temperature.epl': " + e.getMessage(), e);
}
I can get the sentences by statement.getText(), but the subscribers are not activated. What it's wrong?
I'm working with Esper 5.0.0
Seeing that your code uses the current classloader, you'd want to make sure the classloader is the same else you can get different engine instances.
Also have your code actually send an event to see if it matches since this code doesn't send events.

OData Service not returning complete response

I am reading Sharepoint list data (>20000 entries) using Odata RESTful service as detailed here -http://blogs.msdn.com/b/ericwhite/archive/2010/12/09/getting-started-using-the-odata-rest-api-to-query-a-sharepoint-list.aspx
I am able to read data but I get only the first 1000 records. I also checked that List View Throttling is set to 5000 on sharepoint server. Kindly advise.
Update:
#Turker: Your answer is spot on!! Thank you very much. I was able to get the first 2000 records in first iteration. However, I am getting the same records in each iteration of while loop. My code is as follows-
...initial code...
int skipCount =0;
while (((QueryOperationResponse)query).GetContinuation() != null)
{
//query for the next partial set of customers
query = dc.Execute<CATrackingItem>(
((QueryOperationResponse)query).GetContinuation().NextLinkUri
);
//Add the next set of customers to the full list
caList.AddRange(query.ToList());
var results = from d in caList.Skip(skipCount)
select new
{
Actionable = Actionable,
}; Created = d.Created,
foreach (var res in results)
{
structListColumns.Actionable = res.Actionable;
structListColumns.Created= res.Created;
}
skipCount = caList.Count;
}//Close of while loop
Do you see a <link rel="next"> element at the end of the feed?
For example, if you look at
http://services.odata.org/Northwind/Northwind.svc/Customers/
you will see
<link rel="next" href="http://services.odata.org/Northwind/Northwind.svc/Customers/?$skiptoken='ERNSH'" />
at the end of the feed which means the service is implementing server side paging and you need to send the
http://services.odata.org/Northwind/Northwind.svc/Customers/?$skiptoken='ERNSH'
query to get the next set of results.
I don't see anything particularly wrong with your code. You can try to dump the URLs beign requested (either from the code, or using something like fiddler) to see if the client really sends the same queries (and thus getting same responses).
In any case, here is a sample code which does work (using the sample service):
DataServiceContext ctx = new DataServiceContext(new Uri("http://services.odata.org/Northwind/Northwind.svc"));
QueryOperationResponse<Customer> response = (QueryOperationResponse<Customer>)ctx.CreateQuery<Customer>("Customers").Execute();
do
{
foreach (Customer c in response)
{
Console.WriteLine(c.CustomerID);
}
DataServiceQueryContinuation<Customer> continuation = response.GetContinuation();
if (continuation != null)
{
response = ctx.Execute(continuation);
}
else
{
response = null;
}
} while (response != null);
I had the same problem, and wanted it to be a generic solution.
So I've extended DataServiceContext with a GetAlltems methode.
public static List<T> GetAlltems<T>(this DataServiceContext context)
{
return context.GetAlltems<T>(null);
}
public static List<T> GetAlltems<T>(this DataServiceContext context, IQueryable<T> queryable)
{
List<T> allItems = new List<T>();
DataServiceQueryContinuation<T> token = null;
EntitySetAttribute attr = (EntitySetAttribute)typeof(T).GetCustomAttributes(typeof(EntitySetAttribute), false).First();
// Execute the query for all customers and get the response object.
DataServiceQuery<T> query = null;
if (queryable == null)
{
query = context.CreateQuery<T>(attr.EntitySet);
}
else
{
query = (DataServiceQuery<T>) queryable;
}
QueryOperationResponse<T> response = query.Execute() as QueryOperationResponse<T>;
// With a paged response from the service, use a do...while loop
// to enumerate the results before getting the next link.
do
{
// If nextLink is not null, then there is a new page to load.
if (token != null)
{
// Load the new page from the next link URI.
response = context.Execute<T>(token);
}
allItems.AddRange(response);
}
// Get the next link, and continue while there is a next link.
while ((token = response.GetContinuation()) != null);
return allItems;
}

Resources