I have a requirement to run an application through my MVC controller. To get the installation path I used following link (I used answer provided by Fredrik Mörk). It worked and I could able to run the exe through a process. The problem occurred when I deployed this solution on IIS where it did not create the process as it was creating in local dev environment. Can anybody tell me how to create a windows process through a solution which is hosted on IIS ?
private string GetPathForExe(string fileName)
{
private const string keyBase = #"SOFTWARE\Wow6432Node\MyApplication";
RegistryKey localMachine = Registry.LocalMachine;
RegistryKey fileKey = localMachine.OpenSubKey(string.Format(#"{0}\{1}", keyBase, fileName));
object result = null;
if (fileKey != null)
{
result = fileKey.GetValue("InstallPath");
}
fileKey.Close();
return (string)result;
}
public void StartMyApplication()
{
Process[] pname = Process.GetProcessesByName("MyApplication");
if (pname.Length == 0)
{
string appDirectory = GetPathForExe("MyApplication");
Directory.SetCurrentDirectory(appDirectory);
ProcessStartInfo procStartInfo = new ProcessStartInfo("MyApplication.exe");
procStartInfo.WindowStyle = ProcessWindowStyle.Hidden;
Process proc = new Process();
proc.StartInfo = procStartInfo;
proc.Start();
}
}
Related
I am creating an azure function application to validate xml files using a zip folder of schematron files.
I have run into a compatibility issue with how the URI's for the files are being created between mac and windows.
The files are downloaded from a zip on azure blob storage and then extracted to the functions local storage.
When the a colleague runs the transform method of the saxon cs api on a windows machine the method is able to run the first transformation and produce the stage 1.out file, however on the second transformation the transform method throws an exception stating that it cannot find the file even though it is present on the temp directory.
On mac the URI is /var/folders/6_/3x594vpn6z1fjclc0vx4v89m0000gn/T and on windows it is trying to find it at file:///C:/Users/44741/AppData/Local/Temp/ but the library is unable to find the file on the windows machine even if it is moved out of temp storage.
Unable to retrieve URI file:///C:/Users/44741/Desktop/files/stage1.out
The file is present at this location but for some reason the library cannot pick it up on the windows machine but it works fine on my mac. I am using Path.Combine to build the URI.
Has anyone else ran into this issue before?
The code being used for the transformations is below.
{
try
{
var transform = new Transform();
transform.doTransform(GetTransformArguments(arguments[Constants.InStage1File],
arguments[Constants.SourceDir] + "/" + schematronFile, arguments[Constants.Stage1Out]));
transform.doTransform(GetTransformArguments(arguments[Constants.InStage2File], arguments[Constants.Stage1Out],
arguments[Constants.Stage2Out]));
transform.doTransform(GetFinalTransformArguments(arguments[Constants.InStage3File], arguments[Constants.Stage2Out],
arguments[Constants.Stage3Out]));
Log.Information("Stage 3 out file written to : " + arguments[Constants.Stage3Out]);;
return true;
}
catch (FileNotFoundException ex)
{
Log.Warning("Cannot find files" + ex);
return false;
}
}
private static string[] GetTransformArguments(string xslFile, string inputFile, string outputFile)
{
return new[]
{
"-xsl:" + xslFile,
"-s:" + inputFile,
"-o:" + outputFile
};
}
private static string[] GetFinalTransformArguments(string xslFile, string inputFile, string outputFile)
{
return new[]
{
"-xsl:" + xslFile,
"-s:" + inputFile,
"-o:" + outputFile,
"allow-foreign=true",
"generate-fired-rule=true"
};
}```
So assuming the intermediary results are not needed as files but you just want the result (I assume that is the Schematron schema compiled to XSLT) you could try to run XSLT 3.0 using the API of SaxonCS (using Saxon.Api) by compiling and chaining your three stylesheets with e.g.
using Saxon.Api;
string isoSchematronDir = #"C:\SomePath\SomeDir\iso-schematron-xslt2";
string[] isoSchematronXslts = { "iso_dsdl_include.xsl", "iso_abstract_expand.xsl", "iso_svrl_for_xslt2.xsl" };
Processor processor = new(true);
var xsltCompiler = processor.NewXsltCompiler();
var baseUri = new Uri(Path.Combine(isoSchematronDir, isoSchematronXslts[2]));
xsltCompiler.BaseUri = baseUri;
var isoSchematronStages = isoSchematronXslts.Select(xslt => xsltCompiler.Compile(new Uri(baseUri, xslt)).Load30()).ToList();
isoSchematronStages[2].SetStylesheetParameters(new Dictionary<QName, XdmValue>() { { new QName("allow-foreign"), new XdmAtomicValue(true) } });
using (var schematronIs = File.OpenRead("price.sch"))
{
using (var compiledOs = File.OpenWrite("price.sch.xsl"))
{
isoSchematronStages[0].ApplyTemplates(
schematronIs,
isoSchematronStages[1].AsDocumentDestination(
isoSchematronStages[2].AsDocumentDestination(processor.NewSerializer(compiledOs)
)
);
}
}
If you only need the compiled Schematron to apply it further to validate an XML instance document against that Schematron you could even store the Schematron as an XdmDestination whose XdmNode you feed to XsltCompiler e.g.
using Saxon.Api;
string isoSchematronDir = #"C:\SomePath\SomeDir\iso-schematron-xslt2";
string[] isoSchematronXslts = { "iso_dsdl_include.xsl", "iso_abstract_expand.xsl", "iso_svrl_for_xslt2.xsl" };
Processor processor = new(true);
var xsltCompiler = processor.NewXsltCompiler();
var baseUri = new Uri(Path.Combine(isoSchematronDir, isoSchematronXslts[2]));
xsltCompiler.BaseUri = baseUri;
var isoSchematronStages = isoSchematronXslts.Select(xslt => xsltCompiler.Compile(new Uri(baseUri, xslt)).Load30()).ToList();
isoSchematronStages[2].SetStylesheetParameters(new Dictionary<QName, XdmValue>() { { new QName("allow-foreign"), new XdmAtomicValue(true) } });
var compiledSchematronXslt = new XdmDestination();
using (var schematronIs = File.OpenRead("price.sch"))
{
isoSchematronStages[0].ApplyTemplates(
schematronIs,
isoSchematronStages[1].AsDocumentDestination(
isoSchematronStages[2].AsDocumentDestination(compiledSchematronXslt)
)
);
}
var schematronValidator = xsltCompiler.Compile(compiledSchematronXslt.XdmNode).Load30();
using (var sampleIs = File.OpenRead("books.xml"))
{
schematronValidator.ApplyTemplates(sampleIs, processor.NewSerializer(Console.Out));
}
The last example writes the XSLT/Schematron validation SVRL output to the console but could of course also write it to a file.
I have an ASP.NET Core 3.0 Web API endpoint that I have set up to allow me to post large audio files. I have followed the following directions from MS docs to set up the endpoint.
https://learn.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads?view=aspnetcore-3.0#kestrel-maximum-request-body-size
When an audio file is uploaded to the endpoint, it is streamed to an Azure Blob Storage container.
My code works as expected locally.
When I push it to my production server in Azure App Service on Linux, the code does not work and errors with
Unhandled exception in request pipeline: System.Net.Http.HttpRequestException: An error occurred while sending the request. ---> Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: Request body too large.
Per advice from the above article, I have configured incrementally updated Kesterl with the following:
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseKestrel((ctx, options) =>
{
var config = ctx.Configuration;
options.Limits.MaxRequestBodySize = 6000000000;
options.Limits.MinRequestBodyDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.MinResponseDataRate =
new MinDataRate(bytesPerSecond: 100,
gracePeriod: TimeSpan.FromSeconds(10));
options.Limits.RequestHeadersTimeout =
TimeSpan.FromMinutes(2);
}).UseStartup<Startup>();
Also configured FormOptions to accept files up to 6000000000
services.Configure<FormOptions>(options =>
{
options.MultipartBodyLengthLimit = 6000000000;
});
And also set up the API controller with the following attributes, per advice from the article
[HttpPost("audio", Name="UploadAudio")]
[DisableFormValueModelBinding]
[GenerateAntiforgeryTokenCookie]
[RequestSizeLimit(6000000000)]
[RequestFormLimits(MultipartBodyLengthLimit = 6000000000)]
Finally, here is the action itself. This giant block of code is not indicative of how I want the code to be written but I have merged it into one method as part of the debugging exercise.
public async Task<IActionResult> Audio()
{
if (!MultipartRequestHelper.IsMultipartContentType(Request.ContentType))
{
throw new ArgumentException("The media file could not be processed.");
}
string mediaId = string.Empty;
string instructorId = string.Empty;
try
{
// process file first
KeyValueAccumulator formAccumulator = new KeyValueAccumulator();
var streamedFileContent = new byte[0];
var boundary = MultipartRequestHelper.GetBoundary(
MediaTypeHeaderValue.Parse(Request.ContentType),
_defaultFormOptions.MultipartBoundaryLengthLimit
);
var reader = new MultipartReader(boundary, Request.Body);
var section = await reader.ReadNextSectionAsync();
while (section != null)
{
var hasContentDispositionHeader = ContentDispositionHeaderValue.TryParse(
section.ContentDisposition, out var contentDisposition);
if (hasContentDispositionHeader)
{
if (MultipartRequestHelper
.HasFileContentDisposition(contentDisposition))
{
streamedFileContent =
await FileHelpers.ProcessStreamedFile(section, contentDisposition,
_permittedExtensions, _fileSizeLimit);
}
else if (MultipartRequestHelper
.HasFormDataContentDisposition(contentDisposition))
{
var key = HeaderUtilities.RemoveQuotes(contentDisposition.Name).Value;
var encoding = FileHelpers.GetEncoding(section);
if (encoding == null)
{
return BadRequest($"The request could not be processed: Bad Encoding");
}
using (var streamReader = new StreamReader(
section.Body,
encoding,
detectEncodingFromByteOrderMarks: true,
bufferSize: 1024,
leaveOpen: true))
{
// The value length limit is enforced by
// MultipartBodyLengthLimit
var value = await streamReader.ReadToEndAsync();
if (string.Equals(value, "undefined",
StringComparison.OrdinalIgnoreCase))
{
value = string.Empty;
}
formAccumulator.Append(key, value);
if (formAccumulator.ValueCount >
_defaultFormOptions.ValueCountLimit)
{
return BadRequest($"The request could not be processed: Key Count limit exceeded.");
}
}
}
}
// Drain any remaining section body that hasn't been consumed and
// read the headers for the next section.
section = await reader.ReadNextSectionAsync();
}
var form = formAccumulator;
var file = streamedFileContent;
var results = form.GetResults();
instructorId = results["instructorId"];
string title = results["title"];
string firstName = results["firstName"];
string lastName = results["lastName"];
string durationInMinutes = results["durationInMinutes"];
//mediaId = await AddInstructorAudioMedia(instructorId, firstName, lastName, title, Convert.ToInt32(duration), DateTime.UtcNow, DateTime.UtcNow, file);
string fileExtension = "m4a";
// Generate Container Name - InstructorSpecific
string containerName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{instructorId}";
string contentType = "audio/mp4";
FileType fileType = FileType.audio;
string authorName = $"{firstName} {lastName}";
string authorShortName = $"{firstName[0]}{lastName}";
string description = $"{authorShortName} - {title}";
long duration = (Convert.ToInt32(durationInMinutes) * 60000);
// Generate new filename
string fileName = $"{firstName[0].ToString().ToLower()}{lastName.ToLower()}-{Guid.NewGuid()}";
DateTime recordingDate = DateTime.UtcNow;
DateTime uploadDate = DateTime.UtcNow;
long blobSize = long.MinValue;
try
{
// Update file properties in storage
Dictionary<string, string> fileProperties = new Dictionary<string, string>();
fileProperties.Add("ContentType", contentType);
// update file metadata in storage
Dictionary<string, string> metadata = new Dictionary<string, string>();
metadata.Add("author", authorShortName);
metadata.Add("tite", title);
metadata.Add("description", description);
metadata.Add("duration", duration.ToString());
metadata.Add("recordingDate", recordingDate.ToString());
metadata.Add("uploadDate", uploadDate.ToString());
var fileNameWExt = $"{fileName}.{fileExtension}";
var blobContainer = await _cloudStorageService.CreateBlob(containerName, fileNameWExt, "audio");
try
{
MemoryStream fileContent = new MemoryStream(streamedFileContent);
fileContent.Position = 0;
using (fileContent)
{
await blobContainer.UploadFromStreamAsync(fileContent);
}
}
catch (StorageException e)
{
if (e.RequestInformation.HttpStatusCode == 403)
{
return BadRequest(e.Message);
}
else
{
return BadRequest(e.Message);
}
}
try
{
foreach (var key in metadata.Keys.ToList())
{
blobContainer.Metadata.Add(key, metadata[key]);
}
await blobContainer.SetMetadataAsync();
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
blobSize = await StorageUtils.GetBlobSize(blobContainer);
}
catch (StorageException e)
{
return BadRequest(e.Message);
}
Media media = Media.Create(string.Empty, instructorId, authorName, fileName, fileType, fileExtension, recordingDate, uploadDate, ContentDetails.Create(title, description, duration, blobSize, 0, new List<string>()), StateDetails.Create(StatusType.STAGED, DateTime.MinValue, DateTime.UtcNow, DateTime.MaxValue), Manifest.Create(new Dictionary<string, string>()));
// upload to MongoDB
if (media != null)
{
var mapper = new Mapper(_mapperConfiguration);
var dao = mapper.Map<ContentDAO>(media);
try
{
await _db.Content.InsertOneAsync(dao);
}
catch (Exception)
{
mediaId = string.Empty;
}
mediaId = dao.Id.ToString();
}
else
{
// metadata wasn't stored, remove blob
await _cloudStorageService.DeleteBlob(containerName, fileName, "audio");
return BadRequest($"An issue occurred during media upload: rolling back storage change");
}
if (string.IsNullOrEmpty(mediaId))
{
return BadRequest($"Could not add instructor media");
}
}
catch (Exception ex)
{
return BadRequest(ex.Message);
}
var result = new { MediaId = mediaId, InstructorId = instructorId };
return Ok(result);
}
I reiterate, this all works great locally. I do not run it in IISExpress, I run it as a console app.
I submit large audio files via my SPA app and Postman and it works perfectly.
I am deploying this code to an Azure App Service on Linux (as a Basic B1).
Since the code works in my local development environment, I am at a loss of what my next steps are. I have refactored this code a few times but I suspect that it's environment related.
I cannot find anywhere that mentions that the level of App Service Plan is the culprit so before I go out spending more money I wanted to see if anyone here had encountered this challenge and could provide advice.
UPDATE: I attempted upgrading to a Production App Service Plan to see if there was an undocumented gate for incoming traffic. Upgrading didn't work either.
Thanks in advance.
-A
Currently, as of 11/2019, there is a limitation with the Azure App Service for Linux. It's CORS functionality is enabled by default and cannot be disabled AND it has a file size limitation that doesn't appear to get overridden by any of the published Kestrel configurations. The solution is to move the Web API app to a Azure App Service for Windows and it works as expected.
I am sure there is some way to get around it if you know the magic combination of configurations, server settings, and CLI commands but I need to move on with development.
protected void Button1_Click(object sender, EventArgs e)
{
//Incident Service
IncidentService.ServiceNowSoapClient soapClient = new IncidentService.ServiceNowSoapClient();
soapClient.ClientCredentials.UserName.UserName = "username"; // username have SOAP role in SNow.
soapClient.ClientCredentials.UserName.Password = "Password1";
IncidentService.getRecords _getRecords = new IncidentService.getRecords();
IncidentService.getRecordsResponseGetRecordsResult[] getRecordsResponses = soapClient.getRecords(_getRecords);
_getRecords.active = true;
// Note: Please enable SOAP/REST services in your SNow dev instance table(s), Also,
// Go to system web services --> properties -> enable the 3rd option from the bottom.(This property sets the elementFormDefault attribute of the embedded XML schema to the value of unqualified)
//ServiceNowSoapClient client = new ServiceNowSoapClient();
//client.ClientCredentials.UserName.UserName = "username"; // username have SOAP role in SNow.
//client.ClientCredentials.UserName.Password = "Password1";
//insert newRecord = new insert();
//insertResponse insertResponse = new insertResponse();
//newRecord.first_name = "Jackson";
//newRecord.last_name = "Chris";
//newRecord.phone_number = "911-911-9999";
//newRecord.number = "CUS3048232";
try
{
//insertResponse = client.insert(newRecord);
//TextBox1.Text = insertResponse.sys_id;
getRecordsResponses = soapClient.getRecords(_getRecords);
for (int i = 0; i < getRecordsResponses.Length; i++)
{
TextBox2.Text = getRecordsResponses[i].short_description;
TextBox3.Text = getRecordsResponses[i].category;
}
}
catch (Exception ex)
{
TextBox1.Text = ex.Message;
}
//finally { client.Close(); }
}
How do you leverage ServiceNow data that reside in enterprise servicenow(CMDB,ITIL,various enterprise dbs, new dbs) dev,prod instances
to create End to End automated applications with C#, dotnetcore.
our goal is to Automate applications end to end with ServiceNow, dotnetcore, C#, docker containers, Ansible, Automic.
I know you probably don't need, maybe someone looking for the same finds this question.
I developed an library just for that.
https://emersonbottero.github.io/ServiceNow.Core/
I am having difficulty displaying images after deploying to Azure.
I am currently developing my application on ASP.NET MVC 5.
My methodology is for admin to upload a particular image, which would be accessible by users. The image is to be uploaded into a folder.
The issue is that I am able to get the image upload and displaying to work in my own localhost. However, it failed to work when I publish my app to Azure.
I have tried with many of the suggestions found in stackoverflow.
E.g. Changing the Permission settings in the folder, but up to no avail.
I have attached my code for reference. I appreciate any help! =D
Controller (file upload):
if (file != null && file.ContentLength > 0)
try
{
string path = Path.Combine(Server.MapPath("~/ExerciseImagesDepository"),
Path.GetFileName(file.FileName));
file.SaveAs(path);
ExerciseImage eI = new ExerciseImage();
eI.ExerciseID = ex.ExerciseID;
eI.ImageURL = Path.GetFileName(file.FileName);
db.ExerciseImages.Add(eI);
db.SaveChanges();
}
catch (Exception exc)
{
ViewBag.FileUploadErrorMessage = "ERROR:" + exc.Message.ToString();
}
Controller (Details display)
public ActionResult Details(int? id)
{
if (id == null)
{
return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
}
List<ExerciseViewModel> evmLIst = new List<ExerciseViewModel>();
int a = 0;
List<Exercise> exercise = db.Exercises.ToList();
foreach (var item in exercise)
{
ExerciseViewModel evm = new ExerciseViewModel();
a = a + 1;
evm.ExerciseViewModelID = a;
evm.ExerciseRegion = db.ExerciseRegions.Find(item.ExerciseRegionID);
evm.ExerciseDescription = item.Description;
evm.ExerciseType = db.ExerciseTypes.Find(item.ExerciseTypeID);
int ExerciseID = item.ExerciseID;
ExerciseVideo ev = db.ExerciseVideos.Where(m => m.ExerciseID == ExerciseID).SingleOrDefault();
evm.VideoURL = ev.VideoURL;
ExerciseImage ei = db.ExerciseImages.Where(m => m.ExerciseID == id).SingleOrDefault();
if (ei != null)
{
evm.ImageURL = ei.ImageURL;
}
else
{
evm.ImageURL = "No Image Is Available For This Exercise";
}
evm.Exercise = item.Name;
evmLIst.Add(evm);
}
View
<div>
<p> TESTING HERE #Html.Raw(Model.ImageURL) </p>
<img src="~/ExerciseImagesDepository/#Html.Raw(Model.ImageURL)"/>
Folder Structure:
I solve it! I figured that perhaps folder that are not default in the MVC 5 framework might be causing the issue. So instead, I created a new folder within the content directory and added a folder inside. Next, I tried changing that particular folder permission settings by adding a group or usernames "IUSR" and allowing full control. I then published the particular folder specifically to Azure and it works. Although I have no idea why I manage to do it.
I get error 401 (or 403) when trying to connect to Project Online with CSOM in a console app. (This is not on-premise. It is Microsoft Project Online 2013.) Here is the code.
ProjectContext projContext = new ProjectContext(pwaPath);
projContext.Credentials = new NetworkCredential("myUserID", "mypwd", "xxx.onmicrosoft.com");
projContext.ExecutingWebRequest += new EventHandler<WebRequestEventArgs>(projContext_ExecutingWebRequest);
projContext.Load(projContext.Projects);
projContext.ExecuteQuery();
**// Error 401 Unauthorized**
static void projContext_ExecutingWebRequest(object sender, WebRequestEventArgs e)
{
e.WebRequestExecutor.WebRequest.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f");
}
And another try, without ExecutingWebRequest:
ProjectContext projContext = new ProjectContext(pwaPath);
projContext.Credentials = new NetworkCredential("myUserID", "mypwd", "xxx.onmicrosoft.com");
projContext.Load(projContext.Projects);
projContext.ExecuteQuery();
**// Error 403 Forbidden**
Q1: Are there any problems with the code?
Q2: Is there a setting in Project Online that I'm missing?
You can use:
new SharePointOnlineCredentials(username, secpassword);
instead of
new NetworkCredential("admin#myserver.onmicrosoft.com", "password");
First: Install required Client SDK
SharePoint Client SDK :
http://www.microsoft.com/en-au/download/details.aspx?id=35585
Project 2013 SDK:
http://www.microsoft.com/en-au/download/details.aspx?id=30435
Second: add the reference to your project
Microsoft.SharePoint.Client.dll
Microsoft.SharePoint.Client.Runtime.dll
Microsoft.ProjectServer.Client.dll
You can find the dlls in %programfiles%\Common Files\microsoft shared\Web Server Extensions\15\ISAPI
and %programfiles(x86)%\Microsoft SDKs\Project 2013\REDIST
Here is sample code:
using System;
using System.Security;
using Microsoft.ProjectServer.Client;
using Microsoft.SharePoint.Client;
public class Program
{
private const string pwaPath = "https://[yoursitehere].sharepoint.com/sites/pwa";
private const string username ="[username]";
private const string password = "[password]";
static void Main(string[] args)
{
SecureString secpassword = new SecureString();
foreach (char c in password.ToCharArray()) secpassword.AppendChar(c);
ProjectContext pc = new ProjectContext(pwaPath);
pc.Credentials = new SharePointOnlineCredentials(username, secpassword);
//now you can query
pc.Load(pc.Projects);
pc.ExecuteQuery();
foreach(var p in pc.Projects)
{
Console.WriteLine(p.Name);
}
//Or Create a new project
ProjectCreationInformation newProj = new ProjectCreationInformation() {
Id = Guid.NewGuid(),
Name = "[your project name]",
Start = DateTime.Today.Date
};
PublishedProject newPublishedProj = pc.Projects.Add(newProj);
QueueJob qJob = pc.Projects.Update();
JobState jobState = pc.WaitForQueue(qJob,/*timeout for wait*/ 10);
}
}
I already answered this question in other question
How to authenticate to Project Online PSI services?