Is it possible to debug the Quartz.NET Windows service? - asp.net-mvc

I've got my app code in one solution and the Quartz.Net code in another. My app code connects to the service and registers a job but no break points are hit in the Quartz.NET solution despite the Visual Studio instance for the Quartz.Net code being attached to the service process. (Out of interest, when registering the job in my app, I can step into the Quartz.Net source code. It loads the source into my app's instance of Visual Studio.)
If I attach the debugger to the service process from the Visual Studio instance that contains my app code then a break point in my custom job source code says that the symbols haven't been loaded and so won't break. The dll that contains the custom job is not in the list of modules.
Any ideas? What I'm after is debugging my custom job that's loaded by the service when it starts. The dll that contains the custom job I've copied into the same folder as the Quartz.net bin directory and is definitely loaded OK because the job actually runs!
Cheers, Ian.

Have you also copied the custom job .pdb file?
you could also change the build destination to be in the Quartz.Net folder, no need to keep copy files around.

The way I do this is have a bit of code to the start of the job like :
if (System.Diagnostics.Debugger.IsAttached)
System.Diagnostics.Debugger.Break();
Then I open Visual Studio and attach the debugger to the Quartz.Net service. When the job starts, it will break into the debugger and I can then add breakpoints at other places.

What I would do is to create an Asp.Net core console application and with the application executable file I will create a Windows Service using sc. Below is my Program class.
using Backup.Service.Extensions;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using System;
using System.Diagnostics;
using System.Linq;
using System.Threading.Tasks;
namespace Backup.Service
{
public class Program
{
static async Task Main(string[] args)
{
var isDebugging = !(Debugger.IsAttached || args.Contains("--console"));
var hostBuilder = new HostBuilder()
.ConfigureServices((context, services) =>
{
services.AddHostedService<BackupService>();
});
if (isDebugging)
{
await hostBuilder.RunTheServiceAsync();
}
else
{
await hostBuilder.RunConsoleAsync();
}
}
}
}
If we are running the application locally/debug, we are calling the extension method RunConsoleAsync() or else we call our own custom extension method RunTheServiceAsync().
I had written a complete article on this topic, you can read it here. You can also see the complete source code in the GitHub here.

Related

Visual Studio 2019 Logic Apps Designer removing code

I have Visual Studio 2019 16.10.4 and Azure Logic Apps Tools for Visual Studio 2019 2.24.2
I have created a Logic App by first going to the portal and getting a simple skeleton trigger that listens on an Event Grid Topic and connects using a Managed Identity.
I then copy the json over to my Visual Studio project.
Once in Visual Studio if I deploy the ARM template (note I deploy the ARM template using a simple powershell script and not the one generated by the tool but that should not matter) everything works as I expect, I get the API Connection, Logic App, and Event Grid Trigger all created so I am happy that the contents of the ARM template and parameters file all work as I expect.
The issue I face is that when I open the Logic App in the Logic App designer the tool seems to remove the managed identity code from the json and then the tool spits out an error in the output window.
If I can try to explain.
This is the code in the raw json file before I open in the designer:
"$connections": {
"value": {
"azureeventgrid": {
"id": "[concat(subscription().id, '/providers/Microsoft.Web/locations/', parameters('LogicAppLocation'), '/managedApis/', 'azureeventgrid')]",
"connectionId": "[resourceId('Microsoft.Web/connections', parameters('azureeventgrid_1_Connection_Name'))]",
"connectionName": "[parameters('azureeventgrid_1_Connection_Name')]",
"connectionProperties": {
"authentication": {
"type": "ManagedServiceIdentity"
}
}
}
}
}
When I open in the designer I see this error message in the Output window:
The workflow connection parameter 'azureeventgrid' is not valid. The API connection 'azureeventgrid' is configured to support managed identity but the connection parameter is either missing 'authentication' property in connection properties or authentication type is not 'ManagedServiceIdentity'.
And then when I look at the raw json this is what now appears:
"$connections": {
"value": {
"azureeventgrid": {
"id": "[concat(subscription().id, '/providers/Microsoft.Web/locations/', parameters('LogicAppLocation'), '/managedApis/', 'azureeventgrid')]",
"connectionId": "[resourceId('Microsoft.Web/connections', parameters('azureeventgrid_1_Connection_Name'))]",
"connectionName": "[parameters('azureeventgrid_1_Connection_Name')]"
}
}
}
}
So it appears as if the tool is not happy with some part of my code and removes it.
I then do my work, close the designer, open the raw json, copy back in the removed connectionProperties and deploy. So I do have a workaround but it is a bit tedious to have to do this all the time.
Is this a known issue? For example I can see that the designer does not seem to allow me to create a logic app with a trigger to event grid that uses Managed Identity (hence why I started out by creating a skeleton in the portal and copying the code over).

Azure Web Job - Data Processing

In VS I've created an Azure Web Job. I see a boiler plate method:
static void Main()
{
var host = new JobHost();
// The following code ensures that the WebJob will be running continuously
host.RunAndBlock();
}
Also a function method:
// This function will get triggered/executed when a new message is written
// on an Azure Queue called queue.
public static void ProcessQueueMessage([QueueTrigger("queue")] string message, TextWriter log)
{
log.WriteLine(message);
}
Cool... however I don't want to use Azure Queue or blog storage. I don't need to pass in any data as arguments or trigger it.
I simply want a job that will run every hour and do some data processing. Specially hit a 3rd party API and load some data into my Azure DB.
What am I missing here?
EDIT
Should I just be using a vanilla console app in this situation and publish it as an "Azure Web Job" ?
You should just use a vanilla console app and deploy that as an Azure Web Job. See the steps below:
Right-click the web project in Solution Explorer, and then click Add > Existing Project as Azure WebJob. The Add Azure WebJob dialog box appears.
In the Project name drop-down list, select the Console Application project to add as a WebJob.
Complete the Add Azure WebJob dialog, and then click OK.
The Publish Web wizard appears. If you don't want to publish immediately, close the wizard. The settings that you've entered are saved for when you do want to deploy the project.
Source with screenshots: https://azure.microsoft.com/nl-nl/documentation/articles/websites-dotnet-deploy-webjobs/#convert
You can find more information about this here: https://azure.microsoft.com/nl-nl/documentation/articles/websites-dotnet-deploy-webjobs/.
On this page you can also read that a console application can be used as an Azure Web Job by adding the Microsoft.Web.WebJobs.Publish NuGet package and a webjob-publish-settings.json.
Example webjob-publish-settings.json:
{
"$schema": "http://schemastore.org/schemas/json/webjob-publish-settings.json",
"webJobName": "WebJob1",
"startTime": "2014-06-23T00:00:00-08:00",
"endTime": "2014-06-27T00:00:00-08:00",
"jobRecurrenceFrequency": "Minute",
"interval": 5,
"runMode": "Scheduled"
}
When you want to add this Azure Web Job to an existing Azure Web App (website) project you can link the webjob by adding a webjobs-list.json file to the website project.
Example webjobs-list.json:
{
"$schema": "http://schemastore.org/schemas/json/webjobs-list.json",
"WebJobs": [
{
"filePath": "../ConsoleApplication1/ConsoleApplication1.csproj"
},
{
"filePath": "../WebJob1/WebJob1.csproj"
}
]
}

ExcelDNA Memory Space

I am writing an application that has multiple potential user interfaces and I am using MEF to inject the appropriate implementation during startup. One implementaiton of IDisplay uses ExcelDNA (Excel is the interface). The code launches Excel as a process through
var processInfo = new ProcessStartInfo
{
FileName = PATH_TO_EXCEL,
Arguments = PATH_TO_EXCELDNA_ADDIN
};
Process.Start(processInfo);
This works fine except that Excel is now in a seperate memory space so UI callbacks (i.e. Ribbon button clicks) cannot get access to any injected or assigned properties.
One possible solution is to launch Excel first then have ExcelDNA's AutoOpen() hook (which gets called once the add in has loaded in Excel) call the bootstrapper class to configure MEF however I'm wondering if it is possible to share memory between the C# and Excel processes? Would starting Excel via Excel.Application app = new Excel.Application { Visible = true; } resolve? I would try this but have not been able to find out how to specify the path of the ExcelDNA addin for it to load (like above).
Excel will always run as a separate process. So you can't share memory between the Excel process and another process. However, C# code can run inside the Excel process - this is exactly how an Excel-DNA add-in works.
You can also communicate between the Excel process and some other process. One option for this is to use the COM Automation interop - this is what you're doing when you call new Excel.Application from your own executable. You are starting a separate Excel process (or connecting to an existing running process), and then getting back an inter-process communication proxy (the Application object).
If you then want to tell that Excel process to load an Excel-DNA add-in, you can call Application.RegisterXLL(path_to_add_in) to have it load the .xll. How you hook up the Excel-DNA add-in and the rest of your code is still to be figured out.
You might still need some kind of cross-process communication, like .NET Remoting, WCF with named pipes or something like that.

Creating and deploying a windows service publishing pages in SDL Tridion

Our requirement is to schedule content publishing of a content page to run in recurring intervals in Tridion CMS application. We are currently using Tridion 2009 SP1 version.
As per the suggestion from the experts as in: Tridion 2009 SP1: How to schedule a content page for a recurring publishing? we have created a simple C# console application that has referenced Triond Interop .dll's as below:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Tridion.ContentManager.Interop.TDS;
using Tridion.ContentManager.Interop.TDSDefines;
using Tridion.ContentManager.Interop.msxml4;
using System.Configuration;
namespace SchedulePublish
{
class Program
{
static void Main(string[] args)
{
//Please use your system related corresponding webdav url's and tcm id's where ever required. Below are just sample :)
TDSE tdse = new TDSE();
//Give some identity that has access rights on tridion UI
string Identity = #"Domain Name\Username";
tdse.Impersonate(Identity);
tdse.Initialize();
string targetTypeId = "tcm:0-1-65537";
Publication Pub_Obj = (Publication)tdse.GetPublication("/webdav/30%20DIRECTV%20sites");
XMLReadFilter Filter = new XMLReadFilter();
Component CompObj = (Component)tdse.GetObject("/webdav/30%20DIRECTV%20sites/Home/System/xml/Knavigation.xml",
EnumOpenMode.OpenModeView, Pub_Obj.ID, Filter);
DateTime schedulePublishDate = Convert.ToDateTime(ConfigurationManager.AppSettings["SharedPath"].ToString());
CompObj.Publish(targetTypeId, false, false, false, schedulePublishDate, DateTime.MinValue, DateTime.Now, true, EnumPublishPriority.High, false, 3);
}
}
}
As we are new, please provide pointers to implement the below steps:
1.Tridion CMS servers do not have Visual studio installed so please suggest a way to run this application and verify if we are able to publish the content as required.
2.Host this application in the Tridion CMS Server and schedule it to run at the desired intervals every week.
You don't need Visual Studio to run your new console app, simply compile it and copy the files to the CMS server.
If you run the application, you should see items appearing in your Publication Queue, if you don't see your items added to the Publish Queue, I would recommend adding some logging calls to your application so you can see where the code is failing (consider using Log4J.NET if you have not done logging before).
Once you have validated that it works as desired, the easiest way to schedule it is to create a task using the Windows Task Scheduler. There is no way to run such a task from within the CMS. Alternatively you could convert your console app to a windows service, but I think this would be overkill in this case.

How to test a Service Layer in MVC4?

I have a web application with three layers: Web > Services > Core. I've just added a Services.Tests unit tests project and I'm trying to figure out how to get started.
I've added a reference from the Services.Tests project to the Services project. I then created an instance of the Services.service class in my Test class:
[TestClass]
public class CrudTests
{
private readonly SetServices _setService = new SetServices();
[TestMethod]
public void TestMethod()
{
But I got an error about _setService needing a reference to my Core project.
First question: Why do I need to reference my core project if the Services project already references it?
I've added the reference and ran the empty test, which passed. I then tried to call one of the basic methods within my service class:
[TestMethod]
public void TestMethod()
{
_setService.CreateSet("Test Set", "Test Set Details", null);
But I get a providerIncompatileException.
Second question: What is the recommended way to create/use a dedicated test database?
Third question: How do I tell my tests project which database to use?
By parts,
Why do I need to reference my core project if the Services project already references it?
Project references aren't inheritable, as per this answer. The fact that Services has a dependency on Core does not imply that whoever consumes Services needs to know anything about Core, or for that matter uses any logic exclusively defined in Core.
What is the recommended way to create/use a dedicated test database?
It depends entirely on what database, what ORM, what framework, etc. There aren't fixed ways to do it.
How do I tell my tests project which database to use?
In the same way you tell the application to do it: through the configuration. Simply create a DatabaseStub and hard-code the test database information in it.

Resources