I am looking to use itextSharp to create PDFs in my MVC 3 application, and I will be using Azure. What I would like to know is will this create a problem and does Azure support this function.
You can pretty much run anything you want in azure so it wont be a problem from that prospective. The biggest thing to remember is that you probably won't want to perform such a heavy operation in a web role. A worker role is much better suited for this type of task. You could setup a system where you use Azure queues to request PDF jobs and the worker role reads the queue and runs these jobs. You shouldn't have any problem with that approach. Check out the Azure Toolkit to get you started: http://azuretoolkit.codeplex.com
Related
In my job we are building Web Apps that rely on a common Enterprise class. This class has a method that sends a request to our server every time the app_start or app_end event triggers so we can monitor the status remotely. But we are now requiring that at least once a day the web app reports its status, a bit like telemetry. I don't know how to accomplish this, so far I have found some options, but some have limitations:
Use hangfire. I don't like this since it requires to setup a Database or add more tables and install a new Nuget package on each project, but could be my last option.
Use a Windows Service that reads databases. This could be less work but it can't access the Web App web.config code
Use a Javascript tasks that sends an AJAX request. This requires to have an open web browser and is a big risk.
I'm looking for a server side approach that could allow to set to trigger an event or function at 1am.
I would got with Hangifire.
It is dead easy to setup and very reliable.
You don't need to setup the database, you might want to check memory storage:
https://github.com/perrich/Hangfire.MemoryStorage
Also check:
What is the equivalent to CRON jobs in ASP.NET? - C#
You can use FluentScheduler instead of Hangfire (it is more lightweight).
Instead of a Javascript task that sends an AJAX request you can use a WebJob or an Azure Function.
In dev environment I am using the ASP.NET configuration tool in Visual Studio to create a few users for testing. As I movel closer to QA and Production, I'm wondering what is the best way for me to automate the creation of a large amount (1000's) of users after application deployment.
I have a csv with all the usernames and passwords, roles etc. and I wan't to avail of the encryption and password salting security that is built in. I do not want to manually "Register" all these users.
I'm just not sure if this is something I can do (or instruct a db admin to perform for me).
Does anyone know of a way to achieve this?
Any assistance would be greatly appreciated.
Regards
The simplest solution would be to set up a "CSV Upload" form. The CSV would be processed by an MVC action calling Membership.CreateUser in a loop.
Probably, the performance of this will be good enough.
There's a few ways that I know of approaching a batch processing problem on an ASP.NET site.
Because of the wonky way an ASP.NET site's application pool can get recycled, batch processing is usually done on an external process.
Windows service
One way is a separate windows service, which gets the new excel and pumps that data in, and has a timer which keeps going around. I've seen this used often, and it is quite a pain, because it takes extra work to make it easily deployable.
Update ASP.Net membership from windows service
CacheItem
Second way is to use CacheItems and their expiration timers to do batch processing, what you do is you define a cache object with a long timer, and when that expires and the Removed-callback gets called, you do your database work. This is good because it deploys with your ASP.NET site, and you have your code in one logical place.
https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
Workflow Foundation
Third way, is to make a workflow foundation service. That service gets a call from your ASP.NET site, which instantiates a WF service, that does some db work with your excel file, and then it goes into a while-loop with a delay of a month in it. This is good, because it is not tied to the lifespan of your ASP.NET application pool - you get more control, and this logic can be separated into a different IIS hosted WCF service.
http://msdn.microsoft.com/en-us/library/dd489452.aspx
Integrating with data is always a pain though, remember that the solution that gives you the least work and least chance of failure when deploying is the best solution.
I am currently developing an MVC app using asp.net. My final aim is to deploy the saas on Azure.
But would it be feasible to do it at a later stage or should i incorporate it into my development?
When it comes to use Azure authentication etc i will require that due to the app being multi tenancy.
Just wanted to know peoples thoughts on this?
Cheers
It would be better if you can provide more information. Do you want to know if you ignore Azure at the moment, how much effort you need to take if you decide to deploy the application to Azure? In general it would not take too much effort, unless you want to use Azure services, such as storage, ACS, and so on. Deploying an ASP.NET application to Azure web site is just like deploy to a remote IIS. Deploy to web role requires you to create an additional cloud service project. Deploy to virtual machine usually does not require any modifications to the project, but requires you to setup all the environment.
In addition, please note there’re still some difference between Azure and local environment. For example, we usually use Azure SQL Service instead of connecting to the local SQL server.
Best Regards,
Ming Xu.
I'm doing something similar, but without developing on Azure right now. I have prepared for it though by making sure I use interfaces as much as possible. For instance, I don't write to a file system using File and Directory, but to interfaces IFile and IDirectory.
If you can avoid assuming anything based on your current localised, Windows Server environment then you can at least write implementations to satisfy requirements that do work in Azure. I'm planning to deploy to Azure and local Web servers and use Dependency Injection to satisfy the concrete implementation of the interfaces. I could just as easily use the same codebase entirely and have it detect the environment before injecting the implementations.
We run a number of ASP.NET MVC sites - one site per client instance with about 50 on a server. Each site has its own configuration/database/etc.
Each site might be running on a slightly different version of our application depending on where they are in our maintenance schedule.
At the moment for background processing we are using Quartz.net which runs in the app domain of the website. This works well mostly but obviously suffers issues like it isn't running when the appdomain shuts down such as after prolonged activity.
What are our options for creating a more robust solution?
Windows Services are being discussed but I don't know how we can achieve the same multi-site on different versions we get within IIS.
I really want each IIS site to have its own background processing which always runs like a service but is isolated in the same way an IIS site is.
You could install a windows service for each client. That seems like a maintenance headache though.
You could also write a single service that understands all versions of your app, then processes each client one after the other.
You could also create sql server jobs to do this, and setup one job for each customer.
I assume that you have multiple databases for each client as you mentioned.
Create a custom config file with db connection-strings for each client.
e.g.
<?xml version="1.0" encoding="utf-8" ?>
<MyClients>
<MyClient name="Cleint1">
<Connection key="Client1" value="data source=<database>;initial catalog=client1DB;persist security info=True;user id=xxxx;password=xxxx;MultipleActiveResultSets=True;App=EntityFramework" />
</MyClient>
<Myclient name="client2".....></MyClient>
</MyClients>
Write a windows service which loads the above config file (clients|connectiostring) into the memory and iterate through each client database config. I guess all clients have similar job queuing infrastructure, so that you can execute same logic against each database.
You can use a Mutex to initiate the windows service for a client so that it will make sure two instances of the service for the same client won't run at the same time.
I'm working on a asp.net mvc project that will use MS Windows Workflow Foundation to manage client business work flows. These are long running workflows spanning in time over a year or two. So we've decided to use State Machine workflows. A workflow instance will be persisted to a database when not being used (or in idle).
I'm fairly new to MS WF and would like to find out the best practices for implementing the workflows for an asp.net mvc application.
More specifically, where should I host the WF runtime? In asp.net mvc or in a separate process like Windows Service?
I would be most grateful to hear success stories of how MS WF is implemented in asp.net mvc?
Any comments and ideas are welcome,
Thank you all,
Cullen
Are you referring to WF3 or WF4 which is a completely different piece of code.
With WF3 there is the central WorkflowRuntime and that is usually hosted somewhere at the application or possibly session level.
Updated
Some of the things to watch out for:
IIS can recycle the AppDomain at any time it wants to when there are no incoming calls being processed. An async workflow is NOT considered part of the request as it is running on another thread.
To migrate workflows from the old to the new AppDomain you need a persistence service.
The new AppDomain might not be activated right away causing delay activities not to execute as soon as you would expect.
Its is generally best to use the manual workflow scheduler but that makes writing code somewhat more complex as you have to schedule the work and then start execution.