I am developing a tuition fee management system using asp .net MVC for a university assignment. I am quite new with asp .net just learn it this April 2021. One of the requirements is that the system automatically sends an email every month to every user as a reminder about the outstanding balance. So how do I start to develop this requirement since I've been searching and only found tutorials email send manually and to one user only?
There is not built-in functionality in .NET that runs your code once a month, but there are several tools to do this. If you are using Azure, AWS or GCP (or any other cloud platform), you’d might consider a serverless Function to do this. These Functions can be triggered once a month by the cloud provider.
If you’re not hosting it in the cloud (or want to avoid any provider-specific features), you can use for instance Quartz (https://www.quartz-scheduler.net) or Hangfire (https://www.hangfire.io). There are many libraries available, all with their pros and cons. Hangfire for instance has a dashboard built-in to monitor and debug any issues, but this also costs some server resources and might be an overkill if you have a single job to run.
You should however take into account that communication with an SMTP server is quite time-consuming. Sending thousands of emails could therefore take a lot of time. Especially if you are using serverless Functions, this will become an issue due to the time-limits that apply for these Functions. Also when you run these as jobs in Quartz or Hangfire, you want to take into account this job might be aborted halfway. Therefore, you usually insert those mails into a queue (or database) and then have a second process to actually send these mails. Maybe even via a specialized email delivery service?
Related
With CloudKit, you can focus on your client-side app development and let iCloud eliminate the need to write server-side application logic. CloudKit provides you with Authentication, private and public database, structured and asset storage services — all for free with very high limits.
You cannot upload any code to run on Apple's servers?
I've heard it being compared to Google App Engine and other cloud computing platforms, but without the ability to run your own code, isn't the whole thing pretty limited and not really comparable?
For example, if I want to build a news app which periodically pushes stories on topics that the user is interested, then this can't be done just using CloudKit because I would need scheduled jobs and data processing on the server.
Any thoughts?
Server-side
As you said CloudKit doesn't allow server-side code.
But there are possibilities.
Crons
You don't want to connect to the iCloud Dashboard everyday in order to perform the push by adding a record. One solution here is to code an app on a mac server (I guess mac mini as server will become more popular with CloudKit) that add a new Daily CKRecord every day.
Subscriptions
Subscriptions concept is that the client registers for specific updates. You can create a record type called Daily for instance and make users register to it. You should check the Apple documentation and WWDC14 videos (even if Subscriptions are not detailed, it's a good start point).
The good thing is push notifications are linked with the subscription concept. So basically you say: Send my a notification for each new CKRecord of type Daily added.
BaaS party
What is the point for using CloudKit (vs Parse and other?)
Price: CloudKit has a really nice pricing
Ready to go: 2 clicks inside XCode and you are ready to go
User consistency: you get free user login for all his devices through their iCloud account. With a very good privacy system. And you can get relationships with a smart system.
But:
You are stick on Apple platform. We don't even know if we could export the data..
Only data-centered for now (no server-side code)
The CloudKit dashboard is too limited
The future
CloudKit is still pretty new. At the WWDC some guys behind it made me understand that they are still heavily working on it. My bets are they are working on 2 important points :
Server side code execution through remote scheduled tasks
CloudKit for Analytics (Visualization side)
Edit: Apple guys are fully aware and concerned about the lack of web access for the data. It means that one day it may be accessible from other platforms. I read in a comment that Apple probably would have bought Parse if CloudKit wasn't better, AFAIK they tried to buy Parse (skills buy it's said, but we don't really know).
Update WWDC15
CloudKit is now available in JS and some dashboard are available now. Wait and see.
Update February 2016
CloudKit Now Supports Server-to-Server Web Service Requests
Web Services Reference
In some cases, we do not need server-side logic, and just storing static data can cover all the usage scenario.
In this case, it would be very helpful if there's a free accessible storage that you can store something. CloudKit provides such stuffs rather then full service platform.
Yes it is limited. Anyway can be useful for some people. For example, your case actually can be supported CloudKit. Though CloudKit is just a static storage, it support subscription. Which monitors a set of conditions and pushes the event notification to client. It's fortunate that the only background job feature supported by CloudKit is just what you need.
Anyway, if you need more, then you might need to consider full fledged servers. Usually simple web services with simple server-side code execution support are also limited.
You cannot upload any code to run on Apple's servers?
You can and you can't. You can't upload code / SOAP based web services to the server, instead of it you can upload / store observers on the server, called subscription.
whole thing pretty limited and not really comparable?
I would say in CloudKit and in MBaas client communicates with server though a more narrower more robust interface: you can not upload exotic web service to do XML parsing, database manipulations and based on it trigger push notifications, but RestFull architecture allows you to perform the 4 basic operation on the data store, and with subscription client can get notified about INSERT / UPDATE / DELETE operations performed on tables.
I think MBaas is just the next step in evolution of server - client architecture. First it seems it is limiting, but you can do all as in SOAP based web services world. Development is extremely fast / scalable / comfortable to use and easier to control things like permissions / setup, maintain server, security needs almost no effort.
Believe it or not, you can actually get REALLY far with this approach.
I've not used CloudKit, but I can describe for you my application stack:
AngularJS (or your favorite client side HTML rendering framework): A single page will host a series of templates/controllers selected by the router and driven by users changing the anchor to select which page they're on.
Firebase.io (or your favorite cloud storage): Any dynamic data goes into the cloud document store. The controller needs to load the data and render the template on the client, and when the data changes, send the data back. This also provides the authentication and authorization as well, since you can limit access to the data.
Now you need a place to serve the HTML/CSS/JS/images... which requires no 'server side code execution', just a web server where you can put the assets.
Using this technique you could store all the user's topics in the database for that user, and when the page loads, go and aggregate all the sources for those topics (also stored in the database) completely client side. There's nothing in your example application which actually requires server side execution that I can see, so long as you have cloud storage which will provide you with authentication and authorization services, and a 'dumb' web server for serving up static assets.
CloudKit isn't a full-fledged web hosting service. Instead, it's an SDK for iCloud. You shouldn't be putting a web site up there, just storing user data that you may want to use in multiple applications or platforms.
iCloud APIs enable your apps to store app data in iCloud, keeping your apps up to date automatically. Use iCloud to give your users a consistent and seamless experience across iCloud-enabled devices.
I'm developing an application that has various types of Notifications. Examples of notifications:
Message Created
Listing Submitted
Listing Approved
I'd like to tie all of these up to SignalR so that any connected clients get updates in real-time.
As far as architecture goes - right now the application is entirely within a single solution hosted on an Azure Website. The triggers for each of these notification types live within this application.
When a trigger is hit, I'd like to tell signalR, "Hey, send this message to the following clients" along with a list of userIds. I'm assuming that it's possible to identify connected clients based on userId... and I'm assuming that the process of send message to clients should be executed outside of the web application, so as to not slow down the MVC app or risk losing data in a broken async call. First question - are these assumptions correct?
Assuming so, this means that I'll need something like a dedicated web/worker role to be sending messages to clients. I could pass messages from my web application directly to this process, but what happens if the process dies? The resiliency concerns lead me to believe that the proper way to pass messages would be via a queue of some sort. Second question - is this a valid train of thought?
Assuming so, this means that I can either use a good ol' Azure SQL database as a queue, but it seems like there are some specialized (and maybe cheaper) services to handle message queueing, such as this:
http://www.windowsazure.com/en-us/develop/net/how-to-guides/queue-service/
Third question: Should this be used as a queueing mechanism for signalR? I'm interested in using Redis for caching in the future... would Redis be better or worse than the queue service?
Final Question:
I've attempted to illustrate my proposed architecture here:
What I'm most unclear on here is how the MVC app will know when to queue, or how the SignalR processes will know when to broadcast. Should the MVC app queue blindly, without caring about connected clients? This seems to introduce a lot of wasted space on the queue, and wasted cycles in the worker roles, since a very small percentage of clients will ever be connected.
The only other approach I can think of is to somehow give the MVC app visibility into the SignalR processes to see if the client is connected... and if they are, then Enqueue. This makes me uncomfortable though because it means I have to hit that red line on the diagram for every trigger that gets hit, which - even if done async - gets me worrying about performance and reliability.
What is the recommended architecture for scalable, performant SignalR message broadcasting? Performance is top priority, followed closely by cost.
Bonus question:
What if some messages are of higher priority than others? Should two queues be used, one of which always gets checked before the other?
If you want to target some users, you'll have to come up with a mechanism, off the top of my head I can give an example, if any user hits a page, you can create a group for that page and push to all users in that group/in that page.
It's not clear to me why you need the queues. Usually users subscribe to some events when hitting a page or by some action like join a chat room, and the server pushes data using those events/functions when appropriate.
For scalability, you can run signalr in different servers, in which case you should use sql server, or service bus or redis as a backplane.
Firstly you need to create a SignalR server to which all the users can connect to. This SignalR server can be created either in the web role or worker role. If you have a huge user base then its better to create the SignalR server on a separate role.
Then wherever the trigger is hit and you want to send messages to users, you have to create a SignalR client (.NET or javascript) and then connect to SignalR server. Then you can send the message to SignalR server which in turn will broadcast to all the other users connected. After that you can disconnect the connection with SignalR server. This way you dont have to use queues to communicate with the SignalR role.
And also to send messages to specific users you can store the socket id's along with their user id's in a table (azure table storage should do) when they connect to SignalR server. Then using socket id you can send messages to specific user.
I will developp and host an e-commerce website based on Asp.Net MVC4 (with several SQL Server Jobs).
I think use Azure in order to stay in Microsoft's world and avoid dedicated server management.
The package Web Site Shared with 1 site / 5Go SQL Server Database / 200Go Bandwidth is very interesting with the price based on 12 months.
But i don't know if this configuration is enough specially on the bandwidth.
What do you think of ? Did you use Azure with this type of application ?
Regards,
Guillaume.
If you want to develop E-Commerce application you will have to secure customers' sensitive data i.e. credit cards, address details etc. via secure connections (HTTPS; in many countries this is legal requirement). For that reason you will have to have SSL support.
Azure Website do not support SSL for custom domains. However, they support SSL for *.azurewebsites.net DNS name. So if your E-Commerce application DNS will be, say, my-ecom-app.azurewebsites.net then it's fine. Otherwise, I would not recommend Azure Website solution yet (I am sure SSL support for custom domains on Azure Website will be implemented).
Azure Cloud Services, on the other had, have full support of SSL for custom domains.
One of the really good websites to check Azure features and development roadmap is ScottGu's Blog
Azure Web Sites do not support SSL and I really don't know of any successful e-commerce site that does not run SSL for at least part of the website. If you really want to host your e-commerce on Azure today your only real choice is to run Virtual Machines for your web front end servers and use them for your DB or use SQL Azure.
We developed platform called Virto Commerce that does just that, MVC4 website hosted on Azure. There was also a need for SQL Jobs (indexing, payment processing, cart cleanups and so on) for which we used WorkerRole (instead of WebRole). WorkerRole and WebRole can actually be combined as part of a single deployment, however it is better to use a different instance for worker roles. In our case WorkerRole acted as a scheduler for multiple jobs defined in the database.
The challenge with WorkerRoles however is to make sure they scale well when new instances are added. So the workload needs to be distributed between multiple instances. This is done through the use of queues and blob locks, where each job is now split into two, one that schedules and partitions the work and the second that actually picks up the next partition and completes it.
Hope this helps!
PS: Virto Commerce is now available as an open source project on codeplex, go to http://virtocommerce.codeplex.com
I have an existing complex website built using ASP.NET MVC, including a database backend, data layer, as well as the Web UI layer. Rebuilding this website in another language is not a feasible option.
There are some UI elements on some views (client side) which would benefit from live interactivity, involving both push and pull, so rather than implement some kind of custom long polling or websocket server in asp.net, I am looking to leverage node.js for Windows, and Socket.io.
My problem is that I need two way communication between both applications. Each user should only be able to receive data once they are authorised on the ASP.NET website, so I first need communication for this. Secondly, once certain events occur on the ASP.NET website I want to immediately push this data to the Node server, to be broadcast to specific users or groups of users. Thirdly, I would like any data sent to the node.js server to be pushed to the ASP.NET website for processing, as this is where all our business logic lies. The sole reason for adding Node.js is to have the possibility to push data directly to the client, I do not want to build any business logic into it (or as little as possible).
I would like to know what the fastest method of two-way push communication is between Node.Js and ASP.NET. The only good option I'm aware of so far is to create a special listener on a specific port on the node.js server and connect to that, but I was wondering if there's a more elegant or more efficient method? I also know that you could use a database inbetween but surely this would need to be polled and would be less efficient? Both servers will be running on the same server under a Visual Studio project.
Many thanks for any help you can provide.
I'm not an ASP.NET expert, but I think there are multiple ways you can achieve this:
1) As you said, you could make Node listen on a specific port for data and then react based on the data received (TCP)
2) You can make POST requests to Node.js (HTTP) and also send an auth-key in the process to be extra-secure. Like on 1) Node would react to the data you send.
3) Use something like Redis for pub-sub, send messages from ASP.NET (pub) and get them on the Node.js part (sub). This is even better if you want to scale your app across multiple machines etc.
The only good option I'm aware of so far is to create a special
listener on a specific port on the node.js server and connect to that,
but I was wondering if there's a more elegant or more efficient
method?
You can try to look at redis pub/sub model where ASP.NET MVC application and node.js would communicate through separate channels in order to achieve full-duplex communication. Or you can also try to use CouchDB change nofitications.
I also know that you could use a database inbetween but surely this
would need to be polled and would be less efficient?
Former techniques do not require you to poll for changes, but instead they will notify you when the changes happens or channel message arrives.
I have an ASP.NET MVC web application which is hosted by an external provider, on IIS 7.
I wish to run a process every 15 minutes or so, which checks a backlog of emails that need to be sent, and actually sends them.
It seems that the normal way to do this is with Microsoft Message Queue, but since this is a hosted environment which I can't directly control, I won't be able to install or configure MSMQ.
So far I've decided to do it by appending rows to a table in my SQL Server database (same hosting).
So how should I implement the bit where I check the backlog and send the emails?
Should it be some kind of separate thread in my main web application, which restarts itself every 15 minutes?
Another option I considered was just opening an HTTP-POST interface which, when called with an appropriate admin password, runs an iteration of the email sender.
I could then create a small console app on my local PC which calls the interface every 15 minutes.
The first option is simpler, but the second might be more robust.
Any ideas?
I would recommend you taking a look at Quartz.NET. Also an important thing you should be aware is that the web server could unload the ASP.NET application from memory if it is not used meaning that all threads that have been spawned would simply die. That's one of the reasons why such tasks shouldn't be performed in ASP.NET applications but rather offloaded in Windows Services.
Jeff Atwood did a post on how he originally achieved the badge system on Stack Overflow using an expiring cache to reset the process periodically.
https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
I have done something similar to this in the past sending emails out every day. The service was non essential, and it didn't matter if the emails missed a day or two, as they would go out eventually anyway, but the system worked quite well. It's all asp.net so works fine in the hosting environments I use, without access to service on the server or creating a local trigger from your desktop.