Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
My initial website will not experience heavy traffic during beta. But assuming success, when traffic builds I will need to implement plans to handle increased traffic from being aware of it to actually dealing with it. I'd like to start studying that now.
But there is an amazing wealth of information about this on the web. I was hoping someone might help me cut through the volumes of information by pointing me in the right direction with articles/walkthroughs/etc that are more practical and less theoretical? And of course any direct guidance on the issue would certainly be appreciated.
I am currently using a hosting provider, not running my own IIS server.
It's very difficult to predict where your scaling bottlenecks will be. If you're missing a database index for example, queries might run slow, and load balancing your web server won't help.
To start with, you should get comfortable with profiling your application. There's a lot of great tools for the backend, including the Visual Studio Profiler, ANTS Profiler and my favorite, dotTrace.
Next (or maybe first, it doesn't matter), you'll want to profile the client side. Chrome Developer Tools works great, or you can use the new Firefox Developer. This will show you response times and how long it's taking to load assets, like your CSS/Javascript/Images/etc.
After doing both, you should have a good idea where your problem is. But in general, the "easiest" way to improve scaling is:
Bundle/minify/compress your asset files. You can remove hundreds of KB from page loads.
Use a CDN. Browsers are limited by domain in the number of simultaneous connections they can make to fetch assets. With a CDN, you can split the requests between domains, and for popular libraries like jQuery, they're more likely to already be cached.
Cache data as appropriate. If there's some mostly static content that never changes, cache it instead of querying your database every time. Take advantage of things like Output Caching, where your entire rendered view is cached.
Check out some of the checklist items in this post.
Once you've taken all those steps, if you still have problems, then you can look at things like load balancing and better hardware. Otherwise, you risk throwing money away when it might not make a difference at all.
It's great to familiarize yourself with all aspects of a web application. However, load balancing is one of those things that can be tricky to set up, but is nigh impossible to set up right without a very comprehensive knowledge of server and networking architecture and experience. Even the big boys like Twitter and Facebook struggle with handling scale. It's very much a learn as you go process and extremely particular to individual circumstances. An absolutely perfect setup for one application may be completely useless for another.
If you're successful enough to need load-balancing, you're likely also successful enough to hire an infrastructure expert to take care of it for you. Short of that, you can utilize services like Azure, which while they have their own learning curve, provide load balancing nearly out of the box.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm considering using Node.js with a framework such as express, meteor, or sails (a directory with social features such as sharing, messaging and uploading media). I don't have any features planned that explicitly require real-time functionality, so does it make sense to use Node.js anyway instead of Rails?
There's so much buzz around Node.js that I am tempted to use it just so that I don't get left behind.
As DHH wisely noticed regarding Node vs Rails, "everything can be used instead of everything else". That's somewhat true in a sense that, for example, a site in Rails with promptly set up caching can be as fast as one written in Node.js.
Besides, Node is not necessarily about real-time. It's more about being able to handle many light (in terms of processing time needed) requests. If you expect high level of concurrency (I mean, really, expect, not just are dreaming of it) and every request is supposed to be relatively small, then you could consider using Node, just because handling bigger load (up to some point) will require less work with Node.
Bottom line, use what you are good at. Unless you want to try something new. And Node.js is definitely worth trying.
You're question does lack quite a bit of context.
This question all depends on the context.
If this is contract work or something you want to make money with in the near future and you're not sufficiently skilled with any of the mentioned nodejs frameworks.
Then I would recommend you use whatever you're already good at.
If this is a private project for fun or any other non serious purpose.
Then I would seriously recommend you to try one of mentioned nodejs frameworks.
In my opinion nodejs is currently the cutting edge web technology. As a developer
you should always try to stay on the cutting edge. That way when you learn how nodejs can be used you might find ways to use those things in your professional environment.
I've lately been using meteor a lot and I can highly recommend it, once you get the hang of it you can do truly amazing things that you could never even imagine doing(in a reasonable timespan) in a classic php project.
Also according to some meteor will replace RoR alltogether blog
Aside from real-time, main plus I've seen is having same team being able to develop JavaScript client code and server side code for UI web applications. "Code sharing" between client and server seems a pipe dream to me, but same language is really nice.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
Looking for some suggestions from the community for development stacks for collaborative environments. Could you share what you have and what has worked for you or your team?
The following is probably too verbose for some and an expression of just some rambling thoughts I've had about my particular scenario as I'm working with a hatchling dev group. SO, if you read it 1UP for you, otherwise, please just feel free to just share your thoughts re: the first question and what's worked for your team.
I have a situation where myself and a couple other developers are working together and I'd like to set up the "best" dev environment possible for Ruby on Rails development. At the moment I use git and some of the usually accepted best practices for development, however the other guys are new and not terribly familiar with the shell, git, etc. They're more from a php and monolithic environment.
I do have a central linux server that has been used hitherto for LAMP based dev for them. I can retool it to anything I'd like it to be as I'm quite adept and experienced at Unix system and network admin.
Could someone please suggest what may work well in this scenario? Again, ultimately we need to do collaborative development that has the lowest learning curve. I'll be the only one deploying to Heroku until I feel comfortable with their experience.
I would like to put something together that can get us all up to speed quickly in a matter of a day vs a longer learning curve and then allow them to grow into the shell and so forth over the next couple weeks.
What I was thinking was more of a shared SMB (mixed Windows and Mac workstations) and SFTP unified projects folder that has either apache virtual hosts for each project or thin rack. I'd continue to use my methods, but this could provide the flexibility for them to grow into this and be able to restart httpd or thin as per need.
Am I on the proverbial right track or has someone seen a better alternative? A lot of things have crossed my mind such as Gitorious (since we'll have a lot of small projects needing to be tracked and an enormous GitHub account is not feasible), Heroku, OpenShift and a lot of other things, but I have enough uncertainty that I'd like to get some input from the community as to the right mix for great collaborative agile development.
I have an answer but I think you have conflicting requirements: i.e. lowest learning curve vs low/free cost.
You say that GitHub is not feasible but it does offer unparalleled features for novice users. They can see commits on a website instead of on the commandline, can even edit files right in the browser (since yesterday, uses Ace) and gain insight into the branching/merging process.
Another paid option is http://cloud9ide.com/ which is also web-based.
I use my own development server as well but only use it for experienced people who need no hand-holding. If I were to let everyone on there the amount of support would consume my entire day.
It is my opinion that doing Rails development people should adopt the best practices in the field. See it like this: at least you won't burden them with learning Subversion or --eek-- CVS. Just seeing the commits on GitHub and having a discussion right after puzzling pieces of code is worth the money.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I've written a simple order application that consists of a 1 page order-form containing a set of input fields and a submit button. The order form is loaded by invoking a URL (with data being passed via POST):
http://localhost:8080/orderform.jsp
The input fields of the order-form are populated with the POST data and whilst the form is loading there some server side processing taking place e.g. retrieving data from a database. Once the page has loaded and the submit button is clicked further server side processing takes place to process the order and then a receipt page is displayed to the user.
I need a way to load test this simple process in order to ascertain the maximum throughput of the server.
Any tool suggestions would be greatly appreciated.
For this relatively easy way of load testing I would recommend JMeter.
While jmeter is ok, it's really weak in the analytics and parameterization area. If you want something free but enterprise-class, I'd recommend CloudTest Lite from SOASTA. It gives you the fastest way to create realistic scenario on the market, real-time analytics (!) and best of all, it's free.
You can download it and give it a try here: http://www.soasta.com/cloudtest/lite/
Fred
This would be a trivial test using our product, Web Performance Load Tester. You didn't mention a budget and our product is not free. However the free version will run 10 simultaneous users and if you reduce the think time, you can push a lot of transactions using the free version.
If you need something free, JMeter, Grinder and OpenSTA are popular options. You'll spend a lot more time learning the tools and setting up the tests.
I ve wrote one very simple tool cause i didn't want to use any browser based tool.
Biggest problem is i also wanted to stress test it locally, like hitting localhost.
Maybe you will also find it useful:
https://github.com/georgekosmidis/WebStressTool
Uses Apache HTTP server benchmarking tool v2.3.1757674
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I've just been redirected by a firend on the uniGUI website. In a previous question I asked about a comparison between Raudus and ExtPascal.
Now this unigui seems to be an alternative to Raudus, that moreover has the advantage of allowing you to compile the win32 exe at the same time with the same source code (of course if you limit yourself to use only uniGUI approved UI components).
I think this is amazing, even if this idea at a first sight willnot make happy all the web apps purists, but in my opionion having this kind of tool is great.
There are many (even small) applications, that can benefit for this code once, get a double UI.
Anyway which are your feelings about this? Do you think it has a future?
ADDITIONAL NOTE: In order not to start a general discussion please try to answer by mentioniong uniGUI specifically, not only a general answer. Thanks.
I started developing uniGUI (or whatever name it may adopt in future) around two years ago. Since then it has evolved a lot. Initial version was based on VCL for the Web. With addition of ExtPascal and Ext JS it has become a very advanced tool to develop Web apps based on Delphi.
uniGUI simply defines itself as a Web Application Development framework. The concept of Web Application has been controversial since its first inception. Some people claim that Web is stateless but applications are statefull, one should not mix these two. However, nowadays with an increasing demand for web applications such notions only remain as a philosophical point of view.
More and more people want to access their desktop apps from the internet. Companies want their local accounting software to be accessible to other branches. A security company wants a web gateway for their access control software. These are all examples for the increasing demand for web apps.
We can consider uniGUI as an abstraction layer for Delphi VCL controls which extends them to the Web. Like all other abstraction layers it helps developer to focus on application logic rather than the development tool itself. It tries to fully integrate the RAD approach into Delphi based Web development.
Dual nature of uniGUI is simply a plus. I'm referring to its ability to deploy same application to both web and desktop using same codebase. This feature maybe useful for some developers but useless for others and it can be completely ignored by those who focus on Web development only.
As for the scalability, the best target for uniGUI and other similar tools seems to be the intranet where the number of clients are predictable and connection speed is a non-issue.
That said, nothing prevents developers from developing web apps that target the internet. At end it is all Ext JS on the client side and Delphi event handlers on the server side. It all depends on how smart you design your app and how efficient you manage your resources. If each of your sessions consumes 10 MB of memory then you're likely to run out of memory very soon.
In conclusion, this framework will have a group of users which will find it best for their needs. There is no black or white here only big gray areas. Like any other tool it depends on the company, the particular project and the available deployment options to see if it is the right tool for you or not.
Web applications are very different from GUI ones. Mixing two approaches for something
more serious then simple form or several buttons I think is just wrong.
I think that the UniGUI idea is a great one. But I think that Embarcadero is the one that should offer that as one more option for developers instead of a independent one. Delphi developers always wanted an easy way to create web applications, and sincerely WebBroker is very poor.
Anyway which are your feelings about this? Do you think it has a future?
The general idea definitely has a future, if only in the PT Barnum sense. This particular implementation doesn't seem to be anything special - there's nothing in it that grabs me as being a great solution to any of the problems I currently have to deal with. But then, I see thick client apps, especially traditional Delphi 2 tier apps, as quite different from web apps.
I'd be more interested if uniGUI worked the other way, and provided a solid MVC framework for Delphi, then extended that to the web. That way you could more easily have your data + business logic + GUI in three connected pieces, rather than the traditional Delphi/RAD problem that business logic gets all tangled up in the GUI, then the web application is a pain to develop because the layers "have to be" separated. This smells like "solving" that problem by letting you leave the business logic mixed into the GUI when you move to the web.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I am in the process of laying down the requirements for a photography community site. An important feature to investigate would be allowing more fotos/account than rival sites around my country's internet. What are the possibilities out there?
Should I go for something like amazon S3, or is there anything that offers more image-related features? I am mostly interested in low price per GB (storage and transfer out).
I used to work for a social networking website that hosts billions of images and we evaluated S3. Conclusion was that it is too expensive for heavy-traffic sites. The storage itself is pretty cheap, but the costs for accessing the content on S3 add up quickly. That makes S3 more suitable for applications like online backups. In my view, cost is the main con.
On the other hand, this is only a concern once your site gets large. The biggest advantages of S3 are that you don't have to worry about scalability and that it's pretty easy to set up and then forget about it because it just works. Many medium sized services use S3 with great success.
The solution we went for is an array of dedicated servers that host the images and also run webservers (don't use Apache, use webservers optimized for static content such as lighttpd or nginx), and in front of those, use a CDN (content delivery network, such as akamai or panther express). You will typically get high hit rates (depending on the access patterns of your site), so the end users will get most files directly out of the CDN and not cause any load on your servers (except for the first time a file is accessed). Thus you might be fine with just one server and a mirror for a while. As you scale, the challenges become how to distribute your images across the farm, how to manage redundancy etc.
I assume that time-to-market also plays a role. In that respect, a good strategy might be to start with S3 and be up-and-running quickly. Later on you can still migrate to a more sophisticated solution. In that case, make sure management keeps this in mind. Non-tech people tend to believe once a functionality works, you never have to touch it again. And be aware that migrating a lot of data takes time. When we changed our photo architecture, the copy jobs ran for months.
How about a Flickr/Picasa integration? The users use their own Flickr/Picasa account to store their photo and use the features in your site. In that case you pay for nothing for storing photo :P
I myself would like to have a single photo storing acc. instead of having individual acc. for each site.