I am building a web app with a ReactJS for my front-end and a rails back-end API. I have to display only 4 images in total in the whole app; these 4 images are picked among a group of approximatively 50 images, and that group isn't going to grow much (maximum 10 more images per year). The 4 images are supposed to change every 3 to 7 days.
So I was thinking, in terms of productivity, performance and price, what is the best way to handle my images between the following:
Create a local, static img folder in my React front-end, with all
the images, and import them in my components.
Use a Image upload/storage service like e.g. Cloudinary, Imgx, AWS S3... with my rails back-end to serve my images.
Or maybe there is even a better solution than these two ?
Due to the nature of the software you're describing, I'd suggest you go with creating local and static images in your react front-end app.
The main reason for this is that:
You've mentioned that it's not going to be more than 10 images per year, so it's easy to handle it manually whenever you need to update it.
You won't be depending on a third-party in terms of storage (unlike using AWS S3, or any other provider, where you'll be unnecessarily depending on it)
The images will work independently of the backend API server, so even if there's some kind of failure in the backend, the platform will be even more robust, by not depending on the backend server to show these images.
This will also reduce the bandwidth used between server & client, every image request will be "hitting" the client app, which should have been automatically cached the JS, CSS and Images file, so it'd be automatically optimized for better scaling.
Related
I currently have a rails app running on some dedicated servers, where I need to dynamically (per request) generate some banner images by fetching product images from s3 and generate some custom formatting on top of each image (combining with a logo, product price, some texts etc.). After the image has been generated, the image can be cached in a CDN.
There are many, many product images and the data/texts/prices that needs to go into each image often change, hence I don't want to rely on pre-generating all the image combinations and store them in S3. In the current setup I have, where I use imagemagick under the hood to generate the images, a request takes around 750ms where 2/3 are time spend in ruby/imagemagick and 1/3 is network.
I'm considering moving the job of generating the actual banner images onto Amazon (EC2 probably). This way I can make the network to s3 shorter and I can better scale up and down as requests come in. I can then fetch the data/texts needed to generate each image via an API from my current app. However I'm unsure if there are libraries fit for this exact task of the actual image generation, what will be high performing? And is there a better way than spawning some EC2 servers (e.g. lambda)?
You could look at ruby-vips. It's typically 3x to 4x faster than imagemagick, and needs a lot less memory. If you open an issue on the ruby-vips github repo I could help set up a benchmark.
It's part of rails6, so it might to be simple to experiment with, depending on which rails you are using.
You could deploy your app to ElasticBeanStalk and then let Amazon take care of the infra needed to scale when you see high load.
http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/Welcome.html
Or as you pointed out you could go serverless with Lambda
We have 20 different websites which are on two servers on AWS. All websites are using a web application called DesignMaker (MVC application using ImageMagick to do image composition and alterations) to do heavy image processing for users images. Users can upload images to that application and start to do some design with their uploaded images. you can assume that all the image processing is optimized in the code.
My concerns here is to remove load of heavy image process from Cpu of the main servers and put it on another server. So the first things that comes to my mind is to separate that application and convert it to a web service that runs on other servers. In this way we put the load of image processing on other machines. Please tell me if I have missed something.
Is calling an API to do some image processing a good approach?
What are other alternatives?
You're right to move image processing from your web thread, this is just bad practice.
If this was me (and I have done this in a few projects I've worked on) I would upload the image from the MVC app to a AWS S3 container then fire off a message using SQS or some queuing platform. Then have an Elastic Beanstalk instance listening for messages on the queue, when it gets the message it then picks up the images from S3 and processes it however you want.
(I'm an Azure guy so forgive me if I've picked the wrong AWS services, but the pattern is the same)
I have implemented a rails application and have deployed it on azure webserver.
The issue I am getting is that some of the images present in the public folder take a long time to load hence the website performance is very low.Some of the images are as small as 20kb and still takes around 13 secs to load.
My question is that if i were to put the images present in the public directory in the CDN(Content delivery network) and then load via cache, will it give better performance or will it not affect the overall performance.
Is it also possible to put all the images in the CDN for the rails app in production.
Thanks.
Well, remember that a CDN is just another webserver. All you are doing when you use a CDN is hyperlinking to the resource on that other webserver.
<img src="http://www.quackit.com/pix/milford_sound/milford_sound_t.jpg" />
Now, will it speed up the load times of your app? Maybe. There are lots of factors effecting this, namely:
Why is your app loading slow? Is it your connection? Are you on dialup? A CDN can't help that.
Why is your azure server slow? Is there lots of traffic? If so, a CDN will help.
Most large production applications likely use a CDN for all of their static assets, such as images, css, and javascript. (They probably own the CDN, but still, its a CDN.) So, yes, every image in your site could be stored in a CDN. (Very easy if these are all static images.) However, CDN's that do this are not usually free.
Why are you choosing to use azure for a rails app? It's possible, but it would be much easier to use something like Heroku or Engineyard. You could even use a VPS service like Digital Ocean, and set up your own small army of CDN's using VPS providers around the country. (If your a cheapskate, like me.)
There usally aren't that many images in a production rails app that are located in /public. Usually those images are in assets/images/... the only thing I would keep in public would be a small front end site and perhaps some 404/error pages.
Initially I wanted to host my application on Heroku, but since the file-system on Heroku is read-only, I would need to store uploaded images on Amazon S3 or something similar.
The pictures mostly have mobile phone camera quality (I think something between 500kb - 1MB). I would like to also create thumbnails of those pictures with Rails and save them.
Since I don't know how much traffic I will have, the whole system should be scalable.
Is there a better/cheaper alternative to the above (Heroku + S3), e.g. storing images in the database or other hosters?
This really depends on whether you want to stay with a PaaS (i.e. Heroku, Azure, etc.), or if you want to go with a IaaS (i.e. AWS). Given that you stated Heroku, I will assume you want a PaaS. I'm not sure of the exact cost difference between services (but I can get this for you if needed), but combining Heroku + S3 + (Paperclip || Carrierwave) = an incredibly fast solution that scales. Then in the future you can look into cutting costs, once you prove your idea.
I am using PhantomJS to dynamically generate 10 large images of websites at a time in each request. Therefore it is important that I cache these images and check if they are cached so I can serve them up. I've never cached images before, so I have no idea how to do this.
Some other information:
PhantomJS writes images to your local filesystem at the path you specify.
I want to cache these images but also need to balance that with updating the cache if the websites have updated.
I will be running these image generation processes in parallel.
I'm thinking of using Amazon's Elastic MapReduce to take advantage of Hadoop and to help with the load. I've never used it before, so any advice here would be appreciated.
I am pretty much a complete noob with this, so in depth explanations with examples would be really helpful.
What's your front-end web server? Since PhantomJS can write images to your local filesystem at any path you specify, you should specify the document root of your web server so you're serving them statically. This way Rails doesn't even have to be involved.