Grails Hosting on EC2 Amazon Linux Instance - grails

I have successfully uploaded and deployed my grails application on amazon elastic beanstalk with Tomcat 8 and Java 8 on linux ec2 and web app is up and running. It works well when doing REST API calls to and from RDS database. I have a api to upload file to the server from mobile app and from web app frontend. When running this grails app in localhost its works great for this api and uploads files successfully to user.home/{myapplicationDirectory}/somefile path in my Windows OS. But after running this app in elastic beanstalk and trying to upload image from mobile gives NPE as FileNotFoundException
FileNotFoundException occurred when processing request: [POST] /api/images/add
/usr/share/tomcat8/sdpl/images/260519011919.zip (No such file or directory)
Stacktrace follows:
java.io.FileNotFoundException: /usr/share/tomcat8/sdpl/images/260519011919.zip (No such file or directory)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
I have a service to get application data storage directory with this method
def String getApplicationPath() {
return System.getProperty("user.home") + File.separator + "images" + File.separator;
}

Hi as I don't see your full application I don't want to be too presumptuous but as you're using AWS Beanstalk you should consider local file storage to always be a temporary storage. Your server could be terminate and restarted by beanstalk if it stops responding or fails any health checks.
You have other options available, again I don't know if you considered them and have a good reason for using the local file system so forgive me if that's the case, though if not, you could use S3 for the storage of images, then you don't have to worry about disk space, and the images could automatically be then served via AWS's CDN - Cloudfront, thus also reducing load on your app.
Alternatively, when you really want to store these images in the filesystem, you can look at using EFS, the Elastic File System. Your EBS instance could mount the filesystem on startup so it will be always available whenever your instance(s) start.
I didn't suggest using a standard EBS volume, as you can only ever attach a volume to a single instance, if you used EFS, you don't have to worry about space and it can be mounted to multiple instances so is a little more flexible.

Related

MLflow: Unable to store artifacts to S3

I'm running my mlflow tracking server in a docker container on a remote server and trying to log mlflow runs from local computer with the eventual goal that anyone on my team can send their run data to the same tracking server. I've set the tracking URI to be http://<ip of remote server >:<port on docker container>. I'm not explicitly setting any of the AWS credentials on the local machine because I would like to just be able to train locally and log to the remote server (run data to RDS and artifacts to S3). I have no problem logging my runs to an RDS database but I keep getting the following error when it get to the point of trying to log artifacts: botocore.exceptions.NoCredentialsError: Unable to locate credentials. Do I have to have the credentials available outside of the tracking server for this to work (ie: on my local machine where the mlflow runs are taking place)? I know that all of my credentials are available in the docker container that is hosting the tracking server. I've be able to upload files to my S3 bucket using the aws cli inside of the container that hosts my tracking server so I know that it as access. I'm confused by the fact that I can log to RDS but not S3. I'm not sure what I'm doing wrong at this point. TIA.
Yes, apparently I do need to have the credentials available to the local client as well.

Why is my Rails App faster on Heroku than on my Localhost

When I was developing my Rails app I noticed that it got extremely slow as soon as I included some background File creation via Amazon S3.
When I uploaded my site to Heroku the load time dropped a lot.
On my local server a page load takes about ~12s, on Heroku just ~1s.
Why does my app run that much slower on my local computer?
Does the Heroku server have a faster connection to the Amazon S3 servers?
To answer to your last question, yes, Heroku almost certainly has a faster connection to the AWS servers. According to the Heroku support page:
Heroku’s physical infrastructure is hosted and managed within Amazon’s secure data centers and utilize the Amazon Web Service (AWS) technology
Since they are both physically near and probably use the same datacenters as Amazon, any uploading / downloading to Amazon servers will be fast.
EDIT:
And as #Stefan noted, running rails in production mode speeds up a lot of stuff, including assets serving. You can try to run your server locally in production mode to see if that's the issue by running
$ rails s -e production

Rails - Do we have to store uploaded file on the same web server?

Do we have to store uploaded files on web server which is hosting rails app? Our rails app is hosting on a ubuntu server. However we would like to store uploaded files on a file server running Windows. The ubuntu server and Windows server are on the same internal network. Is it possible to do so? Or what is required to do so?
I remember did something like this. I used Samba in Ubuntu for shared a folder.
I was searching some tutorials
http://rubyguide.blogspot.com.ar/2012/02/install-samba-server-in-ubuntu-1104.html
http://www.sitepoint.com/ubuntu-12-04-lts-precise-pangolin-file-sharing-with-samba/

When you use Heroku to host your RoR applications, where and how to access your user uploaded files?

I've deployed a test Ruby on Rails application on Heroku and so far I'm a bit confused coming from a Windows Server 2008 VPN background.
On Windows Server 2008, I had my application folder and would use IIS to host the ASP.Net MVC application. I could then access folders via code and image.Save(path) files to the disc.
How do people handle user uploads on Heroku hosted applications?
For example, I want to let people create an apartment listing on the site, and upload pictures of the apartment. How does this work in Heroku?
You don't. Heroku has almost no provision of "app-local" storage, and the filesystem is read-only, except for the /tmp directory, which is not guaranteed to be persisted, and in any case it is local for each dyne.
You will have to upload the files to some cloud storage service (like S3 or CloudFiles etc). Usually you have gems at your command which will manage this for you (Carrierwave is a quite recent example).

Problem downloading large files with Rails, Nginx (with x-accel-redirect) and Apache(x-sendfile)

We have a big problem with downloads when the size is over than 1gb.
We are using Rails 2.3.5, passenger 2.2.9 on Amazon EC2 2gb with 2gb of Ram and Fedora 10.
Files are stored on /mnt/files, project is on /mnt/www/project
We tried to send files with Nginx and x-accel-redirect and also Apache with x-sendfile.
We can download only and always 1.09gb instead of 1.54gb!!
We can download files without problems where size is less than 1gb
If we link same file (that is not corrupted) in rails public dir, we can download without any problem.
X-Accel-Redirect or X-SendFile are configured correctly, tested and checked a lot of time.
So:
Nginx with x-accel-redirect [fail]
Apache with x-send-file [fail]
Send File without x-accel-redirect or x-sendfile on nginx or apache [fail]
Linking file in public and direct download [works]
Any suggestion?
Thanks!!!
If you're looking to restrict access to these downloads, have you tried the Access Key module?

Resources