Playframework 2.2 update: where to store uploads - upload

Starting Play 2.2 the getFile root directory is target/universal/stage/ instead of the projects root folder. In our project we uploaded files to the <root>/uploads folder. But after updating to Play 2.2 these uploads are inaccessible.
Is it desired that uploads are stored in target/universal/stage? I believe not, since those folders are created on build and we can therefore don't be sure these folders remain in place.

Play specifically does not support this.
See this recent discussion from the mailing list: https://groups.google.com/forum/#!topic/play-framework/iHwp1FIjZbw
As noted by the Play maintainers in that discussion, a frontend server like nginx running in front of Play is the recommended approach.

Related

Extract ZIP and move all files to parent directory

I'm only just getting started with Yeoman, trying to create a Generator that downloads WordPress, unzips it, and then proceeds to download my own WordPress starter theme.
The problem I'm having is that when I extract the latest.zip from wordpress.org (using this.extract()) it contains a wordpress/ directory resulting in my directory structure being my-project/wordpress/ rather than my-project/.
I've tried moving, copying and deleting the wordpress/ directory with various degrees of success; using this.fs.copy() I actually managed to get the files in the correct folder, but when trying to delete the original wordpress/ directory the user has to confirm deletion of every single file (not ideal). When I tried this.fs.move() I had to confirm each and every move instead.
I've found similar gulp/node.js questions on here, but I would prefer to use Yeoman's built in this.fs API.
Please note that I am aware of YEOPress but this is mostly for learnig purposes.
I ended up using the Node Package fs-extra instead as it deletes or moves without confirmation.

iOS:How to check if an directory already exists on the FTP server while uploading?

I am working on an App, in which I want to upload images and pdf to the FTP server. I am using this reference ref.All is working good. The images and pdf are getting uploaded on the server with proper names and sizes.
But, now I want to check if the directory is already exists on the server or not. I am not able to get it to work with this library.
So my question is that how to check directory on ftp,if directory is there then upload the files if not then first create directory on ftp and then upload files onto that directory?
Any Ideas.. ? Any help will be appreciated.
Different FTP servers will answer the LIST request in differing ways, so there is no single answer to this question. RFC959 says on the matter:
Since the information on a file may vary widely from system
to system, this information may be hard to use automatically
in a program, but may be quite useful to a human user.
Using the CWD request to change into the directory in question, and detecting a successful response will detect the directory, however that leaves you in that directory as a potentially unrequired side effect.
For these reasons, as well as others, you may find more modern protocols such as SSH (which includes a file transfer feature) to be more useful. You may find the DLSFTPClient CocoaPod useful.
M.

How do I override the Umbraco built-in media library methods to use s3?

I'm currently looking to move my Umbraco installation over to a load balanced setup. In order to do this, I need to move the Media library over to a CDN like Amazon's S3. I tested a few plugins that allow upload to s3, but they all list media files on the local file directory. This flat out will not work.
I was thinking I would write the code to browse the CDN, but how can I override the built-in media library code so that it uses my version instead? I didn't see a clear way to do this in the docs?
I am using this plugin: http://our.umbraco.org/projects/website-utilities/amazon-s3-media for amazon s3. The source code is here: https://bitbucket.org/gibedigital/umbraco-amazons3provider . He recently just updated the plugin. The plugin does not use the local file system. The developer was pretty responsive (and made a few updates for me when I asked).
However, I am adding to his project because his plugin did not allow saving within a predefined directory (amazon's virtual directories). But his source code is a start.
Good luck,
Robin

iOS storing files rules - Storing audio files on iOS app

I need to be sure I am doing the right thing:
In my app the user can download audio files from the server. I don't want those big file to be backed up as he can re download them when ever he needs to.
My app is addressed for iOS 4.0 and above.
So as I understand I need to store the files is the documents directory and set a flag for the directory to not back up ?
Am I correct ?
Instead of putting them into the "Documents" directory (which gets backed up to the cloud), why not put your audio files into a "Cache" directory (specifically "/Library/Cache", which does not get backed up)?
Here's another question here on Stackoverflow that may help give a further answer to your question.
You can also prevent files to backed up:
https://developer.apple.com/library/ios/qa/qa1719/_index.html

Sharing Uploaded Files between multiple Rails Applications

I have multiple applications (an admin application, a "public"/non-admin application and a web service application) that all share a single database.
I've gotten the applications to share models and other code where appropriate, so I don't have multiple copies of the same code in each. However, the one task that I've yet to configure is how to share files that get uploaded between applications. I'm using Paperclip to successfully upload files to my applications, but if it uploads the files to the application doing the upload.
Ideally, I'd like to be able to serve all the files from the web service. My idea was that I'd need some type of task executed every time a new file is uploaded to any of the applications to have the file created in the file structure of the web service.
I know I could easily accomplish serving files from a single application if I loaded the files into the database (which is how I accomplished this in a similar application suite), but I'm not sure if that's the best route to go for managing/serving the files. Another idea I had was storing the files in the database and having the web service manage "serving" them and having it create the file on the disk on the first request. After the first request for the file, the web service would serve the file from the disk rather than from the database.
Does anyone have any ideas on what the best way to accomplish this might be? Or any better ideas?
Thank you in advance for any feedback anyone might have on the subject.
I'd recommend putting them in a shared location that is served directly by your front end webserver (not Rails) if you have that kind of setup, in this example it's serving up a location called files that points at a folder on disk. Then in your paperclip options, change the save location.
has_attached_file :image,
:url => "/files/:basename.:extension",
:path => "/var/htdocs/public/files/:basename.:extension"
Are you running all apps on the same UNIX/Linux system? Have you tried creating symbolic links to share the folder that contains the images? The goal is to save all images to the same location. Eliminating the need to throw in complicated hooks for attachment creation.
Paperclip by default stores things at :rails_root/public/system/:attachment/:id/:style/:filename If you're sharing a database you won't have to worry about collisions. And you just need to create a system folder to be used by each app.
You can use one app's public/system folder as the master, or create an entirely new one. From this point on all other system folders that aren't the master one will be referred to slave folders. Once you've chosen your master it's as simple as moving everything in each slave folder to the master folder. Deleting the slave folders and replacing them with a symbolic link to the master folder.
Sample command set to migrate and replace with symlink given the paperclip defaults. It's probably a good idea to stop the server before attempting this.
$ mv /path/to/slave/project/public/system/* /path/to/master/system
$ mv /path/to/slave/project/public/system.bak
$ ln -s /path/to/master/system /path/to/slave/project/public/system
Once you're sure the migration is sucessful you can remove the backup:
$ rm /path/to/slave/project/public/system.bak

Resources