I want to use RoxyFileMan to manage uploaded images and files. But I should save them on the server. As you know, RoxyFileMan Upload uploads files and images in a folder named Uploads in a fileman directory. We can change FILES_ROOT to another local path to change the directory files get uploaded to.
But, I want to upload files on the server and then read them from the server after they've been uploaded so that they can be edited in ckeditor.
Can anyone please provide advice/guidance on how to achive this outcome?
It's not very clear how your question is meant, but if I understand you correctly, you are using RoxyFileMan on a local server and want to upload files to a remote online server, right?
Roxy uploads files to a directory on the server it is run from. Which means if run from localhost, it will use directory on your localhost. If run from a server, it will use a directory on server.
As far as my knowledge goes, you cannot upload from a localhost to an online server directly.
You could maybe achieve that using some custom script to open an FTP connection, but then you would have to also remake all the code to also load images from there... which would be rather silly.
Related
I need help moving the images I have from Parse to S3 on AWS. I have viewed numerous supposed guides and GitHub projects, but everything stops short at giving you all the information. One even says, you need GCS bucket set up, but gives no details on how to set up one. Just someone please help me with this. I have the S3 File Adapter in my index.js all set up for the app, but none of the images are there, they are still hosted in parse.
If you are referring to old images that where hosted with parse.com that you want to move across to your own environment then it can be done with the utility tool.
Get all files across all classess in a Parse database. Print file URLs
to console OR transfer to S3, GCS, or filesystem. Rename files so that
Parse Server no longer detects that they are hosted by Parse. Update
MongoDB with new file names.
https://github.com/parse-server-modules/parse-files-utils
Moving forward if you have setup your S3 bucket correctly all new images from your app will be stored there.
https://github.com/ParsePlatform/parse-server/wiki/Configuring-File-Adapters
I'm trying to upload a folder to a ftp server using Overbyte Ics Ftp component.
From what I understand there is no built in function to upload a folder containing files and sub-folders to a ftp so I have to create a recursion in order to upload them all into one call.
What is the correct approach to this problem?
I'm thinking about doing this:
scan the local folder that I want to upload and separate folders from files
for each folder name check if exists on the ftp. If not exists the create it
after creating all folders to the ftp server check if local file exists on the ftp. If not exists start uploading the file to the created directory.
Is this the proper way to do it?
Is there an easier approach to this task?
Thank you!
I'm trying to upload a file to my rails app by sending the user's local location of the file via a parameter, so the input URL would look like this:
http://rails-app.herokuapp.com/element?file=C:\temp\data.txt
It's easy if I'm working on my local machine, since I can just use File.read(filename), however this doesn't work on heroku. Thanks for any help!
First of all Heroku has read-only file system. So you can't upload anything directly to heroku.
Use something like Amazon S3 to keep files.
The second issue is your approach.
When running app locally - it has access to your C:/ drive.
But app that is located at remote server does not have access to your computers C:/ drive, so it can't fetch file.
You should upload either through browser file field or through passing accessible to anyone http link.
I have paperclip working just fine where I can upload files to my site, but whenever I make updates and push a new version of the site all of the files I uploaded via paperclip seem to disappear (All the information that was entered into the database remains though).
I assume the problem is that I haven't pulled the files from the live version of the site, but whenever I do a git pull it tells me everything is up to date. Is there anyway for me to download the files I've uploaded. (I would prefer to not use amazon S3 to store the files currently)
The files you have uploaded are stored at public folder. And public folder is not deployed with code, so your files are assuming to be disappeared.
If, you use amazon S3, then images will be stored at s3 and it will provide a dynamic url to access images. Then, you will be able to access images properly.
You can also save images at dropbox. In this application images are stored at dropbox and running on heroku. You may take the referance:
https://github.com/aman199002/Album-App # open source app to store albums(at dropbox).
http://album-app.herokuapp.com #Url of the application running on heroku.
When you deploy your application to Heroku, your pushed code is compiled into what is called a slug - this is essentially an archive of everything needed to run your application. When you restart or scale your application your original slug is then copied to be run. However, it's a readonly slug so when you upload files they exist on the dyno that received them so if you had multiple dynos your files wouldn't exist across them and they then do not persist when your application is restarted or you push new code nor is there any way to retrieve them.
Your only way to perist files on Heroku is to use an external cloud storage solution like Amazon S3, Rackspace files etc - fortunately it's very simple to have Paperclip use S3 for it's storage mechanism - there's a tutorial at https://devcenter.heroku.com/articles/paperclip-s3
I am not sure if this is the right place to put this question
I want a good FTP Client which can upload the files to the server if any changes are made to a particular file in the directory its watching.
google 'windows ftp syncing' (I'm assuming windows) there's a pile of them.
hth