I'm using Rails 4 and Paperclip.
Beacuse I need to upload files on FTP server i'm using this great gem:
https://github.com/xing/paperclip-storage-ftp
Everything works perfect in local, but in FTP I can't rename files using this code:
def rename_myfile
if self.rename.present?
path = self.myfile.path
FileUtils.move(myfile.path, File.join(File.dirname(myfile.path), self.rename))
self.myfile_file_name = self.rename
end
end
I got an error:
No such file or directory # sys_fail2 - (/myfiles/19/original/myfileOriginalName.jpg, /myfiles/19/original/myfileRenamedName.jpg)
How to enter in ftp with FileUtils.move???
Create and Delete are working very well!
https://github.com/xing/paperclip-storage-ftp/issues/28
You have to build the full path to the file not just the file's dirname and name. change your FileUtils.move line to this:
orig_full_path = Rails.root.join "public", myfile.path # assuming they're saved in your public directory
new_full_path = Rails.root.join "public", File.dirname(myfile.path), self.rename
FileUtils.move orig_full_path, new_full_path
The idea here is to get the absolute path to your files. Before you were just giving FileUtils this path: /myfiles/19/original/myfileOriginalName.jpg which means it will look for the file in a folder /myfiles in the root of your file system. But they're actually in your Rails folder. So you should use Rails.root.join to get the true absolute path: /Users/me/my_rails_project/public/myfiles/19/original/myfileOriginalName.jpg.
Related
I'm trying to migrate my local Active Storage files to Google Cloud Storage. I tried to just copy the files of /storage/* to my GCS Bucket - but it seems that this does not work.
I get 404 not found errors cause it is searching for files like:
[bucket]/variants/ptGtmNWuTE...
My local storage directory has a totally different folder structure with folders like:
/storage/1R/3o/NWuT....
My method to retrieve the image is as followed:
variant = attachment.variant(resize: '100x100').processed
url_for(variant)
What am i missing here?
As it turns out - DiskService aka. local storage uses a different folder structure than the cloud services. Thats really weird.
DiskService uses as folders part of the first chars of the key.
Cloud services just use the key and put all variants in a separate folder.
Created a rake task to copy files over to cloud services. Run it with rails active_storage:migrate_local_to_cloud storage_config=google for example.
namespace :active_storage do
desc "Migrates active storage local files to cloud"
task migrate_local_to_cloud: :environment do
raise 'Missing storage_config param' if !ENV.has_key?('storage_config')
require 'yaml'
require 'erb'
require 'google/cloud/storage'
config_file = Pathname.new(Rails.root.join('config/storage.yml'))
configs = YAML.load(ERB.new(config_file.read).result) || {}
config = configs[ENV['storage_config']]
client = Google::Cloud.storage(config['project'], config['credentials'])
bucket = client.bucket(config.fetch('bucket'))
ActiveStorage::Blob.find_each do |blob|
key = blob.key
folder = [key[0..1], key[2..3]].join('/')
file_path = Rails.root.join('storage', folder.to_s, key)
file = File.open(file_path, 'rb')
md5 = Digest::MD5.base64digest(file.read)
bucket.create_file(file, key, content_type: blob.content_type, md5: md5)
file.close
puts key
end
end
end
I have stored an file in AWS s3 storage when trying to open an file to import data using roo gem, It raising following error Errno::ENOENT: No such file or directory # rb_sysopen
def self.import(file, user_id)
imported_file = ImportedFile.find(file)
spreadsheet = Roo::Spreadsheet.open(open(imported_file.file_url), extension: :csv)
spreadsheet = Roo::Spreadsheet.open(imported_file.file)
header = spreadsheet.row(1)//raising error here
end
I even tried this also
spreadsheet = Roo::Spreadsheet.open(imported_file.file_url)
Getting below error in log
Errno::ENOENT: No such file or directory # rb_sysopen - /uploads/imported_files/7a6f0463-b3cd-48f8-a579-bc27951242fe/13c96e3e-d3f3-4ed8-8d9a-b9ea03c0cc8c.csv
To open URLs you should require the open-uri library first:
require 'open-uri'
See the example:
open('http://example.com/')
# throws Errno::ENOENT: No such file or directory # rb_sysopen - http://example.com/
require 'open-uri'
open('http://example.com/')
# opens the website
Finally following code is worked for me.
spreadsheet = Roo::Spreadsheet.open(open(imported_file.file_url), extension: File.extname(imported_file.file_url).gsub('.','').to_sym) rescue nil
I have application which is deployed to Heroku. I have added functionality for uploading users thorough the CSV. For this I have provided CSV upload functionality (Used Paperclip gem).
Here is my code for reading the file and creating new user
def import(file)
CSV.foreach(file.path, headers: true) do |row|
row_hash = row.to_hash.values
data = row_hash[0].split("\t")
.
.
.
end
On the local it is working fine. But on the heroku it is giving me following error
Errno::ENOENT: No such file or directory # rb_sysopen - https://s3.amazonaws.com/..../..../sample_csv(2).csv
I referred following links
Errno::ENOENT (No such file or directory) in amazon-s3
File reading from Amazon server, ruby on rails, no match route
but didn't any success. For more debugging, I tried same url from my local rails console and it is giving me the same error.
2.2.2 :008 > cp = "https://s3.amazonaws.com/..../..../sample_csv(2).csv"
2.2.2 :008 > f = File.open(cp, "r")
Errno::ENOENT: No such file or directory # rb_sysopen - https://s3.amazonaws.com
Also tried open uri http://ruby-doc.org/stdlib-2.1.0/libdoc/open-uri/rdoc/OpenURI.html.
I can download the same file from the browser.
Can any one let me know how to resolve this error. Is there any bucket permission issue (I have already provided open access for the bucket).
Try this
require 'open-uri'
require 'csv'
def import(file)
CSV.new(open(file), :headers => :true).each do |row| #First open the file using open
row_hash = row.to_hash.values
data = row_hash[0].split("\t")
.
.
.
end
For more info you can refer this link
I have in the lib directory following structure:
/lib/dir_a/dir_b/dir_c/
In dir_c are stored images.
I am trying to load these images and display them in views. I have tried to set up the path to the images in views, but I got the 404 error.
So I did following: created a file in the initializers folder and into this file I put:
Dir[Rails.root + 'lib/dir_a'].each do |file|
require file
end
For loading all content stored in the dir_a directory (which involves as subdirectories as files).
But when I restarted server, I got this error:
...dependencies.rb:251:in `require': cannot load such file -- /Users/radek/rubydev/EDI/lib/brands (LoadError)
I also tried stuff like
Dir[Rails.root + 'lib/dir_a/'].each do |file|
or
Dir[Rails.root + 'lib/dir_a/**'].each do |file|
But none of those helped me.
Thus, is there any way to load a content from /lib directory and work with them in views?
Dir[Rails.root + 'lib/**/*.rb'].each do |file|
require file
end
In my project I upload audiofiles to GridFS using CarrierWave gem. After uploading file is saved to GridFS properly but in my application I am unable to get it from GridFS with MongoFiles Tool or with GridFS-nginx module.
mongofiles get audiotracks/4dfb70d6bcd73f3488000002/data
command leads to this error:
assertion: 13325 couldn't open file: audiotracks/4dfb70d6bcd73f3488000002/data
The only way to get file is to use rails console and it works fine:
cc = Mongo::GridFileSystem.new(Mongo::Connection.new.db("test")).open('audiotracks/4dfb70d6bcd73f3488000002/data', 'r')
cc.read
So if you have encountered problem like this or have some ideas - plz let me know.
mongofiles get will try to write the file to disk with the same name and path as in GridFS.
Assertion 13325 happens when GridFS can't write the file like this.
You should check if the file path exists and you have the permission to write the file. Alternatively you could just provide a file name with the --local parameter.
mongofiles --local mytrack.mp3 get audiotracks/4dfb70d6bcd73f3488000002/data