I have a task that is to output a CSV. I can confirm that the CSV is in memory but when I call the following, the file is not created
File.open(File.join(Rails.root, 'tmp', file_name), 'w') do |file|
file.write(csv_export)
end
I think it might be a permissions issue because the task is running in the application context but for the life of me can not discover what the user name is to add to tmp folder permission
you could check the current permission of that tmp folder (readable?, writable?, executable?), you could also check whether the permission requester is the folder owner or group owner with (owned?, grpowned?), then you could grant the permission for tmp folder by chmod to create file_name
tmp_folder = File.join(Rails.root, 'tmp')
if File.owned?(tmp_folder) && !File.executable?(tmp_folder)
File.chmod(0700, tmp_folder)
end
File.open(File.join(tmp_folder,file_name), 'w') do |file|
file.write(csv_export)
end
Related
I'm trying to migrate my local Active Storage files to Google Cloud Storage. I tried to just copy the files of /storage/* to my GCS Bucket - but it seems that this does not work.
I get 404 not found errors cause it is searching for files like:
[bucket]/variants/ptGtmNWuTE...
My local storage directory has a totally different folder structure with folders like:
/storage/1R/3o/NWuT....
My method to retrieve the image is as followed:
variant = attachment.variant(resize: '100x100').processed
url_for(variant)
What am i missing here?
As it turns out - DiskService aka. local storage uses a different folder structure than the cloud services. Thats really weird.
DiskService uses as folders part of the first chars of the key.
Cloud services just use the key and put all variants in a separate folder.
Created a rake task to copy files over to cloud services. Run it with rails active_storage:migrate_local_to_cloud storage_config=google for example.
namespace :active_storage do
desc "Migrates active storage local files to cloud"
task migrate_local_to_cloud: :environment do
raise 'Missing storage_config param' if !ENV.has_key?('storage_config')
require 'yaml'
require 'erb'
require 'google/cloud/storage'
config_file = Pathname.new(Rails.root.join('config/storage.yml'))
configs = YAML.load(ERB.new(config_file.read).result) || {}
config = configs[ENV['storage_config']]
client = Google::Cloud.storage(config['project'], config['credentials'])
bucket = client.bucket(config.fetch('bucket'))
ActiveStorage::Blob.find_each do |blob|
key = blob.key
folder = [key[0..1], key[2..3]].join('/')
file_path = Rails.root.join('storage', folder.to_s, key)
file = File.open(file_path, 'rb')
md5 = Digest::MD5.base64digest(file.read)
bucket.create_file(file, key, content_type: blob.content_type, md5: md5)
file.close
puts key
end
end
end
I have application which is deployed to Heroku. I have added functionality for uploading users thorough the CSV. For this I have provided CSV upload functionality (Used Paperclip gem).
Here is my code for reading the file and creating new user
def import(file)
CSV.foreach(file.path, headers: true) do |row|
row_hash = row.to_hash.values
data = row_hash[0].split("\t")
.
.
.
end
On the local it is working fine. But on the heroku it is giving me following error
Errno::ENOENT: No such file or directory # rb_sysopen - https://s3.amazonaws.com/..../..../sample_csv(2).csv
I referred following links
Errno::ENOENT (No such file or directory) in amazon-s3
File reading from Amazon server, ruby on rails, no match route
but didn't any success. For more debugging, I tried same url from my local rails console and it is giving me the same error.
2.2.2 :008 > cp = "https://s3.amazonaws.com/..../..../sample_csv(2).csv"
2.2.2 :008 > f = File.open(cp, "r")
Errno::ENOENT: No such file or directory # rb_sysopen - https://s3.amazonaws.com
Also tried open uri http://ruby-doc.org/stdlib-2.1.0/libdoc/open-uri/rdoc/OpenURI.html.
I can download the same file from the browser.
Can any one let me know how to resolve this error. Is there any bucket permission issue (I have already provided open access for the bucket).
Try this
require 'open-uri'
require 'csv'
def import(file)
CSV.new(open(file), :headers => :true).each do |row| #First open the file using open
row_hash = row.to_hash.values
data = row_hash[0].split("\t")
.
.
.
end
For more info you can refer this link
I'm using Rails 4 and Paperclip.
Beacuse I need to upload files on FTP server i'm using this great gem:
https://github.com/xing/paperclip-storage-ftp
Everything works perfect in local, but in FTP I can't rename files using this code:
def rename_myfile
if self.rename.present?
path = self.myfile.path
FileUtils.move(myfile.path, File.join(File.dirname(myfile.path), self.rename))
self.myfile_file_name = self.rename
end
end
I got an error:
No such file or directory # sys_fail2 - (/myfiles/19/original/myfileOriginalName.jpg, /myfiles/19/original/myfileRenamedName.jpg)
How to enter in ftp with FileUtils.move???
Create and Delete are working very well!
https://github.com/xing/paperclip-storage-ftp/issues/28
You have to build the full path to the file not just the file's dirname and name. change your FileUtils.move line to this:
orig_full_path = Rails.root.join "public", myfile.path # assuming they're saved in your public directory
new_full_path = Rails.root.join "public", File.dirname(myfile.path), self.rename
FileUtils.move orig_full_path, new_full_path
The idea here is to get the absolute path to your files. Before you were just giving FileUtils this path: /myfiles/19/original/myfileOriginalName.jpg which means it will look for the file in a folder /myfiles in the root of your file system. But they're actually in your Rails folder. So you should use Rails.root.join to get the true absolute path: /Users/me/my_rails_project/public/myfiles/19/original/myfileOriginalName.jpg.
I would like a rake task that gets some information and then writes to the local filesytem
I thought I had this working but not sure how.
namespace :arc do
task :vday => :environment do
locations=Location.with_current_mec "something"
jt={}
jt[:locations]={}
locations.each do |location|
jt[:locations][location.id]=location.to_small_hash
end
# want this to write to local filesytem
File.write('/Users/jt/output.json', JSON.pretty_generate(jt))
end
end
and then call like
heroku run rake arc:vday
But this is not working and gives me an error on writing the file. Is there a workaround to make it write to my local filesystem and not heroku?
Can you try this
File.open('/Users/jt/output.json', 'w') {|file| file.write(JSON.pretty_generate(jt))}
instead of
File.write('/Users/jt/output.json', JSON.pretty_generate(jt))
I have no experience with Ruby or rake or anything, but I am using slate for API documentation, and it uses Ruby and rake and stuff to build the file. I know nothing at all about these things, but what I do know is this: when I do a rake build it updates a folder (slate/build). I then have to manually copy slate/build to ../app/docs after every single rake build. Is there something I can do that will copy that folder on every rake build automatically for me?
Add to your Rakefile:
ROOT = File.expand_path('..', __FILE__)
task :build_and_move => [:build] do
cp_r(File.join(ROOT, 'slate/build'), File.join(ROOT, '../app/docs'))
# or
# mv(File.join(ROOT, 'slate/build'), File.join(ROOT, '../app/docs'))
end
and then run rake build_and_move.
You can use FileUtils for this.
Docs: http://ruby-doc.org/stdlib-1.9.3/libdoc/fileutils/rdoc/FileUtils.html#method-c-copy
Example from the docs:
Copies src to dest. If src is a directory, this method copies all its contents recursively. If dest is a directory, copies src to dest/src.
FileUtils.cp 'eval.c', 'eval.c.org'
FileUtils.cp %w(cgi.rb complex.rb date.rb), '/usr/lib/ruby/1.6'
FileUtils.cp %w(cgi.rb complex.rb date.rb), '/usr/lib/ruby/1.6', :verbose => true
FileUtils.cp 'symlink', 'dest' # copy content, "dest" is not a symlink