In my rails app, how can I access data from an already uploaded csv file? I have used paperclip to upload the file successfully, but struggling to find any tutorials or references on how to parse the data. I have seen http://railscasts.com/episodes/396-importing-csv-and-excel but this seems to be helpful for uploading into database columns.
From the API for the CSV class
Something along the lines of:
CSV.foreach("path/to/file.csv") do |row|
# use row here...
end
Related
I'm using Active Storage to upload CSV files, which are then read to update values in the database. I have set up a basic upload file class to do this;
class VendorFile < ApplicationRecord
has_one_attached :vendor_upload_file
validate :acceptable_file
end
I open the CSV files using the Rails 6 open method like this;
self.vendor_upload_file.open do |file|
CSV.foreach(file) do |row|
do some processing....
end
end
This works great for processing through the whole file. The issue is before processing I'd like to open the file and just read the first line to ensure the file is the correct format. I cannot find a way to open the file and read just the first line if the file is stored in Active Storage. Does anyone know a way to do this?
Thanks!
How about?
headers = self.vendor_upload_file.open(&:first).parse_csv
Ruby on Rails 3
I have a table that is not shown on my application. I want to export the table to excel from the console.
Is there way to export a database table to an excel file from the console?
Thank you
Here is what I have so far:
require 'csv'
file = "#{Rails.root}/public/survey.csv"
userrefs = Survey.all
CSV.open( file, 'w' ) do |writer|
userrefs.each do |ur|
writer << ur.attributes.values_at(*column_names)
end
end
When I enter require 'csv' it returns false. How do you make it true?
Also, the *column_names is undefined.
As mentioned in the comments an easy approach is using format.xls function to render an excel file from the console. Ryan Bates video covers Excel outputs extensively.
You can connect to the database table using the existing driver or, if you prefer a more high-level API, you can create an ActiveRecord model or use the Sequel gem.
Once connected, simply use the Ruby CSV library (it's in the standard library, so no additional library is required) to dump the content into a CSV file.
CSV files can be easily read from Excel.
PS. Just to use the appropriate words, that is not a Rails table. It's a database table.
Another interesting approach could be using the activeadmin gem, it's easy to install and allow you to export your tables to csv.
http://www.activeadmin.info/
I'm new to rails and I'm currently trying to parse an uploaded file to rails. However, after I "read" the file once I cannot read it again. From what I've read online it appears that rails immediately deletes the uploaded file. Is there a way to make the file persistent? My code is as follows
file_param = params[:sequence]
file_param.read.each do |l|
# do stuff
end
file_param.read.each do |l|
# do stuff again. this is not being called.
end
I've thought of using paperclip or some other storage gem, but I don't need to store the files, simply read their contents. Thanks!
Read it into an array, if you really need to go over it multiple times, or just save it.
I am making an application in Ruby on Rails which will ask users multiple choice questions. I want to upload questions to the database from an excel file. How can I do it?
Save the Excel spreadsheet as a CSV file then use a CSV parser, perhaps in a rake file:
In lib/taks/import.rake:
require 'fastercsv'
namespace :import => :environment do
task :questions do
FasterCSV.foreach("path/to/file.csv") do |row|
q = Question.create(:question=>row[0], etc...)
PossibleAnswer.create(:question=>q, :answer=>row[1], etc....) #providing PossibleAnswer belongs_to Question
end
end
end
Then run "rake import:questions"
You could use the spreadsheet gem to read in the data from an excel file:
http://rubygems.org/gems/spreadsheet
This is most useful if you want to allow users to upload their own excel documents, to import some questions, or if you want users to be able to download questions in excel file format.
If you just want to do a one-off import of some data i would go with Yule's idea and just do it via csv which is much easier to get to grips with.
I'm using paperclip for attachments in my application. I'm writing an import script for a bunch of old data, but I don't know how to create paperclip objects from files on disk. My first guess is to create mock CGI multipart objects, but that seems like a bit of a crude solution, and my initial attempt failed, I think because I didn't get the to_tempfile method right.
Is there a Right Way to do this? It seems like something that should be fairly easy.
I know that I've done the same thing, and I believe that I just created a File object from the path to each file, and assigned it to the image attribute. Paperclip will run on that file:
thing.image = File.new("/path/to/file.png")
thing.save
This works great for local files but it doesn't work as well for remote files. I have an app that uses paperclip for uploading images. Those images are getting stored on amazon s3. Anyway, I had some old data that I needed to import so I tried the following:
thing.image = open('http://www.someurl.com/path/to/image.jpg')
thing.save
If the file is small (say, less than 10K) then openuri returns a stringio object and my file would get stored on s3 as stringio.txt
If the file is larger than around 10K, then openuri returns a TempFile object. But the filename on s3 ends up being unique, but not really relating to the original filename of image.jpg
I was able to fix the problem by doing the following:
remote_photo = open('http://www.someurl.com/path/to/image.jpg')
def remote_photo.original_filename;base_uri.path.split('/').last; end
thing.image = remote_photo
thing.save