How to refresh page after send_data in rails 6 - ruby-on-rails

Hello devs i'm generating a csv in my controllers but everytime the user clicks on download, it has to refresh manually, how can i avoid this?
#archivo_csv = CSV.generate(:encoding => 'windows-1252') do |csv|
csv << ["indicator","count"]
#indicator[1..#indicator.size].each_with_index do |indicator, i|
csv << ["#{indicator[0]}", indicator[1]]
indicator[2].each do |date|
csv << [" #{date[0]}", date[1]]
end
indicator[3].each do |status|
csv << [" #{status[0]}", status[1]]
end
end
send_data #archivo_csv, :filename => "Indicadores Generales.csv"
end

Related

Is there any optimized way to export CSV of more than 100K record using Rails?

I have 200k locations in my database. So I want to export all the locations into CSV format. While doing this it is taking too much time to download. What is the best way to optimize code in rails?
In controller:
def index
all_locations = Location.all
respond_to do |format|
format.csv { send_data all_locations.to_csv, filename: "locations-#{Date.today}.csv" }
end
end
In model
def self.to_csv
attributes = %w{id city address}
CSV.generate(headers: true) do |csv|
csv << ['Id', 'City', 'Address']
all.each do |location|
csv << attributes.map{ |attr| location.send(attr) }
end
end
end
I ran your code with some adjustments with my own data. I made the following changes, and using benchmarking I came to a 7x increase.
Your model:
def self.to_csv
attributes = %w{id city address}
CSV.generate(headers: true) do |csv|
csv << ['Id', 'City', 'Address']
all.pluck(attributes).each { |data| csv << data }
end
end
By using pluck you only get the data you want, and then you push all that data into the csv array.
if you are using Postgresql then you can use this in application_record.rb
def self.to_csv_copy(attrs="*", header=[])
rc = connection.raw_connection
rv = header.empty? ? [] : ["#{header.join(',')}\n"]
sql = self.all.select(attrs).to_sql
rc.copy_data("copy (#{sql}) to stdout with csv") do
# rubocop:disable AssignmentInCondition
while line = rc.get_copy_data
rv << line
end
end
rv.join
end
and then do
Location.to_csv_copy(%w{id city address}, ['Id', 'City', 'Address'])
It is even faster than the above solution.

Do a diff between csv column and ActiveRecord object

I have a simple csv (a list of emails) that I want to upload to my rails backend API which looks like this:
abd#gmail.com,cool#hotmail.com
What I want is to upload that file, check in the user table if there are matching rows (in terms of the email address) and then return a newly downloadable csv with 2 columns: the email and whether or not the email was matched to an existing user(boolean true/false).
I'd like to stream the output since the file can be very large. This is what I have so far:
controller
def import_csv
send_data FileIngestion.process_csv(
params[:file]
), filename: 'processed_emails.csv', type: 'text/csv'
end
file_ingestion.rb
require 'csv'
class FileIngestion
def self.process_csv(file)
emails = []
CSV.foreach(file.path, headers: true) do |row|
emails << row[0]
end
users = User.where("email IN (?)", emails)
end
end
Thanks!
Why not just pluck all the emails from the Users and do something like this. This example keeps it simple but you get the idea. If we can assume your input file is just a string of emails with comma separated values then this should work:
emails = File.read('emails.csv').split(',')
def process_csv(emails)
user_emails = User.where.not(email: [nil, '']).pluck(:email)
CSV.open('emails_processed.csv', 'w') do |row|
row << ['email', 'present']
emails.each do |email|
row << [email, user_emails.include?(email) ? 'true' : 'false']
end
end
end
process_csv(emails)
UPDATED to match your code design:
def import_csv
send_data FileIngestion.process_csv(params[:file]),
filename: 'processed_emails.csv', type: 'text/csv'
end
require 'csv'
class FileIngestion
def self.process_csv(file)
emails = File.read('emails.csv').split(',')
CSV.open('emails_processed.csv', 'w') do |row|
emails.each do |email|
row << [email, user_emails.include?(email) ? 'true' : 'false']
end
end
File.read('emails_processed.csv')
end
end
Basically what you want to do is collect the incoming CSV data into batches - use each batch to query the database and write a diff to a tempfile.
You would then stream the tempfile to the client.
require 'csv'
require 'tempfile'
class FileIngestion
BATCH_SIZE = 1000
def self.process_csv(file)
csv_tempfile = CSV.new(Tempfile.new('foo'))
CSV.read(file, headers: false).lazy.drop(1).each_slice(BATCH_SIZE) do |batch|
emails = batch.flatten
users = User.where(email: emails).pluck(:email)
emails.each do |e|
csv_tempfile << [e, users.include?(e)]
end
end
csv_tempfile
end
end
CSV.read(file, headers: false).lazy.drop(1).each_slice(BATCH_SIZE) uses a lazy enumerator to access the CSV file in batches. .drop(1) gets rid of the header row.
Ok so this is what I came up with. A solution that basically prevents users from uploading a file that has more than 10,000 data points. Might not be the best solution (I prefer #Max's one) but in any case wanted to share what I did:
def emails_exist
raise 'Missing file parameter' if !params[:file]
csv_path = params[:file].tempfile.path
send_data csv_of_emails_matching_users(csv_path), filename: 'emails.csv', type: 'text/csv'
end
private
def csv_of_emails_matching_users(input_csv_path)
total = 0
CSV.generate(headers: true) do |result|
result << %w{email exists}
emails = []
CSV.foreach(input_csv_path) do |row|
total += 1
if total > 10001
raise 'User Validation limited to 10000 emails'
end
emails.push(row[0])
if emails.count > 99
append_to_csv_info_for_emails(result, emails)
end
end
if emails.count > 0
append_to_csv_info_for_emails(result, emails)
end
end
end
def append_to_csv_info_for_emails(csv, emails)
user_emails = User.where(email: emails).pluck(:email).to_set
emails.each do |email|
csv << [email, user_emails.include?(email)]
end
emails.clear
end

Active admin CSV export custom query scoped

Am using active admin Export CSV option. Its returning all the values related to the particular table.
I want the reports only for a particular month.
Can anyone help?
you can write own csv exporter
collection_action :download_report, :method => :get do
users = User.where('created_at >= ?', Date.today - 1.month)
csv = CSV.generate( encoding: 'Windows-1251' ) do |csv|
# add headers
csv < [ #Some header ]
# add data
users.each do |user|
csv << [ user.created_at ]
end
end
# send file to user
send_data csv.encode('Windows-1251'), type: 'text/csv; charset=windows-1251; header=present', disposition: "attachment; filename=report.csv"
end
action_item only: :index do
link_to('csv report'), params.merge(:action => :download_report))
end
index :download_links => false do
# off standard download link
end
this is just example for you. Your code can be another
for generation csv file use this code where you want
# generate csv file of photo
def self.generate_csv
header = []
csv_fname = "#{CSV_FILE_PATH}/images.csv"
options = {headers: :first_row}
photo_columns = column_names - ["id", "updated_at"]
photo_columns.map{|col| col == "created_at" ? header << "ScrapeDate" : header << col.classify}
CSV.open(csv_fname, "w", options ) do |csv|
csv << header if File.exist?(csv_fname) && File.size(csv_fname) == 0
find_each(batch_size: 5000) do |photo|
csv << photo.attributes.values_at(*photo_columns)
end
end
end
in above code which column you don't want subtract that cols from actual cols, for example column_names - ["id", "updated_at"] here column_names return actual cols array and which cols we don't need we subtract them.

ArgumentError on CSV output

i'm getting the following error when trying to generate a CSV:
ArgumentError in ProductsController#schedulecsv
wrong number of arguments (0 for 1)
My Products controller is set up as follows:
def schedulecsv
products = Product.find(:all)
filename ="schedule_#{Date.today.strftime('%d%b%y')}"
csv_data = CSV.generate do |csv|
csv << Product.csv_header
products.each do |p|
csv << p.to_csv
end
end
send_data csv_data,
:type => 'text/csv; charset=iso-8859-1; header=present',
:disposition => "attachment; filename=#{filename}.csv"
end
Does anyone have any pointers here? Driving me bonkers!
Thanks!
From source of csv.rb place in /usr/lib/ruby/(version of your ruby gem)/csv.rb (on my machine)
Here is source code of CSV class's generate method
def CSV.generate(path, fs = nil, rs = nil, &block)
open_writer(path, 'w', fs, rs, &block)
end
generate method require filename as parameter.it will make file with given name,but You are calling CSV.generate filename was missed
so you have to passed name of file in generate call!
filename ="schedule_#{Date.today.strftime('%d%b%y')}"
CSV.generate filename do |csv|
csv << Product.csv_header
products.each do |p|
csv << p.to_csv
end
end

Rails 3.1 active record query to an array of arrays for CSV export via FastCSV

I'm attempting to DRY up a method I've been using for a few months:
def export(imagery_requests)
csv_string = FasterCSV.generate do |csv|
imagery_requests.each do |ir|
csv << [ir.id, ir.service_name, ir.description, ir.first_name, ir.last_name, ir.email,
ir.phone_contact, ir.region, ir.imagery_type, ir.file_type, ir.pixel_type,
ir.total_images, ir.tile_size, ir.progress, ir.expected_date, ir.high_priority,
ir.priority_justification, ir.raw_data_location, ir.service_overviews,
ir.is_def, ir.isc_def, ir.special_instructions, ir.navigational_path,
ir.fyqueue, ir.created_at, ir.updated_at]
end
end
# send it to the browser with proper headers
send_data csv_string,
:type => 'text/csv; charset=iso-8859-1; header=present',
:disposition => "attachment; filename=requests_as_of-#{Time.now.strftime("%Y%m%d")}.csv"
end
I figured it would be a LOT better if instead of specifying EVERY column manually, I did something like this:
def export(imagery_requests)
csv_string = FasterCSV.generate do |csv|
line = []
imagery_requests.each do |ir|
csv << ir.attributes.values.each do |i|
line << i
end
end
end
# send it to the browser with proper headers
send_data csv_string,
:type => 'text/csv; charset=iso-8859-1; header=present',
:disposition => "attachment; filename=requests_as_of-#{Time.now.strftime("%Y%m%d")}.csv"
end
That should be creating an array of arrays. It works just fine in the Rails console. But in the production environment, it just produces garbage output. I'd much rather make this method extensible so I can add more fields to the ImageryRequest model at a later time. Am I going about this all wrong?
I'm guessing that it probably works in the console when you do it for just one imagery_request, yes?
But when you do multiple it fails?
Again I'm guessing that's because you never reset line to be an empty array again. So you're continually filling a single array.
Try the simple way first, to check it works, then start going all << on it then:
csv_string = FasterCSV.generate do |csv|
imagery_requests.each do |ir|
csv << ir.attributes.values.clone
end
end
PS - in the past I've even used clone on my line-by-line array, just to be sure I wasn't doing anything untoward with persisted stuff...

Resources