Save .csv file to database not recognizing to_hash method - ruby-on-rails

As per this example, the following method:
require "csv"
def import_vault_data(filename)
fn = "#{RAILS_ROOT}/public/data/#{filename}"
CSV.foreach(fn, :headers => true) do |row|
House.create!(row.to_hash)
end
end
is producing this error:
undefined method `to_hash' for #<Array:0x104cc07b8>
Any clue as to what is missing?
I am using rails 2.3.9

May be a bit late for the answer but you need to:
CSV.foreach(file.path, headers: true) do |row|
as you cant call to_hash unless you have headers in the hash

try this for ruby 1.8.7
House.create!(row.hash)

you can also do
House.create!(Hash[row])

Related

CSV won't import by key in hash (Rails)

I'm having problems importing this CSV:
municipality,province,province abbrev,country,region
Vancouver,British Columbia,BC,Canada,Metro Vancouver - North
Specifically, Vancouver is not being returned when I look for its value by its key:
municipality_name = row["municipality"]
Here's the code:
def self.import_csv(file)
CSV.foreach(file, headers: true,
skip_blanks: true,
skip_lines: /^(?:,\s*)+$/,
col_sep: ",") do |row|
municipality_name = row["municipality"]
puts row.to_h
puts "municipality_name: #{municipality_name}"
puts "row[0]: #{row[0]}"
end
end
Here's the output:
irb(main):052:0> Importers::Municipalities.import_csv('tmp/municipalities.csv')
{"municipality"=>"Vancouver", "province"=>"British Columbia", "province abbrev"=>"BC", "country"=>"Canada", "region"=>"Metro Vancouver - North"}
municipality_name:
row['municipality']:
row[0]: Vancouver
Seems like I'm missing something obvious. I thought maybe there was a hidden character in the CSV but turned on hidden characters in Sublime and no dice.
Thanks in advance.
You need to call to_h on the row if you want to access it by its keys. Otherwise, it is an array-like object, accessible by indices.
def self.import_csv(file)
CSV.foreach(file, headers: true,
skip_blanks: true,
skip_lines: /^(?:,\s*)+$/,
col_sep: ",") do |row|
row = row.to_h
municipality_name = row["municipality"]
puts "municipality_name: #{municipality_name}"
end
end
Seems like it was a problem with the CSV and the code works fine. Created a new CSV, typed in the same content, and it worked. Maybe an invisible character that Sublime wasn't showing? Can't verify as I wiped the original CSV that was causing issues.

Rails consider multiple col_sep values for CSV import

I want to use comma OR semicolon as :col_sep when importing CSV data in rails:
CSV.foreach(file.path, :col_sep => (";"), headers: true) do |row|
user_hash = row.to_hash
User.create!(user_hash)
end
works.
But putting different col_seps inline won't work:
CSV.foreach(file.path, :col_sep => (";",","), headers: true) do |row|
Is it even possible? I haven't found anything in the docu nor here on stackoverflow.
That isn't possible. Col_sep can only accept one string. There are workarounds for this, mentioned here and here.

Import CSV with long numbers in Rails

I am going to keep this question simple.
I am trying to import a CSV file into my application.
The file has long numbers in it such as :"9405510200830754182150"
but when the file is imported the data looks like this: "9.40551e+21"
does anyone know how to get around this?
Here is the code I am using
CSV.foreach(file.path, headers: true) do |row|
puts "row: #{row.inspect}"
end
UPDATE
Thank you for the comments, I am not sure why CSV is converting that number into a float i need to keep it as a string.
I should clarify that I am using Rails 3.2.18 for this project
If you want to reproduce my code:
1.create CSV with 9405510200830754182150 in it
2.run this code to terminal:
file = File.join(Rails.root, 'tracking.csv')
CSV.foreach(file, headers: true) do |row|
puts "row: #{row.inspect}"
end
I need to be able to keep "9405510200830754182150" is a string since this is a tracking number of an order and needs to be stored in the database
Are you sure that "9.40551e+21" is not a visual approximation? Try this:
CSV.foreach(file.path, headers: true) do |row|
puts row['my_numeric_header']
end
It's supposed to treat everything in a CSV file as a string by default. You could try the converters: numeric option
CSV.foreach(file.path, headers: true, converters: :numeric) do |row|
puts "row: #{row.inspect}"
end
Which will interpret numbers into the appropriate types. Otherwise you might have to debug the CSV module code to figure out what's going on.
The :float converter converted your number to a Float for you. Unfortunately Float cannot hold such a large number, see comments...
[14] pry(main)> val = CSV::Converters[:float].('9405510200830754182150')
=> 9.405510200830755e+21
[15] pry(main)> val.class
=> Float
[16] pry(main)> "%d" % val
=> "9405510200830754553856"
[17] pry(main)> "%f" % val
=> "9405510200830754553856.000000"

how to get headers from a CSV file in ruby

I need to validate headers in a CSV file before parsing data in it.
# convert the data into an array of hashes
CSV::Converters[:blank_to_nil] = lambda do |field|
field && field.empty? ? nil : field
end
csv = CSV.new(file, :headers => true, :header_converters => :symbol, :converters => [:all, :blank_to_nil])
csv_data = csv.to_a.map {|row| row.to_hash }
I know I can use headers method to get the headers
headers = csv.headers
But the problem with headers method is it "Returns nil if headers will not be used, true if they will but have not yet been read, or the actual headers after they have been read."
So if I put headers = csv.headers above csv_data = csv.to_a.map {|row| row.to_hash } line headers is true and if I put it after reading data, headers contain headers row in an array. It imposes an order of instructions on my method which is very hard to test and is bad programming.
Is there a way to read headers row without imposing order in this scenario? I'm using ruby 2.0.
CSV.open(file_path, &:readline)
I get the problem! I'm having the same one. Calling read seems to do what you want (populates the headers variable):
data = CSV.new(file, **flags)
data.headers # => true
data = CSV.new(file, **flags).read
data.headers # => ['field1', 'field2']
There might be other side effects I'm not aware of, but this works for me and doesn't smell too bad.
I don't quite get the problem. If you use one of the iterator methods, it's quite easy to do some validation on the headers:
CSV.foreach('tmp.txt', headers: true) do |csv|
return unless csv.headers[0] != 'xyz'
end

Collecting hashes into OpenStruct creates "table" entry

Why this (evaluated in Rails console)
[{:a => :b}].collect {|x| OpenStruct.new(x)}.to_json
adds a "table" record in there?
"[{\"table\":{\"a\":\"b\"}}]
I want just this:
"[{\"a\":\"b\"}]
Does it mean that Rails' to_json method handles OpenStruct in a different way? When I try it in the irb, it's not there:
require 'ostruct'
[{:a => :b}].collect {|x| OpenStruct.new(x)}.inspect
Because #table is a instance variable of OpenStruct and Object#as_json returns Hash of instance variables.
In my project, I implemented OpenStruct#as_json to override the behaviour.
require "ostruct"
class OpenStruct
def as_json(options = nil)
#table.as_json(options)
end
end
Use marshal_dump, although this somewhat defeats the purpose of converting it to an OpenStruct beforehand:
[{:a => :b}].collect {|x| OpenStruct.new(x).marshal_dump }.to_json
=> "[{\"a\":\"b\"}]"
The shorter way would be:
[{:a => :b}].to_json
"[{\"a\":\"b\"}]"
Alternatively you could moneky patch OpenStruct#as_json as shown in hiroshi's answer:
require "ostruct"
class OpenStruct
def as_json(options = nil)
#table.as_json(options)
end
end
I get around the problem by subclassing OpenStruct like so:
class DataStruct < OpenStruct
def as_json(*args)
super.as_json['table']
end
end
then you can easily convert to JSON like so:
o = DataStruct.new(a:1, b:DataStruct.new(c:3))
o.to_json
# => "{\"a\":1,\"b\":{\"c\":3}}"
Neat huh? So in answer to your question, you'd write this instead:
[{:a => :b}].collect {|x| DataStruct.new(x)}.to_json
giving you:
=> "[{\"a\":\"b\"}]"
UPDATE FOR RUBY 2.7 (Feb 5, 2021)
require 'json'
require 'ostruct'
class OpenStruct
def to_json
to_hash.to_json
end
def to_hash
to_h.map { |k, v|
v.respond_to?(:to_hash) ? [k, v.to_hash] : [k, v]
}.to_h
end
end
o = OpenStruct.new(a:1, b:OpenStruct.new(c:3))
p o.to_json
I found the other responses to be a tad confusing having landed here to just figure out how to turn my OpenStruct into a Hash or JSON. To clarify, you can just call marshal_dump on your OpenStruct.
$ OpenStruct.new(hello: :world).to_json
=> "{\"table\":{\"hello\":\"world\"}}"
$ OpenStruct.new(hello: :world).marshal_dump
=> {:hello=>:world}
$ OpenStruct.new(hello: :world).marshal_dump.to_json
=> "{\"hello\":\"world\"}"
I personally would be hesitant to monkey-patch OpenStruct unless you're doing it on a subclass, as it may have unintended consequences.
With ruby 2.1.2 you can use the following to get JSON without the table root element:
[{:a => :b}].collect {|x| OpenStruct.new(x).to_h}.to_json
=> "[{\"a\":\"b\"}]"
openstruct_array.map(&:to_h).as_json
The issue here is that internally it's doing a as_json which creates a Hash with the table key (because as_json serializes all of the instance variables of the object too and #table is an instance var of OpenStruct) and then it's doing a to_json on that which stringifies it.
So, the easiest way is to first just use to_h (which doesn't serialize the instance variables) and then to_json on that. So:
OpenStruct.new(x).to_h.json or in your case open_struct_array.map(&:to_h).to_json

Resources