So, I've got a script, say, program.rb, and in it I want to output the table_print version of an array of lists
['value1','value2','value3']
['value4','value4','value6']
so that it looks like this in the .txt file I output
col1 | col2 | col3
-------------------------------------------------------------------------
value1 | value2 | value3
.
.
.
I already have table_print installed, but this is all I have so far as a working model:
require 'table_print'
TABLEPRINT STUFF?
open('table_print_output.txt','a'){|g|
g.puts TABLEPRINT?
}
I guess I'm just not getting how to do the Ruby equivalent of MySQL's create table
CREATE TABLE MyGuests (
id INT(6) UNSIGNED AUTO_INCREMENT PRIMARY KEY,
firstname VARCHAR(30) NOT NULL,
lastname VARCHAR(30) NOT NULL,
email VARCHAR(50),
reg_date TIMESTAMP
)
and insert into
INSERT INTO table_name (column1, column2, column3,...)
VALUES (value1, value2, value3,...)
And I don't want a temporary/latent database just sitting around for no reason. It's like I need a database as a variable or something; that is, I create it, I populate it, I print it, I destroy it.
table_print can't print nested arrays like this:
arrays = [
['value1', 'value2', 'value3'],
['value4', 'value5', 'value6']
]
because it flattens the input. You have to convert the inner arrays to another object.
Hash would work:
hashes = array.map { |values| %w(col1 col2 col3).zip(values).to_h }
#=> [
# {"col1"=>"value1", "col2"=>"value2", "col3"=>"value3"},
# {"col1"=>"value4", "col2"=>"value5", "col3"=>"value6"}
# ]
tp hashes
# COL1 | COL2 | COL3
# -------|--------|-------
# value1 | value2 | value3
# value4 | value5 | value6
Struct would work as well:
Row = Struct.new(:col1, :col2, :col3)
rows = arrays.map { |values| Row.new(*values) }
#=> [
# #<struct Row col1="value1", col2="value2", col3="value3">,
# #<struct Row col1="value4", col2="value5", col3="value6">
# ]
tp rows
# COL1 | COL2 | COL3
# -------|--------|-------
# value1 | value2 | value3
# value4 | value5 | value6
Looks like you need String#ljust:
rows = [
['value1','value2','value3'],
['value4','value4','value6']
]
rows.each do |row|
puts "#{row[0].ljust(30)}|#{row[1].ljust(30)}|#{row[2].ljust(30)}"
end
Related
There is an old table with column type as JSON but only arrays are stored in this column.
Even though I am storing array, I am not able to query this field using the ANY keyword (which will work on array type columns in Postgres like in this post)
Eg: let's say ['Apple', 'Orange', 'Banana'] is stored as Json in the fruits column, I want to query like Market.where(":name = ANY(fruits)", name: "Orange") and get all the markets with Oranges available.
Can anyone please help me to write a migration to change the existing column(type: Json) to array type?
One example assuming a json field:
\d json_test
Table "public.json_test"
Column | Type | Collation | Nullable | Default
-----------+---------+-----------+----------+---------
id | integer | | |
fld_json | json | | |
fld_jsonb | jsonb | | |
fruits | json | | |
insert into json_test (id, fruits) values (1, '["Apple", "Orange", "Banana"] ');
insert into json_test (id, fruits) values (2, '["Pear", "Orange", "Banana"] ');
insert into json_test (id, fruits) values (3, '["Pear", "Apple", "Banana"] ');
WITH fruits AS
(SELECT
id, json_array_elements_text(fruits) fruit
FROM json_test)
SELECT
id
FROM
fruits
WHERE
fruit = 'Orange';
id
----
1
2
UPDATE Method to convert JSON array into Postgres array:
SELECT
array_agg(fruit)
FROM
(SELECT
id, json_array_elements_text(fruits)AS fruit
FROM
json_test) AS elements
GROUP BY
id;
array_agg
-----------------------
{Pear,Apple,Banana}
{Pear,Orange,Banana}
{Apple,Orange,Banana}
This assumes the JSON array has homogeneous elements as that is a requirement for Postgres arrays.
A simpler method of finding rows that have 'Orange' in the json field:
SELECT
id, fruits
FROM
json_test
WHERE
fruits::jsonb ? 'Orange';
id | fruits
----+--------------------------------
1 | ["Apple", "Orange", "Banana"]
2 | ["Pear", "Orange", "Banana"]
class AddArrayFruitsToMarkets < ActiveRecord::Migration[6.0]
def up
rename_column :markets, :fruits, :old_fruits
add_column :markets, :fruits, :string, array: true
Market.update_all('fruits = json_array_elements(old_fruits)')
end
end
class RemoveJsonFruitsFromMarkets < ActiveRecord::Migration[6.0]
def up
remove_column :markets, :old_fruits
end
end
But really if you're going to do something why not create tables instead as you're not really improving anything?
class Fruit < ApplicationRecord
validates :name, presence: true
has_many :market_fruits
has_many :markets, through: :market_fruits
end
class MarketFruit < ApplicationRecord
belongs_to :market
belongs_to :fruit
end
class Market < ApplicationRecord
has_many :market_fruits
has_many :fruits, through: :market_fruits
def self.with_fruit(name)
joins(:fruits)
.where(fruits: { name: name })
end
def self.with_fruits(*names)
left_joins(:fruits)
.group(:id)
.where(fruits: { name: names })
.having('COUNT(fruits.*) >= ?', names.length)
end
end
Say i had a record in my database like
+----+-----------+----------+
| id | firstname | lastname |
+----+-----------+----------+
| 1 | 'Bill' | nil |
+----+-----------+----------+
(note last name is nil)
Is there any where I can retrieve the above record using the following hash structure as search parameters:
vals = {firstname: "Bill", lastname: "test"}
Table.where(vals)
(ie: find the closest match, ignoring the nil column value in the table)
(I'm thinking of checking each key in the hash individually and stopping when a match is found, but just wondering if there is a more efficient way, specially for larger tables)
You could make custom search.
def self.optional_where params
query_params = params.keys.map do |k|
"(#{k} = ? OR #{k} IS NULL)"
end.join(" AND ")
where(query_params, *params.values)
end
Then you would use it like
Table.optional_where(vals)
This will produce next query
SELECT "tables".* FROM "tables" WHERE ((firstname = 'Bill' OR first_name IS NULL) AND (lastname = 'test' OR last_name IS NULL))
Let make a custom search like this:
scope :custom_search, -> (params) {
params.each do |k, v|
params[k] = if
if v.is_a? Array
(v << nil).uniq
else
[v, nil]
end
where(params)
end
}
Then we use it like:
search_params = {firstname: "Bill", lastname: "test"}
Table.custom_search(search_params)
The generated sql will be:
SELECT * FROM tables where firstname IN ['Bill', null] AND lastname IN ['test', null]
This means you don't care if one or more fields are nil
I am creating a ruby on rails application. While importing date into available_on column, date does not inserted into database correctly.It's changes when I import data using csv file.
def import
require 'csv'
file = params[:file]
CSV.foreach(file.path, headers: true) do |row|
#prod = Spree::Product.new()
#prod.name = row["name"]
#prod.shipping_category_id = row["shipping_category_id"]
#prod.description = row["description"]
#prod.available_on = row["available_on"]
#prod.meta_description = row["meta_description"]
#prod.meta_keywords = row["meta_keywords"]
#prod.tax_category_id = row["tax_category_id"]
#prod.shipping_category_id = row["shipping_category_id"]
#prod.promotionable = row["promotionable"]
#prod.meta_title = row["meta_title"]
#prod.featured = row["featured"]
#prod.supplier_id = row["supplier_id"]
#prod.master.price = row["master_price"]
#prod.master.cost_price = row["cost_price"]
#prod.master.depth = row["depth"]
#prod.master.height = row["height"]
#prod.master.width = row["width"]
#prod.master.weight = row["weight"]
#prod.master.sku = row["sku"]
#prod.master.tax_category_id = row["tax_category_id"]
#prod.save!
end
end
my database table is just like:
| id | int(11)
| name | varchar(255)
| description | text
| available_on | datetime
| deleted_at | datetime
| slug | varchar(255)
| meta_description | text
| meta_keywords | varchar(255)
| tax_category_id | int(11)
| shipping_category_id | int(11)
| created_at | datetime
| updated_at | datetime
| promotionable | tinyint(1)
| meta_title | varchar(255)
| featured | tinyint(1)
| supplier_id | int(11)
and I know this due to datatype of available_on.
and I inserted date into available_on is like '2015-10-10'.
Can any one tell how to remove this inconsistency from database while importing csv file
Use the :converters option to tell Ruby to automatically convert date fields:
require 'csv'
def import
file = params[:file]
CSV.foreach(file.path, headers: true, converters: :date) do |row|
# ...
end
end
P.S. Might I suggest cleaning this up a bit?
PRODUCT_ATTR_NAMES = %w[ name shipping_category_id description
available_on meta_description meta_keywords
tax_category_id promotionable meta_title
featured supplier_id ]
MASTER_ATTR_NAMES = %w[ master_price cost_price depth height
width weight sku tax_category_id ]
def import
require 'csv'
file = params[:file]
CSV.foreach(file.path, headers: true, converters: :date) do |row|
row = row.to_hash
product_attrs = row.slice(*PRODUCT_ATTR_NAMES)
master_attrs = row.slice(*MASTER_ATTR_NAMES)
#prod = Spree::Product.create!(product_attrs) do |product|
product.master.assign_attributes(master_attrs)
end
end
end
You should parse the value into an instance of class Time:
require 'time'
#prod.available_on = Time.parse(row["available_on"])
You would most likely want to do some type conversion on other columns as well. Those that should contain integer values, for example:
#prod.tax_category_id = Integer(row["tax_category_id"])
Likewise for the other non-string values.
I am using the gem elasticsearch-rails to retrieve data from elasticsearch in a dynamic way, meaning that the result can have none or multiple aggregations depending on users choices.
Imagine a response like this:
(...)
"aggregations"=>
{"agg_insignia_id"=>
{"buckets"=>
[{"key"=>1,
"key_as_string"=>"1",
"doc_count"=>32156,
"agg_chain_id"=>
{"buckets"=>
[{"key"=>9,
"key_as_string"=>"9",
"doc_count"=>23079,
"agg_store_id"=>
{"buckets"=>
[{"key"=>450,
"key_as_string"=>"450",
"doc_count"=>145,
"agg_value"=>{"value"=>1785.13}},
{"key"=>349,
"key_as_string"=>"349",
"doc_count"=>143,
"agg_value"=>{"value"=>1690.37}},
How can I transform that data in a tabular data? like
| insignia_id | chain_id | store_id | value |
| 1 | 9 | 450 | 1785.13 |
| 1 | 9 | 349 | 1690.37 |
(...)
EDIT :: Being clear on the response I am looking for, two choices here: Array (simple) or Array of hashes.
Array style: [[insignia_id, chain_id, store_id, value], [1,9,450,1785.13], [1,9,349,1690.37],...]
Array of Hashes style: [{insignia_id => 1, chain_id => 9, store_id => 450, value => 1785.13}, {insignia_id => 1, chain_id => 9, store_id => 450, value => 1690.37 }]
The later is more like an activerecord style...
ok, so I came up with a solution for an array response.
Firstly added a helper for what comes ahead...
class Hash
def deep_find(key, object=self, found=nil)
if object.respond_to?(:key?) && object.key?(key)
return object[key]
elsif object.is_a? Enumerable
object.find { |*a| found = deep_find(key, a.last) }
return found
end
end
end
now for the array algorithm (added in a concern):
def self.to_table_array(data, aggs, final_table = nil, row = [])
final_table = [aggs.keys] if final_table.nil?
hash_tree = data.deep_find(aggs.keys.first)
if aggs.values.uniq.length == 1 && aggs.values.uniq == [:data]
aggs.keys.each do |agg|
row << data[agg]["value"]
end
final_table << row
else
hash_tree["buckets"].each_with_index do |h, index|
row.pop if index > 0
aggs.shift if index == 0
row << h["key_as_string"]
final_table = to_table_array(h, aggs.clone, final_table, row.clone)
end
end
final_table
end
The call for this method could be made like this:
#_fields = { "insignia_id" => :row, "chain_id" => :row, "store_id"=> :row, "value" => : data }
#res.response => Elasticsearch response
result = to_table_array(res.response, _fields)
There are some things quite specific to this case like you can see on this _fields variable. Also I'm assuming each aggregation has the name of the term itself. The rest is quite the same for every possible case.
A result of an array of hashes is pretty simple from here just by replacing few lines.
I put a lot of efford in this. Hope this helps someone else other than me.
I'm new to Ruby and I want to try to access a MySQL database:
require 'rubygems'
require "dbi"
class DBConnection
attr_accessor :dbh
#Connect to db
def connect?(driver_url,user,pass)
begin
#dbh = DBI.connect(driver_url, user,pass);
return true
rescue DBI::DatabaseError => e
puts "Error message: #{e.errstr}"
#dbh.rollback
return false
ensure
#dbh.disconnect if !dbh
end
end
def execute_customize(query,params)
stm = #dbh.prepare(query)
if( (params != nil) && !(params.empty?) )
stm.execute(params)
else
stm.execute
end
header = false
stm.fetch do |row|
if (!header)
puts("ID Name")
header = true
end
puts("#{row[0]} #{row[1]}")
end
end
end
db = DBConnection.new
db.connect?("DBI:Mysql:test:localhost", "root", "123456")
db.execute_customize("SELECT * FROM test.employee WHERE name = ? OR name = ? ",*["John","Terry"])
But the above returns the following error:
in `execute_customize': wrong number of arguments (3 for 2) (ArgumentError)
But the execution is successful with:
dbh.execute_customize("SELECT * FROM test.employee WHERE name = ?",*["John"])
What am I doing wrong?
Demo data from employee table :
+------+-------+
| id | name |
+------+-------+
| 1 | John |
| 2 | Terry |
| 3 | Vidal |
| 4 | CR7 |
| 5 | M10 |
| 6 | R10 |
| 7 | F4 |
+------+-------+
// Update : Your comment almost told me using IN in query, but if with other query like :
SELECT * FROM test.employee WHERE name = ? and id > ?
I still need a way to passing seperate paramer to every "?" character
You're passing three arguments instead of two.
The splat operator * expands the array, so its elements are treated as separate arguments.
Try
dbh.execute("SELECT * FROM test.employee WHERE name IN (?)", names)
where names is a comma-separated list of strings.
That should work, but you may not need to use execute for this.
If you're using Rails, you can just use
Employee.where(name: ["John","Terry"])
and ActiveRecord will understand what you mean.
See http://guides.rubyonrails.org/active_record_querying.html