I'm runing a sql request in a model to get some data in relation.
I got a array called newScoring with ["_646_maturity", "_660_maturity", "_651_maturity", "_652_maturity", "_641_maturity"]
newScoring.each do |e|
numero = e.from(1).to(-10) // remove text to get only number : 646
sql = "SELECT * FROM pratiques WHERE numero LIKE '%"+numero+"%'" // sql request to get a pratique with numero equal to my previous number
records_array = ActiveRecord::Base.connection.execute(sql)
Rails.logger.debug "SQL : "+records_array
end
When I log records_array.inspect, I got
[{"id"=>1, "numero"=>646, "titre"=>"Acquérir en priorité des équipements reconditionnés", "ponderation"=>3, "texte_kpi"=>"% du parc reconditionné", "section"=>"achats-responsables", "created_at"=>"2019-06-03 14:10:14.228234", "updated_at"=>"2019-06-03 14:10:14.228234"}]
I want to access to ponderation value but I didn't find any docs related. I tried differents things but got errors messages with string conversion.
Thanks for the help!
Since records_array is an Array of Hashes you need to first select the row (Hash) you want and then access the desired column:
records_array.first['ponderation'] # or records_array[0]['ponderation']
=> 3
Btw, if you are on Rails, why don't you generate a model for practiques and use ActiveRecord?
Related
I am new to Rails, and working with some JSON, and not sure how to get to the data as the examples below:
1) If i were to use JSON.parse(response)['Response']['test']['data']['123456'], i will need to parse another response for 123457, is there a better way to loop through all the objects in data?
2) base on the membershipId, identify the top level object, ie data.
"test": {
"data": {
"123456": {
"membershipId": "321321312",
"membershipType": a,
},
"123457": {
"membershipId": "321321312",
"membershipType": a,
},
}
JSON.parse(response)['Response']['test']['data'].each do |key, object|
puts key
puts object['membershipID']
...
end
To select the data record associated with a particular membership
match_membership = '321321312'
member = JSON.parse(response)['Response']['test']['data'].select |_key, object|
object['membershipID'] == match_membership
end
puts member.key
=> 123456
For 1:
Assumption:
By you saying "need to parse another response", you were doing something like below:
# bad code: because you are parsing `response` multiple times
JSON.parse(response)['Response']['test']['data']['123456']
JSON.parse(response)['Response']['test']['data']['123457']
then simply:
Solution 1:
If you are gonna be accessing 2+ level deep hash values for just maybe 2 or 3 times,
response_hash = JSON.parse(response)
response_hash['Response']['test']['data']['123456']
response_hash['Response']['test']['data']['123457']
Solution 2:
If you are gonna be accessing 2+ level deep hash values for loooooots of times,
response_hash = JSON.parse(response)
response_hash_response_test_data = response_hash['Response']['test']['data']
response_hash_response_test_data['123456']
response_hash_response_test_data['123457']
response_hash_response_test_data['123458']
response_hash_response_test_data['123459']
response_hash_response_test_data['123460']
# ...
Solution 2 is better than Solution 1 because it saves repetitive method calls for Hash#[] which is the "getter" method each time you do like ...['test'] then ['data'] then ['123456'], and so is better-off doing Solution 2 which you store the nested-level of the hash into a variable (this does not duplicate the values in-memory!). Plus it's more readable this way.
I am having hard time coming with the syntax of updating map using cqerl. I have tried the following till now and it doesn't work
statement = "UPDATE keyspace SET data[?] = :data_value WHERE scope = ?;",
values = [{data,"Key Value"},{data_value, "Data Value",{scope, "Scope Value"}]
What am I doing wrong here?
Also setting ttl does not work
statement = "INSERT INTO data(scope)
VALUES(?) USING ttl ?",
values = [{scope, "Scope Value"},{[ttl], 3650}]
Anyone, any idea?
Please note that you are using single quotes around the values, which in Erlang syntax indicates you are using atoms. Based on the documentation cqerl, it doesn't expect atoms there. cqerl data types
For example try:
statement = "INSERT INTO data(scope)
VALUES(?) USING ttl ?",
values = [{scope, "Scope Value"},{[ttl], 3650}]
Based on reply from the contributor on github, it takes an atom, so '[ttl]' is the right way
https://github.com/matehat/cqerl/issues/122
For updating a map correct way is with atom in the values part
statement = "UPDATE keyspace SET data[?] = ? WHERE scope = ?;",
values = [{'key(data)',"Key Value"},{'value(data)', "Data Value",{scope, "Scope Value"}]
I have scraped data from a website and entered it into an array using the code below:
def process_course_details(course_details)
details_array =[]
details_link = true
entry_link = true
details_info = {}
# Sets all data in hash
details_info[:url] = clean_link(course_details.search('div.coursedetails_programmeurl a'))
details_array.push(details_info)
print_details_info(details_info)
entry_link = course_details.search('ul.details_tabs').first
end
The code above stores the element being pulled as such:
View course details on provider's website
But I'd like to clean the above to the below:
http://www.abdn.ac.uk/study/courses/undergraduate/C8R1/
or failing that remove the apostrophe and have this:
View course details on providers website`
You can extract the href with Nokogiri like this:
html = Nokogiri::HTML('View course details on provider\'s website')
html.xpath("//a/#href").to_s # => "http://www.abdn.ac.uk/study/courses/undergraduate/C8R1/"
Based on your comment:
When storing other data I've scraped into the database the apostrophe
provided errors and stopped it. Once I had cleaned the apostrophe and
it no longer was part of the array the code worked and the table was
created.
db = SQLite3::Database.open('ahhh.sqlite3')
db.execute "INSERT INTO aahah (uname, cname, duration, qualification, url, entry) VALUES ('#{#uni_name}', #{#course_name}', '#{#course_duration}', '#{#course_qual}', '#{#details_entry}', '#{#requirements}')"
You are inserting the values via string interpolation:
db.execute("INSERT INTO table_name (foo, bar) VALUES ('#{#foo}', '#{#bar}')")
Apparently, if the interpolated strings contain an apostrophe, your SQL string likely becomes invalid. Even worse, this code is prone to SQL injection.
Instead you should use parameter markers and let the SQLite gem handle the escaping:
db.execute("INSERT INTO table_name (foo, bar) VALUES (?, ?)", [#foo, #bar])
This allows you to safely insert apostrophe and other special characters.
Right now I am in the middle of migrating from SQLite to Postgresql and I came across this problem. The following prepared statement works with SQLite:
id = 5
st = ActiveRecord::Base.connection.raw_connection.prepare("DELETE FROM my_table WHERE id = ?")
st.execute(id)
st.close
Unfortunately it is not working with Postgresql - it throws an exception at line 2.
I was looking for solutions and came across this:
id = 5
require 'pg'
conn = PG::Connection.open(:dbname => 'my_db_development')
conn.prepare('statement1', 'DELETE FROM my_table WHERE id = $1')
conn.exec_prepared('statement1', [ id ])
This one fails at line 3. When I print the exception like this
rescue => ex
ex contains this
{"connection":{}}
Executing the SQL in a command line works. Any idea what I am doing wrong?
Thanks in advance!
If you want to use prepare like that then you'll need to make a couple changes:
The PostgreSQL driver wants to see numbered placeholders ($1, $2, ...) not question marks and you need to give your prepared statement a name:
ActiveRecord::Base.connection.raw_connection.prepare('some_name', "DELETE FROM my_table WHERE id = $1")
The calling sequence is prepare followed by exec_prepared:
connection = ActiveRecord::Base.connection.raw_connection
connection.prepare('some_name', "DELETE FROM my_table WHERE id = $1")
st = connection.exec_prepared('some_name', [ id ])
The above approach works for me with ActiveRecord and PostgreSQL, your PG::Connection.open version should work if you're connecting properly.
Another way is to do the quoting yourself:
conn = ActiveRecord::Base.connection
conn.execute(%Q{
delete from my_table
where id = #{conn.quote(id)}
})
That's the sort of thing that ActiveRecord is usually doing behind your back.
Directly interacting with the database tends to be a bit of a mess with Rails since the Rails people don't think you should ever do it.
If you really are just trying to delete a row without interference, you could use delete:
delete()
[...]
The row is simply removed with an SQL DELETE statement on the record’s primary key, and no callbacks are executed.
So you can just say this:
MyTable.delete(id)
and you'll send a simple delete from my_tables where id = ... into the database.
Hi I'm new to rails and developing an application to pull results from database in preparation for charting. I have the following code in my controller:
#statistic = OutstandingWorkIndex.find_by_sql ["SELECT Result_Set.Set_Code, Request.Specimen_Number ,
DATEDIFF('hh',Result_Set.Date_Time_Booked_In,current_timestamp) as HrsIn FROM iLabTP.Outstanding_Work_Index, iLabTP.Result_Set Result_Set, iLabTP.Request
WHERE Outstanding_Work_Index.Request_Row_ID = Result_Set.Request_Row_ID and Outstanding_Work_Index.Request_Row_ID = Request.Request_Row_ID and Result_Set.Set_code=?
order by Result_Set.Date_Time_Booked_In DESC", params[:set_code].upcase]
What I'd like to do is count the number of records returned in addition to the object from above which I then use to create and XML stream of paired values or use the google charts java script api in the view.
Do I need to issue commands like:
#statistic = OutstandingWorkIndex.find_by_sql ["SELECT Result_Set.Set_Code, Request.Specimen_Number ,
DATEDIFF('hh',Result_Set.Date_Time_Booked_In,current_timestamp) as HrsIn
FROM iLabTP.Outstanding_Work_Index, iLabTP.Result_Set Result_Set, iLabTP.Request
WHERE Outstanding_Work_Index.Request_Row_ID = Result_Set.Request_Row_ID and Outstanding_Work_Index.Request_Row_ID = Request.Request_Row_ID and Result_Set.Set_code=?
order by Result_Set.Date_Time_Booked_In DESC", params[:set_code].upcase].**count**
And if so does this result in the query being reissued?
Thanks
You should do:
#size = #statistic.size
It's well explained here.