Rails - Understanding the Querying - ruby-on-rails

I am new to Rails, and I am given a code, but I do not understand what it means (I am actually just trying to understand the Rails code because I am tasked to use the same logic inside another program)
here is the code
ids = [1, 2, 3]
users = User.where(account_id: ids)
output = Worksheet.where(created_by: users).as_json(only: [:created_at, :id]).group_by_week(week_start: :monday)
{|w| w["created_at"]}
i am not sure if I am following along, but from what i understand, it seems like i am querying the users with id 1, 2, 3, and finding the worksheets that are created by said users, and grouping them by week. However, I do not really understand what the 'only: [:created_at, :id]' does, but I checked through the columns, and there were columns 'created_at' and 'id' inside the worksheet table. Also, I am totally lost about what the below code is about
{|w| w["created_at"]}
and finally, is it possible to let me know what the output of the program would be like? thanks all!

The as_json(only: [:created_at, :id]) part says "convert this result to json but I only want those two columns." Documentation.
The group_by_week(week_start: :monday) takes a block, which is what the { |w| w["created_at"] } part is. It will go through each result from all the previous operations, assign each in turn to w, and then use w["created_at"] for the group_by_week function (for comparison purposes, most likely).

Related

Why does splitting a string return a (bad) value across multiple lines in my Rails API application, but not the console?

I have a GrapeSwaggerRails API application that takes a two dates and a comma-delimited string of category IDs. It should query the database for Records with a created_at within the two given dates and a category_id that matches one of the IDs passed to it. I'm not having any trouble with the dates, so I'll skip that for now. But let's say I want Records with categories matching 8, 2, or 1. In the code, it looks like "8,2,1". In the URL, it gets appended as &categories=%228%2C2%2C1%22.
Anyway, I figured one decent way of getting this to do what I want would be to convert that string into an array of integers like this: categories = params[:categories].split(',').map(&:to_i)
But given "8,2,1", the output is this (ignore the comment):
0 # <-- ?????
2
1
Very strange. In the definition of the API, params[:categories] looks like this: "8,2,1". But params[:categories].split(',') becomes the following:
"8
2
1"
That's a bit odd, isn't it? Running the map method on that turns it into that nonsense higher up, converting the "8 to a 0 for reasons I'm hoping to find out here. I know I could probably come at this problem from a different angle and sidestep the issue, but I'd rather try to get to the root of what's going wrong, so I can learn something from it. For reference, here's what the Rails console does when I put (as far as I can tell) the same data into it:
>> "8,2,1".split(',')
#=> ["8", "2", "1"]
map then works as expected.
>> "8,2,1".split(',').map(&:to_i)
#=> [8, 2, 1]
So my question is twofold. What's going wrong with this split function? Why does it behave differently in the console?
Because params[:categories] is actually
'"8,2,1"' # <- the outer ''s are just for illustration of a string.
If you pass &categories=8%2C2%2C1 it should work as expected.

In Rails 3.2 & Rspec 2, how to manage a set of 50,000 pairs of input_string, expected_score pairs?

I am writing specs for a method 'scores' a string of text according to a fairly complex set of rules having to do with a large set of various combinations of keywords.
My test set of strings is 50,000 strings. my_method_being_tested("some test string") produce a score with 3 elements [boolean, boolean, integer].
I have a tally of 50,000 inputs & expected outputs, something like this
test_set = [ {"test string one" => [true, false, 0] } , { "test string 2" => [false, false, 10] } , ... ]
What is the best way to store/manage a 50,000 element test set when using Rspec, so I can loop thru the array something like:
test_set.each do | a_set |
my_method_being_tested(a_set.key).should == a_set.value
end
There is no underlying ActiveRecord Model for the method in my app so I cannot simply store a fixture and load it into an activerecord table (unless perhaps it makes sense somehow to create an activerecord-less model of some kind and load a fixture into that?
StackOverflow isn't really set up for answering "what is best" questions, but your basic approach is fine and you just need to make sure that your data structure and access mechanisms match up. Given the code snippets you showed, I would suggest reading up on Ruby hashes and structs.
At a meta level you'd have:
test_set = my_test_setup
test_set.each do |pair|
expect(my_method_being_tested(my_key_accessor(pair)).to match_array(my_value_accessor(pair))
end
If you want to keep test_set as as is, then you can change your test loop to be:
test_set.each do |pair|
expect(my_method_being_tested(pair.keys.first)).to match_array(pair.values.first)
end
If you want to keep your test loop as is, you can change our test_setup to be:
TestPair = Struct.new(:key, :value)
test_set = [
TestPair.new("test string one", [true, false, 0]),
TestPair.new("test string 2", [false, false, 10]),
... ]
Note that I'm using the new "expect" syntax rather than the deprecated "should" syntax, but that's a separate issue.
UPDATE: As for storing key/value pairs in a file, there are myriad options as well. YAML, as you note in your comment is fine, and you can combine it with DBM, letting you do something like:
require 'yaml/dbm'
YAML::DBM.load('your_yaml_file.yml').each do |key, value|
expect(my_method_being_tested(key)).to match_array(value)
end
That, of course, assumes that you've stored your key/value pairs in the YAML+DBM file in the first place, which gets you back to creating some Ruby to represent the key/value pairs.
A set of 50k items isn't that big to keep in memory, but if you're really concerned about doing the reading and testing of each pair incrementally, you can always read a line at a time from a file. But that still begs the question of what to store in that line (e.g. JSON) and formatting it in the first place.

How do you search through a hash mapped from an ActiveRecord result?

I have a hash variable that contains the results from an ActiveRecord search, which will be iterated through to display data. My code (transcribed) goes somewhere like this:
hvar=User.map{|x| {:name => x.name, :type => x.type, :section => x.section,
:result => (x.var1/x.var2).round(1)}}
The hash variable 'hvar' would then display the following through an .inspect :
[
{:name=>'Michael', :type=>7, :section=>1, :result=>4.1},
{:name=>'Seymour', :type=>4, :section=>1, :result=>3.9},
{:name=>'Walter', :type=>2, :section=>1, :result=>6.3},
{:name=>'Josephine', :type=>7, :section=>1, :result=>5.4},
{:name=>'Carla', :type=>7, :section=>0, :result=>5.4}
]
So far, so good.
Now, I wish to do a search through that resulting hash, e.g. all those records of type 7, and I'm not sure how to get to it. I found that you could do something like this:
mission=hvar.select{|k| k[:type] == 7}
But it gives me 0 results, which makes sense to me, as I think it is searching through the "first level" of the hash (i.e. 0, 1, 2, 3) instead of the subhashes within.
How could I find all those records with type 7? And to that effect, how could I also do a search on two fields? Say, type == 7 and section == 1.
In case you're wondering, I'm not doing the search from the ActiveRecord itself, cause I have to iterate through every single record and arrange them in a pivoted table that merges this data with another table. So, to make it more efficient I figured to use a hash instead of iterating through the ActiveRecord, which currently it's spitting somewhere around 1700 SQL queries.
Thanks in advance.
Your code is written correctly. The only thing that can really be answered is
how could I also do a search on two fields? Say, type == 7 and section == 1.
The same way you're doing so now, plus an additional condition:
mission = hvar.select { |k| k[:type] == 7 && k[:section] == 1}

Thinking Sphinx can't sort when using delta indexing

I'm not sure who is at fault here, but we have a column in our users table called last_logged_in_at that we use for sorting. This is in a Rails 2.3 project using Thinking Sphinx with delta indexes enabled.
When a record has delta set to true, it is push to the bottom even if the sorting by last_logged_in_at should put it at the top.
I tried with last_logged_in_at being a datetime, a timestamp and even an integer and the behavior is always the same.
Any ideas why?
The query looks something like:
{:populate=>true,
:match_mode=>:boolean,
:order=>"last_logged_in_at DESC, updated_at DESC",
:per_page=>20,
:with_all=>{:role_id=>17,
:state=>"activated",
:mandator_id=>9,
:profile_active=>true},
:page=>nil}
Sorry, life's crazy busy, hence slow reply.
You're filtering on a string - which Sphinx doesn't currently allow. There are ways around this, though.
Also: You're using :with_all, but :with behaves in exactly the same way in your situation. :with_all is useful when you want to match multiple values on a single attribute. For example, this query will match results where articles have any of the given tag ids:
Article.search :with => {:tag_ids => [1, 2, 3]}
But this next query matches articles with all of the given tag ids:
Article.search :with_all => {:tag_ids => [1, 2, 3]}
I realise neither of these points are directly related to your issue - however, it's best to get the query valid first, and then double-check whether the behaviour is correct or not.

Subquery in Rails report generation

I'm building a report in a Ruby on Rails application and I'm struggling to understand how to use a subquery.
Each 'Survey' has_many 'SurveyResponses' and it is simple enough to retrieve these however I need to group them according to one of the fields, 'jobcode', as I only want to report the information relating to a single jobcode in one line in the report.
However I also need to know the constituent data that makes up the totals for that jobcode. The reason for this is that I need to calculate data such as medians and standard deviations and so need to know the values that make the total.
My thinking is that I retrieve the distinct jobcodes that were reported on for the survey and then as I loop through these I retrieve the individual responses for each jobcode.
Is this the correct way to do this or should I follow a different method?
You could use a named scope to simplify getting the groups of responses:
named_scope :job_group, lambda{|job_code| {:conditions => ["job_code = ?", job_code]}}
Put that in your response model, aand use it like this:
job.responses.job_group('some job code')
and you'll get an array of responses. If you're looking to get the mean of the values of one of the attributes on the responses, you can use map:
r = job.responses.job_group('some job code')
r.map(&:total)
=> [1, 5, 3, 8]
Alternatively, you might find it quicker to write custom SQL in order to get the mean / average / sum of groups of attributes. Going through rails for this sort of work may cause significant lag.
ActiveRecord::Base.connection.execute("Custom SQL here")
You can also use Model.find_by_sql()
For example:
class User < Activerecord::Base
# Your usual AR model
end
...
def index
#users = User.find_by_sql "select * from users"
# etc
end

Resources