I have the following Pixel model with the following attributes media, page, disabled, selective
And I am looking for the most reusable way to make an API call that selects all objects that match the parameters passed in the API call. The API call can take 0 parameters or any number of parameters
Here is what I have so far. I believe it will work, but I definitely think there has to be a better solution.
def pixels_by_params
if(params[:media] && params[:page] && params[:selective] && params[:disabled])
pixels = TrackingPixel.media(params[:media]).page(params[:media]).selective(params[:selective]).disabled(params[:disabled])
elsif(params[:media] && params[:page] && params[:selective])
pixels = TrackingPixel.media(params[:media]).page(params[:media]).selective(params[:selective])
elsif(params[:media] && params[:page])
pixels = TrackingPixel.media(params[:media]).page(params[:media])
elsif(params[:media])
pixels = TrackingPixel.media(params[:media])
...
...
...
end
You're chaining all of them anyway, so why don't you just...
pixels = params[:media] ? TrackingPixel.media(params[:media]) : TrackingPixel.all
pixels = pixels.page(params[:page]) if params[:page]
pixels = pixels.selective(params[:selective]) if params[:selective]
pixels = pixels.disabled(params[:disabled]) if params[:disabled]
Or if you prefer a fancy-fied loop, you could:
pixels = params[:media] ? TrackingPixel.media(params[:media]) : TrackingPixel.all
[:page, :selective, :disabled].each do |attr|
pixels = pixels.send(attr, params[attr]) if params[attr]
end
Related
I'm building a spam filter for a job app (think tinder for jobs). I'm currently helping to build a spam filter. To achieve that goal, a signal for users who are "over-applying" for jobs is their apply-to-rejection ratio per day.
To inform our threshold, I've come up with a solution to gather that data from the db by using a nested hash i.e. {user1 =>{date1=>0.33, date2=>0.66}}. My problem now is that the ratios are all 1.0, because i think i'm looping up until either rejections or applications are all gone through so the calculation is always the same number divided by itself.
Here's what i got so far. Appreciate the help.
users = User.all
ratio_hash = Hash.new
users.each do |user|
if user.job_applications.count > 0 && user.job_rejections.count > 0
ratio_hash[user.name] = Hash.new
apply_array = []
reject_array = []
user.job_rejections.each do |reject|
user.job_applications.each do |apply|
if (apply.user_id.present? && reject.user_id.present?) || rej.user_id.present?
if (apply.user_id == user.id && reject.user_id == user.id) || rej.user_id == user.id
if (apply.created_at.present? && reject.created_at.present?) || reject.created_at.present?
date = (apply_array && reject.created_at.to_date)
if (apply.created_at.to_date == reject.created_at.to_date) || reject.created_at.to_date == date
apply_array << apply.created_at.to_date
reject_array << reject.created_at.to_date
ratio_hash[user.name][(apply.created_at.to_date || reject.created_at.to_date)] = (apply_array.length.round(2)/reject_array.length)
end
end
end
end
end
end
end
end
I'm trying to implement a simulated annealing on ruby based on a TSP in which i tried to solve (i converted this code from java). However it turns out the annealing is making my results worst! (PlayerPath gives me a path in which i'll do an Simulated annealing on - i got the path by carrying out a greedy algorithm 1). Can someone help me check on the code and see if i've got something wrong or is it just that simulated annealing doesnt always make things better?
#BEGIN ANNEALING
for i in 1..k
temp = 10000000
cooling = 0.003
if (playerPath[i].length > 2) # if path is larger than 2
bestPath = playerPath[i]
while temp > 1
newSolution = playerPath[i];
firstPosition = rand(newSolution.length)
secondPosition = rand(newSolution.length)
if(firstPosition == 0 || firstPosition == newSolution.length-1)
next
end
if(secondPosition == 0 || secondPosition == newSolution.length-1 )
next
end
# swap cities
tempStore = newSolution[firstPosition]
newSolution[firstPosition] = newSolution[secondPosition]
newSolution[secondPosition] = tempStore
# Tabulation
currentEnergy = calculate_distance(playerPath[i])
neighbourEnergy = calculate_distance(newSolution)
if(acceptanceProbability(currentEnergy,neighbourEnergy,temp) > rand)
playerPath[i] = newSolution
end
if(calculate_distance(playerPath[i])< calculate_distance(bestPath))
bestPath = playerPath[i];
end
temp *= (1-cooling);
end
end
end
#END ANNEALING
#acceptanceProbability
def acceptanceProbability(energy, newEnergy,temperature)
# If the new solution is better, accept it
if (newEnergy < energy)
return 1.0
end
# If the new solution is worse, calculate an acceptance probability
return Math.exp((energy - newEnergy) / temperature)
end
I need to filter objects in array.
It works with one parameters
#usersc = #usersb.select { |user| user.need_appartment? }
but i would like use more parameters than in SQL/ActiveRecord :
(need_bedrooms_min >= :nb_bedrooms_min) AND (budget_amount BETWEEN :budget_min AND :budget_max) AND ((need_surface_min BETWEEN :surface_min AND :surface_max) OR (need_surface_max BETWEEN :surface_min AND :surface_max))"+req,{nb_bedrooms_min: params[:nb_bedrooms_min], budget_min: params[:budget_min], budget_max: params[:budget_max],surface_min: params[:surface_min], surface_max: params[:surface_max]}).paginate(:page => params[:page])
I dont find the solution... Anyone can help me ?
F.
select does exactly what you need with as many parameters as you might want:
#usersb.select do |user|
user.need_bedrooms_min >= params[:nb_bedrooms_min].to_i &&
(params[:budget_min].to_i..params[:budget_max].to_i).include? user.budget_amount &&
((params[:surface_min].to_i..params[:surface_max].to_i).include? user.need_surface_min ||
(params[:surface_min].to_i..params[:surface_max].to_i).include? user.need_surface_max)
end
Or, more cleanly:
class User
def needs_apartment?(params)
budget_min, budget_max, surface_min, surface_max, nb_bedrooms_min =
%w{budget_min budget_max surface_min surface_max nb_bedrooms_min}.map{|k| params[k.to_sym].to_i}
budget_range = budget_min..budget_max
surface_range = surface_min..surface_max
need_bedrooms_min >= nb_bedrooms_min &&
budget_range.include? budget_amount &&
(surface_range.include?(need_surface_min) || surface_range.include?(need_surface_max))
end
end
#usersb.select{|user| user.needs_apartment?(params)}
I have array of objects form AR
I want to rarefy them, with limit.
Current method looks like:
def rarefied_values(limit = 200)
all_values = self.values.all
rarefied_values = []
chunk_size = (all_values.size / limit.to_f).ceil
if all_values.size > limit
all_values.each_slice(chunk_size) do |chunk|
rarefied_values.push(chunk.first)
end
return rarefied_values
else
return all_values
end
end
Any hints for refactoring?
def rarefied_values(limit = 200)
all_values = values.all
return all_values unless all_values.size > limit
chunk_size = all_values.size / limit
(0...limit).map{|i| all_values[i*chunk_size]}
end
Some general points in refactoring in ruby
self can be omitted usually. In a few cases, you cannot, for example self.class. In this case, self.values.all => values.all
If one of the conditioned procedures is much simpler compared to the others, then place that simple case first, and get rid of it from the rest of the code using return. In this case, return all_values unless all_values.size > limit
In general, when you need nested conditions, design it so that cases with simpler procedures split off eariler, and the complicated cases are placed toward the end.
Let the code be lazy as possible. In this case, rarefied_values = [] is not necessary if all_values.size > limit. So put that in the conditioned section.
Here's a naïve refactor, keeping your same methods, but removing the explicit return calls and only performing certain transformations if necessary:
def rarefied_values(limit = 200)
all_values = self.values.all
if all_values.size <= limit
all_values
else
chunk_size = (all_values.size / limit.to_f).ceil
[].tap{ |rare| all_values.each_slice(chunk_size){ |c| rare << c.first } }
end
end
Here's a faster, more terse version:
def rarefied_values(limit = 200)
all_values = self.values.all
if (size = all_values.size) <= limit
all_values
else
all_values.values_at(*0.step(size-1,(size.to_f/limit).ceil))
end
end
in ruby/rails3, I need to do some heavy text parsing to find a certain string. Right now I'm doing something like the following:
extract_type1 = body.scan(/(stuff)/m).size
extract_type2 = body.scan(/(stuff)/m).size
extract_type3 = body.scan(/(stuff)/m).size
extract_type4 = body.scan(/(stuff)/m).size
extract_type5 = body.scan(/(stuff)/m).size
if extract_type1 > 0
elsif extract_type2 > 0
elsif extract_type3 > 0
elsif extract_type4 > 0
elsif extract_type5 > 0
The problem here is that I keep needing to add extract types based on the app. And that results in a lot of processing when the case occurs that extract_type1 >0 and the rest aren't needed.
But at the same time, it's nice and clean to have the extract logic separated from the if block as that would be busy messy and hard to read.
Any thoughts on how to optimize this while not compromising readability?
Thanks
what about storing all your "keywords" you are searching for in an array and iterate over it like:
stuff = ["stuff1", "stuff2"]
stuff.each do |c_stuff|
if body.scan(/(#{Regexp.escape(c_stuff)})/m).size > 0
# do something
# break the loop
break
end
end
Edit: If you need the index of the element, you can use each_with_index do |c_stuff, c_index|
Lazy evaluation might work for you; just convert your extract_X variables to lambdas so that the values are computed on use:
extract_type1 = lambda { body.scan(/(stuff)/m).size }
extract_type2 = lambda { body.scan(/(stuff)/m).size }
extract_type3 = lambda { body.scan(/(stuff)/m).size }
extract_type4 = lambda { body.scan(/(stuff)/m).size }
extract_type5 = lambda { body.scan(/(stuff)/m).size }
if extract_type1.call > 0
elsif extract_type2.call > 0
elsif extract_type3.call > 0
elsif extract_type4.call > 0
elsif extract_type5.call > 0
If you're using the extract_X values more than once then you can add memoization to the lambdas so that the values are computed on first access and then cached so that subsequent accesses would just use the value that was already computed.