Related
I have a performance issue in my application. I would like to gather some ideas on what I can do to improve it. The application is very easy: I need to add values inside a nested table to get the total an user wants to pay out of all the pending payments. The user chooses a number of payments and I calculate how much it is they will pay.
This is what I have:
jsonstr = "{ "name": "John",
"surname": "Doe",
"pending_payments": [
{
"month": "january",
"amount": 50,
},
{
"month": "february",
"amount": 40,
},
{
"month": "march",
"amount": 45,
},
]
}"
local lunajson = require 'lunajson'
local t = lunajson.decode(jsonstr)
local limit -- I get this from the user
local total = 0;
for i=1, limit, 1 do
total = total + t.pending_payments[i].amount;
end;
It works. At the end I get what I need. However, I notice that it takes ages to do the calculation. Each JSON has only twelve pending payments (one per month). It is taking between two to three seconds to come up with a result!. I tried in different machines and LUA 5.1, 5.2., 5.3. and the result is the same.
Can anyone please suggest how I can implement this better?
Thank you!
For this simple string, try the test code below, which extracts the amounts directly from the string, without a json parser:
jsonstr = [[{ "name": "John",
"surname": "Doe",
"pending_payments": [
{
"month": "january",
"amount": 50,
},
{
"month": "february",
"amount": 40,
},
{
"month": "march",
"amount": 45,
},
]
}]]
for limit=0,4 do
local total=0
local n=0
for a in jsonstr:gmatch('"amount":%s*(%d+),') do
n=n+1
if n>limit then break end
total=total+tonumber(a)
end
print(limit,total)
end
I found the delay had nothing to do with the calculation in LUA. It was related with a configurable delay in the retrieval of the limit variable.
I have nothing to share here related to the question asked since the problem was actually in an external element.
Thank #lfh for your replies.
I have two arrays, one is two dimensional and another one dimensional as:
array1 = [["San Francisco", 8], ["New York", 3], ["Madison", 2], ["Washington", 3], ["Tulsa", 3]]
array2 = ["Durham", "Rochester", "New York", "Tulsa", "Kenner", "Washington", "Linton", "Kansas City", "San Francisco", "Madison"]
I want to compare the arrays for existence of City Name in both arrays and show total users as given in the first array (second elements) or 0 if the city is not in first array.
The output should be like this:
Durham (0)
Rochester (0)
New York (3)
Tulsa(3)
Kenner (0)
Washington (3)
...
How can I achive this in Rails?
EDIT:
Actually I have tried array1-array2 to get the differences and adding the differences to array1 with second value 0 but this didn't work for me.
Thanks in advance.
array1 is a perfect candidate to be converted to Hash.
h=Hash[array1]
array2.each{|city| puts "%s(%d)" % [city, h[city]||0] }
Using Array#assoc:
array1 = [["San Francisco", 8], ["New York", 3], ["Madison", 2], ["Washington", 3], ["Tulsa", 3]]
array2 = ["Durham", "Rochester", "New York", "Tulsa", "Kenner", "Washington", "Linton", "Kansas City", "San Francisco", "Madison"]
mapping = Hash[array1]
mapping.default = 0
array2.each do |city|
puts "#{city} (#{mapping[city]})"
end
I'd suggest looking at array uniq, and intersection (&).
This is my code for calculate word frequency
word_arr= ["I", "received", "this", "in", "email", "and", "found", "it", "a", "good", "read", "to", "share......", "Yes,", "Dr", "M.", "Bakri", "Musa", "seems", "to", "know", "what", "is", "happening", "in", "Malaysia.", "Some", "of", "you", "may", "know.", "He", "is", "a", "Malay", "extra horny", "horny nor", "nor their", "their babes", "babes are", "are extra", "extra SEXY..", "SEXY.. .", ". .", ". .It's", ".It's because", "because their", "their CONDOMS", "CONDOMS are", "are Made", "Made In", "In China........;)", "China........;) &&"]
arr_stop_kwd=["a","and"]
frequencies = Hash.new(0)
word_arr.each { |word|
if !arr_stop_kwd.include?(word.downcase) && !word.match('&&')
frequencies["#{word.downcase}"] += 1
end
}
when i have 100k data it will take 9.03 seconds,that,s to much time can i calculate any another way
Thx in advance
Take a look at Facets gem
You can do something like this using the frequency method
require 'facets'
frequencies = (word_arr-arr_stop_kwd).frequency
Note that stop word can be subtracted from the word_arr. Refer to Array Documentation.
I am using Nokogiri to grab data from a webpage, I was under the impression that the following would grab the data and return is as an array? Instead I am getting one big string which is causing a few issues.
home_team = doc.css(".team-home.teams")
if i was to use
home_team = doc.css(".team-home.teams").text
i could understand the data being returned as as string. Am i looking at this the wrong way?
I have even tried
home_team = doc.css(".team-home.teams").map(&:text)
but that seems to be returning a string aswell? If i was getting an array returned in the console it would be in array format yes?
If someone could try this in their console
require 'open-uri'
require 'nokogiri'
FIXTURE_URL = "http://www.bbc.co.uk/sport/football/premier-league/fixtures"
doc = Nokogiri::HTML(open(FIXTURE_URL))
home_team = doc.css(".team-home.teams").map(&:text)
#home_team = doc.css(".team-home.teams")
puts home_team
and just confirm that the output is a string in both cases and what the difference between the two are. slightly lost at the mo
Thanks
You are getting an array. It's just that puts is doing a to_s on. Check this out:
require 'open-uri'
require 'nokogiri'
FIXTURE_URL = "http://www.bbc.co.uk/sport/football/premier-league/fixtures"
doc = Nokogiri::HTML(open(FIXTURE_URL))
home_team = doc.css(".team-home.teams").map(&:text)
# home_team = doc.css(".team-home.teams")
puts home_team.class
puts home_team.map(&:strip).inspect
#=> Array
#=> ["Everton", "Aston Villa", "Southampton", "Stoke", "Swansea", "Man Utd", "Sunderland", "Tottenham", "Chelsea", "Wigan", "Sunderland", "Arsenal", "Man City", "Swansea", "West Ham", "Wigan", "Everton", "Aston Villa", "Southampton", "Fulham", "Reading", "Chelsea", "Newcastle", "Norwich", "Stoke", "West Brom", "Liverpool", "Tottenham", "QPR", "Man Utd", "Newcastle", "Arsenal", "Aston Villa", "Everton", "Reading", "Southampton", "Stoke", "Chelsea", "Arsenal", "Fulham", "Norwich", "QPR", "Sunderland", "Swansea", "West Brom", "West Ham", "Tottenham", "Liverpool", "Man Utd", "Man City", "Aston Villa", "Chelsea", "Everton", "Southampton", "Stoke", "Wigan", "Newcastle", "Reading", "Arsenal", "Fulham", "Liverpool", "Man Utd", "Norwich", "QPR", "Sunderland", "Swansea", "Tottenham", "West Brom", "West Ham", "Arsenal", "Aston Villa", "Everton", "Fulham", "Man Utd", "Norwich", "QPR", "Reading", "Stoke", "Sunderland", "Chelsea", "Liverpool", "Man City", "Newcastle", "Southampton", "Swansea", "Tottenham", "West Brom", "West Ham", "Wigan"]
There's a lot of white space in the data. I get an array when I do this:
home_team = doc.css(".team-home.teams").map {|team| team.text.strip}
We need a Rails plugin for US states and cities. Please see if we can get that.
Maybe this would help: http://github.com/bcardarella/decoder
Interestingly enough, the National Weather Service produces such a data source:
http://www.weather.gov/geodata/catalog/national/html/cities.htm
CityState gem: https://github.com/loureirorg/city-state
CS.states(:us)
# => {:AK=>"Alaska", :AL=>"Alabama", :AR=>"Arkansas", :AZ=>"Arizona", :CA=>"California", :CO=>"Colorado", :CT=>"Connecticut", :DC=>"District of Columbia", :DE=>"Delaware", :FL=>"Florida", :GA=>"Georgia", :HI=>"Hawaii", :IA=>"Iowa", :ID=>"Idaho", :IL=>"Illinois", :IN=>"Indiana", :KS=>"Kansas", :KY=>"Kentucky", :LA=>"Louisiana", :MA=>"Massachusetts", :MD=>"Maryland", :ME=>"Maine", :MI=>"Michigan", :MN=>"Minnesota", :MO=>"Missouri", :MS=>"Mississippi", :MT=>"Montana", :NC=>"North Carolina", :ND=>"North Dakota", :NE=>"Nebraska", :NH=>"New Hampshire", :NJ=>"New Jersey", :NM=>"New Mexico", :NV=>"Nevada", :NY=>"New York", :OH=>"Ohio", :OK=>"Oklahoma", :OR=>"Oregon", :PA=>"Pennsylvania", :RI=>"Rhode Island", :SC=>"South Carolina", :SD=>"South Dakota", :TN=>"Tennessee", :TX=>"Texas", :UT=>"Utah", :VA=>"Virginia", :VT=>"Vermont", :WA=>"Washington", :WI=>"Wisconsin", :WV=>"West Virginia", :WY=>"Wyoming"}
CS.cities(:ak, :us)
# => ["Adak", "Akhiok", "Akiachak", "Akiak", "Akutan", "Alakanuk", "Ambler", "Anchor Point", "Anchorage", "Angoon", "Atqasuk", "Barrow", "Bell Island Hot Springs", "Bethel", "Big Lake", "Buckland", "Chefornak", "Chevak", "Chicken", "Chugiak", "Coffman Cove", "Cooper Landing", "Copper Center", "Cordova", "Craig", "Deltana", "Dillingham", "Douglas", "Dutch Harbor", "Eagle River", "Eielson Air Force Base", "Fairbanks", "Fairbanks North Star Borough", "Fort Greely", "Fort Richardson", "Galena", "Girdwood", "Goodnews Bay", "Haines", "Homer", "Hooper Bay", "Juneau", "Kake", "Kaktovik", "Kalskag", "Kenai", "Ketchikan", "Kiana", "King Cove", "King Salmon", "Kipnuk", "Klawock", "Kodiak", "Kongiganak", "Kotlik", "Koyuk", "Kwethluk", "Levelock", "Manokotak", "May Creek", "Mekoryuk", "Metlakatla", "Mountain Village", "Nabesna", "Naknek", "Nazan Village", "Nenana", "New Stuyahok", "Nikiski", "Ninilchik", "Noatak", "Nome", "Nondalton", "Noorvik", "North Pole", "Northway", "Old Kotzebue", "Palmer", "Pedro Bay", "Petersburg", "Pilot Station", "Point Hope", "Point Lay", "Prudhoe Bay", "Russian Mission", "Sand Point", "Scammon Bay", "Selawik", "Seward", "Shungnak", "Sitka", "Skaguay", "Soldotna", "Stebbins", "Sterling", "Sutton", "Talkeetna", "Teller", "Thorne Bay", "Togiak", "Tok", "Toksook Bay", "Tuntutuliak", "Two Rivers", "Unalakleet", "Unalaska", "Valdez", "Wainwright", "Wasilla"]
It works with all countries over the world. Also, it uses the MaxMind database so its continuously updated (with command CS.update)
I just took the data from the NWS and created a Rails plugin called geoinfo hosted on Github. At this point, it's still a quick hack, but contains all the NWS data in the lib/db folder if you don't want to use it as a plugin. Hope this helps.