ENV:
Rails 3.2.15
ruby 1.9.3p194 (2012-04-20 revision 35410) [x86_64-darwin12.2.0]
I set cookies in controller and read it from helper method, and two result is not same. Why?
# in controller
cookies[:"position"] = { :value => [ 100,200 ], :expires => 1.years.from_now }
# read it same time, it display value is an array [100, 200]
# But I read this cookies in another request, it display "100&200"
# in helper
module WelcomeHelper
def get_position
cookies[:"position"]
end
end
```
get_position method return 100&200
Where can I find some documation? I found in code, it discribled array can be stored in cookies directly and read it directly: https://github.com/rails/rails/blob/v3.2.15/actionpack/lib/action_dispatch/middleware/cookies.rb#L45 , but why I stored an arry in cookies and result from read is a string?
If you want to use non-string values for your cookies please convert them to JSON first.
# using the preferred 1.9 hash syntax and no need to quote "position"
cookies[:position] = { value: JSON.generate([ 100,200 ]), expires: 1.years.from_now }
Check the Rails 4.1 documentation on Cookies.
Rails 3.2 documentation caused a slight confusion on that issue by implying that a Ruby array can be stored directly as a cookie value which was fixed recently by using JSON dump/load and subsequently by the safer JSON generate/parse.
Related
Perhaps my understanding of how this is supposed to work is wrong, but I seeing strings stored in my DB when I would expect them to be a jsonb array. Here is how I have things setup:
Migration
t.jsonb :variables, array: true
Model
attribute :variables, :variable, array: true
Custom ActiveRecord::Type
ActiveRecord::Type.register(:variable, Variable::Type)
Custom Variable Type
class Variable::Type < ActiveRecord::Type::Json
include ActiveModel::Type::Helpers::Mutable
# Type casts a value from user input (e.g. from a setter). This value may be a string from the form builder, or a ruby object passed to a setter. There is currently no way to differentiate between which source it came from.
# - value: The raw input, as provided to the attribute setter.
def cast(value)
unless value.nil?
value = Variable.new(value) if !value.kind_of?(Variable)
value
end
end
# Converts a value from database input to the appropriate ruby type. The return value of this method will be returned from ActiveRecord::AttributeMethods::Read#read_attribute. The default implementation just calls #cast.
# - value: The raw input, as provided from the database.
def deserialize(value)
unless value.nil?
value = super if value.kind_of?(String)
value = Variable.new(value) if value.kind_of?(Hash)
value
end
end
So this method does work from the application's perspective. I can set the value as variables = [Variable.new, Variable.new] and it correctly stores in the DB, and retrieves back as an array of [Variable, Variable].
What concerns me, and the root of this question, is that in the database, the variable is stored using double escaped strings rather than json objects:
{
"{\"token\": \"a\", \"value\": 1, \"default_value\": 1}",
"{\"token\": \"b\", \"value\": 2, \"default_value\": 2}"
}
I would expect them to be stored something more resembling a json object like this:
{
{"token": "a", "value": 1, "default_value": 1},
{"token": "b", "value": 2, "default_value": 2}
}
The reason for this is that, from my understanding, future querying on this column directly from the DB will be faster/easier if in a json format, rather than a string format. Querying through rails would remain unaffected.
How can I get my Postgres DB to store the array of jsonb properly through rails?
So it turns out that the Rails 5 attribute api is not perfect yet (and not well documented), and the Postgres array support was causing some problems, at least with the way I wanted to use it. I used the same approach to the problem for the solution, but rather than telling rails to use an array of my custom type, I am using a custom type array. Code speaks louder than words:
Migration
t.jsonb :variables, default: []
Model
attribute :variables, :variable_array, default: []
Custom ActiveRecord::Type
ActiveRecord::Type.register(:variable_array, VariableArrayType)
Custom Variable Type
class VariableArrayType < ActiveRecord::ConnectionAdapters::PostgreSQL::OID::Jsonb
def deserialize(value)
value = super # turns raw json string into array of hashes
if value.kind_of? Array
value.map {|h| Variable.new(h)} # turns array of hashes into array of Variables
else
value
end
end
end
And now, as expected, the db entry is no longer stored as a string, but rather as searchable/indexable jsonb. The whole reason for this song and dance is that I can set the variables attribute using plain old ruby objects...
template.variables = [Variable.new(token: "a", default_value: 1), Variable.new(token: "b", default_value: 2)]
...then have it serialized as its jsonb representation in the DB...
[
{"token": "a", "default_value": 1},
{"token": "b", "default_value": 2}
]
...but more importantly, automatically deserialized and rehydrated back into the plain old ruby object, ready for me to interact with it.
Template.find(123).variables = [#<Variable:0x87654321 token: "a", default_value: 1>, #<Variable:0x12345678 token: "b", default_value: 2>]
Using the old serialize api causes a write with every save (intentionally by Rails architectural design), regardless of whether or not the serialized attribute had changed. Doing this all manually by overriding setters/getters is an unnecessary complication due to the numerous ways attributes can be assigned, and is partly the reason for the newer attributes api.
If it helps anyone else, Rails wants you to provide the possible keys to permit in the controller as well if you're using strong params:
def controller_params
params.require(:parent_key)
.permit(
jsonb_field: [:allowed_key1, :allowed_key2, :allowed_key3]
)
end
One solution could be to just parse the variable via JSON.parse, push it inside an empty array, then assign it to the attribute.
variables = []
variable = "{\"token\": \"a\", \"value\": 1, \"default_value\": 1}"
variable.class #String
parsed_variable = JSON.parse(variable) #{"token"=>"a", "value"=>1, "default_value"=>1}
parsed_variable.class #Hash
variables.push parsed_variable
I'm having problem with special characters when casting a hash to a json string.
Everything works fine with Ruby 2.0 / Rails 3.2.21, that is,
puts "“".to_json
#"\u201c"
But with Ruby 2.3.0 / Rails 4.2.5.1 I get
puts "“".to_json
#"“"
Is there any way to force Ruby 2.3.0 to convert special characters to unicode style strings (\uXXXX) ?
Remark:
Notice that in Ruby 2.3 / Rails 4, we get
"“".to_json.bytesize == 5 #true
However, in 2.0 we get
"“".to_json.bytesize == 8 #true
So clearly it's the string itself that is different, not different output formats.
I ❤ Rails (just kidding.)
In Rails3 there was a hilarious method to damage UTF-8 in JSON. Rails4, thanks DHH, freed from this drawback.
So, whether one wants the time-back machine, the simplest way is to monkeypatch ::ActiveSupport::JSON::Encoding#escape:
module ::ActiveSupport::JSON::Encoding
def self.escape(string)
if string.respond_to?(:force_encoding)
string = string.encode(::Encoding::UTF_8, :undef => :replace)
.force_encoding(::Encoding::BINARY)
end
json = string.
gsub(escape_regex) { |s| ESCAPED_CHARS[s] }.
gsub(/([\xC0-\xDF][\x80-\xBF]|
[\xE0-\xEF][\x80-\xBF]{2}|
[\xF0-\xF7][\x80-\xBF]{3})+/nx) { |s|
s.unpack("U*").pack("n*").unpack("H*")[0].gsub(/.{4}/n, '\\\\u\&')
}
json = %("#{json}")
json.force_encoding(::Encoding::UTF_8) if json.respond_to?(:force_encoding)
json
end
end
More robust solution would be to corrupt the result:
class String
def rails3_style
string = encode(::Encoding::UTF_8, :undef => :replace).
force_encoding(::Encoding::BINARY)
json = string.
gsub(/([\xC0-\xDF][\x80-\xBF]|
[\xE0-\xEF][\x80-\xBF]{2}|
[\xF0-\xF7][\x80-\xBF]{3})+/nx) { |s|
s.unpack("U*").pack("n*").unpack("H*")[0].gsub(/.{4}/n, '\\\\u\&')
}
json = %("#{json}")
json.force_encoding(::Encoding::UTF_8) if json.respond_to?(:force_encoding)
json
end
end
puts "“".to_json.rails3_style
#⇒ "\u201c"
I hardly could understand why anybody might want to do this on purpose, but the solution is here.
In this link Rails find_or_create by more than one attribute? can use more that one attribute with active record.
How can I use more than attribute in mongoid?
Thank you
If you look at the source in lib/mongoid/finders.rb:
# Find the first +Document+ given the conditions, or creates a
# with the conditions that were supplied.
...
# #param [ Hash ] attrs The attributes to check.
#
# #return [ Document ] A matching or newly created document.
def find_or_create_by(attrs = {}, &block)
find_or(:create, attrs, &block)
end
you can see that find_or_create_by accepts a {} as the first argument. You can just pass in several conditions at once
something.find_or_create_by(name: 'john', age: 20)
and it should work.
From the mongoid docs on querying:
Model.find_or_create_by
Find a document by the provided attributes, and if not found create
and return a newly persisted one.
Christoffer,
I ran into a similar issue just recently and eventually figured it out after reading the source in the mongoid git repository:
In mongoid 3.1.0 stable branch, this works
#new_object = NewObject.find_or_create_by(indexed_attribute: my_unique_value,
:attributeA => value,
:attributeB => value)
In Python, I can do this:
>>> import urlparse, urllib
>>> q = urlparse.parse_qsl("a=b&a=c&d=e")
>>> urllib.urlencode(q)
'a=b&a=c&d=e'
In Ruby[+Rails] I can't figure out how to do the same thing without "rolling my own," which seems odd. The Rails way doesn't work for me -- it adds square brackets to the names of the query parameters, which the server on the other end may or may not support:
>> q = CGI.parse("a=b&a=c&d=e")
=> {"a"=>["b", "c"], "d"=>["e"]}
>> q.to_params
=> "a[]=b&a[]=c&d[]=e"
My use case is simply that I wish to muck with the values of some of the values in the query-string portion of the URL. It seemed natural to lean on the standard library and/or Rails, and write something like this:
uri = URI.parse("http://example.com/foo?a=b&a=c&d=e")
q = CGI.parse(uri.query)
q.delete("d")
q["a"] << "d"
uri.query = q.to_params # should be to_param or to_query instead?
puts Net::HTTP.get_response(uri)
but only if the resulting URI is in fact http://example.com/foo?a=b&a=c&a=d, and not http://example.com/foo?a[]=b&a[]=c&a[]=d. Is there a correct or better way to do this?
In modern ruby this is simply:
require 'uri'
URI.encode_www_form(hash)
Quick Hash to a URL Query Trick :
"http://www.example.com?" + { language: "ruby", status: "awesome" }.to_query
# => "http://www.example.com?language=ruby&status=awesome"
Want to do it in reverse? Use CGI.parse:
require 'cgi'
# Only needed for IRB, Rails already has this loaded
CGI::parse "language=ruby&status=awesome"
# => {"language"=>["ruby"], "status"=>["awesome"]}
Here's a quick function to turn your hash into query parameters:
require 'uri'
def hash_to_query(hash)
return URI.encode(hash.map{|k,v| "#{k}=#{v}"}.join("&"))
end
The way rails handles query strings of that type means you have to roll your own solution, as you have. It is somewhat unfortunate if you're dealing with non-rails apps, but makes sense if you're passing information to and from rails apps.
As a simple plain Ruby solution (or RubyMotion, in my case), just use this:
class Hash
def to_param
self.to_a.map { |x| "#{x[0]}=#{x[1]}" }.join("&")
end
end
{ fruit: "Apple", vegetable: "Carrot" }.to_param # => "fruit=Apple&vegetable=Carrot"
It only handles simple hashes, though.
This question already has answers here:
Parsing a JSON string in Ruby
(8 answers)
Closed 4 years ago.
I'm looking for a simple way to parse JSON, extract a value and write it into a database in Rails.
Specifically what I'm looking for, is a way to extract shortUrl from the JSON returned from the bit.ly API:
{
"errorCode": 0,
"errorMessage": "",
"results":
{
"http://www.foo.com":
{
"hash": "e5TEd",
"shortKeywordUrl": "",
"shortUrl": "http://bit.ly/1a0p8G",
"userHash": "1a0p8G"
}
},
"statusCode": "OK"
}
And then take that shortUrl and write it into an ActiveRecord object associated with the long URL.
This is one of those things that I can think through entirely in concept and when I sit down to execute I realize I've got a lot to learn.
These answers are a bit dated. Therefore I give you:
hash = JSON.parse string
Rails should automagically load the json module for you, so you don't need to add require 'json'.
Parsing JSON in Rails is quite straightforward:
parsed_json = ActiveSupport::JSON.decode(your_json_string)
Let's suppose, the object you want to associate the shortUrl with is a Site object, which has two attributes - short_url and long_url. Than, to get the shortUrl and associate it with the appropriate Site object, you can do something like:
parsed_json["results"].each do |longUrl, convertedUrl|
site = Site.find_by_long_url(longUrl)
site.short_url = convertedUrl["shortUrl"]
site.save
end
This answer is quite old. pguardiario's got it.
One site to check out is JSON implementation for Ruby. This site offers a gem you can install for a much faster C extension variant.
With the benchmarks given their documentation page they claim that it is 21.500x faster than ActiveSupport::JSON.decode
The code would be the same as Milan Novota's answer with this gem, but the parsing would just be:
parsed_json = JSON(your_json_string)
Here is an update for 2013.
Ruby
Ruby 1.9 has a default JSON gem with C extensions. You can use it with
require 'json'
JSON.parse ''{ "x": "y" }'
# => {"x"=>"y"}
The parse! variant can be used for safe sources. There are also other gems, which may be faster than the default implementation. Please refer to multi_json for the list.
Rails
Modern versions of Rails use multi_json, a gem that automatically uses the fastest JSON gem available. Thus, the recommended way is to use
object = ActiveSupport::JSON.decode json_string
Please refer to ActiveSupport::JSON for more information. In particular, the important line in the method source is
data = MultiJson.load(json, options)
Then in your Gemfile, include the gems you want to use. For example,
group :production do
gem 'oj'
end
This can be done as below, just need to use JSON.parse, then you can traverse through it normally with indices.
#ideally not really needed, but in case if JSON.parse is not identifiable in your module
require 'json'
#Assuming data from bitly api is stored in json_data here
json_data = '{
"errorCode": 0,
"errorMessage": "",
"results":
{
"http://www.foo.com":
{
"hash": "e5TEd",
"shortKeywordUrl": "",
"shortUrl": "http://whateverurl",
"userHash": "1a0p8G"
}
},
"statusCode": "OK"
}'
final_data = JSON.parse(json_data)
puts final_data["results"]["http://www.foo.com"]["shortUrl"]
Ruby's bundled JSON is capable of exhibiting a bit of magic on its own.
If you have a string containing JSON serialized data that you want to parse:
JSON[string_to_parse]
JSON will look at the parameter, see it's a String and try decoding it.
Similarly, if you have a hash or array you want serialized, use:
JSON[array_of_values]
Or:
JSON[hash_of_values]
And JSON will serialize it. You can also use the to_json method if you want to avoid the visual similarity of the [] method.
Here are some examples:
hash_of_values = {'foo' => 1, 'bar' => 2}
array_of_values = [hash_of_values]
JSON[hash_of_values]
# => "{\"foo\":1,\"bar\":2}"
JSON[array_of_values]
# => "[{\"foo\":1,\"bar\":2}]"
string_to_parse = array_of_values.to_json
JSON[string_to_parse]
# => [{"foo"=>1, "bar"=>2}]
If you root around in JSON you might notice it's a subset of YAML, and, actually the YAML parser is what's handling JSON. You can do this too:
require 'yaml'
YAML.load(string_to_parse)
# => [{"foo"=>1, "bar"=>2}]
If your app is parsing both YAML and JSON, you can let YAML handle both flavors of serialized data.
require 'json'
out=JSON.parse(input)
This will return a Hash
require 'json'
hash = JSON.parse string
work with the hash and do what you want to do.
The Oj gem (https://github.com/ohler55/oj) should work. It's simple and fast.
http://www.ohler.com/oj/#Simple_JSON_Writing_and_Parsing_Example
require 'oj'
h = { 'one' => 1, 'array' => [ true, false ] }
json = Oj.dump(h)
# json =
# {
# "one":1,
# "array":[
# true,
# false
# ]
# }
h2 = Oj.load(json)
puts "Same? #{h == h2}"
# true
The Oj gem won't work for JRuby. For JRuby this (https://github.com/ralfstx/minimal-json) or this (https://github.com/clojure/data.json) may be good options.
RUBY is case sensitive.
require 'json' # json must be lower case
JSON.parse(<json object>)
for example
JSON.parse(response.body) # JSON must be all upper-case
Here's what I would do:
json = "{\"errorCode\":0,\"errorMessage\":\"\",\"results\":{\"http://www.foo.com\":{\"hash\":\"e5TEd\",\"shortKeywordUrl\":\"\",\"shortUrl\":\"http://b.i.t.ly/1a0p8G\",\"userHash\":\"1a0p8G\"}},\"statusCode\":\"OK\"}"
hash = JSON.parse(json)
results = hash[:results]
If you know the source url then you can use:
source_url = "http://www.foo.com".to_sym
results.fetch(source_url)[:shortUrl]
=> "http://b.i.t.ly/1a0p8G"
If you don't know the key for the source url you can do the following:
results.fetch(results.keys[0])[:shortUrl]
=> "http://b.i.t.ly/1a0p8G"
If you're not wanting to lookup keys using symbols, you can convert the keys in the hash to strings:
results = json[:results].stringify_keys
results.fetch(results.keys[0])["shortUrl"]
=> "http://b.i.t.ly/1a0p8G"
If you're concerned the JSON structure might change you could build a simple JSON Schema and validate the JSON before attempting to access keys. This would provide a guard.
NOTE: Had to mangle the bit.ly url because of posting rules.
You can try something like this:
def details_to_json
{
:id => self.id,
:credit_period_type => self.credit_period_type,
:credit_payment_period => self.credit_payment_period,
}.to_json
end