Not showing data with react.rb - ruby-on-rails

I'm just trying to use ReactRB with reactive-record.
So the deal is in render part I think. When I'm setting param :user, type: User in React Component class, I can't see any data in my table. Of course Model User in public folder, as this requirement in ReactRB.
Well, in console I see that server is fetching nothing, but right data returned.
What I'm missing? Thanks for the help!
The key for answer is in this screenshot
The details are that the data comes back from the server as a json blob
reactive-record decodes it, but counts on the fact that if you try to json parse a simple string, it raises an error.
opal 0.10 no longer raises standard error, so the whole thing just hangs up.

Just thinking about this... there is a known problem in Opal https://github.com/opal/opal/issues/1545 and this causes a problem in reactive-record. Please make sure that you are not using opal 0.10

One thing to keep in mind is that reactive-record lazy loads records, and attributes. So unless someplace in your render, you access a particular record/attribute that attribute will not show up on the client.
Its hard to tell more without a bit more of your code posted, but here is some help:
Lets say your component looks like this:
class Foo < React::Component::Base
param :user, type: User
def render
"user.name = #{user.name}"
end
end
and someplace either in a controller or in a layout you do this:
render_component '::Foo', {user: User.first}
You might try something very simple like this, just to get familiar with how things work.
What happens should be this: You will render your view and a placeholder for the first User will be sent to the component, during rendering the component looks for that user's name attribute, which it does not have, so that is queued up to fetch from the server. Rendering will complete, and eventually the data will come down from the server, the local model's data will be updated, and components displaying that data will be rerendered.
During prerendering all the above happens internal to the server, and when the component has been rendered the final html is delivered along with all the model data that was used in rendering the component. So on first load if all is working you will not see any fetches from the server.
So if you try out the above small example, and then go into your javascript console you can say things like this:
Opal.User.$first()
and you will see the underlying model data structure returned (I am translating from JS into ruby above... ruby methods all start with $)
You can then do this:
Opal.User.$first().$name()
And you can even do this (assuming there are at least 2 user models):
Opal.User.$find(2).$name()
You should have something like "DummyValue" returned, but then there will be a server fetch cycle in the console, then if you repeat the above command you will get back the actual value!!!
This may not be the best forum for more details, if you need to drop by https://gitter.im/reactrb/chat for more help

Related

Is there a way to have a persistent object in memory I can read/write to anywhere in a rails app?

I'm working on a rails based web backend, and I've ran into a bit of an issue. I'm building a crypto trading based application, which relies on knowing the exact current price of many cryptos/stocks. To do this I seem to need a websocket to update certain data, however I can't figure out how to store this data. I need to be able to write to it on every websocket update, as well as read from it when sending out data to the front end. Both of these actions seem too fast to rely on my database, so I'm wondering if there is a better option. My idea was to use a class with a class method that is set on server startup. Then read from/write to that method when needed. The class looks something like this
class CryptoSocket
def self.start
##cryptos = {
BTC: 0,
ETH: 0,
DOGE: 0
}
end
def self.value(symbol)
##cryptos[symbol]
end
end
Inside the start method is a websocket which gets opened, and on message writes to ##cryptos with the updated value of the coin. I call CryptoSocket.start when the server boots up
To get the value for a symbol I can just call CryptoSocket.value(symbol) anywhere in my app. This seemed like it was working, however I've noticed sometimes it fails telling me NameError: uninitialized class variable ##cryptos in CryptoSocket
It seems like the issue is running reload! in the rails console, or entering a binding.pry then exiting. My guess is some garbage collection is happening, but overall it's something I'd like to avoid.
Does anyone have a suggestion for a better way to go about setting this class up? Does rails have a better way to persist an object in memory? It's fine to lose it when the server shuts down, but I would like to keep access to it when the server stays up

Rails: Serialization of custom class for flash messages

I can't seem to figure out how flash messages in RoR insist on being serialized for the next page view. When setting a simple type to e.g. flash[:notice], all is well to get it across to the next page view. When I however try and set the value of flash[:notice] to a custom class, it serializes only the properties:
flash[:notice] = Info.notice("Content...", "Title")
... equates to ...
{"type"=>"notice", "content"=>"Content...", "title"=>"Title"}
... which has no knowledge of the class it serialized. One solution I found was to use .to_yaml before doing a redirect, and then use YAML.load at the later step, but I don't find that viable.
So my question is, how would I be able to make sure that it automatically serialize this object, to properly be deserialized at a later stage?
Rails: 4.2.5.1,
Ruby: 2.2.4p230
Thanks

Rails - how to cache data for server use, serving multiple users

I have a class method (placed in /app/lib/) which performs some heavy calculations and sub-http requests until a result is received.
The result isn't too dynamic, and requested by multiple users accessing a specific view in the app.
So, I want to schedule a periodic run of the method (using cron and Whenever gem), store the results somewhere in the server using JSON format and, by demand, read the results alone to the view.
How can this be achieved? what would be the correct way of doing that?
What I currently have:
def heavyMethod
response = {}
# some calculations, eventually building the response
File.open(File.expand_path('../../../tmp/cache/tests_queue.json', __FILE__), "w") do |f|
f.write(response.to_json)
end
end
and also a corresponding method to read this file.
I searched but couldn't find an example of achieving this using Rails cache convention (and not some private code that I wrote), on data which isn't related with ActiveRecord.
Thanks!
Your solution should work fine, but using Rails.cache should be cleaner and a bit faster. Rails guides provides enough information about Rails.cache and how to get it to work with memcached, let me summarize how I would use it in your case
Heavy method
def heavyMethod
response = {}
# some calculations, eventually building the response
Rails.cache.write("heavy_method_response", response)
end
Request
response = Rails.cache.fetch("heavy_method_response")
The only problem here is that when ur server starts for the first time, the cache will be empty. Also if/when memcache restarts.
One advantage is that somewhere on the flow, the data u pass in is marshalled into storage, and then unmartialled on the way out. Meaning u can pass in complex datastructures, and dont need to serialize to json manually.
Edit: memcached will clear your item if it runs out of memory. Will be very rare since its using a LRU (i think) algoritm to expire things, and I presume you will use this often.
To prevent this,
set expires_in larger than your cron period,
change your fetch code to call the heavy_method if ur fetch fails (like Rails.cache.fetch("heavy_method_response") {heavy_method}, and change heavy_method to just return the object.
Use something like redis which will not delete items.

Rails ActiveRecord callbacks

I'm having an issue with a date format. I have a time picker that has the date in a funky format (well, it's a nice format, actually, but not to the computer). I'm trying to have Chronic parse the date so that it can be saved properly.
At first, I was doing, in the create action of my controller:
params[:event][:start] = Chronic.parse(params[:event][:start])
but if and when validation fails, it sends the parsed value back to the view, and my datetimepicker is all botched, then.
So, I thought... callback? In my model, I added:
private
def date_change
self.start = Chronic.parse(self.start)
end
I tried before_save, before_validation, after_validation... but nothing seems to get that date formatted correctly.
As it stands, I keep getting ArgumentError in EventsController#create - Argument out of range. I assume that's because the database is expecting a properly formatted datetime object.
Any idea on how I can accomplish my goal, here, of not changing the params, but still being able to save a properly formatted object?
I'm guessing that the problem is occurring the the start= mutator method that ActiveRecord supplies. If you're doing things like this in your controller:
#event.update_attributes(params[:events])
#event = Event.create(params[:event])
#...
then create and update_attributes should call start= internally. That should allow you to put the Chronic stuff in your own start=:
def start=(t)
super(Chronic.parse(t))
end
You might need to adjust that for non-String ts, I'm not sure what Chronic.parse(Time.now), for example, would do. You could also call write_attribute(:start, Chronic.parse(t)) or self[:start] = Chronic.parse(t) if you didn't want to punt to super.
Note that before_validation and similar handlers will be called too late to bypass whatever default string-to-timestamp conversion ActiveRecord is doing but a mutator override should happen at the right time.
Alternatively, you could parse the time in the controller with something like this:
event = params[:events].dup
events[:start] = Chronic.parse(events[:start])
#event = Event.create(event)
Assumption is the mother of all mess ups :)
are you sure the callback is hit? Because if it would, and the error occurred (like it did), wouldn't it still send back the incorrect data (because parsed) back to the view? In case of doubt: log something to make sure it is hit.
are you sure which field causes the Argument out of range error.
Most cases bugs are so hard to find/fix because we assume we know the error, but we are looking at the error in the wrong way.
Easy ways to test which attribute causes the error:
open rails console, build an object with the parameters, save it, and ask the errors. Something like
e = Event.new(params[:event]) # copy params verbatim from your logfile
e.save
e.errors
and that will display which field causes the error.
Alternatively: use pry and add a line binding.pry just after the save, so you inspect the errors (more info)
Answer (assuming your assumption was correct)
I see two options to do what you want:
use the after_validation callback, if you are sure the data will always be correct, and correctly parsed by Chronic. This way if validation is passed, then convert the field and normally nothing can go wrong anymore, and the value is never sent to the browser again.
Note: if some other attribute is causing the error, this callback is never hit, of course. Because it does not pass the validation.
use a virtual attribute, e.g. start_str, which is a visual representation of your start, and
before_save convert it to start. It does not really matter that much here, because if validation fails, you just show start_str and not the "real" start field.

Delayed Job object not properly deserialized

I'm having a hard time believing what I'm seeing, but it sure looks like DJ is failing to deserialize an object properly. I look at the DJ record in mongo and I see in the YAML that the object has its text field set, but when the code runs, the text field is not set. Here is some minimal repro code:
class Board
include Mongoid::Document
field :text, type: String
def process_text_field
if not self.text
raise "Text field is blank"
end
# Text field gets processed
end
end
# in a controller
def start_doing_something_slow
board = Board.find(params[:id])
board.text = "Text field is set"
board.save!
raise "Text disappeared!" unless board.text
board.delay.process_text_field
render json: {:result=>'ok'}
end
I invoke the controller method with the browser, and check the DJ record directly in mongo. I see in the YAML that the Board object has the text field correctly set. But when it executes in DJ, it raises the Text field is blank exception.
Somehow it's not deserializing the object properly.
Well this took me about a week to figure out, so I'm posting it here to help others who fall into this trap. Turns out this is a known bug in delayed_job_mongoid. And it's had a simple fix listed right there in the bug report for 10 months.
The problem arises if you use the identity map in mongoid, which acts as an in-process caching layer to the database. For normal web requests, the cache gets cleared between each request, so your controller methods don't use stale versions of the objects. But delayed_job_mongoid doesn't clear the cache between jobs without this patch (which I just put together): https://github.com/collectiveidea/delayed_job_mongoid/pull/38
The result is your delayed jobs are sometimes using old versions of the objects, depending on what ran before them, which creates truly bizarre, mysterious failures that are extremely difficult to track down until you understand what's happening.

Resources