I'm working on adding functionality to an existing Spree custom photo printing store to allow photographers to upload their portfolios and sell photos through the site. I created a select2 text field for adding keywords to products using Spree::Taxon(s), and it works fine. I have fields for adding keywords in each language that the site supports (English and French).
However, the ajax query takes an extremely long time to complete (5-15s on average). The ActiveRecord query takes between 5-150ms to complete, and the view rendering takes no more than 60ms to complete. I can't account for the rest of the load time. Does anyone have advice on speeding up returning the result or what could be behind the extra time it takes to complete?
Using MySQL for the database, Ruby 2.2.1 and Rails 4.2.1. My dev environment is: Mac Mini (8gb ram, HDD), Aptana Studio IDE, server running on localhost:3000.
Please don't hesitate to request more clarifying information! I wasn't sure exactly what I needed to post to help with the context of my issue.
Controller to return JSON for the ajax request:
class KeywordTagsController < Spree::StoreController
respond_to :json
def find_keywords
term = params[:q]
Rails.logger.debug params.inspect
if params[:locale_str]
query = []
query = ["spree_taxons.name like ?", term + '%'] if term
locale = params[:locale_str].to_sym
keyword_taxonomy = Spree::Taxonomy.find_by(name: Spree.t("pi-taxonomy-keyword"))
keyword_taxons = keyword_taxonomy.taxons.with_translations(locale).where(query).order('name asc').select{ |t| t.parent_id != nil} if locale
respond_with keyword_taxons if keyword_taxons
end
end
end
Select2 initializer Javascript:
$("#keywords_en").select2({
createSearchChoice: function(term, data) {
if ($(data).filter(function() {
return this.text.localeCompare(term) === 0;
}).length === 0) {
return {
id: term,
text: term
};
}
},
multiple: true,
ajax: {
url: '/keywords/en',
dataType: 'json',
data: function (params) {
return {
q: params // search term
};
},
results: function(data){
return { results: $.map( data, function (keyword, i) {
return {id: keyword.id, text: keyword.name }
})}
}
},
tags: true,
tokenSeparators: [','],
placeholder: '<%= Spree.t('pi-keywords-placeholder-en') %>',
initSelection: function (element, callback) {
var data = [];
function splitVal(string, separator) {
var val, i, l;
if (string === null || string.length < 1) return [];
val = string.split(separator);
for (i = 0, l = val.length; i < l; i = i + 1) val[i] = $.trim(val[i]);
return val;
}
$(splitVal(element.val(), ",")).each(function () {
data.push({
id: this,
text: this
});
});
callback(data);
}
});
Some console output for the request (some of the faster examples):
15:00:51 INFO: Processing by KeywordTagsController#find_keywords as JSON
15:00:51 INFO: Parameters: {"q"=>"", "_"=>"1436986845195", "locale_str"=>"fr"}
15:00:54 INFO: Completed 200 OK in 2870ms (Views: 40.6ms | ActiveRecord: 5.2ms)
15:33:45 INFO: Started GET "/keywords/fr?q=mer&_=1436986845196" for 127.0.0.1 at 2015-07-15 15:33:45 -0400
15:33:48 INFO: Processing by KeywordTagsController#find_keywords as JSON
15:33:48 INFO: Parameters: {"q"=>"mer", "_"=>"1436986845196", "locale_str"=>"fr"}
15:33:50 INFO: Completed 200 OK in 2136ms (Views: 5.4ms | ActiveRecord: 113.4ms)
15:33:58 INFO: Started GET "/keywords/fr?q=&_=1436986845197" for 127.0.0.1 at 2015-07-15 15:33:58 -0400
15:33:58 INFO: Processing by KeywordTagsController#find_keywords as JSON
15:33:58 INFO: Parameters: {"q"=>"", "_"=>"1436986845197", "locale_str"=>"fr"}
15:34:00 INFO: Completed 200 OK in 1885ms (Views: 38.7ms | ActiveRecord: 4.6ms)
It turns out that fetching the query results was only slow because I was in the dev environment. In production, it works at the speed one would expect. I'm posting this answer in case others have the same question!
Related
Im using Angular + ts.
My request looks like:
getOrders(this.value.id, null, null).subscribe((s) => {
this.ordersArray = s
})
For some reason. null is converted to "null" on the server, what could be the reason for this?
backend console
Started GET "/admin/api/as_supplier/orders.json?start=null&end=null" for ::1 at 2022-04-19 11:01:32 +0300
Processing by Admin::Api::AsSupplier::OrdersController#index as JSON
Parameters: {"start"=>"null", "end"=>"null"}
Completed 500 Internal Server Error in 30ms (ActiveRecord: 0.0ms)
Or is the problem not on the frontend, but on the backend?
upd:
public getOrders(supplierId?: any, start?: any, end?: any): Observable<IOrders> {
return this.http
.get<IOrders>(`${environment.apiUrl}/admin/api/as_supplier/orders.json`, {
params: {
start,
end
}
})
.pipe(tap(response => this.orders = response))
}
you should do something like this:
let params = new HttpParams();
if (start !== null && start !== undefined) {
params = params.set('start', start);
}
if (end !== null && end !== undefined) {
params = params.set('end', end);
}
return this.http.get<IOrders(`${environment.apiUrl}/admin/api/as_supplier/orders.json`, {params})
#Alex i think the problem in your server side, try to parse params before you use it in the rest of your code in API
I'm building an Angular 2 front-end app that's getting data from a Rails 5 API. It's a kind of network inventory app.
I've got a Asset-form in Angular2 and there's an multi-select input for the ip_addresses of the asset.
I'm unable to get Rails to accept this data in the back-end.
The asset object:
{"name":"SERVER_A","serial":"XOR-354","location_id":1,"asset_type_id":1,"model_id":3,"ip_address_ids":[5304,5305]}
Here's my asset.service.ts:
createAsset(asset: Asset){
let formData = new FormData();
for(var key in asset){
if(key == "ip_address_ids") {
for (var i = 0; i < asset[key].length; i++) {
formData.append("asset["+key+"][]", JSON.stringify(asset[key][i]));
console.log("asset["+key+"][]", JSON.stringify(asset[key][i]));
}
}
if(key != "id") {
formData.append("asset["+key+"]", asset[key]);
}
}
let headers = new Headers();
let authToken = localStorage.getItem('auth_token');
headers.append('Authorization', authToken);
return this.http.post(this.assetUrl, formData, { headers })
.map((response: Response) => response.json());}
This is what im getting in the Rails server console:
Started POST "/assets" for ::1 at 2017-02-06 13:58:33 +0100
Processing by AssetsController#create as HTML
Parameters: {"asset"=>{"name"=>"SERVER_A", "description"=>"undefined",
"serial"=>"XOR-354", "asset_tag"=>"undefined",
"firmware"=>"undefined", "ssh"=>"undefined", "telnet"=>"undefined",
"http"=>"undefined", "location_id"=>"1", "asset_type_id"=>"1",
"model_id"=>"3", "prtg_id"=>"undefined", "ticket_id"=>"undefined",
"ip_address_ids"=>"5304,5305"}}
Unpermitted parameter: ip_address_ids
I've permitted the param of ip_address_ids
def asset_params
params.require(:asset).permit(:name, :description, :serial, :asset_tag, :firmware, :ssh, :telnet, :http, :location_id, :asset_type_id, :model_id, :prtg_id, :ticket_id, :ip_address_ids => [])
end
The strange thing is that if I use the Advanced REST Client in Chrome It's successful.
Here's an image of the REST Client
The result in the Rails server console:
Started POST "/assets" for ::1 at 2017-02-06 14:04:42 +0100
Processing by AssetsController#create as HTML
Parameters: {"asset"=>{"name"=>"Test", "asset_type_id"=>"2", "location_id"=>"33", "model_id"=>"4", "ip_address_ids"=>["5213", "5214"]}}
I think that the problem is that Angular sends the IDs as a string and the REST Client sends the IDs as an Array of strings.
Any idea on how to fix this?
Your ip_address_ids params is being passed as string, rather than array, why not use formData = JSON.stringify({asset: asset}) instead of custom processing?
If the case is that you want to pass it as an actual string you should not permit it with => [], only :ip_address_ids would be enough.
I have gone from incorporating extjs in my original asp.net application which worked when hardcoding any data stores and binding them to the charts/grids. When I tried proxy url calls or even fetching the data from code behind and wrapping in json I still do not get the data into the grid. So I gave up and went with extjs and nodejs and still using mongodb; this worked perfectly but I still have to learn to create a better UI using express/jade etc which is a different project now. But then I came across using MVC with extjs and with a sample project tried the same thing (the sample had hardcoded data) and I cannot for the life of me get it to display the data.
Ext.require([
'Ext.grid.*',
'Ext.data.*',
'Ext.util.*',
'Ext.state.*'
]);
Ext.onReady(function () {
Ext.QuickTips.init();
// setup the state provider, all state information will be saved to a cookie
Ext.state.Manager.setProvider(Ext.create('Ext.state.CookieProvider'));
Ext.define('User', {
extend: 'Ext.data.Model',
fields: [
{ name: 'username', type: 'string' }
]
});
Ext.define('UserStore', {
extend: 'Ext.data.Store',
model: 'User',
autoload: true,
proxy: {
type: 'ajax',
url: '/dashboard.aspx/getDBData',
reader: {
type: 'json',
root: 'users'
},
listeners:
{
exception: function (proxy, response, operation) {
Ext.MessageBox.show(
{
title: 'REMOTE EXCEPTION',
msg: operation.getError(), icon: Ext.MessageBox.ERROR, buttons: Ext.Msg.OK
});
}
}
}
});
var myStore = Ext.getStore('UserStore');
the url I am including here is the codebehind function that I initially tried which accesses the mongodb and returns json result. Not working.
Now from the extjs node.js application I have results coming into localhost:3000/userlist which returns a list from mongodb and displays it as follows:
extends layout
block content
h1.
User List
u1
each user, i in userlist
li
a(href="mailto:#{user.email}")= user.username
Now would it be possible to use the same server and call the base url and then change the route.js file to return the mongodb json result or call the mongodb localhost:27017 and get a result. Really confused here
exports.index = function(db) {
return function(req, res) {
var collection = db.get('usercollection');
collection.find({},{}, function(e,docs){
res.render('userlist', {
"userlist" : docs
});
});
};
};
EDIT:
First thing I realized from asp.net perspective was that I was not calling a webservice just a codebehind method. Any comments will still be appreciated.
EDIT 2:
{"connTime":null,"userName":"101591196589145","clientName":null,
"feedUrl":null,"dconnTime":null,"errMessage":null,"ip":null}
You have identified a root in your store as 'users'
reader: {
type: 'json',
root: 'users'
},
But there is no root in your returned json such as:
{"users":[{"connTime":null,"userName":"101591196589145","clientName":null,
"feedUrl":null,"dconnTime":null,"errMessage":null,"ip":null}]}
so I have a .NET MVC project with an Update controller called from an Ajax POST that can take a long time to run which causes a timeout exception.
When I debug it on my local machine it runs fine, however - when I publish it to my azure website and update it from there the request never successfully completes and Chrome's console reports:
POST http://mysiteaddress/Admin/UpdateLibrary/Update?Length=13 504 (Proxy Timeout ( This operation returned because the timeout period expired. ))
Attempting the same operation on a remote desktop within Firefox causes the console to report:
[07:42:13.856] POST http://mysiteaddress/Admin/UpdateLibrary/Update?Length=13 **[HTTP/1.1 502 Bad Gateway 182940ms]**
I've tried setting a long timeout within my web.config file
<httpRuntime executionTimeout="2000"/>
and within the body of my ajax call
$.ajax({
url: this.action,
type: 'POST',
data: $(this).serialize(),
success: function (data) {
document.write(data);
},
failure: function (XMLHttpRequest, textStatus, errorThrown) {
console.log(XMLHttpRequest);
console.log(textStatus);
console.log(errorThrown);
},
timeout: 2000000 //Milliseconds
});
But no joy.
So this is not really a fix, but a workaround. Instead of making a single long request I had my javascript repeatedly query an ActionResult that returned some json deciding whether my long running process had finished. When it had completed I redirect the browser to a results screen.
$.updateProgressbar = function () {
$.get('#Url.Action("GetStatus", "UpdateLibrary", new { countryId = countryId }, Request.Url.Scheme)', function (data) {
$('#progressbar').progressbar('value', data.progress)
if (data.currentItem != null) {
$('#currentWorkItem').text('#l12.View_Update_currentlyWorking' + data.currentItem);
}
if (data.progress == 100) {
window.location =
'#Url.Action("UpdateResults", "UpdateLibrary", new { countryId = countryId }, Request.Url.Scheme)';
} else {
setTimeout($.updateProgressbar, 5000);
}
});
};
$(function () {
$("#progressbar").progressbar({ value: 0 });
setTimeout($.updateProgressbar, 5000);
});
It looks like on your local network to external azure site, you are going out through a proxy/gateway server. Does your company have any block-lists or white-lists for allowed/disallowed websites that might be intercepting and blocking the request?
I'm trying to upload pics with valums' against a ruby server apache & nginx + passenger , rails 3 ruby 1.9 (1.8 on dev)
Typically a file over 3Mb will fail with the following trace :
#
Started POST "/settings/uploadpict?qqfile=venise.JPG&user_id=680251975" for 82.245.125.231 at Tue Apr 05 23:30:30 +0200 2011
TypeError (expected Hash (got String) for param `'):
Rendered /usr/lib/ruby/gems/1.8/gems/actionpack-3.0.5/lib/action_dispatch/middleware/templates/rescues/diagnostics.erb within rescues/layout (17.2ms)
#
I made sure it's not apache or nginx cutting the flow (nginx did that and I raised to max sie of a request)
What's puzzling is that my controller is not even called (it starts with a logger.console which does not print... ) so I'm a bit helpless to trace the issue...
Any clue ?
view code (controller is never called ...)
//Valum's Ajax File Upload //
function setup_file_upload(){
var uploader = new qq.FileUploader({
// pass the dom node (ex. $(selector)[0] for jQuery users)
element: $("#settings_upload_btn")[0],
// path to server-side upload script
action: '/settings/uploadpict',
// additional data to send, name-value pairs
params: {
user_id: <%=#user.fb_id%>
},
// validation
// ex. ['jpg', 'jpeg', 'png', 'gif'] or []
allowedExtensions: ['jpg', 'jpeg', 'png', 'gif', 'gif', 'bmp'],
// each file size limit in bytes
// this option isn't supported in all browsers
//sizeLimit: 0, // max size
//minSizeLimit: 0, // min size
// set to true to output server response to console
debug: false,
// events
// you can return false to abort submit
onSubmit: function(id, fileName){
//clean-up the mess....
$(".qq-upload-list").empty();
},
onProgress: function(id, fileName, loaded, total){},
onComplete: function(id, fileName, responseJSON){
if (responseJSON["success"] == "false" || responseJSON["success"] == undefined) {
$(".qq-upload-failed-text").show();
}else{
//do the dance
uploadDone(responseJSON["filename"]);
}
},
onCancel: function(id, fileName){},
messages: {
// error messages, see qq.FileUploaderBasic for content
},
showMessage: function(message){ alert(message); }
});
}
After investigation I found a workaround => DESACTIVATE XHR !!!
in valums's fileuploader .js:
qq.UploadHandlerXhr.isSupported = function(){
return false;
};
and after that it works flawlessly using legacy iframe ... no more nice progressbar though...
found this here http://developer.appcelerator.com/question/116980/iphone--rails--xhr--undefined-method-tosym-for-nilnilclass
I found the "problem", it's in the Rails source code
The error is in rails/actionpack-3.0.3/instrumentation.rb line (22)
I don't know if it is real a problem. It occurs because Titanium.Network.createHTTPClient() doesn't send the Content-Type as default
In my example, I wasn't used a Content-Type, it's causing error on rails
I added Content-Type in the code
//...
xhr.open('GET', url, false);
xhr.setRequestHeader('Content-Type', 'application/json');
xhr.send();
///
Now it works
I think that text/plain can be the default Content-Type for Titanium XHR, not empty like now. I also created a ticket on Rails Issue Tracker https://rails.lighthouseapp.com/projects/8994/tickets/6546-error-sending-empty-content-type-instrumentationrb22