Can you pick a browser target server-side? - ruby-on-rails

I have a form that lets users select checks, and when submitted, creates a PDF, which opens in a new browser tab. It doesn't have any branding, and will probably open in a plugin anyway, so I don't want it taking over my site's tab. So I set the form's target to _blank.
But it's possible for the user to submit the form without enough information to create the PDF, in which case I flag the error (server-side) and re-render the form. But because I set the form's target, this re-render opens in a new tab as well, and that's not what I want - in this case, I want it to behave as if target were _top.
So the question is: Can I change the browser's rendering target server-side?
Yes, I know that this can be done with client-side JavaScript, but JS annoys me, and I have to do the validation server-side anyway. I may end up having to use it, but please don't suggest it as an answer - I'm more curious if what I'm attempting can even be done.
PS: I'm on Ruby on Rails 2.3.8, in case anyone knows a framework-specific solution.

A workaround on this problem would be to use the content-disposition header on the pdf, in order to force the file to be downloaded, and avoid the whole "target" approach..
Content-type: application/pdf
Content-Disposition: attachment; filename="downloaded.pdf"

No. This is a purely client-specific feature. As a matter of fact, it's quite possible to get a browser that supports only one window and where the target attribute would have simply no effect. There were even efforts to make this attribute disappear from future HTML standards completely (for instance, the XHTML branch had no such attribute).
The only overlap that I can think of between HTML and HTTP are the <meta http-equiv> tags (where HTML can affect otherwise HTTP-controlled behavior). HTTP is a transfer protocol, designed to work with about just any kind of data. Letting it control presentation would be a pretty terrible mix of concerns.
Fortunately, we live in a JavaScript-enabled world. It is rather easy to validate a form using an AJAX request, especially with libraries like jQuery.
For instance, this script performs a POST request to an URL (in this case, /pdf/validate) and expects the page to send back "ok" (if everything's good) or something else if there was an error.
<form method="post" action="/pdf/send" id="pdf-form">
<!-- form stuff here -->
</form>
<script type="text/javascript" src="jquery.js"></script>
<script type="text/javascript">
$(document).ready(function()
{
// set to true if we are to bypass the check
// this will happen once we've confirmed the parameters are okay
var programmaticSubmit = false;
// attach an event handler for when the form is submitted
// this allows us to perform our own checks beforehand; we'll do so by
// cancelling the event the user triggered, and do the submit ourselves if
// we detect no error
$('#pdf-form').submit(function(event)
{
if (!programmaticSubmit)
{
// first off, cancel the event
event.preventDefault();
// do an AJAX request to /pdf/validate
$.ajax("/pdf/validate", {
type: "POST",
data: $(this).serialize(), // send the form data as POST data
success: function(result)
{
// this gets called if the HTTP request did not end
// abnormally (i.e. no 4xx or 5xx status);
// you may also want to specify an "error" function to
// handle such cases
if (result == "ok")
{
// since the server says the data is okay, we trigger
// the event again by ourselves, but bypassing the
// checks this time
programmaticSubmit = true;
$(this).submit();
}
else // something went wrong! somehow display the error
alert(result);
}
});
}
});
});
</script>

Related

How to purposely delay an AJAX response while testing with Capybara?

I have a React component that mimics the "link preview" feature that most modern social media sites have. You type in a link and it fetches the image, title, etc...
I do this by having the React component make an AJAX call back to my server to fetch the URL preview data.
While it's fetching I show an intermediate "loading" state (i.e. some loading icon or spinning wheel)
The relevant React snippet looks like
this.setState({ isLoadingAttachment: true })
return $.ajax({
type: "GET",
url: some_url,
dataType: "json",
contentType: "application/json",
}).success(function(response){
// Succesful! Do Success stuff
component.setState({ isLoadingAttachment: false })
}).error(function(response) {
// Uh oh! Handle failure stuff
component.setState({ isLoadingAttachment: false })
});
Note how the isLoadingAttachment state variable is only valid for a brief second while the server is doing the fetching. Both the success and error scenarios immediately disable it.
I'd like to test some functionality during my "loading" state with my Capybara feature specs. I've mocked all the web calls and the data to be returned by the server, but it all happens so quickly that it passes through the "loading" state before I can even run any expect().. statement on it. I also purposely don't call wait_for_ajax so the page will go ahead without waiting for the ajax, but it's still too fast.
Lastly I also tried purposefully delaying the server call by 1.0 second, but that didn't work either. I assume because the whole thing is single threaded somehow?
# `foo` is an arbitrary method called during the server-side execution
allow_any_instance_of(MyController).
to receive(:foo) { sleep(1.0) }.and_call_original
Any thoughts on how I could do this?
Thanks!
Capybara starts up the app server in a different thread than the tests, however if you're using the default Capybara.server setting you may have issues with your app calling back to itself since it uses webrick by default. Instead you should specify Capybara.server = :puma. Beyond that, mocking responses is generally a bad idea in feature specs (which are generally meant to be end-to-end tests) since it means you're not actually testing your apps code the way it would run in production anymore. A better solution is to use something like puffing-billy - https://github.com/oesmith/puffing-billy - to mock web responses outside of your apps code which would allow you to do something like
proxy.stub('https://example.com/proc/').and_return(Proc.new { |params, headers, body|
sleep 2
{ :text => "Your results"}
})

webix: Validating edits from datatable on the server

I have the following scenario:
An datatable with some editable columns which validate for input on the client with the webix rules. There are columns though, that cannot be validated on the client, but on the server only (ie for unique id/code).
An approach would be to create a rule and validate with webix.ajax in synchronous mode that I would prefer to avoid this at all means.
I thought I could validate on 'save'. The server can return a status response with error or success. I can catch this with onAfterUpdate event of the datatable (correct me if there is a better way, but it works this way).
At this point, I would like to display a validation error on the datatable if the server script returns an error status and mark the row (and possibly the corresponding column/cell) with error.
I thought I could use the callEvent method on the datatable and fire a onValidationError event but I didn't manage to make that work.
save: {
url: "save.php",
autoupdate: true,
on:{
onAfterUpdate:function(response, id, details) {
if (response.status == 'error')
myDataTable.callEvent('onValidationError');
}
}
}
The documentation states that I can pass some parameters to the event from callEvent but I could not find any specification on the docs. The code above does not work (the event is not fired).
So the question is: How can I fire a onValidationError event for the datatable using callEvent?
or what would be another approach to use webix to show the error on the datatable with validation on the server side?
Thank you.
Instead of calleing onValidationError event you can use
//mark cell, call after error response
myDataTable.addCellCss(id, columnId, "webix_invalid");
//remove mark, call after success response
myDataTable.removeRowCss(id, "webix_invalid");
which will mark the cell as non-valid.
On a side note, if you want to trigger some event with parameters, you can use code like next. Just beware that triggering an event is not a good way to change the component's state ( it can be used to trigger your own event handler though )
myDataTable.callEvent("event name", [param1, param2, param3])
just

in C#, msgbox didn't show

Below is part of my codes in C#, it suppose to send email, show the msg box, then direct to 'Multihotelbook.aspx' page, but the direct to the page without showing the msgbox. i dont know why. need help
emailClient.Send(message);
// Response.Write("<script>window.alert('Email sent')</script>");
//ClientScript.RegisterStartupScript(typeof(Page), "myscript", "<script>alert('Email sent');</script>");
// System.Windows.Forms.MessageBox.Show("Email sent");
// MessageBox.Show("Email sent");
// MessageBoxResult result = MessageBox.Show("Email sent", "Confirmation");
//ClientScript.RegisterStartupScript(typeof(Page), "myscript", "<script>alert('Email sent');</script>");
//ScriptManager.RegisterStartupScript(this, typeof(string), "Message", "confirm('Email sent');", true);
//ScriptManager.RegisterStartupScript(this.Page, this.GetType(), "KEY", "alert('Email sent')", true);
Page.ClientScript.RegisterStartupScript(typeof(string), "alert", "<script>alert('Email sent')</script>");
Response.Redirect("Multihotelbook.aspx");
This... looks like ASP.NET code. There is no MessageBox in ASP.NET. Notice that you had to fully reference System.Windows.Forms, which I imagine you also had to add as a reference. Windows Forms and ASP.NET are two very different things.
What exactly are you trying to accomplish? Showing a JavaScript alert()? If that's the case then you can just include some additional JavaScript code in the response.
Except... you're also doing this:
Response.Redirect("Multihotelbook.aspx");
Which means that the response to the client is being clobbered by a header which tells the client to go to Multihotelbook.aspx. So the client never sees anything else you're including in the response, basically anything in RegisterStartupScript.
After this code executes, the client is going to end up on Multihotelbook.aspx. Unless there's a JavaScript alert() on that page, the browser won't show one.
One approach you could try is to pass a flag to Multihotelbook.aspx, something like Multihotelbook.aspx?emailSent=true and in the Page_Load of that page check for that value and, if it's set to true, include JavaScript code in the page to show the alert() (probably using RegisterStartupScript like you're already trying).

Grails file download does not initiate when called from remoteFunction

In my Grails application, a user can click on a g:link which will call my controller to export certain data to a CSV file. This works with no problems.
I then moved that button to a jQuery dialog box and, when the button is clicked, I use
${remoteFunction(action:'export', onSuccess:'closeMe();', id:courseInstance?.id)}
to call the same controller method and close the dialog box. I've confirmed that the method is actually called, and the dialog box closes. The user is not prompted with the CSV dowmload, however. I'm assuming this has something to do with the remoteFunction, but I'm not really sure. Can anyone explain why this might happen, and a potential fix?
Thanks!
With AJAX requests you can't handle to download content as attachment and so it can't trigger the Save As dialog.
There are a couple of workarounds for this:
Use a plain g:link as before and bind the 'closeMe();' function to the 'click' event. The problem is that you have no control on error or success response.
Use an iframe: You can create a temporary invisible iframe and set its location to the URL of the file to download. It also has the backside of not controlling the success/error response.
The code could be the same as in this answer:
<script type="text/javascript">
function downloadURL(url) {
var iframe;
var hiddenIFrameID = 'hiddenDownloader';
iframe = document.getElementById(hiddenIFrameID);
if (iframe === null) {
iframe = document.createElement('iframe');
iframe.id = hiddenIFrameID;
iframe.style.display = 'none';
document.body.appendChild(iframe);
}
iframe.src = url;
}
</script>
And the link
Export

File download from JSF with a rendered response

I have some dynamically generated files which I want my JSF 2.0 app to download for the user. I've been able to get this working using the code found in the solution here :
Forcing a save as dialogue from any web browser from JSF application
and a command button in a form on the page
And that works fine except for one hitch. I'd like to be able to render a message back to the user on the initial page that tells them their file is being processed and to please wait. Obviously the responseComplete call stops that from happening. Is there some way to re-render the submitting page and send back a file from the same button?
No, you can't. You can send only one response back per request. Best what you could do is to use JavaScript to show an initially hidden div or something which contains the message during the onclick. But you'll have the problem that you cannot hide it whenever the download is completed.
An alternative is to store the file on temp disk and return a fullworthy JSF response wherein you display a download link which returns the file from temp disk by a standalone servlet.
I think you can use ajax to solve this. Call the method that creates the file from an ajax action and provide a javascript callback to handle the navigation or to show a layer or whatever
<script type="text/javascript">
function processEvent(data) {
if (data.status == "begin") {
showWaitingLayer();
} else if (data.status == "success") {
hideWaitingLayer();
showDownloadLink();
}
}
</script>
<h:commandLink action="#{myBean.createDocument}">
<f:ajax onevent="processEvent"/>
</h:commandLink>

Resources