JApplet: getAppletContext().showDocument() with POST-Data? - post

i have to develop a java-applet which redirects to another webpage. Normaly i use the "showDocument(URL ul)"-method to do that. But in this case i have to send a lot of data to this page. So i need to do it via POST. But showDocument just allows GET-method.
My question: Is it possible to do a redirect to another webpage from within a japplet AND to send POST Data in the same request (like showDocument() but with POST data)? - I know that i can do a POST request from within the applet - but this will happen in the applet's context.
It's a bit complicated because the script from which the applet is called, runs on a client-auth protected server. So i need to do the requests with the browser (because he will be authenticated) - if i do these requests from within the applet, the applet have to authenticate again...
thanx
daniel

Directly with the Java Applet API this is not possible - showDocument is the only thing you have, and it supports only GET. You may be able to do something like this with the JavaScript bridge (i.e. call javascript functions from the applet, which then send the POST request to the server like the browser would (and show the result in a new browser window), but I never used this.

Related

How can send ajax request from activex control/NPAPI and process its response?

Currently i have a wepage where I am sending a ajax request from javascript and in respose for that server is sending a video file which will be by default saved browser download location. I want the user to select download path each and every time for the file download(which can be achieved by changing browser settings which is not suited for me). So i want to include a activex object which can send ajax request and get its response. First I want to know whether is it possible, if yes is there any prototype/examples, or please let me know how it can be achieved.
It is possible; FireBreath has a mechanism called BrowserStreams that would probably work for what you're describing, but honestly I'd suggest against it. See if you can do what you need using an extension; Chrome is dropping support for NPAPI next year and even if they weren't I think it's a really bad idea to use a plugin for something like this.
Up to you, of course. There are examples for making HTTP GET and POST requests in the FBTestPlugin example in FireBreath.

Changing the interface of a webservice witout having access to it

I have awebsite, lets just call it search, in one of my browserpages open. search has a form, which when submitted runs queries on a database to which I don't have direct access. The problem with search is that the interface is rather horrible (one cannot save the aforementioned queries etc.)
I've analyzed the request (with a proxy) which is send to the server via search and I am able to replicate it. The server even sends back the correct result, but the browser is not able to open it. (Same origin policy). Do you have any ideas on how I could tackle this problem?
The answer to your question is: you can't. At least not without using a proxy as suggested in the answer by Walter, and that would mean your web site visitors would have to knowingly login to your web site using their other web site's credentials (hmm doesn't sound good...)
The reason you can't do this is related to security, if you could run a script on the tab next to the one with the site open (which is what I'm guessing you want to do), you would be able to do a CSRF attack and get any data you wish and send it to hack.com
This is, of course, assuming that there has to be a login somewhere in the process, otherwise there's no reason for you to not be able to create a simple form which posts the required query and gets the info.
If you did have access to the mentioned website, you would be able to support cross domain xml using JSONP.
It is not possible to bypass the same origin policy in javascript (assuming that you want to do it with that considering your question). You need to set up a proxy server side that is doing the request for you and returns the html.
A simple way of doing this in PHP would be like this:
<?php
echo file_get_contents("http://searchdomainname.com" . "?" . http_build_query($_GET, '', '&'));
?>

Web Application for testing post requests

Is there a web application for testing post requests? What I imagine it'd be like is you would visit the site and then it would redirect you to a unique URL. You could then send a post request to the URL which would display the request after it was received.
Alternative from Microsoft: WFetch
POST request instruction
This looks like it would be more along the lines of what you're looking for:
http://www.htttools.com
Rest Client is a Firefox Add On that I have used in the past as an Http Post/Get testing tool.
The "net" tab in the Firebug plugin for Firefox will show you the contents of all requests including POSTs. You can also intercept and modify them with TamperData.
Fiddler will do the same for Internet Explorer and other windows programs. Wireshark will also show this information.
There are multiple approaches. If you want to do automated browser-based testing, you could use Selenium/Java or Windmill/Python. Alternatively, if you want to perform white-box testing, you can write scripts that make a http post request to the web application (e.g. using httplib if you are using Python), obtains the response and verifies that the response is as expected.
RequestBin allows you to create a temporary URL and view the last twenty requests.
With PutsReq you can test requests and simulate responses using JavaScript.

How does Facebook pull website data when it sees you've typed a URL into a wall post?

So I'm writing a post on my wall and type a URL into the main body of the post. As soon as I finish the URL, Facebook creates a little section underneath which has the title, description, and an image from the url I typed.
Without getting too indepth, how is this done and what is the best way of make something similar myself?
jQuery (or some other framework that lets you do Ajax easily) to communicate between browser client and webserver
PHP/ASP.NET/Python (or some other scripting framework on the backend) to fetch the url
Facebook also has a meta data specification you might be interested in, to let developers further define what gets shown in a Facebook page.
I believe Facebook is written in PHP. And PHP does this easily.
FOpen can be used to access files on other sites. There are other functions but this will get you started. Then it's a matter of parsing the html you get from the url to get what you want.
http://php.net/manual/en/function.fopen.php
You have a couple choices. You can fetch it using Ajax from the client; or you can fetch it from your server.
If doing it from your server in asp.net then you need to use HttpWebRequest.
FB does an asynchronous JavaScript call to fetch that data without reloading the window you're on. Lookup ajax and libraries like jquery do this: http://api.jquery.com/category/ajax/

How to pass data from a web page to an application?

Trying to figure out a way where I can pass some data/fields from a web page back into my application. This needs to works on Windows/Linux/Mac so I can't use a DLL or ActiveX. Any ideas?
Here's the flow:
1. Application gathers some data and then sends it to a web page using POST that is either imbedded in the app or pops up a new IE window.
2. The web page does some services and then needs to relay the results back to the application.
The only way to do this that I can think of is writing the results locally from the page in a cookie or something like that and have the application monitor for a specific file in that folder.
Alternatively, make a web service that the application hits after passing control to the page and when the page is done the web service will return the data. This sounds like it might have some performance drawbacks.
Can anyone suggest any better solutions for this?
Thanks
My suggestion:
Break the processing logic out of the Web Page into a seperate assembly. You can then create a Web Service that handles all of the processing without needing to pass control over to a page.
Your application can then call the Web Service directly and then serialize the results and work with the data quite easily.
Update
Since the page is supplied by a third party, you obviously can't break anything out. The next best thing would be to handle the entire web request internal to your application (rather than popping a new Window).
With this method, you can get the raw HTTP response (and page markup) and work with it directly. You can then parse the Response stream and gather the required data from it.
During performing an HTTP request you should be able to retrieve the text returned by the page. For instance, if your HTTP POST was to hit a Java servlet, the doPost() method would be fired and you would then perform your actions, you could then use the PrintWriter object from the Response object (PrintWriter out = response.getWriter();) and write text back to the calling application. I'm not sure this helps?
The fact that
web page is hosted by a third party
and they need to be doing the
processing on their servers.
is important to this question.
I like your idea of having the app call a webservice after it passes the data to the third-paty web page. You can always call the webservice asynchronously if you're worried about blocking your application while waiting for results from this webservice.
Another option is that your application implements an XML-RPC server that can be called from the web page using PHP, Python or whatever you use to build the website
A REST server will do the job also...

Resources