Can you use jQuery POST in a Chrome extension? - post

I'm trying to get my Chrome extension working with the Google Calendar API. However, the way Google has set up the extension sandbox makes anything almost impossible.
I can't add the Calendar API using JavaScript because I've tried 200 different ways to include the http://www.google.com/jsapi library. Therefore, I want to try interact with the Calendar API with PHP. Is it even possible to do a POST from a Chrome extension in order to run my PHP file? If not, it's pretty much impossible to interact with any external API that doesn't have a downloadable library, isn't it? If that's the case, I don't see how you can make anything useful with Chrome extensions.

I think you are still having difficulties because you don't completely understand the difference between content scripts and background pages.
Content scripts have certain limits. They can't:
Use chrome.* APIs (except for parts of chrome.extension)
Use variables or functions defined by their extension's pages
Use variables or functions defined by web pages or by other content scripts
Make cross-site XMLHttpRequests
Basically all they can is access DOM of a page where they were injected and communicate with background page (by sending requests).
Background page thankfully doesn't have any of those limits, only it can't access pages user is viewing. Good news is that background page can communicate with content scripts (again through requests).
As you can see background page and content scripts supplement each other. If you use both at the same time you have almost no limitations. All you need is correctly split your logic between those two.
As to your initial question - content scripts can't make cross domain requests, but background pages can. You can read more here.

Related

WKWebView analog of service worker

I'm trying to implement part of app with PWA approach, that works fine on Android, but not for iOS. We need to have
offline content availability
option to update content dynamically (like special offers or so). With service worker we show prompt to update web content.
As were mentioned here service workers are not supported within WKWebView (or UIWebView). So is there analog or alternative solution like smart cache control?
Seems like it is possible to store some web content from app and be able to update it if something changes. May there is already a framework/library/approach for that purpose?
EDIT
Service Workers unavailable in WKWebView in iOS 11.3 - this question explains the status of ServiceWorkers in WKWebView, but no alternative is given. I would like to discuss any alternative solutions.
One thing I discovered is https://github.com/xtools-at/iOS-PWA-Wrapper. It looks like working based on AppCache, but https://developer.mozilla.org/en-US/docs/Web/HTML/Using_the_application_cache#Browser_compatibility says it is deprecated and adviced to use SW instead (which is not an option for PWA).
So before WKWebView get service-workers to work, you can use AppCache (not yet fully deprecated).
You can use this and take a look at source page at https://leasingrechnen.at
What they do? In case if there is no service worker in browser, they load iframe that points to a page with manifest.appcache file implemented, so the page is cached.

How is this URL modification possible?

Could anyone please tell how the site http://www.outsharked.com/imagemapster/default.aspx?what.html is working in such way? Modifying the url without loading/reloading the page. I think this is not done by html5. Because it works in IE6 which doesn't support html5.
I created that site. The commenter is correct, it uses Javascript to change the URL. There's nothing about how that navigation works that is different for IE6 - that browser supports the necessary client-side functionality to do this kind of thing. The basic functionality involves:
capturing click events on the nav, and loading the inner content via AJAX
update the URL to reflect a working direct URL to target.
The links also are valid anchor links that, in the absence of Javascript, would go to the same page (but load the whole thing). This is your basic AJAX web site setup with one minor difference. It's common practice to use a URLs like this in AJAX/single page web sites:
http://mysite.com/home#somepage
or even just
http://mysite.com/#somepage
Where the hashtag part represents the actual page a user has navigated to. If someone accessed that url directly, e.g. from outside the site, the site would use Javascript to load the correct content based on the hashtag, after the page had loaded. This means that there might be a little delay for the inner content to reflect the correct page, since it has to run another request after the initial page has loaded from the browser to get the inner content via AJAX.
I was trying to avoid that by creating a setup that worked completely with and without Javascript. If you go directly to a URL within the site such as http://www.outsharked.com/imagemapster/default.aspx?faq.html you will notice it loads the content directly. This URL will work even if Javascript is disabled. You can't actually do this using hashtags, since hashtag content is not sent to the server. Only the client knows what's after the hashtag in a URL. That's why I was using query strings to represent inner pages.
This site architecture was sort of an experiment at the time. It works pretty well but the code isn't fantastic, I didn't really do anything else with it, and I'm sure there are other better-fleshed-out/tested/full-featured frameworks out there to do much the same thing.
But it might not be a bad example of the nuts and bolts of creating a basic AJAX navigation setup, as a learning tool, since it's pretty concise, and also does HTML5 history navigation (e.g. so the back button works on modern browsers).

How to check which external connections are being used by my website

I'm not 100% sure if this is a programming question, but I do believe I'm targeting the correct audience for this issue.
I've built a web-based frontend for an application. Now the frontend will be deployed to the customer's machine (localhost-based website). However, this frontend uses Google Maps V3 and some other external components. It will need internet access, but the customer network is highly secured. Here my issues begin.
To make sure everything works as planned, we need to allow the connections that are being made when starting up the webpage, so I need a list of URLs that my frontend is using when starting up. I mainly need the google maps URLs, they are so varied (googleis.com, gstatic.com, ...)
How can I get a list of these URLs? Is there any Google documentation (didn't find any)?
I've thought about using Firebug and listing all entries in the Network tab. However, that scales to about 2000 items (including all images, scripts, CSS stylesheets etc that are being loaded from the local website).
Or is there a tool/workaround to easily find out which connections should be explicitly allowed for the website to work like it should?
Your approach of using the Firebug - Network tab is good. The Chrome Developer Tools - Network view is also very good. I haven't seen a list of everything that gets loaded by the map, but that is because it varies based on how you set up your map. I know that Google works hard to only load what is needed by your map, based on your options.
So if you only use selected map controls, Google will try to limit the image downloads to just what is needed to display the controls your map needs. Of course, if you include additional items, such as using a parameter on the URL that loads the drawing tools (libraries=drawing), you will have additional network loading. Google defined these "extra" items as libraries to avoid loading everything; just those that need them will have to load them.
Other than setting your map up and watching what is loaded, I can't think of another option.

Is it possible to use the jQueryUI dialog to pull in a HTML page similar to a traditional popup?

I'm trying to come up with a way to load a url, (https://tools.usps.com/go/ZipLookupAction!input.action in this case) in a jQueryUI dialog box instead of a traditional popup window.
So far, I've tried pulling in the page using an AJAX (AJAH :D) request, but each time, I get a status 200 but no data.
Here is the code from what I've tried: http://jsfiddle.net/Handyman/aXPU7/1/
I had thought that maybe the don't allow ajax requests to usps.com, but I tried a couple of my own sites with the same luck.
Is it even possible to do this without an iframe or a traditional style popup?
Might be easier to use a lightbox plugin like colorbox, jsFiddle.
$('#zipcode_lookup').colorbox({iframe:true, innerWidth:425, innerHeight:344});
Check out the Same Origin Policy.
In computing, the same origin policy is an important security concept for a number of browser-side programming languages, such as JavaScript. The policy permits scripts running on pages originating from the same site to access each other's methods and properties with no specific restrictions, but prevents access to most methods and properties across pages on different sites.
You would need to use JSONP or a proxy.

web scraping/parsing of college course site

Trying to parse/scrape the course site for memphis. The site is "https://spectrumssb2.memphis.edu/pls/PROD/bwckgens.p_proc_term_date". It appears to be some sort of javascript issue, or dynamic generation of the text. I can see the underlying DOM structure using livehttpdheaders/Firefox, but not when I simply view the underlying source/text of the page..
Thoughts/Comments/Pointers would be appreciated...
Well this modern days the site may be assembled in few steps. First the main structure is pulled in and then, often based on identity of the user additional AJAX calls are executed. Your best bet is to sniff HTTP to see what kind of requests are issued between the site is initially requested and when it's fully built
Since you are using firebug you can get HttpFox add-on which gives you what you need

Resources