Good afternoon:
I am an ETSY affiliate through the AWIN platform. I am having problems when generating affiliate links. The message that I get is this
Unable to embed this url
Embedly is unable to generate an embed for this URL. Please check that the URL is valid (has an http prefix: ex. http://embed.ly) and that the content is publically accessable.
The point is that although I put the direct link of etsy (without affiliate) I still get the same error
https://www.etsy.com/es/listing/520698100/amigurumi-del-sombrerero-loco-de-alicia?ref=shop_home_active_1
Regards, Marina
Probably because embedly doesn't have a free plan.
However, You can use https://microlink.io:
https://api.microlink.io/?url=https://www.etsy.com/es/listing/520698100/amigurumi-del-sombrerero-loco-de-alicia?ref=shop_home_active_1
Related
I am working on Asp.Net MVC 5. When i click a link (placed in another website) I navigate to UserDetails.cshtml page. Basically that 3rd party site is passing the UserName & Password to my site & using that I authorize & display further user info.
It's fine but the Url is looking like this
localhost:8080//Admin/UserDetails/UserName/PWD.
I don't want to show the UserName & Password in URL i.e URL should look something like :
localhost:8080//Admin/UserDetails/
One possible solution could be rewrite the URL in IIS (http://www.hanselman.com/blog/ASPNETMVCAndTheNewIIS7RewriteModule.aspx)
But I believe there is an easier way to handle this by using the routing mechanism of MVC.
Please help me to figure out the same.
EDIT :
As many of you are confused why I am not doing a Form Post here, let me re-frame my question. I have no control over the third party application, so I cant request them to do a form Post to my MVC application. Again the 3rd party application is a Oracle Reporting application (OBI), so doing a POST from that application might not be feasible too...
Let me reverse engineer your requirements from your question:
I want to have an URI that when invoked will give access to a secured section of my website. This URI must be clicked by visitors of a third-party site, whom I give that URI to. I want to hide the credentials from the URI.
You cannot do this, the requirements are conflicting. You cannot hand out URIs that will authenticate anyone who fires a request to that URI.
You could do something with a token (like http://your-site/auth/$token), but then still, anyone with access to that URI can use it to authenticate themselves, or simply put it up on their own website.
If you have data you want to expose to a third-party site, let that site perform an HTTP request (with tokens, usernames, headers or whatever you want to use to authenticate) in the background to your site, and display the response in their site. Then the visitor won't see that traffic, can't share the URI and all will be secure.
No. No. NO. Like seriously, NO. Any sensitive information should be sent via a post body over a secure connection (HTTPS). You can't "hide" information in a GET request, because it's all part of the URI, or the location of a particular resource. If you remove a portion, it's an entirely different location.
UPDATE
I find it extremely hard to believe that any third-party application that needs to authenticate via HTTP and isn't designed by a chimp with a typewriter, wouldn't support a secure method to do so, especially if it's an Oracle application. I'm not familiar with this particular app, but, and no offense meant here, but I would more easily believe that you've missed something in the documentation or simply haven't found the right way to do it yet before I'd believe you have to send clear-text credentials over GET.
Regardless, as I said previously, there's no way to hide information in a GET request. All data in a GET is part of the URL, and therefore is plainly visible in the browser location bar or whatever. Unfortunately, I have no advice for you other than to look closer at the documentation, even reach out to Oracle if you have to. Whether by post or something like OAuth, there almost has to be another way.
Is there a way to use Google Analytics to track vanity urls that redirect to other site locations?
Like this:
http://www.focusonenergy.com/utilities
Resolves to:
http://www.focusonenergy.com/about/participating-utilities
I'd like to know how many visitors used the vanity link. Filtering the Site Content doesn't give an accurate report.
I believe you can, but you need to decorate your vanity link href with additional meta details if you have control of it, or you'd need to track the URL hit server side before you redirect the user. I believe the Analytics has an API you can call from your server side code.
I'm using YoutubeAPI v3.0 to automatically upload videos to my own channel. However the script still needs manual intervention during Oath2.0 authorization. How to make it completely automatic?
1) Access the API using username and password
2) Or find a way to create permanent OAuth2.0 authentication
P/S: I use this script to upload
https://developers.google.com/youtube/v3/guides/uploading_a_video
The only thing I can think of is web scraping. Basically, programmatically open the web page and get its HTML. Then find the authorization code, and store it as a string. I don't know if your scripting language of choice can do it, but Python has Beautiful Soup (links at the bottom). The problem, of course, is accessing the contents of a page like that which is pretty clearly designed to be reached by a logged in user from a web browser. I've never done that, but there's some concept of a "login handshake" where you post the data to the server that's needed as you access the page. I've a few links at the bottom.
Anyway, to give you a better idea of what I mean in pseudo-code (for those who may be confused), it'd be something like:
webURL = 'http://any-url.net";
webPageObject = openPage(webURL);
pageHTML = webPageObject.getHTML();
theHTMLTag = searchForTagById(pageHTML, "<p id='oAuthMessage'>");
//And from there, figure out where the string containing the code is.
//Probably just by getting a substring from the end of the text in the <p>
//backward until you reach the length of the oAuth code.
You'll have to look at the page source to know which tags to look for specifically, but this can all just be done programmatically/automatically, as you wanted.
Links:
Login handshake - Scraping from a website that requires a login?
Beautiful Soup - http://www.crummy.com/software/BeautifulSoup/
google.gov/webScraping - https://www.google.com/search?ie=UTF-8&oe=utf-8&q=how+to+web+scrape+logged+in+page
You can use get Google OAUTH2 for devices in order to have fully automatic token renewal process.
So all you need now is:
Request a device code and confirmation code
Enter confirmation code to confirm your application have access for specific account
Generate new or renew existing ACCESS_TOKEN for your device code
Upload Video using your device code and valid ACCESS_TOKEN
Here is documentation for it.
And here is some examples.
One of my sites is for old mobile phones that don't accept cookies so it uses a URL-based Session ID.
However, Google is indexing the Session ID, so when my site is searched on Google, all the results come up with a specific Session ID.
On most occasions, that Session ID is no longer valid by the time a guest clicks on it, but I've had at least one case where a guest clicked on a link from Google and it actually logged them into someone else's account, which is obviously a huge security flaw.
So how can I keep Google from indexing the Session ID in my URL's? In case it helps, the Session ID has always been set to "Representative URL" in Google's Webmaster Tools.
You can do this by placing a robots.txt file in your root web directory to tell Googlebot and all other crawlers not to crawl URLs with that attribute.
Here is an example:
Lets say the URL you want to block is in the form of:
http://www.mywebsite.com/page.html?id=1234
The robots.txt syntax to block URLs with the id attribute is:
User-agent: *
Disallow: /*id
You can find out more about robots.txt at http://www.robotstxt.org
Read more about this at http://www.seochat.com/c/a/Search-Engine-Optimization-Help/Preventing-Duplicate-Content-on-an-ECommerce-Site-from-Session-IDs/1/
Check this out, https://developers.google.com/search/docs/advanced/crawling/consolidate-duplicate-urls, you can set canonical urls and google-bot will use this url to crawl your webpage, this can also solve duplicate url issues for the same webpage.
I've read that it is possible to access Facebook Insights programmatically:
The Graph API provides programmatic access to all of this data so you
can integrate Platform data into your own, custom analytics systems.
I have two questions, though:
Is it possible to access data from a domain, using Facebook Insights
for domains?
How to get data from a public URL? I've wrote this small script that returns the number of shares for a given URL without using data from Facebook Insights for domains, but how do I get all possible information from a given URL (e.g. who shared it, who liked it, who commented, etc)? Is this even possible?
To get insights for a domain, get the read_insights permission, then GET
https://graph.facebook.com/insights?domain=example.com&access_token=TOKEN
To get insights for a particular URL on your domain, GET
https://graph.facebook.com/?id=YOUR_URL&access_token=TOKEN
Not all of the data you want is available - for example, you can't get the UIDs of the users who shared and liked your URL, but you can get the count.
If you have the comments plugin embedded on your URL, you can get the UIDs of the users who have commented on your URL as comments in the plugin are always public.