Ok, so here is the scenario:
I have an activeX that uploads files using HttpWebRequest class. My problem is that I have to specify the network credentials in order to get the activeX to work properly behind a proxy server.
Here is the code:
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(m_url);
req.Proxy = new WebProxy("http://myProxyServer:8080");
req.Proxy.Credentials = new NetworkCredential("user", "password", "domain");
How can i get this information from iExplorer with no (or minimal) user interface?
Thank You :)
I managed to do it ;)
private static WebProxy QueryIEProxySettings(string strFileURL)
{
HttpWebRequest WebReqt = (HttpWebRequest)HttpWebRequest.Create(strFileURL);
WebProxy WP = new WebProxy(WebReqt.Proxy.GetProxy(new Uri(strFileURL)));
WP.Credentials = CredentialCache.DefaultCredentials;
return WP;
}
Related
I have developed an application with ASP.NET MVC5. I have used Facebook external authentication in my application.
When I debug this application with the "Locallhost" domain, the Facebook login works well but when I publish the application in the main server,the AuthenticationManager.GetExternalLoginInfo() returns null and it gives me an error like this in the url:
http://xxxxx.com/Account/ExternalLoginCallback?ReturnUrl=%2Fen&error=access_denied#_=_
I have set the "Site URL" as "http://xxxx.com" and "Valid OAuth redirect URIs" as "http://xxxx.com/signin-facebook" in the Facebook development console.
My setting in the Startup.Outh.cs file is:
var FacebookOptions = new Microsoft.Owin.Security.Facebook.FacebookAuthenticationOptions();
FacebookOptions.AppId = ConfigurationManager.AppSettings["Facebook_User_Key"];
FacebookOptions.AppSecret = ConfigurationManager.AppSettings["Facebook_Secret_Key"];
FacebookOptions.Provider = new Microsoft.Owin.Security.Facebook.FacebookAuthenticationProvider()
{
OnAuthenticated = async context =>
{
context.Identity.AddClaim(new System.Security.Claims.Claim("FacebookAccessToken", context.AccessToken));
foreach (var claim in context.User)
{
var claimType = string.Format("urn:facebook:{0}", claim.Key);
string claimValue = claim.Value.ToString();
if (!context.Identity.HasClaim(claimType, claimValue))
context.Identity.AddClaim(new System.Security.Claims.Claim(claimType, claimValue, "XmlSchemaString", "Facebook"));
}
}
};
FacebookOptions.SignInAsAuthenticationType = DefaultAuthenticationTypes.ExternalCookie;
app.UseFacebookAuthentication(FacebookOptions);
I don't know why the external login does not work only in the server with my main domain name. please help me about this problem.
I encountered pretty much the same symptoms you describe:
shortly:
A Facebook authentication worked well on localhost, and after uploading the project to another server (and changing the site URL on Facebook console), authentication did not succeed.
I would recommend you roll back to the MVC template code, and if that works - notice any changes you have made to the middleware code (Startup.Auth.sc).
In particular pay attention to code that interacts with LOCAL configuration, such as Disk I/O and OS permissions for local services.
My particular case:
Starting from the Owin/Katana supported Visual Studio template of a WebAPI project, external login was working perfectly with Facebook, Microsoft and Google OAuth middleware, when testing on localhost.
Later I added come code to Startup.Auth.sc because I needed further authentication activity.
So this was the original code:
public void ConfigureAuth(IAppBuilder app)
{
// see WebAPI template of Visual Studio 2013/2015
...
app.UseFacebookAuthentication(
appId: 99999999,
appSecret: *******);
}
and this was replacement:
public void ConfigureAuth(IAppBuilder app)
{
// see WebAPI template of Visual Studio 2013/2015
...
app.UseFacebookAuthentication(GetFacebookAuth());
}
private FacebookAuthenticationOptions GetFacebookAuth()
{
string picRequest =
String.Format("/me/picture?redirect=false&width={0}&height={0}", ProfileInfoClaimsModel.PIC_SIDE_PX);
var facebookProvider = new FacebookAuthenticationProvider()
{
OnAuthenticated = async (context) =>
{
var client = new FacebookClient(context.AccessToken);
dynamic me = client.Get("/me?fields=id,name,locale");
dynamic mePicture = client.Get(picRequest);
// storing temporary social profile info TO A LOCAL FOLDER
// uploading the local folder to a service WITH A LOCAL CREDENTIAL FILE
...
}
};
var options = new FacebookAuthenticationOptions()
{
AppId = 0123456789,
AppSecret = ******,
Provider = facebookProvider,
};
return options;
}
You may notice that my comments will make the problem obvious - the code points to local resources.
Then I published the project to a virtual server (by Amazon EC2) running Windows Server 2012 with IIS 8.5.
From that moment I kept getting error=access_denied in the redirect from /signin-facebook.
I decided to follow this good old concept, and go back to the original template code. Pretty soon I figured out that I forgot to configure the new server. For instance, the folder the code refers to did not exist and the site had no permission to create it.
Obviously, that solved it.
I'm using GitLab with an external issue tracker (JIRA), and it works well.
My problem is when I create a new GitLab project (using API), I have to go the GitLab's project settings and manually select the issue tracker I want to use and manually enter the project's id of my external issue tracker.
This screen will be more eloquent:
(source: bayimg.com)
(The two fields I am talking about are "Issue tracker" and "Project name or id in issues tracker")
So here is my question: is there any way to set up this two fields automatically, using API or other ? Currently, GitLab API does not mention anything about external issues tracker settings.
This code helped me to automatically set the GitLab's external issues-tracker settings, using Apache HttpClient and Jsoup.
This code is absolutely not 100% good, but it shows the main idea, wich is to recreate the corresponding POST request that the web form sends.
// 1 - Prepare the HttpClient object :
BasicCookieStore cookieStore = new BasicCookieStore();
LaxRedirectStrategy redirectStrategy = new LaxRedirectStrategy();
CloseableHttpClient httpclient = HttpClients.custom()
.setDefaultCookieStore(cookieStore)
.setRedirectStrategy(redirectStrategy)
.build();
try {
// 2 - Second you need to get the "CSRF Token", from a <meta> tag in the edit page :
HttpUriRequest getCsrfToken = RequestBuilder.get()
.setUri(new URI("http://localhost/_NAMESPACE_/_PROJECT_NAME_/edit"))
.build();
CloseableHttpResponse responseCsrf = httpclient.execute(getCsrfToken);
try {
HttpEntity entity = responseCsrf.getEntity();
Document doc = Jsoup.parse(EntityUtils.toString(entity));
String csrf_token = doc.getElementsByAttributeValue("name", "csrf-token").get(0).attr("content");
// 3 - Fill and submit the "edit" form with new values :
HttpUriRequest updateIssueTracker = RequestBuilder
.post()
.setUri(new URI("http://localhost/_NAMESPACE_/_PROJECT_NAME_"))
.addParameter("authenticity_token", csrf_token)
.addParameter("private_token", "_MY_PRIVATE_TOKEN_")
.addParameter("_method", "patch")
.addParameter("commit", "Save changes")
.addParameter("utf8", "✓")
.addParameter("project[issues_tracker]", "jira")
.addParameter("project[issues_tracker_id]", "_MY_JIRA_PROJECT_NAME_")
.addParameter("project[name]", "...")
...
.build();
CloseableHttpResponse responseSubmit = httpclient.execute(updateIssueTracker, httpContext);
} finally {
responseCsrf.close();
}
} finally {
httpclient.close();
}
Change _NAMESPACE_/_PROJECT_NAME_ to make it corresponds to your project URL, change _MY_PRIVATE_TOKEN_ with your admin account's token, and change _MY_JIRA_PROJECT_NAME_ with ... your jira project's name.
I have a simple web api project based on this example:
http://aspnet.codeplex.com/sourcecontrol/latest#Samples/WebApi/RelaySample/Program.cs
However, in the above sample the relay is working with a local server, in my project the relay is working with an external web server with a real address; companyX.com
I am using the relay service (or, web proxy service) through a browser, for example, in the browser request relayService.com/companyX. The relay service responds with data from the external companyX.com site.
The relay works great, however some headers are not correct and I need to see what the HttpClient is sending to the remote companyX.com server.
In fiddler or Charles, only the request/response from my browser to relayService.com is listed, the request/response from the HttpClient to relayService.com never shows up.
The relayService.com is running locally on my machine, in IIS7, I'm using the hosts file to direct traffic to relayService.com.
I have tried several variation of the following when creating the HttpClient:
var clientHandler = new HttpClientHandler
{
CookieContainer = cookies,
UseCookies = true,
UseDefaultCredentials = false,
Proxy = new WebProxy("http://localhost:8888"),
UseProxy = true,
AutomaticDecompression = DecompressionMethods.GZip,
AllowAutoRedirect = false,
ClientCertificateOptions = ClientCertificateOption.Automatic
};
HttpClient client = new HttpClient(clientHandler);
UPDATE
If I change UseProxy = false The service continues to work, when Fiddler is open or closed.
With UseProxy = true then the service will fail, if fiddler is open, I get the following error:
Object reference not set to an instance of an object.
at System.DomainNameHelper.IdnEquivalent(String hostname) at System.Net.HttpWebRequest.GetSafeHostAndPort(Uri sourceUri, Boolean addDefaultPort, Boolean forcePunycode) at System.Net.HttpWebRequest.GenerateProxyRequestLine(Int32 headersSize) at System.Net.HttpWebRequest.SerializeHeaders() at System.Net.HttpWebRequest.EndSubmitRequest() at System.Net.Connection.CompleteConnection(Boolean async, HttpWebRequest request)
With UseProxy = true and fiddler is CLOSED, I get the following (obvious) error:
No connection could be made because the target machine actively refused it 127.0.0.1:8888
In the same solution I am using HttpWebRequest to download data from the web and that does show up in Fiddler, so it seems to be an issue with the HttpClient.GetAsync()
I have tried this on two machines with identical results.
I have been struggling with this all day, any help would be much appreciated.
Just to recap:
* relayService.com is running locally on my machine, in IIS7
hosts file has "127.0.0.1 relayService.com"
relayService.com is an MVC Web API site that uses HttpClient.GetAsync() to download content from the live web
Fiddler/Charles is running locally on same machine
browser traffic to the local relayService.com appears in Fiddler/Charles
HttpClient.GetAsync() to live web traffic does not appear in Fiddler/Charles
Fiddler/Charles are both up to date versions.
Thanks again
You don't need anything in your HOSTS file if you're using Fiddler; you can use Fiddler's Tools > HOSTS to redirect traffic anywhere you'd like.
When trying to capture traffic from a service account (e.g. the ASP.NET acccount) you typically want to configure that account to use the proxy; see http://fiddler2.com/blog/blog/2013/01/08/capturing-traffic-from-.net-services-with-fiddler for details on that. If you do that, you shouldn't need to configure the proxy object directly in code.
The exception you've shown suggests that here's a bug in the GenerateProxyRequestLine function. Is there any change if you update this: new WebProxy("http://localhost:8888"); to new WebProxy("127.0.0.1", 8888); ?
Generally speaking, .NET applications will bypass the proxy for URLs pointed at //localhost or //127.0.0.1, so when debugging with Fiddler it's common to use a service URL of //localhost.fiddler so that the traffic is always sent to the proxy.
I fixed the problem by making the HttpClient static.
This works fine (for the program functionality) but has the problem with fiddler described above, where trying to use the proxy throws an error:
private HttpClient _client()
{
var clientHandler = new HttpClientHandler
{
UseCookies = true,
UseDefaultCredentials = false,
Proxy = new WebProxy("http://localhost:8888"),
UseProxy = true,
AutomaticDecompression = DecompressionMethods.GZip,
AllowAutoRedirect = true,
ClientCertificateOptions = ClientCertificateOption.Automatic
};
HttpClient client = new HttpClient(clientHandler);
client.Timeout = TimeSpan.FromMinutes(20);
return client;
}
The client was created with:
using (HttpResponseMessage serviceResponse = await _client().GetAsync(getURL(), HttpCompletionOption.ResponseHeadersRead))
{
// Return response
}
However, the below also works and all traffic shows up in Fiddler!
private static readonly HttpClientHandler _clientHandler = new HttpClientHandler()
{
//CookieContainer = cookies,
UseCookies = true,
UseDefaultCredentials = false,
Proxy = new WebProxy("http://localhost:8888"),
UseProxy = false,
AutomaticDecompression = DecompressionMethods.GZip,
AllowAutoRedirect = false,
ClientCertificateOptions = ClientCertificateOption.Automatic,
};
//Create a shared instance of HttpClient and set the request timeout
private static readonly HttpClient _client = new HttpClient(_clientHandler)
{
Timeout = TimeSpan.FromMinutes(20)
};
The client was created with (only difference is removing the '()' after _client above):
using (HttpResponseMessage serviceResponse = await _client.GetAsync(getURL(), HttpCompletionOption.ResponseHeadersRead))
{
// Return response
}
I have no clue why this works. Any insight would be appreciated.
Thanks,
So tipically if you install Neo4j in your development environment, you will have a local hosted version of the Neo4Jserver, which usually you can browse with: localhost:7474/db/data.
Your code is like this:
var client = new GraphClient(new Uri("http://localhost:7474/db/data"));
client.Connect();
However, one day you will want to connect to your Cloud-based Neo4J Server (Heroku, Azure, etc.)
Of course, that means you will have to provide Network credentials.
If you only use your bare hands, it could be like this:
var http = (HttpWebRequest)WebRequest.Create(new Uri("http://<<your_REST_query"));
var cred = new NetworkCredential("yourusername", "yourpassword");
http.Credentials = cred;
var response = http.GetResponse();
var stream = response.GetResponseStream();
But how can I include network credentials to connect with Neo4JClient? or is there another option that I don't know?
We support the standard URI syntax for basic authentication credentials:
var client = new GraphClient(new Uri("http://user:pass#localhost:7474/db/data"));
From version 1.1.0.0
var username = "app_username"
var password = "1#mGr#phG0d"
var client = new GraphClient(new Uri("http://localhost:7474/db/data"), username, password);
I would like to build a simple REST web service (using Ruby on Rails). However, I would like to be able to call this service from a Windows mobile app. Is that possible? or do I have to use SOAP?
I don't have much experience with Windows Mobile apps so it would be nice if you can provide pseudo code or link to tutorial for the possible case.
Thanks,
Tam
Yes you can. I've done it lots using the Win32 wininet API.
You can also do it in C# using the System.Net HttpWebRequest API.
dim sendUrl : sendUrl = baseUrl & url
dim objXML : Set objXML = CreateObject("MSXML2.ServerXMLHTTP.6.0")
objXML.open "GET", sendUrl, false
objXML.setRequestHeader "Content-Type", "application/x-www-form-urlencoded"
objXML.send(sendxml)
HttpPost = objXml.responseText
Set objXML = nothing
On desctop Microsoft offers an com interface which can be used to implement REST APIs.
Maybe this also exists on Windows Mobile.
Here's an example of using a HttpWebRequest to call the twitter search api,hth:
Uri uri = new Uri("http://search.twitter.com/search.json?q=twitter");
String result = String.Empty;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (Stream responseStream = response.GetResponseStream())
{
using (StreamReader readStream = new StreamReader(responseStream, Encoding.UTF8))
{
result = readStream.ReadToEnd();
}
}
}