We have a requirement to get the printer IP Address configured in the default printer driver in Control Panel in our UWP app.
I was able to retrieve the "System.DeviceInterface.PrinterPortName" by fetching interface class GUID and passing this above property for retrieval.
But I couldn't get "System.Devices.IpAddress" similarly.
Code pasted below for PortName.
I badly need the IP address as the port name is user's choice and could be modified to any name removing the IP address.
Kindly help sharing working code to retrieve the IP Address using above property or any other way in UWP app.
Below is Working Code for Port Name, Kindly help to fetch IP Address of the same port similarly.
string aqsFilter = "System.Devices.InterfaceClassGuid:=\"{0ecef634-6ef0-472a-8085-5ad023ecbccd}\"";
string[] propertiesToRetrieve = new string[] { "System.DeviceInterface.PrinterPortName"};
DeviceInformationCollection deviceInfoCollection = await DeviceInformation.FindAllAsync(aqsFilter, propertiesToRetrieve);
foreach (DeviceInformation deviceInfo in deviceInfoCollection)
{
if (deviceInfo.IsDefault == true)
{
string strPortName = (string)deviceInfo.Properties["System.DeviceInterface.PrinterPortName"];
if (!string.IsNullOrEmpty(strPortName))
{
strPortName = await ParsePortName(strPortName);
if (!string.IsNullOrEmpty(strPortName))
{
_strIPAddress = strPortName;
}
}
break;
}
}
This is not endorsed because the IP address can change and so it is unreliable.
That being said, if your printer is installed using wsd, it is technically supported
E.g.,
DEVPKEY_PNPX_IpAddress DEVPROP_TYPE_STRING_LIST 32 "10.137.192.202"
But there is no way to reliably use this without a lot of various scenario checks since the IP address may change.
Furthermore, looking at this example, you are not hitting the DAF providers but looking for devices. You are using 0ecef634-6ef0-472a-8085-5ad023ecbccd which is the printer class guid. It also does not look like IP address is propagated in the PnP Explorer property bag so the IP address is not accessible.
I'm trying to make a program that sends SNMP queries to some switches in the network.
Using the Net-snmp tools, I can send get requests to the switch using its name, and it works fine. But SNMP4J requires an IP address in CommunityTarget, so I get an IllegalArgumentException.
This is the relevant part of the code:
TransportMapping transport = new DefaultUdpTransportMapping();
transport.listen();
CommunityTarget comtarget = new CommunityTarget();
comtarget.setCommunity(new OctetString("public"));
comtarget.setVersion(SnmpConstants.version1);
comtarget.setAddress(new UdpAddress("switchName")); // exception happens here
comtarget.setRetries(2);
comtarget.setTimeout(1000);
How can I work around this?
You can get the IP address by using DNS resolution, like this answer says:
InetAddress address = InetAddress.getByName(switchName);
System.out.println(address.getHostAddress());
Am trying to use the valance api to call few methods. Am authenticating using https://apitesttool.desire2learnvalence.com from where am getting UserId & UserKey. Now am confused what should i pass in the x_a - x_d parameters for getting the organization info.
What ever i pass i get a 403 forbidden & incorrect token exception.
Some body please help. Am passing folling in the parameters.
x_a : Application ID
x_b : User ID( I got this from https://apitesttool.desire2learnvalence.com)
x_c : private String calculateParameterExpectation( String key, String httpMethod, String apiPath, long timestamp)
{
String unsignedResult = String.format("%s&%s&%s", httpMethod, apiPath, timestamp);
System.out.println(unsignedResult);
String signedResult = D2LSigner.getBase64HashString(key, unsignedResult);
return signedResult;
} Where key is the App Key
x_d : private String calculateParameterExpectation(
String key, String httpMethod, String apiPath, long timestamp) {
String unsignedResult = String.format("%s&%s&%s", httpMethod, apiPath, timestamp);
System.out.println(unsignedResult);
String signedResult = D2LSigner.getBase64HashString(key, unsignedResult);
return signedResult;
} Where key is the User Signature that i got from https://apitesttool.desire2learnvalence.com
Am not sure what is done wrong.
Please note that each back-end service generates a unique UserID/Key pair to go with each user and each application ID, upon request by a call from that application ID.
This explicitly means that User ID/Key pairs are not transferrable from one application to another. Nor are they transferrable from one back-end service to another -- every API-using application should request its own UserID/Key pair for making calls on behalf of each distinct user. Even if you used your App ID/Key when using the api-test tool, unless you pointed the tool at the same back-end service you're actually making the API calls against, you won't get back a UserID/Key pair you can use for later making the API calls (against another service).
Please also note that the signing mechanism requires that you use the upper-case version of the http-method string (thus GET, not get), and it requires that you use the lower-case version of the api path string (thus /d2l/auth/api/token, not /D2L/AUTH/API/TOKEN). If you're pointing the api-test tool at the same LMS you're wanting to make API calls against, and you're using the same App ID/Key pair with the api-test tool as you're using in your production code, then I would seek to make sure that you're formatting your base string exactly right for signing.
I would also encourage you to make fuller use D2L's own client SDK libraries for doing the app/user context management and signing, rather than just using the raw signing call from within the library.
How can I resist the bad unidentified bots to crawl my website? Some bad bots whose name is not present in cPanel of Apache are badly accessing my website bandwidth.
I had tried robots.txt on batgap.com/robots.txt and also blocked with .htaccess but there is no improvement in bandwidth usage. I don't know the IP of those bots so unable to block them by IP address. These bots are consuming too much bandwidth of site and hence a result I need to increase it from server.
I'm from Incapsula and we deal with bad bots on a regular basis.
We've recently release a bot-related research that provides insights of the scope of the problem ( http://www.incapsula.com/the-incapsula-blog/item/225-what-google-doesnt-show-you-31-of-website-traffic-can-harm-your-business ) and in light of this data I have to agree with #Leonard Challis - you simply can not handle bot protection manually.
Having said that, there are bot protection solutions, even Free ones (us included) that can help you with bad bots.
BTW - Just like you mentioned, one byproduct of bad bots visits is a loss of bandwidth.
We`ve recently became aware of just how surprisingly HUGE bot-related bandwidth usage really is.
This is an interesting topic by itself.
We believe that by avoiding bad bot traffic, hosting providers can actually greatly improve their efficiency (hopefully using this to drop cost or to improve services). Once you imagine Social and Business implication of this you can understand the real scope of this bad bot problem that goes way beyond the immediate damage done.
I block 'bad bots' by using PHP.
I filter in IP address primarily, then by User-Agent secondarily.
I make the 'bad bot' wait for up to 999 seconds, then return a very small web page.
Usually (always) the internet connection times-out and zero (0) bytes are returned.
Best of all I have delayed them for a few minutes before the get to the next victim.
http://gelm.net/How-to-block-Baidu-with-PHP.htm
Unfortunately robots.txt is sometimes ignored by these "bad bots", though if the problem is more things like genuine search engine spiders that you don't want to see they ought to take it in to account. I presume with CPanel you can get in to the web server (apache) logs? In there you can look for two things: the IP and the User-Agent. You can find the culprits in there and add them to your robots.txt and .htaccess. Note that .htaccess rules denying IP addresses are far better that just relying on robots.txt because you are taking the choice out of the bot creator's hands.
If you know specific bots which are doing this you should be able to get IP addresses and user-agents from forums, but if it's a more general thing then really I'm afraid it's more of a manual job.
There are other methods that can be used with varying effect, such as mod_security (http://www.askapache.com/htaccess/modsecurity-htaccess-tricks.html) but this will mean you'll have to access your web server configuration.
Finally, you can check the links that are pointing to your web site (using the link: option on google). Sometimes if you have links on spammy forums or the like this can increase the chances of bots coming to get you. Maybe you can look at the referer URL in the apache logs - but this is all based on a lot of presumptions and you'd probably be lucky if it had a great effect.
Block Unwanted Robots/Spiders visitors via PHP
Instructions:
Place the following PHP Code in the beginning of your index.php file.
The idea here is to place the code in the main site's PHP home page, the main entry point of the site.
If you have other PHP files that are accessed directly via an URL (not including PHP include or require support type files), then place the code in the beginning of those files.
For most PHP sites and PHP CMS sites, the root's index.php file is the file that is the main entry point of the site.
Keep in mind that your site statistics, i.e. AWStats, will still log the hits under Unknown robot (identified by 'bot' followed by a space or one of the following characters _+:,.;/-), but these bots will be blocked from accessing your site's content.
<?php
// ---------------------------------------------------------------------------------------------------------------
// Banned IP Addresses and Bots - Redirects banned visitors who make it past the .htaccess and or robots.txt files to an URL.
// The $banned_ip_addresses array can contain both full and partial IP addresses, i.e. Full = 123.456.789.101, Partial = 123.456.789. or 123.456. or 123.
// Use partial IP addresses to include all IP addresses that begin with a partial IP addresses. The partial IP addresses must end with a period.
// The $banned_bots, $banned_unknown_bots, and $good_bots arrays should contain keyword strings found within the User Agent string.
// The $banned_unknown_bots array is used to identify unknown robots (identified by 'bot' followed by a space or one of the following characters _+:,.;/\-).
// The $good_bots array contains keyword strings used as exemptions when checking for $banned_unknown_bots. If you do not want to utilize the $good_bots array such as
// $good_bots = array(), then you must remove the the keywords strings 'bot.','bot/','bot-' from the $banned_unknown_bots array or else the good bots will also be banned.
$banned_ip_addresses = array('41.','64.79.100.23','5.254.97.75','148.251.236.167','88.180.102.124','62.210.172.77','45.','195.206.253.146');
$banned_bots = array('.ru','AhrefsBot','crawl','crawler','DotBot','linkdex','majestic','meanpath','PageAnalyzer','robot','rogerbot','semalt','SeznamBot','spider');
$banned_unknown_bots = array('bot ','bot_','bot+','bot:','bot,','bot;','bot\\','bot.','bot/','bot-');
$good_bots = array('Google','MSN','bing','Slurp','Yahoo','DuckDuck');
$banned_redirect_url = 'http://english-1329329990.spampoison.com';
// Visitor's IP address and Browser (User Agent)
$ip_address = $_SERVER['REMOTE_ADDR'];
$browser = $_SERVER['HTTP_USER_AGENT'];
// Declared Temporary Variables
$ipfound = $piece = $botfound = $gbotfound = $ubotfound = '';
// Checks for Banned IP Addresses and Bots
if($banned_redirect_url != ''){
// Checks for Banned IP Address
if(!empty($banned_ip_addresses)){
if(in_array($ip_address, $banned_ip_addresses)){$ipfound = 'found';}
if($ipfound != 'found'){
$ip_pieces = explode('.', $ip_address);
foreach ($ip_pieces as $value){
$piece = $piece.$value.'.';
if(in_array($piece, $banned_ip_addresses)){$ipfound = 'found'; break;}
}
}
if($ipfound == 'found'){header("location: $banned_redirect_url"); exit();}
}
// Checks for Banned Bots
if(!empty($banned_bots)){
foreach ($banned_bots as $bbvalue){
$pos1 = stripos($browser, $bbvalue);
if($pos1 !== false){$botfound = 'found'; break;}
}
if($botfound == 'found'){header("location: $banned_redirect_url"); exit();}
}
// Checks for Banned Unknown Bots
if(!empty($good_bots)){
foreach ($good_bots as $gbvalue){
$pos2 = stripos($browser, $gbvalue);
if($pos2 !== false){$gbotfound = 'found'; break;}
}
}
if($gbotfound != 'found'){
if(!empty($banned_unknown_bots)){
foreach ($banned_unknown_bots as $bubvalue){
$pos3 = stripos($browser, $bubvalue);
if($pos3 !== false){$ubotfound = 'found'; break;}
}
if($ubotfound == 'found'){header("location: $banned_redirect_url"); exit();}
}
}
}
// ---------------------------------------------------------------------------------------------------------------
?>
First of all, a kind user named "leppie" tried to help me but I couldn't get the answer I am looking for and it's kind of an urgent matter.
I run a windows service in Windows 7 with LocalSystem account (Since this win service will be installed many computers remotely and silently, I guess I need to use LocalSystem in ServiceInstaller.Designer.cs by the code below:
this.ProcessInstaller.Account = System.ServiceProcess.ServiceAccount.LocalSystem;
this.ProcessInstaller.Password = null;
this.ProcessInstaller.Username = null;
When I run this windows service the code below cannot get the currently logged in user's credentials (the users do not have admin privileges, not even myself).
using (DirectoryEntry de = new DirectoryEntry("LDAP://MyDomainName"))
{
using (DirectorySearcher adSearch = new DirectorySearcher(de))
{
adSearch.Filter = "(sAMAccountName=" + Environment.UserName + ")";
SearchResult adSearchResult = adSearch.FindOne();
UserInternalEmail = GetProperty(adSearchResult, "mail");
}
}
I have been suggested to run the WinService under a AD/LDAP/domain account, but which user could this be?
this.ProcessInstaller.Account = System.ServiceProcess.ServiceAccount.<User ? LocalService ? NetworkService>;
this.ProcessInstaller.Password = "adminpassword";
this.ProcessInstaller.Username = "adminusername";
I mean, lets say an ABC user is an admin and lets say I knew the password and username of this ABC admin, but when this admin changes the password, I think this will effect my winservice which will be running on 70 computers.
Is there a way to retrieve the user credentials on active directory? I would be really appreciated if you provide me some code samples..
Thank you very very much,
The problem is that Environment.UserName will always return the username of the service account under which the service is running, not the user logged into the machine.
See this question for information on how to get the names of users logged into the workstation. Keep in mind that Windows will allow multiple users to be logged in at the same time.