Webroot Endpoint Protection set an Active Policy as follows.
Policy Name
Policy Value
Extensions
Install "https://anywhere.webrootcloudav.com/wtsff/live/latest.xpi"
It triggers the flag "Your browser is being managed by your organization" in settings.
It also reinstalls the Webroot extention after removal.
Webroot is managed by IT but i do not trust it with browsing habits because of it's poor reputation in the past.
The Policy is not located in the registry under
HKLM/Software/Policies
as they were in the past.
Thanks for your help.
Related
I'm getting an error trying to set DNS zone permissions. I'm following official instruction for a feature released in GA on Dec 5th 2022 "Cloud DNS per resource IAM permissions". I'm trying to resolve issue described in this post. Unfortunately I always get You don't have permission to edit the permissions of the selected resource despite of the fact my account seem to have suitable roles.
What I did was:
assign "DNS Administrator" role to my account in IAM (see: my IAM roles)
create a public managed zone in Cloud DNS
tick the check-box for the zone and click on "Permissions" button at the top
Result is this permissions warning.
Still I believe my account is granted "DNS Administrator" role as suggested in the official instruction.
Is this something related to IAM roles/permissions or Cloud DNS issue?
Edit: do not know how to get it working with DNS roles but when assigned Owner role I was able to edit DNS zone permissions.
I am using Visual Studio 2019. While building a solution its asking for credentials and build is getting failed. api.nuget.org asking username and password.
Note: I am using public default nuget package
ZScalar is installed in my system. This was blocking the nuget uri. Hence, the nuget was prompting for credentials. If i use proxy, it didn't prompt for credentials.
From the official thread of Visual Studio's developer community, you should consider signing out from all accounts (located on the top-right corner of Visual Studio). This should solve your issue.
After clicking on your profile, go to Account settings... and you'll be prompted this:
Simply remove all the accounts & enjoy!
Note that this problem has a good chance to be related wtih NuGet packages installation permission.
The answer here, for me, was the blocking of downloadable executable files by group policy.
.nupkg was classified as executable or just not whitelisted - so a group policy (company enforced internet setting) is what was causing the 403 error (on the command line) and this password prompt to nuget.api.org
The password prompt doesn't really make any sense, in my context, but I suppose if I was a network admin and I entered my network password, it might have worked.
I have recently built a two-tier PKI infrastructure. This infrastructure consists of an offline root CA named: xxxx-ROOTCA and an online enterprise CA named: xsxx-SUBCA1.
The server xxxx-SUBCA1 also has an internal web site configured on it to which I want to publish the CRLs.
I have issued a handful of certificates during testing that I would now like to revoke. When I go to manually publish the CRL, I get the following error message: Access is denied. 0x80070005 (WIN32: 5 ERROR_ACCESS-DENIED)
In trying to find a solution to this issue, I have come across several resources that state the computer account of the CA must be given additional rights on share where the CRL list is to be published. I've gone into the share (located at D:\pki on xxxx-SUBCA1) and given the xxxx-SUBCA1$ computer account full control share permissions and full control NTFS permissions. I have also made sure the computer account has the same level of share and NTFS permissions for c:\windows\system32\certsrv\certenroll.
If anyone can help me figure out what I have done wrong here, it would be greatly appreciated.
Best regards,
NTD_1313
You need to give the Cert Publisher group write permission on your share on the web server.
Also, note that (unless this is a lab environment) your web server shouldn't be on the same box as the CA.
Right, this is driving me insane. This works fine locally with Excel 2013, but when the website is published to a remote server with Excel 2010 it fails. From what I can see the DCOM configuration is the same locally as remote.
After fighting with Excel 2010 and DCOM permissions for over an hour now the best I have got is this exception: System.Runtime.InteropServices.COMException (0x80070BBC): Office has detected a problem with this file. To help protect your computer this file cannot be opened.
This is the result of a web application trying to open a *.xls file from a location it has just uploaded to. The application pool is running under ApplicationPoolIdentity and I have set the permissions for this specific app pool under mmc -32 on Launch and Activation Permissions so there's no problem running Excel. What I think I'm facing here is protected mode issues as the file is definitely not corrupt.
I've gone into Excel and Trust Centre settings and have added the location where the *.xls file is uploaded to (and subsequently opened) as a trusted location. If I open the file on the hosting server (under my domain account) I don't get the protected view block on the file - however, the Identity on the DCOM configuration is set to the launching user. So, what does this mean from the following (or something I haven't listed):
I need to add this location as trusted at a group policy level because the account launching the actual application doesn't have this configuration in its profile?
I need to create an actual account on the server and use this account as the Identity for running the application?
... ?
Just to clarify I've already been down the DCOM Security config route and RIDICULOUS issues with C:\Windows\System32\config\systemprofile\Desktop and C:\Windows\SysWOW64\config\systemprofile\Desktop. The configuration is:
.NET 4.5 (classic pipeline) app pool running under ApplicationPoolIdentity
DCOM Config > Security > Launch and Activation Permissions all set for this specific identity (Access Permissions and Configuration Permissions all set to Use Default)
File is uploaded correctly and appears in destination, opening on the server itself (under my domain account) respects the Trusted Location and doesn't give protected mode warning
Process to parse fails with the above exception.
Here is a screenshot of the Interop assembly I'm using if this is pertinent.
Ok... for anyone stumbling on this issue I have bitten the bullet and had to do the following:
Create a local account (AutomatedOffice in my instance) and set DCOM config to run Excel under this account
Log in as above account and change Excel settings to add folder in application root to trusted location and disable protected mode messages
Allow "Network Service" to invoke DCOM processes locally (through server DCOM config and not CLSID config)
Add NTFS permissions for this account on C:\Windows[System32|SYSWOW64]\config\systemprofile\Desktop paths
What was weird, after creating the account I was getting the following exception Retrieving the COM class factory for component with CLSID {00024500-0000-0000-C000-000000000046} failed due to the following error: 80070005 Access is denied. which was resolved by adding HOST\Users and HOST\NetworkServices group to DCOM security (local only!!!) settings.
You need to add in trust center, security locations the folder where your website is published, for examplo if your website reads a file from c:\temporal\ you must put on excel, security locations that folder name
We are providing updates to our Firefox add-on on our own http site. Due to an error in the manual signing process with McCoy, a version of the add-on was published with the wrong public key in its install.rdf. We don't have the matching private key to sign the update.rdf. The Firefox update manager is now silently ignoring our updates.
Is there a way to tell the update manager to notify prominently our users of the problem, and suggest they install the fixed version ?
Sadly, it is not possible.