Using X509Certificate Pfx for client certificate authentication creates too many temporary files in C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys - x509certificate2

We are using ClientCertificate authentication with HttpWebRequest. For the client certificate authentication we are creating the X509Certificate from Pfx file and attached to the outgoing HttpWebRequest.
What we see is that after the request is created we are seeing to many temporary files in the folder : C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys
As we make more request we see that the number of temporary files created in C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys folder is increasing continuously.
We create the certificate from pfx file and then we set the key container. Expilicity access is granted using CryptoKeySecurity and CryptKeyAccessRule. The private method SetKeyContainerSecurity takes the CspKeyContainerInfo and CryptoSecurity. Then it uses the CryptAcquirecontext and CryptSetProvParam to set the key container security.
The issue we are facing is large number of temporary files getting created in the disk at location : C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys.
What is the correct approach to ensure that these temporary files are deleted.

Is the hotfix at http://support.microsoft.com/kb/931908 relevant? It addresses the issue "On a Windows Server 2003-based or Windows Server 2008-based client computer, the system does not delete a temporary file that is created when an application calls the "CryptQueryObject" function"

Related

A way to run FTP server (vsftpd or proftpd) without tying it to the linux user subsystem

I am looking for a way to make a simple ftps server that will serve a single folder containing 2 files using a dedicated username:password pair.
The issue is that for security reasons I have two requirements:
The server will not give access (even read) to anything outside the specified folder (server has world readable files that should be only accessible by users having accounts on that server)
I don't want to tie the ftp server with existing users system (the entire ftp application and its config must be independent of the server configuration)
So far every tutorial I found is using pam to configure both vsftpd and proftpd,
while I want a simple config file having username:password:folder triplet eg:
backups:s3cr#t:/backups/origin
backups2:secret:/backups/anonymized
documents:secret:/var/www/data/documents
How can I do it with either vsftpd or proftpd?

How to authenticate to Cloud Storage from a Docker app on Cloud Run

I have a Node.js app in a Docker container that I'm trying to deploy to Google Cloud Run.
I want my app to be able to read/write files from my GCS buckets that live under the same project, and I haven't been able to find much information around it.
This is what I've tried so far:
1. Hoping it works out of the box
A.k.a. initializing without credentials, like in App Engine.
const { Storage } = require('#google-cloud/storage');
// ...later in an async function
const storage = new Storage();
// This line throws the exception below
const [file] = await storage.bucket('mybucket')
.file('myfile.txt')
.download()
The last line throws this exception
{ Error: Could not refresh access token: Unsuccessful response status code. Request failed with status code 500"
at Gaxios._request (/server/node_modules/gaxios/build/src/gaxios.js:85:23)
2. Hoping it works out of the box after setting the Storage Admin IAM role to my Cloud Run service accounts.
Nope. No difference with previous.
3. Copying my credentials file as a cloudbuild.yaml step:
...
- name: 'gcr.io/cloud-builders/gsutil'
args: ['cp', 'gs://top-secret-bucket/gcloud-prod-credentials.json', '/www/gcloud-prod-credentials.json']
...
It copies the file just fine, but then the file is nor visible from my app. I'm still not sure where exactly it was copied to, but listing the /www directory from my app shows no trace of it.
4. Copy my credentials file as a Docker step
Wait, but for that I need to authenticate gsutil, and for that I need the credentials.
So...
What options do I have without uploading my credentials file to version control?
This is how I managed to make it work:
The code for initializing the client library was correct. No changes here from the original question. You don't need to load any credentials if the GCS bucket belongs to the same project as your Cloud Run service.
I learned that the service account [myprojectid]-compute#developer.gserviceaccount.com (aka "Compute Engine default service account") is the one used by default for running the Cloud Run service unless you specify a different one.
I went to the Service Accounts page and made sure that the mentioned service account was enabled (mine wasn't, this was what I was missing).
Then I went here, edited the permissions for the mentioned service account and added the Storage Object Admin role.
More information on this article: https://cloud.google.com/run/docs/securing/service-identity
I believe the correct way is to change to a custom service account that has the desired permissions. You can do this under the 'Security' tab when deploying a new revision.

Calling secured web service

I am having difficulty making connection to secured webservice. My service provider gave me a WSDL URL (which uses SSL Connection) and a PFX file. The service provider told us that WSDL can only be accessed by users belonging to our network.
So far I have tried:
(i) Exported key from given pfx file to separate keystore
(ii) Exported certificate to (a) java 's default truststore (b) seaparate store
I have set system properties:
System.setProperty("javax.net.ssl.keyStore", "C:/Test/keystore.jks");
System.setProperty("javax.net.ssl.keyStorePassword", "test123");
System.setProperty("javax.net.ssl.keyStoreType", "jks");
System.setProperty("javax.net.ssl.trustStore", "C:/Program Files/Java/jdk1.7.0_17/jre/lib/security/cacerts");
System.setProperty("javax.net.ssl.trustStorePassword", "changeit");
I am using latest versions of CXF & Java. I got several exceptions which I could resolve by Googling the issues, but I am finally struck with "could not send message exception". How can I deal with this exception?

ASP.NET MVC NESTED Virtual Directory for Content Files

I'm trying to map a virtual directory in my ASP.NET MVC 3 website. The virtual directory contains image files only, and the physical directory is located on another server. When I try to access an image from this directory via a web browser, I get an HTTP 500 error:
Parser Error Message: An error occurred loading a configuration file: Failed to start monitoring changes to '<virtual directory path>' because access is denied.
[EDIT]: After restarting IIS, I see that it's added "access is denied" to the end of the error message. However, the directory has the "Everyone" permission on it with read-access and it's still not working. I've seen posts that detail this same error, but the permission-setting solution isn't working for me.
Why is it looking for a configuration file? I've updated my Global.asax to ignore the route in question, and from within IIS 6, I can browse the files located in the virtual directory without issue. I've verified that the virtual directory is NOT set up as an application. Also, the permissions on the directory have read access for Everyone set. What am I doing wrong?
[EDIT #2]: The virtual directory being pointed to is a network shared folder... Does that make any difference?
[EDIT #3]: The IIS 6 hierarchy for what I'm trying to do is this: Default Website -> OurSite (MVC website which is ITSELF a virtual directory) -> Images (virtual directory that I'm struggling with). There definitely is a problem with the fact that this is a nested virtual directory; if I create the directory as a direct child of Default Website instead of Default Website -> OurSite, it works fine.
Thanks,
Andy
Make sure the identity running the App Pool also has execute and list permissions on that virtual folder as it's trying to monitor it for changes for caching.
http://support.microsoft.com/kb/316721/
[EDIT] This link indicates it may be the second case where it's not having permissions to a subfolder in the shared directory.
http://support.microsoft.com/kb/317955
Does the ASPNET account have access all the way down that tree?
Reading Turnkey's answer and your comments, I just wanted to point out that if your AppPool is using NETWORK SERVICE identity then your remote server needs to add the web server's machine/computer account to the shared folder and NTFS permissions NOT the NETWORK SERVICE account.
If you add NETWORK SERVICE to the shared folder and NTFS permissions, you are just adding the local machine's NETWORK SERVICE account not the web server's NETWORK SERVICE account. I hope I am making sense.
Whenever you use a computer's NETWORK SERVICE account to connect remotely you are in effect using the computer's machine account so that's what you need to add.
Say your web server is called PRODSERVER, you need to add the machine/computer account PRODSERVER$ to the remote shared folder's permissions.
Hope that helps.

How can I create a local user profile for the anonymous user of an ASP.Net MVC application under IIS 7?

I've been experimenting with ASP.Net MVC, and have come across a problem that is probably not specifically MVC related. But I cannot get the authentication in the default MVC application (the one created by the wizard when you create a new MVC project) to work properly under IIS 7 on Windows 7.
If I run under the Visual Studio environment, it works, but if I switch the settings to run under IIS instead, I get the following exception trying to submit the login or registration:
Failed to generate a user instance of
SQL Server due to failure in
retrieving the user's local
application data path. Please make
sure the user has a local user profile
on the computer. The connection will
be closed.
I believe that this is because the website runs under my own account in Visual Studio, but under the IUSR account in IIS. Google searches on the exception message have been unhelpful so far.
So, can one create a local user profile for the IUSR account? If so, how? Is there something else I should be doing to get the SQLExpress engine to work under the anonymous account in IIS 7?
I also tried configuring the IIS website to use my account, but since this is my home machine, my account doesn't have a password, and it appears that IIS won't let a website be configured to use an account without a password. Or, since this is my first experience with IIS 7, and configuration feels very different than IIS 5/6, I am just missing the right setting that will let me configure the account to use for anonymous access.
EDIT: Some additional information. If I empty the App_Data folder and try again from IIS, SQLExpress attempts to create my database and fails, but the exception message has further information with the following suggestions.
SQLExpress database file auto-creation
error:
The connection string specifies a
local Sql Server Express instance
using a database location within the
applications App_Data directory. The
provider attempted to automatically
create the application services
database because the provider
determined that the database does not
exist. The following configuration
requirements are necessary to
successfully check for existence of
the application services database and
automatically create the application
services database:
If the applications App_Data directory does not already exist, the
web server account must have read and
write access to the applications
directory. This is necessary because
the web server account will
automatically create the App_Data
directory if it does not already
exist.
If the applications App_Data directory already exists, the web
server account only requires read and
write access to the applications
App_Data directory. This is necessary
because the web server account will
attempt to verify that the Sql Server
Express database already exists within
the applications App_Data directory.
Revoking read access on the App_Data
directory from the web server account
will prevent the provider from
correctly determining if the Sql
Server Express database already
exists. This will cause an error when
the provider attempts to create a
duplicate of an already existing
database. Write access is required
because the web server accounts
credentials are used when creating the
new database.
Sql Server Express must be installed on the machine.
The process identity for the web server account must have a local user
profile. See the readme document for
details on how to create a local user
profile for both machine and domain
accounts.
I've pretty extensively confirmed that the first three suggestions have been satisfied. The fourth seems to be the cause of my problems, but I can't figure out how to do that. And although the suggestion claims there is a readme document that describes it, I have not been able to find that document.
I got this problem as well running under medium trust. The process that creates the database requires at least High trust. You can check this by looking in your Web.Config for
<trust level="TrustLevel" />
If there is no trust specified in your Web.Config, try adding it and set it to either Full or High. If this doesn't work, there is a machine.config in your IIS which you would need to modify.
That being said, the best route I have found to solve this is to just use aspnet_regsql.exe to create the necessary tables and then change the connection string in your Web.Config to look at it directly.
I solved "The directory 'LocalApplicationData' does not exist." error when running an SSIS package through an SQL Job by ensuring the SQL SSIS service and SQL Server Agent service were running under the same account as the swql job was set to use!
This in my case was a domain account.
Solution: Try UNINSTALLING any updates before you started experiencing the issue. I spent countless hours - wasted hours that I will never get back in my life - reading and following every solution possible without success. I uninstalled all SQL Server updates and now everything works fine.

Resources