We are experiencing some issues with EF6 and Always encrypted feature.
I believe we need to set up something into DBContext, in order to instruct how to encrypt or decrypt columns, but I couldn't find a way to do this.
We already have an ADO access layer, and it works perfectly with encrypted fields. We would rather use EF instead of ADO.
Symptoms are:
With EF, We are able to query the data. And decryption process works fine.
Insertion process throws error below:
Operand type clash: varchar is incompatible with varchar(8000) encrypted with (encryption_type = 'DETERMINISTIC', encryption_algorithm_name = 'AEAD_AES_256_CBC_HMAC_SHA_256', column_encryption_key_name = 'CEK_Auto1', column_encryption_key_database_name = 'Development_v2_qa') collation_name = 'SQL_Latin1_General_CP1_CI_AS'**
Query with where clause, using an encrypted field, throws same error.
Technologies used:
EF6 with Poco entities.
AzureKeyVault for storing encryp/decryp masterkey.
Using SSL Certidicate to authenticate against KeyVault
Connection string contains "Column Encryption Setting=enabled;"
AzureSqlServer
FWK4.6
ADO
We have some code which works fine with ADO. It works fine with every SqlConnection
// Instantiate our custom AKV column master key provider.
// It uses the GetToken function as the callback function to authenticate to AKV
SqlColumnEncryptionAzureKeyVaultProvider akvprov = new SqlColumnEncryptionAzureKeyVaultProvider();
akvprov.KeyVaultClient = SecureConfigurationManager.KeyVaultClient;
// Register the instance of custom provider to SqlConnection
Dictionary<string, SqlColumnEncryptionKeyStoreProvider> providers = new Dictionary<string, SqlColumnEncryptionKeyStoreProvider>();
// "SqlColumnEncryptionAzureKeyVaultProvider.ProviderName" is the name of the provider. It must match the string we used when we created the column master key
providers.Add(SqlColumnEncryptionAzureKeyVaultProvider.ProviderName, akvprov);
SqlConnection.RegisterColumnEncryptionKeyStoreProviders(providers);
Yep, I just found the same problem, needed to add
[Column(TypeName = "varchar(max)")]
in the POCO type before the field for it to work.
Be nice if the error was a bit clearer (and nicer still if NVARCHAR did actually work)
I'm working through the same issue. The problem is with the datatype mapping from C# to the database. Not all the lengths matter at all for always encrypted and varchars with Entity Framework, only varchar(max) or varchar(8000). I have all the entity framework working with azure key vault for all the datatypes, same as you. This link below shows how to do the insert with inline SQL. I've only worked with entity framework and hope I never have to work in inline sql, even though I might have to, if I can't find a way to shrink down the database storage overhead needed for encryption, or look to something like Stretch Db, also another feature in SQL Server 2016. Thanks Jakub Szymaszek and Microsoft.
I have conceded and made all of my data types varchar(max) and it works just fine. So string = varchar(max). The weird thing is that there is not 8000 characters in the encryption, but there is probably 8000 allocated.
"something1" becomes this after encryption and insert:
0x0190F9D80C3F70890FB154F2123459506AD5BDA165333710D161ED80E42FCAFA882C66FF5B68E412B5F9EE11A9F308201D0AE2BD4032151398171FDBE2F3AEA20D
Interesting thing about varchar(max) is that supposidly there is a link to a table or somewhere else the data is stored, beside the table it is inserted into, so varchar(max) may only take the amount shown. (I'm a dev)
The dataType for my column and stored procedure variables:
[testVarChar] varchar COLLATE Latin1_General_BIN2 ENCRYPTED WITH (COLUMN_ENCRYPTION_KEY = [CEK_Auto1], ENCRYPTION_TYPE = Randomized, ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256') NOT NULL,
The data type of the parameter targeting the SSN column is set to an ANSI (non-Unicode) string, which maps to the char/varchar SQL Server data type. If the type of the parameter was set to a Unicode string (String), which maps to nchar/nvarchar, the query would fail, as Always Encrypted does not support conversions from encrypted nchar/nvarchar values to encrypted char/varchar values. See SQL Server Data Type Mappings for information about the data type mappings.
https://learn.microsoft.com/en-us/sql/relational-databases/security/encryption/develop-using-always-encrypted-with-net-framework-data-provider
Related
My project handles data that is stored in a key value based NoSQL database.The value part is stored as byte stream.I want a type provider to read my data according to the schema of the byte stream.The schema of the data is represented as json schema. Can I use Json type provider to read this data? If no then what can be the solution to my problem?
If your DB stores the JSON as a bytestream, simply decode it through System.Text.Encoding.UTF8.GetString (replace UTF8 with the appropriate encoding if necessary) in order to get the JSON as a regular string.
Then, you can use the JSON type provider on that stream like on any other stream, as long as you provide a compile-time sample for the type provider to use. A schema doesn't work.
In other words, you need to extract a fully representative sample of your database's JSON contents, then declare the provided types using that sample, either as a string directly embedded in the code, or as a file URI that your development machine can access.
As long as the sample matches the actual structure of your database, it will work at run-time.
// embedded in the code
type Simple1 = JsonProvider<""" { "name":"John", "age":94 } """>
// referenced
type Simple2 = JsonProvider<#"C:\MyProjectFolder\sample.json">
I want to use the SqlEntityConnection type provider in f# to query and update a db.
It works well when I use it with the connection string pointing to a live SQL Server DataBase.
type EntityConnection = SqlEntityConnection<"Data Source=myServer;Initial Catalog=myDb;...", Pluralize=true>
Now I want to get rid of the dependency with the live DB and, instead, use a local schema file. Given what I read on msdn, I gave a try to the following line:
type private EntityConnection = SqlEntityConnection<LocalSchemaFile="mySchemaFile.ssdl", Pluralize=true>
Unfortunately, it doesn't compile and the compiler's message is:
Error 46 The type provider 'Microsoft.FSharp.Data.TypeProviders.DesignTime.DataProviders' reported an error: When using this provider you must specify either a connection string or a connection string name. To specify a connection string, use SqlEntityConnection<"...connection string...">.
So what should I do? If I leave the connection string, I have the feeling that I don't really turn off the dependency to the DB. For instance, if I try to switch the Data Source with a non-existing server, it doesn't compile.
You can provide a connection string name, along with a connection string in the configuration file.
You can still provide a localschemafile that will be used to cache locally the schema.
If you put the schema file under source control, the connection string will only be used as a default when calling GetDataContext() without parameter, but not when building or editing code.
You also need to set the parameter ForceUpdate to False.
I am using EF code first model to get the data from data base table in which i have 400,000 records.
But when i use the LINQ query something like:
var urer = context.UserEntity.Where(c => c.FirstName.Contains('s'));
The above statement gives me all the user to whose first name contains 's'. But since this is a huge data base table, it is giving me the following error:
An existing connection was forcibly closed by the remote host
Please suggest me the best way to do it. I am assigning this data to gridview. I am thinking to get the first 500 each time. Is there any way to do it from EF side, so that i won't need to do it in sql.
Thanks
1.add index on your column
2. increase timeout connection
You can create Store procedure
USE LINQ call Store procedure
LINQ to SQL (Part 6 - Retrieving Data Using Stored Procedures)
http://weblogs.asp.net/scottgu/archive/2007/08/16/linq-to-sql-part-6-retrieving-data-using-stored-procedures.aspx
See this answer as well
Calling a SQL Server stored procedure with linq service through c#
Get rid of EF
set key in web.config common key for timeout replace 600
try
{
conn.Open();
mySqlCommand.Connection = conn;
mySqlCommand.CommandTimeout=600;
How would you make the LINQ-TO-SQL type provider, generate or/and regenerate the classes?
I've just added a new table to my database, and the type provider can't figure that out. I've tried to delete the line with the type provider, and type it once more - no luck. I've also tried to do a rebuild.. still no luck.
Edit:
I've defined the type provider like:
[<Generate>]
type dbSchema = SqlDataConnection<"conString">
and using it like:
let ctx = dbSchema.GetDataContext()
You're right - this seems to be quite tricky. I'm using SqlDataConnection type provider in a script file and the only way to update the schema that I've found so far is to make some minor (irrelevant) change in the connection string. For example, add space after = of one of the parameters:
[<Generate>]
type Northwind = TypeProviders.SqlDataConnection
<"data source=.\\sqlexpress;initial catalog=Northwind;integrated security=True">
[<Generate>]
type Northwind = TypeProviders.SqlDataConnection
<"data source=.\\sqlexpress;initial catalog=Northwind;integrated security= True">
// ^ here
The schema seems to be cached using connection string as the key, so if you change it back, you get the old schema again. I guess this is probably a bug, so adding whitespace is a possible workaround.
There is also a parameter ForceUpdate, but that doesn't seem to have any effect and the documentation doesn't say much about it.
I have a problem calling stored procedures with a fixed length binary parameter using Entity Framework. The stored procedure ends up being called with 8000 bytes of data no matter what size byte array I use to call the function import. To give some example, this is the code I am using.
byte[] cookie = new byte[32];
byte[] data = new byte[2];
entities.Insert("param1", "param2", cookie, data);
The parameters are nvarchar(50), nvarchar(50), binary(32), varbinary(2000)
When I run the code through SQL profiler, I get this result.
exec [dbo].[Insert] #param1=N'param1',#param2=N'param2',#cookie=0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
[SNIP because of 16000 zeros]
,#data=0x0000
All parameters went through ok other than the binary(32) cookie. The varbinary(2000) seemed to work fine and the correct length was maintained.
Is there a way to prevent the extra data being sent to SQL server? This seems like a big waste of network resource.
Use varbinary instead of binary. Helpfull for me
EF 4 always uses largish params, because using param-size = data-size mostly means that the query can't be re-used with new param values, so SQL can't cache the queries for you. In other words, what it's doing now is probably more efficient overall than using a smaller param, if your data size ever changes.
If your data size never changes, then use a different column width.