EF 4.1 Code First modify generated SQL query - entity-framework-4

I am using EF Code First to perform queries on the database. However, I would like to modify the generated SQL commands by CF to add extra things to the SQL Server DB. Is it possible to change the final SQL query before it is executed ??
Thanks.

Related

Fetch file from database server

I'm building a project where the front end is react and the backend is ruby on rails and uses a postgres DB. A required functionality is the ability for users to export a large datasets.
I have the following code snippet that creates a CSV and stores it on the database server.
query = <<-SQL
COPY (SELECT * FROM ORDERS WHERE ORDERS.STORE_ID = ? OFFSET ? LIMIT ?) to '/temp/out.txt' WITH CSV HEADER
SQL
query_result = Order.find_by_sql([query, store_id.to_i, offset.to_i, 1000000])
How would I be able to retrieve that file to send to the front end. I've seen examples that use copy_data and get_copy_data but I couldn't get it to work with parameterized query. Any help would be great. Thanks!
There are two problems with your approach:
COPY doesn't support parameters, so you will have to construct the complete query string on the client side (beware of SQL injection).
COPY ... TO 'file' requires superuser rights or membership in the pg_write_server_files role.
Don't even think of running an application as a superuser.
Even without that, allowing client code to create files on the database server opens you the risk of denial-of-service through a full file system.
I think that the whole idea is ill-conceived. If you have a large query result, the database server will automatically use temporary files if an intermediate result won't fit into memory. Keep it simple.

How can i implement SQL query if i delete my model in rails?

I want to as that what will happen if I delete the models in the rails project and apply my SQL query, is it necessary to have a model to write Raw SQL queries.
is it necessary to have a model to write Raw SQL queries?
No, if you're using ActiveRecord, then you can use the ActiveRecord::Base.connection.execute and/or ActiveRecord::Base.connection.exec_query methods.
Similarly you can even run prepared statements relying on the ActiveRecord::Base.connection.raw_connection.prepare method that it inherits from your adapter.
Providing you have run a migration and the table is present in your database, it is not necessary to have the model file present locally if you want to run raw sql queries.
In doing this you will miss out on a lot of the perks ActiveRecord/Ruby objects provide and make the process much more error prone and vulnerable to sql injection attacks.

How to export the DDL of domains(Database schema) of new version for update

I have developed a application with Grails earlier.Now as per new requirement there is a need to modify the existing domain classes as well as adding a new classes and changing / establishing new relationship between the as well.
Now the new requirement has been implemented and I am going to deploy to production environment. however, the DBA want a script change the production database DDL. DBA is not allowing the auto create / update of database schema while bootstrapping the application.
I know how to export DDL of for creating tables. but that script will drop tables which means all data will be lost.
But I don't know how to export DDL for DDL-update (no drop tables/recreate tables). Anybody has good suggestion ?
You can not expect the existing data to get stored according to the new database schema as is.
For example, you have a table Sample with the contactNumber field with the nullable : true constraint in your existing schema and in your new schema this constraint has been changed to nullable : true & unique : true.
In such cases database will fail to keep the existing data intact or adapt to new schema.
To preserve the existing data, you may have to go through a tedious process like -
Take backup of the existing database.
Make a note of the modification you have made to the existing Domain classes.
Find out which modifications may lead to failure / data loss.
Drop the earlier database schema & Deploy the new application and let it create the database schema.
Write a script or utility which will process & store the data from database backup according to new database schema.Make sure the utility you have written has the capalibility to handle the modification(constraint, field added, field removed) done to the database schema.

EF Code First deployment

I have an MVC code-first application that creates a database based on my schema. On the production machine, we will need to first create the database (empty, with no tables), so that we can assign the proper username and password to my dbcontext in the connection string. Considering that the DB is created on production, what should my code do?
I can't use DropCreateDatabaseIfModelChanges because a DB will already be created with no Metadata.. and can't use DropCreateDatabaseAlways because I need it created only the first time.
I also tried this:
if (context.Database.Exists() && !context.Database.CompatibleWithModel(false))
context.Database.Delete();
if (context.Database.CreateIfNotExists())
but context.Database.CompatibleWithModel(false) return always TRUE on an empty database for some reason...
You might need to deploy the database schema via SQL script manually. You can generate the script from your production database instance.

Bulk upsert with Ruby on Rails

I have a Rails 3 application where I need to ingest an XML file provided by an external system into a Postgres database. I would like to use something like ActiveRecord-Import but this does not appear to handle upsert capabilities for Postgres, and some of the records I will be ingesting will already exist, but will need to be updated.
Most of what I'm reading recommends writing SQL on the fly, but this seems like a problem that may have been solved already. I just can't find it.
Thanks.
You can do upserting on MySQL and PostgreSQL with upsert.
If you're looking for raw speed, you could use nokogiri and upsert.
It might be easier to import the data using data_miner, which uses nokogiri and upsert internally.
If you are on PostgreSQL 9.1 you should use writeable common table expressions. Something like:
WITH updates (id) AS (
UPDATE mytable SET .....
WHERE ....
RETURNING id
)
INSERT INTO mytable (....)
SELECT ...
FROM mytemptable
WHERE id NOT IN (select id from updates);
In this case you bulk process thins in a temp table first, then it will try to update the records from the temptable according to your logic, and insert the rest.
Its a two step thing. First you need to fetch the XML File. If its provided by a user via a form that luck for you otherwise you need to fetch it using the standard HTTP lib of ruby or otherwise some gem like mechanize (which is actually really great)
The second thing is really easy. You read all the XML into a string and then you can convert it into a hash with this pice of code:
Hash.from_xml(xml_string)
Then you can parse and work with the data...

Resources