We are using breeze.js on client side for data access layer over entity framework.
We need to maintain the audit log for all the data changes on server.
Can somebody please advise what could be the best way to do that?
The two options that occur to me are either
1) Server side triggers
2) Use the Breeze beforeSaveEntities mechanism to intercept the Breeze saveChanges call and add audit records directly in code on the server.
Related
I need some advice on how to proceed with this design please:
I use BreezeJS internally on top of my Entity Framework Web API. So
all my data retrievals and saves are done in the Breeze style.
I now need to expose an API for external consumption which has a subset of
the calls I use internally.
Ideally I would like to consume the same API internally.
I do not want to tell the consumers that they have to
use BreezeJS on their client-side to consume my API (I am using
Swagger on top of it)
So, for now I have a separate set of API calls for the External API which go through different validation code and have a structure like:
Get(item)
Get(items)
List item
Post(item)
etc.
My question is: Am I going down the wrong track? Should I either be using BreezeJS on the ExternalAPI or stop using Breeze internally and consume the External API, OR should I perhaps wrap the ExternalAPI methods around Breeze and restrict the saving of the entity graph to the entity type that is being queried at the time only?
I am using localstorage to maintain the data at client side in MVC.
I need to read the localstorage data in MVC controller. Is this possible?
If it is, please give me the solution how to do it.
The server doesn't have access to the client browser which is where the local data is stored . You will need to send the data to the server using javascript.
If you can give us some more information then i can give you an example that would solve your problem.
What version of MVC are you using (MVC2/3/4, WEB API, etc)?
What is the reason you need this data visible to the server?
Do you want the controller to do something with this data before it returns a new view?
etc.
I need to know how many requests I received through odata protocol. Is there any way I can keep track of this ? I need it for analytics purpose. (using a custom tool is not allowed, but nevertheless it could be great if you can point me to some).
Thanks, Ritwik
A few options you could consider:
If you need to capture the data into your own custom format/DB hook into the WCF Data Services ProcessingPipeline (there are general events there for pre and post processing queries and data change events) - CodeProject Example
If you need your own format but for specific entities, check out the Query and Change Interceptors - MSDN
If you can use another framework and need a quick, ready to go dashboard (built into IIS) check out AppFabric Monitoring (requires some install/setup on the server) WCF Data Service and AppFabric
Project:
Exposing via OData (Wcf Data services) an Entity Framework ObjectContext configured by code-first approach.
Everything works fine for simple queries and CUD operations.
However, I can't see how to configure default schema loading(server side).
IE: If my entity Customer has a collection of Addresses or a one on one relation to an Entity called Manager, how can I configure my ObjectContext so that every queries on Customers would load automatically all the addresses and the manager of the Customers Entities?
I know that on the client-side, the caller can use the query().Expand("path") to eager load data. But, I want to specify it on the server side so that all queries on Customers entities will result as it was the .Include("Addresses") or .Include("Manager") would be configured by default?
Any idea?
The only 'hack' we can think of is an HTTPModule that intercepts GET requests and adds some ?expand=XXX to the URL. This would be my last solution if we cannot find anything better...
Thanks for your help!
You could try using a query interceptor.
http://msdn.microsoft.com/en-us/library/dd744837.aspx
Our current project at work is a new MVC web site that will use a WCF service primarily to access a 3rd party billing system via a web service as well as a small SQL database for user personalization. The WCF service uses nHibernate for the SQL database.
We'd like to implement some sort of web farm for load balancing as well as failover and maintenance. I'm trying to decide the best way to handle nHibernate's caching and database concurrency if there are multiple WCF services running.
Some scenarios I've been thinking about...
1) Multiple IIS servers, one WCF server. With this setup, the WCF server would be a single point of failure, but there would be no issues with nHibernate caching or database concurrency.
2) Multiple IIS servers, each with it's own WCF service. This removes a single point of failure, but now nHibernate on one machine would not know about database changes done by another machine.
Some solutions to number 2 would be to use an IStatelessSession so we're not doing any caching and nHibernate is always fetching directly from the database. This might be the most feasible as our personalization database has very few objects in it. I'm also considering a 2nd-level cache such as memcached or Velocity, but it may be overkill for this system.
I'm putting this out there to see if anyone has experience doing this sort of architecture and to get some ideas for a solution. Thanks!
am i missing something here, i don't see a problem with nhibernate on the webservers.
application cache would not be a problem as each nhibernate box would keep it's own cache which would be populate from the datastore. look at creating a table that can be monitored for reasons to do a cache refresh. we used to do this using using CacheDependency class in .net 2.0 that would detect changes to a column and then remove the relevant item from the cache. so if a user inserts a new product, the cache would be dropped and the next call to get the products would load the cache again. it's old but check out: http://msdn.microsoft.com/en-us/magazine/cc163955.aspx#S2 for the concept. cheers
I would suggest not doing caching until not doing caching becomes a problem. Your DB will do its own caching to save you searching for the same data repeatedly, so the only thing you have to worry about is data across the wire. Judging by your description, you're not going to have a problem there. If you ever get to a stage where you do, use a distributed cache - allowing your servers to cache separately will cause you bouncing data problems on refresh.