We've got a bunch of data that we'd like to expose to the world hosted on an asp-net.mvc website. I'd like to ensure that we deliver it using technology that is easy for end developers to implement and not tied to any particular platform, rather than using technology that is unpopular/incompatible with developers.
The kind of requests we expect are mainly to retrieve search results (not many parameters), but down the like we'd like to be able to provide catalogue lookups and the like, which may be more complex.
Bearing this in mind, what is the preferred means of doing this?
Windows Communication Foundation can be used to create both SOAP services (great if your consumers are businesses, using Visual Studio/.NET or Java) or REST services (for people on other platforms). Those are the preferred means of exposing public APIs.
If you want maximum exposure, probably best to use the REST approach, since it is easier to consume from "web" languages like JavaScript. Microsoft has extensive resources on putting together a REST API using WCF.
Honestly, for the kinds of requests you say you need to handle, which all seem to be looking up data as opposed to modifying it, the difference is almost trivial - you can switch from SOAP to REST simply by changing a few attributes/configuration options and you could technically even host both at the same time using very little additional code. As long as you stick to WCF and don't use outdated technology like ASMX/WSE then you will be fine.
Reasons to use REST:
Consumable from almost anywhere (including JavaScript, RSS readers, etc.);
It's popular (in use by Google, Twitter, etc.)
Supports many different data formats (JSON, Atom, etc.)
Reasons to use SOAP:
Standardized security protocol (encryption, non-repudiation, etc.)
Distributed transactions
Message Queuing
That's not an exhaustive list but it should give you an idea of who the target markets are for each. If you're hosting a very open, very public site designed to be consumed by anyone and everyone, go with REST. If the service is part of a business system and you need to guarantee reliability, security, and consistency of data, you'll want to go with SOAP. Choose the appropriate technology based on your target market.
Create a RESTful API. As a developer who often consumes web services, it's what I would expect and prefer.
Many popular services (digg/twitter/netflix/google) are moving to REST over SOAP, so you would be wise to follow suit.
If you do create a REST API you should also create a WADL file. It's WISDL for REST. They're not well supported yet, but they're not hard to create and they'll become more useful as support increases.
YOu will want to check out odata. Look at odata.org and live.visitmix.com/videos
This will give you REST access, metadata support like in SOAP, interoperability with the whole office stack and if you are using WCF Data Services you can implement it in a matter of hours, days at most.
Take a look at netflix.com, they have done it right (IMHO).
Related
The community of developers using odata for their REST implementations seem to be the least of all the REST implementations that I usually come across.
Any reasons?
There is virtually no contract. A service consumer has no idea how to use the service (for example, what are valid Command arguments, encoding expectations, and so on).
The interface errs on the side of being too liberal in what it will accept.
The contract does not provide enough information to consumers on how to use the service. If a consumer must read something other than the service’s signature to understand how to use the service, the factoring of the service should be reviewed.
Consumers are expected to be familiar with the database and table structures prior to consuming the Web service. This results in a tight coupling between service providers and consumers.
Performance will suffer due to dependencies on late binding and encoding/decoding between boundaries within the same service.
Source: https://docs.servicestack.net/why-not-odata
OData is a great standard to expose datasets with good tool support (Excel, Tableau, PowerBI...).
As far as I'm concerned it saved me a lot of time and effort, projecting/sorting/filtering... being available out of the box without having to code anything (especially with .net). It's my go to option for RESTful APIs on table like structures.
I had an interesting conversation with a contractor from one of the major outsourcing companies the other day. He has built restful APIs for many customers and when I asked if he used OData sometimes, he replied 'we don't do OData, we prefer Json' (sigh...).
So I guess one of the possible answers to your question is ignorance, many simply don't know OData or understand it...
How do I know when a web site uses an RDF ?
For example , I know that eBay and Amazon uses RDF because I've read in many articles, but as I know it in practice ?
In practice, there is currently no single standardized way for websites to "advertise" their use of RDF. You find out by them informing you about it in some fashion, e.g. by them publishing a link to API documentation that describes how they use RDF, or indeed by writing an article about it, so pretty much the same way you find out about any REST API / web service. Of course, in the case of RDF/linked data you are often helped by the fact that other datasets you already know about may be linking to the new source, thus making it discoverable.
There are some attempts at defining more standardized mechanisms for 'advertising' a website's linked data use. The W3C VoID specification is the closest thing to a standard in that regard: it provides a vocabulary for publishers to describe the data and access mechanisms they offer, and it also gives pointers on how to make things discoverable. Unfortunately, it is not (yet) very widely adopted.
I need to make a recommendation on approaches for allowing web service (WCF) documentation (wsdl, schemas, locations etc.) to be stored and found. Being able to monitor the services would be a definite bonus.
This needs to be considered in the wider context of moving to an SOA built, where possible, with Microsoft technologies that should be accessible by clients from other frameworks. The aim is to develop a system in which clients do not need to change if a service is moved or new versions are brought online - it should be possible to write the client 'knowing' just one address / location which is capable of directing them appropriately.
Having a central location for the service documentation is important too; our Business Analysts should be able to find all they need to about the services we provide from a central place. We would also want (potentially) to expose that repository of service information to partners as well. I know we could generate wsdls and manually manage them (create a folder somewhere and zip them up before sending them out) but that seems very labour intensive and prone to error (on my part).
As I see it at the moment there are two broad approaches;
Write something bespoke that uses WS-Discoverability and a dynamic routing service which can respond to the client requests.
Get an off the shelf solution.
I have to say that an off the shelf solution is the most likely approach that will be accepted but I have to at least consider the alternatives. For the off the shelf solutions I have identified
BizTalk
WSO2 ESB and WSO2 Governance Registry
as possibly providing the features.
What I need to know
Am I right with my understanding of the broad approaches?
Are there any other approaches I should consider evaluating?
Specifically I also need to know pros and cons of any approach I consider and have an idea of how it could be implemented.
To start with I would definitely not go with Biztalk or any WS-Whatever SOAP based protocol.
Go simpler and you'll be an happy man in the end.
For the middleware I would go Mass Transit
or if you prefer, NServiceBus, which I'm not a big fan off, but which provides another level of enterprise support. If you choose to go with Event SOA you'd get async operations as a bonus.
With the middleware layer defined it is time to define the API Layer. I would not expose my services to the outside world, and if the middleware is event based, the services within it they can only respond to events placed in the bus, so I would use ASP.NET Web API with a REST interface to get the requests to the outside, and based on the request type create the related message (command) and place it on the bus.
Way to high level but I hope it helps.
We have a 3-tier Delphi application written using RemObjects DataAbstract. Many of our customers are asking for an API so they can interact with it using their own applications.
The API must allow the clients to call methods with various parameters and return results ranging from simple parameters to whole datasets.
What types of API can you recommend and how difficult are they to implement?
Since you've written your application using RemObjects DataAbstract then you've got just about everything you need already waiting for you in your application.
RemObjects DataAbstract includes the RemObjects SDK which is one of the most flexible and easy ways to build an API available. The RemObjects SDK lets you expose methods methods to your customers in a multitude of ways from native binary RemObjects calls, to XML-RPC, to JSON, to SOAP, to a local DLL, to Windows Messages, to Named Pipes... even via SMTP/POP.
The beauty is that you'll be able to design one API and then easily expose it to your customers via any or all of these different mechanisms. Just design your API methods, then ask your customer how they'd like to be able to consume it, chances are RemObjects have a message/channel combination that matches their request.
Publish the API as functions in a DLL. Easy enough to code, but limited by the DLL limits (only plain functions, etc.). Not easy to call from scripts, for example
Publish the API as COM objects. A bit more complex to implement (especially if you never used COM before), but very flexible. Can be easily called from scripts, if needed.
Use a standard generic RPC mechanism like SOAP or REST. Better suited for server, not difficult to implement, requires a "listener" active to receive the calls
Use your own protocol to communicate. Longer to implement, can be faster than SOAP or REST, but requires also more work on the customer side.
Besides the plain business logic API, I think it will be also a big advantage if the application offers APIs for generic tasks like:
logging / audit trails
monitoring (performance, statistics)
rights administration
basic administration (shutdown / go to maintenance mode)
messaging (send notifications to users or applications)
Is it a good practice to develop web-service and web-site in two different languages, on two different servers? E.g. right now I create a Java web-service running on Glassfish and Ruby on Rails presentation layer running in the same server.
I'd like to leave web-service on the same server but use Ruby 1.9, running in Passenger.
Is it a good idea? I don't have experience in architecture of web-apps.
If you write a contract first web service that consumes and produces XML, you can talk to any client that can make an HTTP GET or POST request in the appropriate format. SOAP or REST, doesn't matter.
I've written Java/Spring web services that started with an XSD. A Yahoo UI RIA client took the WSDL, made an HTTP POST to send the request document, and displayed the XML response in a nice data grid.
Technically, yes you can most certainly do that. That is one of the advantages of using WS. They are interoperable.
However, I would give some consideration to the thought if someone else were to maintain it and has expertise in only one of the two platforms (RoR or Java). It is always best to ask :-)
In terms of the architecture of the system, yes, this is a "good practice". By good, I mean that it achieves the goals, does no harm, and enforces separation of concerns.
I've been developing on an architecture that has a similar structure. The user interface is .NET and uses Java Web Services. That web services then are responsible for all interaction with the persistence media, third party components, etc.
I'd say in any system you should be working to abstract your user interface logic from your business logic. It's just good separation of concerns. Using web services to do that is just one way to achieve that goal. I'd recommend using web services in the case that you will re-use those business services in other use cases in your system.
One more thing; after using two different technologies on the UI and WS for the last 8 years, I've learned that most of the challenges are organizational, not technical. For example, it's harder to find those new developers that have both skills you're looking for to maintain your app. You end up having to find an expert on one and then train them on the other technology.
It depends on how similar they are.
If your web service basically mirrors your website in functionality - then it makes a lot of sense to reuse existing code and thus to make them the same thing on the same server.
Note - this is not the same thing as entangling tiers as your views are still separate from your business logic.
From the Ruby-on-Rails perspective, the "web service" and "web-site" are often interchangeable as they are exactly the same code, with only the view-template differing (html for the website, xml for the web service).
If you build with a RESTful architecture in mind from the beginning, then you can achieve this with the minimum of duplication and with all application layers correctly decoupled.