How do I know when a web site uses an RDF ?
For example , I know that eBay and Amazon uses RDF because I've read in many articles, but as I know it in practice ?
In practice, there is currently no single standardized way for websites to "advertise" their use of RDF. You find out by them informing you about it in some fashion, e.g. by them publishing a link to API documentation that describes how they use RDF, or indeed by writing an article about it, so pretty much the same way you find out about any REST API / web service. Of course, in the case of RDF/linked data you are often helped by the fact that other datasets you already know about may be linking to the new source, thus making it discoverable.
There are some attempts at defining more standardized mechanisms for 'advertising' a website's linked data use. The W3C VoID specification is the closest thing to a standard in that regard: it provides a vocabulary for publishers to describe the data and access mechanisms they offer, and it also gives pointers on how to make things discoverable. Unfortunately, it is not (yet) very widely adopted.
Related
The community of developers using odata for their REST implementations seem to be the least of all the REST implementations that I usually come across.
Any reasons?
There is virtually no contract. A service consumer has no idea how to use the service (for example, what are valid Command arguments, encoding expectations, and so on).
The interface errs on the side of being too liberal in what it will accept.
The contract does not provide enough information to consumers on how to use the service. If a consumer must read something other than the service’s signature to understand how to use the service, the factoring of the service should be reviewed.
Consumers are expected to be familiar with the database and table structures prior to consuming the Web service. This results in a tight coupling between service providers and consumers.
Performance will suffer due to dependencies on late binding and encoding/decoding between boundaries within the same service.
Source: https://docs.servicestack.net/why-not-odata
OData is a great standard to expose datasets with good tool support (Excel, Tableau, PowerBI...).
As far as I'm concerned it saved me a lot of time and effort, projecting/sorting/filtering... being available out of the box without having to code anything (especially with .net). It's my go to option for RESTful APIs on table like structures.
I had an interesting conversation with a contractor from one of the major outsourcing companies the other day. He has built restful APIs for many customers and when I asked if he used OData sometimes, he replied 'we don't do OData, we prefer Json' (sigh...).
So I guess one of the possible answers to your question is ignorance, many simply don't know OData or understand it...
I am looking for advice to save me time. I am planning to create a Q&A web app for my university, a stackoverflow clone. I know Rails and I know Angular but I never used them together. One option for me is to use Firebase APIs because it's simple. My question is which is easier, making Angular consume Rails APIs, or firebase APIs? Or is it the same steps no matter what I use to create the APIs ?
This is not going to be the same process. It's also not a direct comparison.
Firebase is a hosted third party 'backend-as-a-service' and you use what they give you for API calls, but you can more or less rely on the API working as advertised (though the docs can be less than useful in places).
That is vs building your own RoR API, which means just that - you pick the groceries just as you want them, but you also get to fix all the bugs on both your API and and your client. You also still need a DB selected.
A more direct comparison would be 'should I pick Node or RoR.' Your question as it's posed is really a question of your own backend implementation or a hosted package.
Which is easier?
There is not a real, single answer.
If by easier you mean 'most direct', Firebase has an Angular library called AngularFire. It translates a lot of Firebase paradigms into a pretty familiar Angular pattern, with a couple of nice extras. You focus on your client code and DB design, it handles the server operations. That seems fairly direct to me.
If by easier you mean 'most flexibility', it's hard to see how building your own doesn't give you that.
If by easier you mean 'less work for me', then 'it depends'. If you are comfortable writing RoR backends and less familiar with NoSQL patterns, then you can probably put it together faster on your own setting your API methods and selecting a DB you are comfortable with. If you feel stronger with Angular than RoR, then learning Firebase paradigms might be a shorter climb.
For what it's worth, given its limited set of API calls, there is probably more focused support for specific questions about Firebase. But you do sacrifice the option of doing it 'your way,' and the RoR community is far larger than Firebase's so you can probably still get plenty of help. Like I said, it can really depend.
I am trying to get the data of UserProfiles from SharePoint 2010 site using Objective-C within xCode.Now I am using the SOAP service in my project. Is anyone able to point me in the right direction here? Thank you....
You probably mean "iOS" or "Cocoa" instead of Xcode.
If possible, avoid SOAP. It's much easier to access a web service via REST and using JSON as transport format - and in 99.8% of all use cases, a RESTful web service and JSON will fulfill all your requirements up to 100%.
What you need to accomplish your task can be summarized into "networking development", which involves NSURLConnection (and related friend classes), and NSJSONSerialization and a few other system classes depending on your needs.
Unless you stay with a RESTful web service and JSON and moderate requirements, networking may become quickly complex. And it becomes unnecessarily complex when using SOAP. Possibly you may want to utilize a third party library which may help here.
I'm assuming you are already familiar with the basic major principles when programming in Objective-C and for Mac OS X and iOS. So, I would suggest to start with reading examples from the Apple docs involving networking and utilizing NSURLConnection (e.g. MVCNetworking).
We're actually looking to integrate Moses into our localization workflow. Our application is in Java and we're looking at using Moses' functionalities using xml-rpc calls.
Specifically, we're looking at APIs for:
Incremental training (i.e. Avoid having to retrain the model
from scratch every time we wish to use some new training data)
Domain-specific training (i.e. It should maintain separate
phrase tables for each domain that the input data belongs),
Decoding
The tutorial says that these can be achieved via xml-rpc calls. But, I don't find any examples or clear ways to do them. Can someone please provide some examples?
Also, I would like to know if the training and decoding phases can be done in a
distributed manner.
Thanks!
this question is perfectly suitable for moses mailing list:
http://www.statmt.org/moses/?n=Moses.MailingLists
moses server documentation (via xml-rpc):
http://www.statmt.org/moses/?n=Moses.AdvancedFeatures#ntoc28
However, I have better experiences with: moses/contrib/web/bin/daemon.pl which makes server as well, and you communicate via tcp stream.
General examples are harder to find(everyone has different enviroment,...), but make your question more specific and send it to moses mailing list. (e.g. someone had a problem with server installation: http://comments.gmane.org/gmane.comp.nlp.moses.user/7242 )
We've got a bunch of data that we'd like to expose to the world hosted on an asp-net.mvc website. I'd like to ensure that we deliver it using technology that is easy for end developers to implement and not tied to any particular platform, rather than using technology that is unpopular/incompatible with developers.
The kind of requests we expect are mainly to retrieve search results (not many parameters), but down the like we'd like to be able to provide catalogue lookups and the like, which may be more complex.
Bearing this in mind, what is the preferred means of doing this?
Windows Communication Foundation can be used to create both SOAP services (great if your consumers are businesses, using Visual Studio/.NET or Java) or REST services (for people on other platforms). Those are the preferred means of exposing public APIs.
If you want maximum exposure, probably best to use the REST approach, since it is easier to consume from "web" languages like JavaScript. Microsoft has extensive resources on putting together a REST API using WCF.
Honestly, for the kinds of requests you say you need to handle, which all seem to be looking up data as opposed to modifying it, the difference is almost trivial - you can switch from SOAP to REST simply by changing a few attributes/configuration options and you could technically even host both at the same time using very little additional code. As long as you stick to WCF and don't use outdated technology like ASMX/WSE then you will be fine.
Reasons to use REST:
Consumable from almost anywhere (including JavaScript, RSS readers, etc.);
It's popular (in use by Google, Twitter, etc.)
Supports many different data formats (JSON, Atom, etc.)
Reasons to use SOAP:
Standardized security protocol (encryption, non-repudiation, etc.)
Distributed transactions
Message Queuing
That's not an exhaustive list but it should give you an idea of who the target markets are for each. If you're hosting a very open, very public site designed to be consumed by anyone and everyone, go with REST. If the service is part of a business system and you need to guarantee reliability, security, and consistency of data, you'll want to go with SOAP. Choose the appropriate technology based on your target market.
Create a RESTful API. As a developer who often consumes web services, it's what I would expect and prefer.
Many popular services (digg/twitter/netflix/google) are moving to REST over SOAP, so you would be wise to follow suit.
If you do create a REST API you should also create a WADL file. It's WISDL for REST. They're not well supported yet, but they're not hard to create and they'll become more useful as support increases.
YOu will want to check out odata. Look at odata.org and live.visitmix.com/videos
This will give you REST access, metadata support like in SOAP, interoperability with the whole office stack and if you are using WCF Data Services you can implement it in a matter of hours, days at most.
Take a look at netflix.com, they have done it right (IMHO).