I am trying to build a number classification model in CoreML and want to use the naive bayes classifier but not able to find how to use it. My algorithm is using naive bayes
At the moment, coremltools support only following types of classifiers:
SVMs (scikitlearn)
Neural networks (Keras, Caffe)
Decision trees and their ensembles (scikitlearn, xgboost)
Linear and logistic regression (scikitlearn)
However, implementing Naïve Bayes in Swift yourself is not that hard, check this implementation, for example.
Related
In TensorFlow library, what does the tf.estimator.LinearClassifier class do in linear regression models? (In other words, what is it used for?)
Linear Classifier is nothing but Logistic Regression.
According to Tensorflow documentation, tf.estimator.LinearClassifier is used to
Train a linear model to classify instances into one of multiple
possible classes. When number of possible classes is 2, this is binary
classification
Linear regression predicts a value while the linear classifier predicts a class. Classification aims at predicting the probability of each class given a set of inputs.
For implementation of tf.estimator.LinearClassifier, please follow this tutorial by guru99.
To know about the linear classifiers, read this article.
I'm looking for a SVM implementation with support for non-linear kernels and one-vs-rest scenario, to perform a multi-label classification. Preferably, written in Python, or that I can call from Python with wrappers.
I was looking into sklearn, and there are two implementations to use SVM for classification:
sklearn.svm.LinearSVC
- supports multi-label classification with a one-vs.-rest scenario, but it's based on liblinear, and therefore only supports linear kernels.
sklearn.svm.SVC
- based on libsvm, supports non-linear kernels, but multi-label classification is done under a one-vs.-one reduction, it trains K (K − 1) / 2 binary classifiers for a K-way multiclass problem.
More info also here:
http://scikit-learn.org/stable/modules/multiclass.html
Does anyone knows any other SVM implementations directly supporting multi-label classification and non-linear kernels ?
One possible solution could also be to adapt the code based on sklearn.svm.SVC, to perform One-vs-Rest, was this already attempted before?
Binary Relevance problem transformation method uses one-vs-rest approach for doing multi-label classification. One could easily implement SVM with non-linear kernels using scikit-multilearn library. The following is the sample python code to do the same, where each row of train_y is a one-hot vector representing multiple labels (For instance, [0,0,1,0,1,0])
from skmultilearn.problem_transform.br import BinaryRelevance
from sklearn.svm import SVC
# Non-linear kernel
svm = SVC(kernel='rbf')
cls = BinaryRelevance(classifier=svm)
cls.fit(train_x, train_y)
predictions = cls.predict(test_x)
Naive Bayes Algorithm assumes independence among features. What are some text classification algorithms which are not Naive i.e. do not assume independence among it's features.
The answer will be very straight forward, since nearly every classifier (besides Naive Bayes) is not naive. Features independence is very rare assumption, and is not taken by (among huge list of others):
logistic regression (in NLP community known as maximum entropy model)
linear discriminant analysis (fischer linear discriminant)
kNN
support vector machines
decision trees / random forests
neural nets
...
You are asking about text classification, but there is nothing really special about text, and you can use any existing classifier for such data.
I read a lot of times in literature that there are several Data Mining methods (for example: decision trees, k-nearest neighbour, SVM, Bayes Classification) and the same for Data Mining algorithms (k-nearest neighbour algorithm, Naive Bayes Algorithm).
Is a DM method using different DM algorithms or is it the same?
An example to clarify - is there any difference between the below?
I'm using the Naive Bayes classification method.
I'm using the Naive Bayes classification algorithm.
Or is "Bayes" the method and "Naive Bayes" the algorithm?
Googled a lot to find an answer.Then thought this will be the area where some one will be able to answer my doubt.
In classification algorithm we have model and prediction part.
Normally while testing we have accuracy rate.
Likewise is there any accuracy rate/confidence for model in Navie Bayes algorithm.
Evaluation is (usually) not part of the classifier.
It's something you do seperately to evaluate if you did a good job, or not.
If you classify your test data using naive bayes, you can perform exactly the same kind of evaluation as with other classifiers!