After Microsoft becoming more open source friendly, I started to see more and more emphasis on Machine Learning such as ML.NET and Infer.NET.
I want to know, what is the difference between two, since both are coming from Microsoft. What will be pros and cons of both frameworks?
From MSDN:
https://blogs.msdn.microsoft.com/dotnet/2018/10/08/announcing-ml-net-0-6-machine-learning-net/
Infer.NET is now open-source and becoming part of the ML.NET family
On October 5th 2018, Microsoft Research announced the open-sourcing of
Infer.NET – a cross-platform framework for model-based machine
learning.
Infer.NET differs from traditional machine learning frameworks in that
it requires users to specify a statistical model of their problem.
This allows for high interpretability, incorporating domain knowledge,
doing unsupervised/semi-supervised learning, as well as online
inference – the ability to learn as new data arrives. The approach and
many of its applications are described in our free online book for
beginners.
Places where Infer.NET is used at Microsoft include TrueSkill – a
skill rating system for matchmaking in Halo and Gears of War, Matchbox
– a recommender system in Azure Machine Learning, and Alexandria –
automatic knowledge base construction for Satori, to name a few.
We’re working with the Infer.NET team to make it part of the ML.NET
family. Steps already taken in this direction include releasing under
the .NET Foundation and changing the package name and namespaces to
Microsoft.ML.Probabilistic.
ML.Net is for .Net Developers and Infer.Net is for open source Developer but both of them for machine learning concepts and algorithms.
Related
I need advice on which libraries and game engines should I use for a ml project
my goal is to create machine learning model for pruning the trees. I believe I have to create a game with generic tree model with some randomness then create reinforcement learning model and train ml model inside the game.ML model must have ability to first find the branch which must be cut and then find a path to move robotic arm near to that branch to cut it. I have experience in c++ and java but I prefer c++ , could you give me advise which library should I use for ML and which language and game engine should I use for creating game? I have a little experience in opengl. If it doesn't make any difference my prefered language is c++ but I know that I should use right tool for right job and python is leader in ML so if it will save a time and energy I have nothing against learning python.
My recommendation is to learn and use Python for your ML project. Though there is some work in R, for your future in ML, your best bet is to learn and use Python. The community is great, and there are many frameworks that can work out-of-the-box.
After a quick search, I did find a framework called robotframework, that is pretty highly starred on GitHub here: https://github.com/robotframework/robotframework. I will say though, however, that I am not personally familiar with using this framework. But it may be helpful to you.
In terms of tree-based algorithms, you might want to start exploring with XGBoost. It can be found here: https://github.com/dmlc/xgboost.
The Watson Machine Learning service provides three options for training deep learning models. The docs list the following:
There are several ways to train models Use one of the following
methods to train your model:
Experiment Builder
Command line interface (CLI)
Python client
I believe these approaches will differ with their (1) maturity and (2) the features they support.
What are the differences in these approaches? To ensure this question meets the quality requirements, can you please provide a objective list of the differences? Providing your answer as a community wiki answer will also allow the answer to be updated over time when the list changes.
If you feel this question is not a good fit for stack overflow, please provide a comment listing why and I will do my best to improve it.
The reasons to use these techniques depends on a user's skillset and how they are fitting the training/monitoring/deploying steps into their workflow:
Command Line Interface (CLI)
The CLI is useful for quick and random access to details about your training runs. It's also useful if you're building a data science workflow using shell scripts.
Python Library
WML's python library allows users to integrate their model training+deployment into a programmatic workflow. It can be used both within notebooks as well as via IDEs. The library has become the most widely used way for executing batch training experiments.
Experiment Builder UI
This is the "easy button" for executing batch training experiments within Watson Studio. It's a quick way to learn the basics of the batch training capabilities in Watson Studio. At present, it's not expected that data scientists would use Experiment Builder as their primary way of starting batch training experiments. Perhaps as Model Builder matures, this could change but the Python library is more flexible for integrating into production workflows.
I am new to the field of machine learning, I am planning to use python as the programing language for implementing algorithms and Java for system architecture.
As far as I understand, machine learning is more about modeling data specific to the domain, visualize the data, and choose appropriate models & parameters. Implementing the models/algorithms is the last and relatively easy step.
Matlab seems to have everything for machine learning but it is too expensive and requires to learn a new language.
What tools other than programming language do I need in general for machine learning for enterprise projects? things like data modeling, visualization,etc
After a couple of years of trial and error, I would suggest you to go directly with python, possibly with scikit-learn or tensorflow (if you want to go hardcore :).
I also tried R in the past, and while it is a very valid language it has some limitations: It is single threaded by default, and although there are solutions for that, they are non as clean as python.
Also, python seems to be THE language for machine learning, it is easy to learn, and fast (depending on the interpreter implementation of course), also there is huuuuuuge support for it, lots of tutorials, documentation and, more important, libraries are actively develop and supported.
Finally, i recommend you to consider spyder as a good IDE for data science, I also tried Rodeo, but it does not seem as mature and stable as spyder.
Hope this helps.
Are there any machine learning packages that implement spiking neural networks? or any other stand-alone implementations of them that could get me started to work with?
A python library named Brian ought to be useful for you.
There's also what I believe is a programing language named NEURON, but Brian is fairly easy to learn, at least for the basics. It took me a while though to figure out how to do a couple small things, since its a really high level language or whatnot.
There are several other SNN platforms these days that allows you to run classification. I have worked with NeuCube (https://kedri.aut.ac.nz/R-and-D-Systems/neucube) which is a Matlab & Java-based SNN platform.
Also, check out Akida Development Environment (ADE) from Brainchip Inc (https://brainchipinc.com/). One of the best features of ADE is that it's APIs are based on tensorflow/keras structure and also supports CNN2SNN converter to use your deep learning models in SNN domain. SNN models developed using this platform can be deployed on their neuromorphic processor Akida.
I believe there are other platforms such as PyNN and Nengo (compatibility to run models on Loihi) within the SNN domain.
Here are links for brain simulator
https://github.com/brian-team/brian2
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2605403/
http://briansimulator.org/
You can install the Nengo Loihi library for deployment not only of spiking neural networks but also neuromorphic neural networks.
here's the link to their website: https://www.nengo.ai/nengo-loihi/v1.0.0/index.html
You can find on Kaggle an implementation of the ciphar10 dataset, locally loaded, using Nengo Loihi library. Here's the link:
https://www.kaggle.com/migueltoms/neuromorphic-ciphar-10-loihi-comparison-of-results
Applying machine learning techniques, more specifically text mining techniques, in browser environment (mainly Javascript) or as a web application is not a very widely discussed topic.
I want to build my own web application / browser extension that can accomplish certain level of text classification / visualization techniques. I would like to know, if there is any open source projects that apply text mining techniques in web application or even better as browser extensions?
So far, these are the projects/discussions I gathered with days of random searching:
For text mining in web application:
http://text-processing.com/ with demo (Close source, with limited api)
uClassify (close source, no info about library base)
For machine learning in Javascript:
Discussion on the possibility about Machine learning in
JavaScript. (mainly about saying Node.js is going to change the landscape)
brain - javascript supervised machine learning
A demo project with Naive Bayes implemented in Javascript
For web application text mining, the architect that I can think of:
Python libraries (e.g. NLTK or scikit-learn) + Django
Java libraries (a lot) + Play! framework
Even R based + rApache
Some popular machine learning libraries:
Python - PyBrain
Apache - Mahout
I'll give you my favourities:
Brain.js
ConvNetJS
It has been 7 years since this question was asked, but there is a chance that machine learning will get native browsers support: https://webmachinelearning.github.io/
(just make sure you like posts in github issues about adding training capabilities, otherwise you might end up only with some 3rd party models support :-) )