machine learning loss during learning jumps all over - machine-learning

what to do when the loss starts to jump all over during learning?
the plot has been generated as such
Thank you
tried different learners like SGD and Adam or different learning rates, also learning-rate-decays, nothing really helped.

Related

Neural Net learning only after first couple of epochs

I have a general question to the Machine Learning experts out there:
How can it be, that a neural network starts learning only after a couple of epochs? The first epochs nothing changes, then all of the sudden rapid progress occurs. What exactly can be behind this?
In my case I am dealing with a resnet type architecture and image processing.
Loss Function
Greetings

Deep Learning equivalent Machine Learning

The question seems simple, can we find a neural network for every classical Machine Learning model?
For example:
Linear regression is a perceptron.
PCA is an auto-encoder with a single intermediate layer
We can approximate Ridge or Lasso by adding some decay at the time of the construction of the network
If the answer to the first question is yes then how can I find equivalents to decision trees and SVMs?

metric learning and contrastive learning difference

I researched some materials,and know that the goal of contrastive learning and metric learning are both to learn such an embedding space in which similar sample pairs stay close to each other while dissimilar ones are far apart. But what is the difference of metric learning and contrastive learning? I can not understand.
Someone can give some advises? Thanks.

Using Machine Learning for Price Prediction

What Machine Learning Method should i Use to predict Prices like Stocks,gold and etc?
I Prefer using Python but I Can't Find the Starting Point as it Seems so Complicated to me and I've no Clue How to Start it.
Talking about the machine learning method, Regression Method is used for Price prediction as it is used to predict a continuous variable. There are wide range of techniques for regression in machine learning. Starting from simple linear regression, SVR, RandomForest, CatBoost to RNN. Based on target problem, available datasets and computing resources, one of the algorithms can be used.
Yes, Python is the best language to get started into machinbre learning. And definitely, Linear Regression is the best way to start for this regression task if you are new. Gradually, you can start exploring other techniques in scikit-learn before directly jumping into RNN. Scikit-learn is the best machine learning library from beginners to professionals.

convnet suddenly drop accuracy

I was trying to train an emotion recognition model on the fer2013 dataset using the architecture proposed in this paper
The paper uses different dataset than mine so I did some modifications on on the stride and filter size.
After a couple hours of training, accuracy on both training and test set suddenly drops.
After that the accuracy just stay around 0.1-0.2 for both set, never improve anymore.
Does anybody know about this phenomenon?
In any neural network training, if both accuracies i.e. training and validation improves at first and then starts decreasing, it is a sign that your network is failing to converge. More appropriately, your optimizer has started overshooting.
One most likely reason for this could be high learning rate. Reduce your learning rate and then check your example again. Also, in your linked paper, (at least in first glimpse), I couldn't see learning rate mentioned. Since your data is different from the paper's, same learning rate might not work as well.

Resources