I need to use the perceptron algorithm to study the learning rate and the asymptotic error of some datasets which are not linearly separable.
In order to do this, I need to understand a few parameters of the constructor. I spent plenty of hours googling them but I still can't understand quite well what they do or how to use them.
The ones that create me more problems are: alpha and eta0
I understand that every update of the algorithm is:

where (d-y(t)) just gives the desired + or -, in order to increase or decrease the component of the vector, and r is the learning rate that smooths the update.
from the scikit-learn documentation (https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.Perceptron.html)
'alpha' is a constant that multiplies the regularization term if regularization is used.
'eta0' is a constant by which the updates are multiplied.
What is the regularization term (alpha) in the perceptron? in which part of the formula appears?
Is the eta0 the 'r' of the formula above?
Both of these parameters should slow the algorithm but make it more efficient, I would like to understand how to use them at their best.
Thank you in advance, I will appreciate any answer even if not complete.