Logistic Regression with Variable Fractional Gradient Descent Method

2020 
Logistic regression is a classic classification method in machine learning. Classical logistic regression uses a general gradient descent method to solve the best parameters of the loss function, but it is easy to fall into the dilemma of local extremes. The fractional step descent method cannot converge to the exact extreme point, and there is always a deviation, so we use a variable fractional step descent method to simulate a new type of iteration to ensure the global convergence of the optimization algorithm. In this paper, the new variable fractional gradient descent method is used to solve the optimal parameters of the loss function, which overcomes the limitation of the gradient function on the step size of the loss function. In this paper, logistic regression and variable fractional step descent methods are used to deal with the problem of data dimensionality reduction to verify the effectiveness of the algorithm after optimization.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    9
    References
    0
    Citations
    NaN
    KQI
    []