D
D
Dplll2018-02-11 19:54:57
Python
Dplll, 2018-02-11 19:54:57

Gradient Descent Algorithm, why is it so and nothing else?

def minimize_stochastic(target_fn, gradient_fn, x, y, theta_0, alpha_0=0.01):

    data = list(zip(x, y))
    theta = theta_0                             
    alpha = alpha_0                             
    min_theta, min_value = None, float("inf")   
    iterations_with_no_improvement = 0

    
    while iterations_with_no_improvement < 100:
        value = sum( target_fn(x_i, y_i, theta) for x_i, y_i in data )

        if value < min_value:
            min_theta, min_value = theta, value
            iterations_with_no_improvement = 0
            alpha = alpha_0
        else:
            iterations_with_no_improvement += 1
            alpha *= 0.9
       
        for x_i, y_i in in_random_order(data):
            gradient_i = gradient_fn(x_i, y_i, theta)
            theta = vector_subtract(theta, scalar_multiply(alpha, gradient_i))

    return min_theta

If in string data = list(zip(x,y)), do not write list, regardless of the input, returns the input value theta, why?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
S
Sergey Gornostaev, 2018-02-11
@adelshin23

Because zip returns a generator, and a generator can only be used once. Accordingly, the line

value = sum( target_fn(x_i, y_i, theta) for x_i, y_i in data )
works, but the cycle for x_i, y_i in in_random_order(data):is gone. If you use list, then the data variable will contain not the generator, but the data that it generates.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question