What are Weights in Machine Learning

What are Weights?

Lesson Details:
June 29, 2020

I: Introduction

A: The Learning Phase

B: The Execution Phase

C: The Update Phase

D: The Inference Phase

E: The Tasks to perform machine learning

II: Body

A: What are weights?

1. Weight initializations and their role in neural networks

2. Parameter initialization and the effect on model accuracy

3. Common weight initialization methods and their advantages and disadvantages (e.g., random, normal, lognormal)

4. Discussion on how to choose a weight initialization method.

B: The Execution Phase

1. Why is the Execution Phase important?

2. Variation of execution phase such as batch size and training rate (speed of learning) and their effect on learning.

3. Discussion on how to choose a learning rate and optimal values for other parameters such as batch-size and momentum.

C: The Update Phase: How does backpropagation work? What is backpropagation and why is it useful? How do we implement backpropagation? Backpropagation algorithm, Backpropagation through time, learning via gradient descent, the chain rule, Backpropagation through time, Practical considerations such as computational complexity, convergence issues and stability issues, Optimizing cost function. Back to our simple example using neural network to predict house price, what is the cost function? What are its components? What are its components? What is the cost function optimization problem? Solving cost function optimization problems. Stochastic gradient descent algorithm, Stochastic Gradient Descent with momentum, Stochastic Gradient Descent with Nesterov accelerated gradient. Neural Networks with Nesterov accelerated gradient algorithm, Batch-normalization, Learning rate decay, Weight Initialization techniques, Momentum strategy, RMS Propagation, Nesterov momentum, Better weight initialization techniques.

III: Conclusion

Course content