Linear Regression scratch - Part 2 Forward Propagation

Dear Sciaku Learner you are not logged in or not enrolled in this course.

Please Click on login or enroll now button.

If you have any query feel free to chat us!

Happy Coding! Happy Learning!

Lecture 29:- Linear Regression scratch - Part 2 Forward Propagation

In the context of linear regression, "forward propagation" usually refers to the process of making predictions using the learned coefficients (slope and intercept) of the model. Forward propagation involves applying the model equation to new data points to obtain the predicted dependent variable values.

Let's continue the previous Python implementation and include the forward propagation step to make predictions using the learned coefficients.

pythonCopy code

def mean(values):    return sum(values) / float(len(values)) def simple_linear_regression(x, y):    # Calculate the mean of x and y    x_mean, y_mean = mean(x), mean(y)        # Calculate the slope (m) and y-intercept (b)    numerator = sum((xi - x_mean) * (yi - y_mean) for xi, yi in zip(x, y))    denominator = sum((xi - x_mean)**2 for xi in x)    m = numerator / denominator    b = y_mean - m * x_mean        return m, b def predict(x, m, b):    return [m * xi + b for xi in x] def r_squared(y_true, y_pred):    mean_y = mean(y_true)    ss_total = sum((yi - mean_y)**2 for yi in y_true)    ss_res = sum((yi - ypi)**2 for yi, ypi in zip(y_true, y_pred))    r2 = 1.0 - (ss_res / ss_total)    return r2 # Example usage: x_train = [1, 2, 3, 4, 5] y_train = [2, 3, 4, 2, 3] # Find coefficients m, b = simple_linear_regression(x_train, y_train) # Make predictions x_test = [6, 7, 8] predictions = predict(x_test, m, b) print("Slope (m):", m) print("Y-Intercept (b):", b) print("Predictions:", predictions)

In this example, after obtaining the slope (m) and y-intercept (b) from the simple_linear_regression function, we can use the predict function to make predictions for new data points (x_test). The forward propagation step involves applying the model equation y = mx + b to each value in x_test to get the corresponding predicted y values.

The output will show the predicted y values for the x_test data points based on the learned coefficients from the training data (x_train and y_train).

Keep in mind that this is a basic implementation for linear regression. In practice, using libraries like scikit-learn provides additional functionalities, better handling of complex scenarios, and better performance optimizations.

3. Regression

Comments: 0

Frequently Asked Questions (FAQs)

How do I register on Sciaku.com?
How can I enroll in a course on Sciaku.com?
Are there free courses available on Sciaku.com?
How do I purchase a paid course on Sciaku.com?
What payment methods are accepted on Sciaku.com?
How will I access the course content after purchasing a course?
How long do I have access to a purchased course on Sciaku.com?
How do I contact the admin for assistance or support?
Can I get a refund for a course I've purchased?
How does the admin grant access to a course after payment?