NPTEL Introduction to Machine Learning Assignment 10 Answers 2023


Q.1. Consider the following one dimensional data set: 12, 22, 2, 3, 33, 27, 5, 16, 6, 31, 20, 37, 8 and 18. Given k=3 and initial cluster centers to be 5, 6 and 31, what are the final cluster centres obtained on applying the k-means algorithm?

  • 5, 18, 30
  • 5, 18, 32
  • 6, 19, 324.8, 17.6, 32
  • None of the above

Q.2. For the previous question, in how many iterations will the k-means algorithm converge?

  • 2
  • 3
  • 4
  • 6
  • 7

Q.3. In the lecture on the BIRCH algorithm, it is stated that using the number of points N, sum of points SUM and sum of squared points SS, we can determine the centroid and radius of the combination of any two clusters A and B. How do you determine the centroid of the combined cluster? (In terms of N,SUM and SS of both the clusters)
  • Answer: C

Q.4. What assumption does the CURE clustering algorithm make with regards to the shape of the clusters?
  • No assumption
  • Spherical
  • Elliptical

Q.5. What would be the effect of increasing MinPts in DBSCAN while retaining the same Eps parameter? (Note that more than one statement may be correct)

  • Increase in the sizes of individual clusters
  • Decrease in the sizes of individual clusters
  • Increase in the number of clusters
  • Decrease in the number of clusters

For the next question, kindly download the dataset – DS1. The first two columns in the dataset correspond to the co-ordinates of each data point. The third column corresponds two the actual cluster label.

Q.6. Visualize the dataset DS1. Which of the following algorithms will be able to recover the true clusters (first check by visual inspection and then write code to see if the result matches to what you expected)


  • K-means clustering
  • Single link hierarchical clustering
  • Complete link hierarchical clustering
  • Average link hierarchical clustering

  • Answer: B

  • Answer: D

Disclaimer: This answer is provided by us only for discussion purpose if any answer will be getting wrong don’t blame us. If any doubt or suggestions regarding any question kindly comment. The solution is provided by Infospothub. This tutorial is only for Discussion and Learning purpose.

About NPTEL Introduction to Machine Learning Course:

With the increased availability of data from varied sources there has been increasing attention paid to the various data driven disciplines such as analytics and machine learning. In this course we intend to introduce some of the basic concepts of machine learning from a mathematically well motivated perspective. We will cover the different learning paradigms and some of the more popular algorithms and architectures used in each of these paradigms.

Course Outcome:
  • Week 0: Probability Theory, Linear Algebra, Convex Optimization – (Recap)
  • Week 1: Introduction: Statistical Decision Theory – Regression, Classification, Bias Variance
  • Week 2: Linear Regression, Multivariate Regression, Subset Selection, Shrinkage Methods, Principal Component Regression, Partial Least squares
  • Week 3: Linear Classification, Logistic Regression, Linear Discriminant Analysis
  • Week 4: Perceptron, Support Vector Machines
  • Week 5: Neural Networks – Introduction, Early Models, Perceptron Learning, Backpropagation, Initialization, Training & Validation, Parameter Estimation – MLE, MAP, Bayesian Estimation
  • Week 6: Decision Trees, Regression Trees, Stopping Criterion & Pruning loss functions, Categorical Attributes, Multiway Splits, Missing Values, Decision Trees – Instability Evaluation Measures
  • Week 7: Bootstrapping & Cross Validation, Class Evaluation Measures, ROC curve, MDL, Ensemble Methods – Bagging, Committee Machines and Stacking, Boosting
  • Week 8: Gradient Boosting, Random Forests, Multi-class Classification, Naive Bayes, Bayesian Networks
  • Week 9: Undirected Graphical Models, HMM, Variable Elimination, Belief Propagation
  • Week 10: Partitional Clustering, Hierarchical Clustering, Birch Algorithm, CURE Algorithm, Density-based Clustering
  • Week 11: Gaussian Mixture Models, Expectation Maximization
  • Week 12: Learning Theory, Introduction to Reinforcement Learning, Optional videos (RL framework, TD learning, Solution Methods, Applications)
CRITERIA TO GET A CERTIFICATE:

Average assignment score = 25% of average of best 8 assignments out of the total 12 assignments given in the course.
Exam score = 75% of the proctored certification exam score out of 100

Final score = Average assignment score + Exam score

YOU WILL BE ELIGIBLE FOR A CERTIFICATE ONLY IF AVERAGE ASSIGNMENT SCORE >=10/25 AND EXAM SCORE >= 30/75. If one of the 2 criteria is not met, you will not get the certificate even if the Final score >= 40/100.