Are you looking for IBM Machine Learning – Dimensionality Reduction Exam Answers by Cognitive Class? if yes, this article will help you find all the questions and answers asked in the Cognitive Class IBM Machine Learning – Dimensionality Reduction Quiz. I have followed this article to solve all the questions for this exam.

In this course, you will learn the theory behind dimension reduction, and get some hands-on practice using Principal Components Analysis (PCA) and Exploratory Factor Analysis (EFA) on survey data.

Requirement | Basic knowledge of operating systems (UNIX/Linux) |

Course Start | Any time, Self-paced |

Course Maker | Konstantin Tskhay |

Min Pass Mark | 70% |

All Review Questions | 50% |

Final Exam | 50% |

True/False | 1 Attempt |

Other Questions | 2 Attempt |

IBM Machine Learning – Dimensionality Reduction | Click Here |

Page Contents

### IBM Cognitive Class – Machine Learning – Dimensionality Reduction Answers

### Module 1: Data Series

**1. Which of the following techniques can be used to reduce the dimensions of the population?**

**2. Cluster Analysis partitions the columns of the data, whereas principal component and exploratory factor analyses partition the rows of the data. True or false?**

**3. Which of the following options are true? Select all that apply.**

### Module 2: Data Refinement

**1. Which of the following options is true?**

**2. PCA is a method to reduce your data to the fewest ‘principal components’ while maximizing the variance explained. True or false?**

**3. Which of the following techniques was NOT covered in this lesson?**

### Module 3: Exploring Data

**1. EFA is commonly used in which of the following applications? Select all that apply.**

**2. Which of the following options is an example of an Oblique Rotation?**

**3. An Orthogonal Rotation assumes that factors are correlated with each other. True or false?**

### Machine Learning – Dimensionality Reduction Final Exam Answers

**1. Why might you use cluster analysis as an analytic strategy?**

**2. Suppose you have 100,000 individuals in a dataset, and each individual varies along 60 dimensions. On average, the dimensions are correlated at r = .45. You want to group the variables together, so you decide to run principle component analysis. How many meaningful, higher-order components can you extract?**

**3. What technique should you use to identify the dimensions that hang together?**

**4. What are loadings?**

**5. When would you use PCA over EFA?**

**6. What is uniqueness?**

**7. Suppose you are looking to extract the major dimensions of a parrot’s personality. Which technique would you use?**

**8. Suppose you have 60 variables in a dataset, and you know that 2 components explain the data very well. How many components can you extract?**

**9. When would you use an orthogonal rotation?**

**10. When would you use confirmatory factor analysis?**

**11. Which of the following is NOT a rule when deciding on the number of factors?**

**12. What is one assumption of factor analysis?**

**13. What is an eigenvector?**

**14. What is a promax rotation?**

**15. What is the cut-off point for the Common Variance Explained rule?**

**16. Why would you try to reduce dimensions?**

**17. If you have 20 variables in a dataset, how many dimensions are there?**

**18. What term describes the amount of variance of each variable explained by the factor structure?**

**19. What package contains the necessary functions to perform PCA and EFA?**

**20. What is the best method for identifying the number of factors to extract?**

### Wrap Up

I hope this article would be useful for you to find all the “Cognitive Class Answers: IBM Machine Learning – Dimensionality Reduction Quiz Answers”. If this article helped you to learn something new for free then share it on social media and let others know about this and check out the other free courses that we have shared here.