# write your code
# case 1
In some cases, sequential application of PCA and Logistic Regression (LR) leads to the very same separating hyperplane as just application LR (let us denote it as $PCA + LR \sim LR$). In such cases there is a specific relation R between the projection hyperplane of PCA and separating hyperplane of LR.
What is the relation R? Describe it and explain why R is necessary and sufficient for $PCA + LR \sim LR$.
Write your answer here.
Provide an example when both PCA followed by LDA and LDA alone work worse than QDA. Provide a theoretical explanation of the proposed case.
# write your code here
Provide an example when both PCA followed by LDA and LDA alone have accuracy on the valid set < 100%, whereas QDA has accuracy on valid set = 100% and:
# write your code here