CSCI 3346, Fall 2018 Prof. Alvarez Exam 2 Topics (also see PS 5-7, their solutions, and the relevant sections of the textbook) The second exam will emphasize topics covered since the first exam (see below), but will rely on foundational topics from the first part of the course that are not included in the list below (e.g., classification, model complexity, overfitting and underfitting, MDL). Bayesian predictive techniques Terminology Class prior distribution, P(c), where c are the classes Class-conditional attribute distribution, P(x | c) Posterior class distribution P(c | x) Bayes' rule Posterior class distribution in terms of the others Naive Bayes Conditional independence of attributes Application to document categorization Bayes networks Conditional independence relationships from network graph (conditional independence from non-descendants given parents) Specifying a Bayes network: probability tables Forward and backward inference using chain rule, Bayes' rule Logistic regression and neural networks Learning as optimization Cross-entropy loss function for logistic regression Mean-squared error loss function for neural networks Gradient descent training Stepping downhill in loss landscape Weight update equations Perceptrons Computation of output from inputs, weights, biases Learning weights (and biases) from training data Limited representational capacity Multi-layer neural networks Terminology: input, hidden, output layers Role of hidden layers as feature extractors