Example: confidence

Introduction to Machine Learning - University Of Maryland

Introduction to Machine LearningCMSC is this course about? Machine Learning studies algorithms for Learning to do stuff By finding (and exploiting) patterns in dataWhat can we do with Machine Learning ?Analyze genomics dataRecognize objectsin imagesAnalyze text & speechTeach robots how to cook from youtubevideosQuestion Answering system beats Jeopardy champion Ken Jennings at Quiz bowl!Sometimes machines even perform better than humans! Machine Learning Paradigm: Programming by example Replace ``human writing code'' with ``human supplying data'' Most central issue: generalization How to abstract from ``training'' examples to ``test'' examples?A growing and fast moving field Broad applicability Finance, robotics, vision, Machine translation, medicine, etc.

Introduction to Machine Learning CMSC 422 MARINE CARPUAT marine@cs.umd.edu. What is this course about? •Machine learning studies algorithms for learning to do stuff •By finding (and exploiting) patterns in data. What can we do ... Machine Learning as Function Approximation

Tags:

  Introduction, Machine, Learning, Machine learning, Introduction to machine learning

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Introduction to Machine Learning - University Of Maryland

1 Introduction to Machine LearningCMSC is this course about? Machine Learning studies algorithms for Learning to do stuff By finding (and exploiting) patterns in dataWhat can we do with Machine Learning ?Analyze genomics dataRecognize objectsin imagesAnalyze text & speechTeach robots how to cook from youtubevideosQuestion Answering system beats Jeopardy champion Ken Jennings at Quiz bowl!Sometimes machines even perform better than humans! Machine Learning Paradigm: Programming by example Replace ``human writing code'' with ``human supplying data'' Most central issue: generalization How to abstract from ``training'' examples to ``test'' examples?A growing and fast moving field Broad applicability Finance, robotics, vision, Machine translation, medicine, etc.

2 Close connection between theory and practice Open field, lots of room for new work!Course Goals By the end of the semester, you should be able to Look at a problem Identify if ML is an appropriate solution If so, identify what types of algorithms might be applicable Apply those algorithms This course is not A survey of ML algorithms A tutorial on ML toolkits such as Weka, TensorFlow, ..TopicsFoundations of Supervised Learning Decision trees and inductive bias Geometry and nearest neighbors Perceptron Practical concerns: feature design, evaluation, debugging Beyond binary classification Advanced Supervised Learning Linear models and gradient descent Support Vector Machines Naive Bayes models and probabilistic modeling Neural networks Kernels Ensemble Learning Unsupervised Learning K-means PCA Expectation maximizationWhat you can expectfrom the instructorsWe are here to help you learn by Introducing concepts from multiple perspectives Theory and practice Readings and class time Providing opportunities to practice, and feedback to help you stay on track Homeworks Programming assignmentsTeaching Assistants:Ryan Dorson Joe Yue-HeiNgWhat I expect from you Work hard (this is a 3-credit class!)

3 Do a lot of math (calculus, linear algebra, probability) Do a fair amount of programming Come to class prepared Do the required readings!Highlights from course logisticsGrading Participation (5%) Homeworks(15%), ~10, almost weekly Programming projects (30%), 3 of them, in teams of two or three students Midterm exam (20%), in class Final exam (30%), cumulative,in class. HW01 is due Wed 2:59pm No late homeworks Read syllabus here: find the readings: A Course in Machine Learning view and submit assignments: Canvas check your grades: Canvas ask and answer questions, participate in discussions and surveys, contact the instructors, and everything else: Piazza Please use piazza instead of emailToday s topicsWhat does it mean to learn by example ?

4 Classification tasks Inductive bias Formalizing learningClassification tasks How would you write a program to distinguish a pictureof mefrom a pictureof someone else? Provide examples pictures of meand pictures of other people and let a classifierlearn to distinguish the tasks How would you write a program to distinguish a sentenceis grammaticalor not? Provide examples of grammaticaland ungrammatical sentencesand let a classifier learn to distinguish the tasks How would you write a program to distinguish cancerouscells from normal cells? Provide examples of cancerousand normal cellsand let a classifier learn to distinguish the tasks How would you write a program to distinguish cancerouscells from normal cells?

5 Provide examples of cancerousand normal cellsand let a classifier learn to distinguish the s try it Your task: learn a classifier to distinguish class A from class B from examples Examples of class A: Examples of class BLet s try it learn a classifier from examples Now: predict class on new examples using what you ve learnedWhat if I told ingredients needed for Learning Training vs. test examples Memorizingthe training examples is not enough! Need to generalizeto make good predictions on test examples Inductive bias Many classifier hypotheses are plausible Need assumptions about the nature of the relation between examples and classesMachine Learning as Function ApproximationProblem setting Set of possible instances Unknown target function : Set of function hypotheses = : }Input Training examples { 1, 1.

6 , }of unknown target function Output Hypothesis that best approximates target function Formalizing induction:Loss Function ( , )where is the truth and the system s , ( )= 0 = ( )1 Captures our notion of what is important to learnFormalizing induction:Data generating distribution Where does the data come from? Data generating distribution: Probability distribution over ( , )pairs We don t know what is! We get a random sample from it: our training dataFormalizing induction:Expected loss should make good predictions as measured by loss on futureexamples that are also draw from Formally , the expected loss of over with respect to should be small , ~ ( , ( ))= ( , ) , ( , ( ))Formalizing induction:Training error We can t compute expected loss because we don t know what is We only have a sample of training examples { 1, 1.

7 , } All we can compute is the training error =1 1 ( , ( ))Formalizing Induction Given aloss function a sample from some unknowndata distribution Our task is to compute a function f that has low expected error over with respect to . , ~ ( , ( ))= ( , ) , ( , ( ))Recap: introducing Machine learningWhat does it mean to learn by example ? Classification tasks Learning requires examples + inductive bias Generalization vs. memorization Formalizing the Learning problem Function approximation Learning as minimizing expected lossYour tasks before next class Check out course webpage, Canvas, Piazza Do the readings Get started on HW01 due Wednesday 2:59pm


Related search queries