Showing posts with label CONCEPT LEARNING. Show all posts
Showing posts with label CONCEPT LEARNING. Show all posts

About Machine Learning 2

Machine Learning 

πŸ‘‰Machine Learning 2 Syllabus


UNIT-1 : Introduction & Concept Learning and the General to Specific Ordering

Introduction- Well-Posed Learning Problems, Designing a Learning System, Perspectives and Issues in Machine Learning, Introduction to Supervised, Unsupervised and Reinforcement Learning.

Concept Learning and the General to Specific Ordering – Introduction, A Concept Learning Task, Concept Learning as Search, Find-S: Finding a Maximally Specific Hypothesis, Version Spaces and the Candidate Elimination Algorithm.

πŸ‘‰Machine Learning 2- UNIT-1 (A) Notes: Introduction & Concept Learning and the General to Specific Ordering Notes

πŸ‘‰Machine Learning 2 UNIT-1 (A) PPTs: Introduction PPTs

πŸ‘‰Machine Learning 2- UNIT-1 (B) PPTs: Concept Learning and the General to Specific Ordering PPTs

πŸ‘‰Machine Learning 2 - UNIT-1 Questions

UNIT-2: Decision Tree Learning & Artificial Neural Networks

Decision Tree Learning – Introduction, Decision Tree Representation, Appropriate Problems for Decision Tree Learning, The Basic Decision Tree Learning Algorithm, Issues In Decision Tree Learning.

Artificial Neural Networks- Introduction, Neural Network Representation, Appropriate Problems for Neural Network Learning, Perceptrons, Multilayer Networks and the Back-Propagation Algorithm.

πŸ‘‰Machine Learning 2 - UNIT-2 (A) NOTEs: Decision Tree Learning

πŸ‘‰Machine Learning 2 - UNIT-2 (A) PPTs : Decision Tree Learning PPTs

πŸ‘‰Machine Learning2 - UNIT-2(B) Notes: Artificial Neural Networks Notes

πŸ‘‰Machine Learning2 - UNIT 2(B) PPTs : Artificial Neural Networks PPTs

πŸ‘‰Machine Learning2 - UNIT 2 : Decision Tree and Artificial Neural Network Questions

UNIT-3: Bayesian Learning & Instance-Based Learning

Bayesian Learning – Introduction, Bayes Theorem, Bayes Theorem and Concept Learning, Bayes Optimal Classifier, Naive Bayes Classifier, Bayesian Belief Networks, EM Algorithm.

Instance-Based Learning- Introduction, K-Nearest Neighbor Algorithm, Locally Weighted Regression, Remarks on Lazy and Eager Learning.

πŸ‘‰Machine Learning 2 - UNIT-3 (A) Notes: Bayesian Learning Notes

πŸ‘‰Machine Learning 2 : UNIT-3 (A) PPTs: Bayesian Learning PPTs

πŸ‘‰Machine Learning 2 : UNIT-3 (B) NOTEs: Instance-Based Learning NOTEs

πŸ‘‰Machine Learning 2 : UNIT-3 (B) PPTs: Instance-Based Learning PPTs

πŸ‘‰Machine Learning2- UNIT-3 Questions

UNIT-4: Genetic Algorithms & Learning Sets of Rules

Genetic Algorithms – Motivation, Genetic Algorithms, An Illustrative Example, Genetic Programming, Models of Evolution and Learning, Parallelizing Genetic Algorithms.

Learning Sets of Rules – Introduction, Sequential Covering Algorithms, Learning Rule Sets: Summary, Learning First-Order Rules, Learning Sets Of First-Order Rules: FOIL

πŸ‘‰Machine Learning2 - UNIT-4 (A) NOTEs: Genetic Algorithms NOTEs

πŸ‘‰Machine Learning2- UNIT -4 (A) PPTs: Genetic Algorithms PPTs

 πŸ‘‰Machine Learning2: UNIT-4 (B) NOTEs: Learning Sets of Rules NOTEs

πŸ‘‰Machine Learning2: UNIT-4 (B) PPTs: Learning Sets of Rules PPTs

πŸ‘‰Machine Learning2: UNIT-4 Questions

UNIT-5: Analytical Learning & Reinforcement Learning

Analytical Learning- Introduction, Learning With Perfect Domain Theories: PROLOG-EBG, Explanation-Based Learning Of Search Control Knowledge.

Reinforcement Learning – Introduction, The learning task, Q–learning, Nondeterministic, Rewards and Actions, Temporal Difference Learning, Generalizing from Examples, Relationship to Dynamic Programming.

πŸ‘‰Machine Learning2: UNIT-5(A) NOTES: Analytical Learning NOTES

πŸ‘‰Machine Learning2: UNIT-5(A) PPTs: ANALYTICAL LEARNING PPTs

πŸ‘‰Machine Learning2: UNIT-5(B) NOTEs: Reinforcement Learning NOTEs

πŸ‘‰Machine Learning2: UNIT-5(B) PPTs: Reinforcement Learning PPTs

πŸ‘‰Machine Learning2: UNIT-5 Questions


Text Books:

1. Machine Learning – Tom M. Mitchell, – MGH

2. Machine Learning: An Algorithmic Perspective, Stephen Marsland, Taylor & Francis (CRC)

Reference Books:

1. Machine Learning Methods in the Environmental Sciences, Neural Networks, William W Hsieh, Cambridge Univ. Press.

2. Richard o. Duda, Peter E. Hart and David G. Stork, pattern classification, John Wiley & Sons Inc., 2001.





 



CONCEPT LEARNING AS SEARCH

 

CONCEPT LEARNING AS SEARCH

·       Concept learning can be viewed as

·       the task of searching through a large space of hypotheses implicitly defined by the hypothesis representation.

·       The goal of this search is to

·       find the hypothesis that best fits the training examples.


Example:

        Consider the instances X and hypotheses H in the EnjoySport learning task.

        The attribute

        Sky has three possible values, and

        AirTemp, Humidity, Wind, Water, Forecast each have two possible values,

        the instance space X contains

        exactly 3*2*2*2*2*2 = 96 distinct instances

        5*4*4*4*4*4 = 5120 syntactically distinct hypotheses within H.

        Every hypothesis containing one or more "Ξ¦" symbols represents the empty set of instances; that is, it classifies every instance as negative.

        1 + (4*3*3*3*3*3) = 973 Semantically distinct hypotheses.

A CONCEPT LEARNING TASK – Instance Space



Hypothesis Space

ΓΌ  Similarly there are 5 * 4 * 4 * 4 * 4 * 4 = 5120 syntactically distinct hypotheses within H.

ΓΌ  Notice, however, that every hypothesis containing one or more "ΓΈ" symbols represents the empty set of instances; that is, it classifies every instance as negative.

ΓΌ  Therefore, the number of semantically distinct hypotheses is only 1 + (4 *3 * 3 * 3 * 3 * 3) = 973.

ΓΌ  Our EnjoySport example is a very simple learning task, with a relatively small, finite hypothesis space.



General-to-Specific Ordering of Hypotheses


        To illustrate the general-to-specific ordering, consider the two hypotheses

h1 = (Sunny, ?, ?, Strong, ?, ?)

h2 = (Sunny, ?, ?, ?, ?, ?)

        Now consider the sets of instances that are classified positive by h1 and by h2. Because h2 imposes fewer constraints on the instance, it classifies more instances as positive.

        In fact, any instance classified positive by h1 will also be classified positive by h2. Therefore, we say that h2 is more general than h1.

 



More General Than hypothesis



 



·       In the figure,

ΓΌ  the box on the left represents the set X of all instances,

ΓΌ  the box on the right the set H of all hypotheses.

·       Each hypothesis corresponds to some subset of X

ΓΌ  – the subset of instances that it classifies positive.

·       The arrows connecting hypotheses represent

ΓΌ   the more - general -than relation,

ΓΌ  with the arrow pointing toward the less general hypothesis.

·       Note the subset of instances characterized by

ΓΌ     h2 subsumes the subset characterized by h1,

ΓΌ     hence  h2 is more - general– than h1.


CONCEPT LEARNING

 

CONCEPT LEARNING

·       Learning involves acquiring general concepts from specific training examples.

·       Example: People continually learn

·       general concepts or

·       categories such as

ΓΌ  "bird,"

ΓΌ  "car,"

ΓΌ  "situations in which I should study more in order to pass the exam," etc.

·       Each such concept can be viewed as

·       describing some subset of objects or events

ΓΌ  defined over a larger set.

 

·       Alternatively, each concept can be thought of as a Boolean-valued function defined over this larger set.

·       Example: A function defined over all animals, whose value is

·       true for birds and

·       false for other animals.

 

Definition: Concept learning - Inferring a Boolean-valued function from training examples of its input and output

What is Concept Learning…?

        “A Task of acquiring a potential hypothesis (Solution) that best fits the given training examples”.



5. CONCEPT LEARNING

 

5. CONCEPT LEARNING

·       Learning involves acquiring general concepts from specific training examples. Example: People continually learn general concepts or categories such as "bird," "car," "situations in which I should study more to pass the exam," etc.

·       Each such concept can be viewed as describing some subset of objects or events defined over a larger set

·       Alternatively, each concept can be thought of as a Boolean-valued function defined over this larger set. (Example: A function defined over all animals, whose value is true for birds and false for other animals).

Definition: Concept learning - Inferring a Boolean-valued function from training examples of its input and output

About Machine Learning

Welcome! Your Hub for AI, Machine Learning, and Emerging Technologies In today’s rapidly evolving tech landscape, staying updated with the ...