Classification and Modeling with Linguistic Information Granules: Advanced Approaches to Linguistic Data MiningSpringer Science & Business Media, 2004 M11 19 - 308 pages Many approaches have already been proposed for classification and modeling in the literature. These approaches are usually based on mathematical mod els. Computer systems can easily handle mathematical models even when they are complicated and nonlinear (e.g., neural networks). On the other hand, it is not always easy for human users to intuitively understand mathe matical models even when they are simple and linear. This is because human information processing is based mainly on linguistic knowledge while com puter systems are designed to handle symbolic and numerical information. A large part of our daily communication is based on words. We learn from various media such as books, newspapers, magazines, TV, and the Inter net through words. We also communicate with others through words. While words play a central role in human information processing, linguistic models are not often used in the fields of classification and modeling. If there is no goal other than the maximization of accuracy in classification and model ing, mathematical models may always be preferred to linguistic models. On the other hand, linguistic models may be chosen if emphasis is placed on interpretability. |
What people are saying - Write a review
Reviews aren't verified, but Google checks for and removes fake content when it's identified
User Review - Flag as inappropriate
Horrible
Contents
1 Linguistic Information Granules | 1 |
11 Mathematical Handling of Linguistic Terms | 2 |
12 Linguistic Discretization of Continuous Attributes | 4 |
2 Pattern Classification with Linguistic Rules | 11 |
22 Linguistic Rule Extraction for Classification Problems | 12 |
221 Specification of the Consequent Class | 13 |
222 Specification of the Rule Weight | 17 |
23 Classification of New Patterns by Linguistic Rules | 20 |
823 Other Approaches to Linguistic Rule Generations | 166 |
824 Estimation of Output Values by Linguistic Rules | 169 |
826 Limitations and Extensions | 172 |
827 NonStandard Fuzzy Reasoning Based on the Specificity of Each Linguistic Rule | 174 |
83 Modeling of Nonlinear Fuzzy Functions | 177 |
9 Design of Compact Linguistic Models | 181 |
912 Handling as a SingleObjective Optimization Problem | 182 |
913 Handling as a ThreeObjective Optimization Problem | 183 |
232 VotingBased Method | 22 |
24 Computer Simulations | 25 |
241 Comparison of Four Definitions of Rule Weights | 26 |
242 Simulation Results on Iris Data | 29 |
243 Simulation Results on Wine Data | 32 |
244 Discussions on Simulation Results | 35 |
3 Learning of Linguistic Rules | 39 |
312 Illustration of the Learning Algorithm Using Artificial Test Problems | 41 |
313 Computer Simulations on Iris Data | 45 |
314 Computer Simulations on Wine Data | 47 |
321 Learning Algorithm | 48 |
322 Illustration of the Learning Algorithm Using Artificial Test Problems | 50 |
323 Computer Simulations on Iris Data | 54 |
324 Computer Simulations on Wine Data | 56 |
33 Related Issues | 57 |
332 Adjustment of Membership Functions | 62 |
4 Input Selection and Rule Selection | 69 |
42 Input Selection | 70 |
422 Simulation Results | 71 |
43 Genetic AlgorithmBased Rule Selection | 75 |
431 Basic Idea | 76 |
432 Generation of Candidate Rules | 77 |
433 Genetic Algorithms for Rule Selection | 80 |
434 Computer Simulations | 87 |
44 Some Extensions to Rule Selection | 89 |
441 Heuristics in Genetic Algorithms | 90 |
442 Prescreening of Candidate Rules | 93 |
443 Computer Simulations | 96 |
5 GeneticsBased Machine Learning | 103 |
52 MichiganStyle Algorithm | 105 |
523 Algorithm | 107 |
524 Computer Simulations | 108 |
525 Extensions to the MichiganStyle Algorithm | 111 |
53 PittsburghStyle Algorithm | 116 |
531 Coding of Rule Sets | 117 |
533 Algorithm | 119 |
54 Hybridization of the Two Approaches | 121 |
542 Hybrid Algorithm | 124 |
543 Computer Simulations | 125 |
544 Minimization of the Number of Linguistic Rules | 126 |
6 MultiObjective Design of Linguistic Models | 131 |
62 MultiObjective Genetic Algorithms | 134 |
622 Elitist Strategy | 135 |
63 MultiObjective Rule Selection | 136 |
64 MultiObjective GeneticsBased Machine Learning | 139 |
7 Comparison of Linguistic Discretization with Interval Discretization | 143 |
71 Effects of Linguistic Discretization | 144 |
712 Effect in the Classification Phase | 146 |
713 Summary of Effects of Linguistic Discretization | 147 |
722 Specification of Partially Fuzzified Linguistic Discretization | 150 |
73 Comparison Using Homogeneous Discretization | 151 |
732 Simulation Results on Wine Data | 154 |
74 Comparison Using Inhomogeneous Discretization | 155 |
741 EntropyBased Inhomogeneous Interval Discretization | 156 |
742 Simulation Results on Iris Data | 157 |
743 Simulation Results on Wine Data | 158 |
8 Modeling with Linguistic Rules | 161 |
82 Linguistic Rule Extraction for Modeling Problems | 162 |
821 Linguistic Association Rules for Modeling Problems | 163 |
822 Specification of the Consequent Part | 165 |
92 MultiObjective Rule Selection | 185 |
923 ThreeObjective Genetic Algorithm for Rule Selection | 187 |
924 Simple Numerical Example | 189 |
93 Fuzzy GeneticsBased Machine Learning | 190 |
931 Coding of Rule Sets | 192 |
933 Simple Numerical Example | 194 |
94 Comparison of Two Schemes | 196 |
10 Linguistic Rules with Consequent Real Numbers | 199 |
102 Local Learning of Consequent Real Numbers | 201 |
1022 Incremental Learning Algorithm | 203 |
103 Global Learning | 205 |
1031 Incremental Learning Algorithm | 206 |
1032 Comparison Between Two Learning Schemes | 207 |
104 Effect of the Use of Consequent Real Numbers | 208 |
1042 Simulation Results | 210 |
105 TwinTable Approach | 211 |
1051 Basic Idea | 212 |
1052 Determination of Consequent Linguistic Terms | 213 |
1053 Numerical Example | 215 |
11 Handling of Linguistic Rules in Neural Networks | 219 |
111 Problem Formulation | 220 |
1112 MultiLayer Feedforward Neural Networks | 221 |
112 Handling of Linguistic Rules Using Membership Values | 222 |
1122 Network Architecture | 223 |
113 Handling of Linguistic Rules Using Level Sets | 225 |
1132 Network Architecture | 226 |
114 Handling of Linguistic Rules Using Fuzzy Arithmetic | 228 |
1143 Network Architecture | 230 |
1144 Computer Simulation | 233 |
12 Learning of Neural Networks from Linguistic Rules | 235 |
122 Learning from Linguistic Rules for Classification Problems | 237 |
1223 Extended BackPropagation Algorithm | 238 |
1224 Learning from Linguistic Rules and Numerical Data | 241 |
123 Learning from Linguistic Rules for Modeling Problems | 245 |
1233 Extended BackPropagation Algorithm | 246 |
1234 Learning from Linguistic Rules and Numerical Data | 247 |
13 Linguistic Rule Extraction from Neural Networks | 251 |
131 Neural Networks and Linguistic Rules | 252 |
1321 Basic Idea | 253 |
1323 Computer Simulations | 254 |
133 Linguistic Rule Extraction for Classification Problems | 258 |
1331 Basic Idea | 259 |
1333 Computer Simulations | 263 |
1334 Rule Extraction Algorithm | 265 |
1335 Decreasing the Measurement Cost | 267 |
134 Difficulties and Extensions | 270 |
1341 Scalability to HighDimensional Problems | 271 |
14 Modeling of Fuzzy InputOutput Relations | 277 |
1411 Linear Fuzzy Regression Models | 278 |
1412 Fuzzy RuleBased Systems | 280 |
1413 Fuzzified Takagi Sugeno Models | 281 |
1414 Fuzzifled Neural Networks | 283 |
142 Modeling of Fuzzy Mappings | 285 |
1422 Fuzzy RuleBased Systems | 286 |
1424 Fuzzified Neural Networks | 287 |
1431 Fuzzy Classification of NonFuzzy Patterns | 288 |
1432 Fuzzy Classification of Interval Patterns | 291 |
1434 Effect of Fuzziflcation of Input Patterns | 292 |
305 | |
Other editions - View all
Common terms and phrases
ability adjusted antecedent applied approach assume attributes average calculated candidate rules Chap chapter Class classification boundary classification rate combinations compatibility grade computer simulations condition confidence consequent corresponding crossover current population definition epochs estimated examined example explain extraction figure fitness five linguistic four function fuzzy GBML algorithm fuzzy output fuzzy reasoning genetic algorithm given hand handled heuristic improved increase initial input vector input-output pairs interval iterated learning algorithm learning scheme length linguistic discretization linguistic input linguistic rule-based systems linguistic rules linguistic terms manner medium method Michigan Michigan-style modeling mutation nonlinear number of linguistic obtained operation optimization output value pattern classification problems performance population probability procedure randomly relation rule set rule weight shown in Fig shows Simulation results single space specified string Table test patterns three linguistic tion trained neural network training patterns update variables wine data set