The ability to adapt to changing circumstances (i.e., to learn) appears to be fundamental to the nature of intelligence. However, the construction of computer programs is a difficult and demanding exercise that usually requires that much time be spent on what is euphemistically called debugging—tracking down and eliminating errors in a program. The intricacies and complexity of debugging can escalate dramatically if a program is self-modifying (i.e., if it learns), because learning changes some aspects of the original program. This means that when a programmer needs to delve into a learning program to alter or correct its behavior, he or she first has to understand the detailed results of the learning in addition to the complexities of the basic program as originally written. Consequently, most AI systems have little or no learning behavior built into them. Explorations of mechanisms to support adaptive behavior tend to be pursued as an end in themselves, and these explorations constitute the AI subarea of machine learning..
Of complex systems can be analyzed, including data associated with symbolic values ??(eg a numeric attribute, not just a numeric value, just a number , but a probabilistic value, that is to say a number together with a probability or associated with a confidence interval ) or a set of possible modalities of a numeric attribute or categorical. The analysis may involve the same data presented as graphs or trees , or of curves (eg, the curve of time evolution of a measure, this is known as continuous data, as opposed to discrete data associated to traditional attribute-value).
Analsis of the training data
The design of computational systems capable of reasoning in an intelligent, human-like way necessarily involves a formalization of a number of concepts that are central to human cognition, such as truth, uncertainty, similarity, and utility. The latter three concepts are naturally considered as gradual in the sense of being a matter of degree: One can be more or less uncertain about an event, two objects can be more or less similar, and the utility of a product can vary in a gradual way. The gradual nature of these concepts is also reflected in corresponding formal theories, such as probability and utility theory, which are numerical by nature. It is hence somewhat surprising that, with the same naturalness, the notion of truth is commonly considered as a bivalent concept: Logically, a proposition is either true or false, but nothing in-between. Clearly, this conception, which pervades modern science and thinking, has a longstanding tradition in Western philosophy, with roots in the school of Aristotle. It also manifests itself in the mathematical set theory due to Cantor. And admittedly, formal systems based on bivalent logic and set theory (including theories of uncertainty based on such systems, such as probability theory) have proved extremely useful in the scientific terrain, where they paved the way for the amazing success of the exact and engineering sciences in the last century.