getty
In today’s column, I examine the crucial debate about whether human intelligence is actually a form of computational intelligence. The premise is this. Some fervently assert that we have already figured out how to get AI to be on par with human intelligence, as evidenced by modern-era LLMs, generative AI, and computational transformers. Furthermore, and here’s the kicker, human intelligence is claimed to be the same as computational intelligence………Continue reading….
By : Lance Eliot
Source: Forbes
.
Critics:
In computer science, computational intelligence (CI) refers to concepts, paradigms, algorithms and implementations of systems that are designed to show “intelligent” behavior in complex and changing environments. These systems are aimed at mastering complex tasks in a wide variety of technical or commercial areas and offer solutions that recognize and interpret patterns, control processes, support decision-making or autonomously manoeuvre vehicles or robots in unknown environments, among other things.
These concepts and paradigms are characterized by the ability to learn or adapt to new situations, to generalize, to abstract, to discover and associate. Nature-analog or nature-inspired methods play a key role, such as in neuroevolution for Computational Intelligence. CI approaches primarily address those complex real-world problems for which mathematical or traditional modeling is not appropriate for various reasons:
The processes cannot be described exactly with complete knowledge, the processes are too complex for mathematical reasoning, they contain some uncertainties during the process, such as unforeseen changes in the environment or in the process itself, or the processes are simply stochastic in nature. Thus, CI techniques are properly aimed at processes that are ill-defined, complex, nonlinear, time-varying and/or stochastic.
Artificial intelligence (AI) is used in the media, but also by some of the scientists involved, as a kind of umbrella term for the various techniques associated with it or with CI. Craenen and Eiben state that attempts to define or at least describe CI can usually be assigned to one or more of the following groups:
- “Relative definition” comparing CI to AI
- Conceptual treatment of key notions and their roles in CI
- Listing of the (established) areas that belong to it
The relationship between CI and AI has been a frequently discussed topic during the development of CI. While the above list implies that they are synonyms, the vast majority of AI/CI researchers working on the subject consider them to be distinct fields, where either.
- CI is an alternative to AI
- AI includes CI
- CI includes AI
The view of the first of the above three points goes back to Zadeh, the founder of the fuzzy set theory, who differentiated machine intelligence into hard and soft computing techniques, which are used in artificial intelligence on the one hand and computational intelligence on the other. In hard computing (HC) and AI, inaccuracy and uncertainty are undesirable characteristics of a system, while soft computing (SC) and thus CI focus on dealing with these characteristics.
The adjacent figure illustrates these relationships and lists the most important CI techniques. Another frequently mentioned distinguishing feature is the representation of information in symbolic form in AI and in sub-symbolic form in CI techniques. Hard computing is a conventional computing method based on the principles of certainty and accuracy and it is deterministic. It requires a precisely stated analytical model of the task to be processed and a prewritten program, i.e. a fixed set of instructions.
The models used are based on Boolean logic (also called crisp logic), where e.g. an element can be either a member of a set or not and there is nothing in between. When applied to real-world tasks, systems based on HC result in specific control actions defined by a mathematical model or algorithm. If an unforeseen situation occurs that is not included in the model or algorithm used, the action will most likely fail. Soft computing, on the other hand, is based on the fact that the human mind is capable of storing information and processing it in a goal-oriented way, even if it is imprecise and lacks certainty.
SC is based on the model of the human brain with probabilistic thinking, fuzzy logic and multi-valued logic. Soft computing can process a wealth of data and perform a large number of computations, which may not be exact, in parallel. For hard problems for which no satisfying exact solutions based on HC are available, SC methods can be applied successfully. SC methods are usually stochastic in nature i.e., they are a randomly defined processes that can be analyzed statistically but not with precision.
Up to now, the results of some CI methods, such as deep learning, cannot be verified and it is also not clear what they are based on. This problem represents an important scientific issue for the future. AI and CI are catchy terms, but they are also so similar that they can be confused. The meaning of both terms has developed and changed over a long period of time, with AI being used first. Bezdek describes this impressively and concludes that such buzzwords are frequently used and hyped by the scientific community, science management and (science) journalism.
Not least because AI and biological intelligence are emotionally charged terms and it is still difficult to find a generally accepted definition for the basic term intelligence. Evolutionary computation can be seen as a family of methods and algorithms for global optimization, which are usually based on a population of candidate solutions. They are inspired by biological evolution and are often summarized as evolutionary algorithms. These include the genetic algorithms, evolution strategy, genetic programming and many others.
They are considered as problem solvers for tasks not solvable by traditional mathematical methods and are frequently used for optimization including multi-objective optimization. Since they work with a population of candidate solutions that are processed in parallel during an iteration, they can easily be distributed to different computer nodes of a cluster. As often more than one offspring is generated per pairing, the evaluations of these offspring, which are usually the most time-consuming part of the optimization process, can also be performed in parallel.
In the course of optimization, the population learns about the structure of the search space and stores this information in the chromosomes of the solution candidates. After a run, this knowledge can be reused for similar tasks by adapting some of the “old” chromosomes and using them to seed a new population. Swarm intelligence is based on the collective behavior of decentralized, self-organizing systems, typically consisting of a population of simple agents that interact locally with each other and with their environment.
Despite the absence of a centralized control structure that dictates how the individual agents should behave, local interactions between such agents often lead to the emergence of global behavior. Among the recognized representatives of algorithms based on swarm intelligence are particle swarm optimization and ant colony optimization. Both are metaheuristic optimization algorithms that can be used to (approximately) solve difficult numerical or complex combinatorial optimization tasks.
Since both methods, like the evolutionary algorithms, are based on a population and also on local interaction, they can be easily parallelized and show comparable learning properties. In complex application domains, Bayesian networks provide a means to efficiently store and evaluate uncertain knowledge. A Bayesian network is a probabilistic graphical model that represents a set of random variables and their conditional dependencies by a directed acyclic graph.
The probabilistic representation makes it easy to draw conclusions based on new information. In addition, Bayesian networks are well suited for learning from data. Their wide range of applications includes medical diagnostics, risk management, information retrieval, and text analysis, e.g. for spam filters. Their wide range of applications includes medical diagnostics, risk management, information retrieval, text analysis, e.g. for spam filters, credit rating of companies, and the operation of complex industrial processes.
Artificial immune systems are another group of population-based metaheuristic learning algorithms designed to solve clustering and optimization problems. These algorithms are inspired by the principles of theoretical immunology and the processes of the vertebrate immune system, and use the learning and memory properties of the immune system to solve a problem.
Operators similar to those known from evolutionary algorithms are used to clone and mutate artificial lymphocytes. Artificial immune systems offer interesting capabilities such as adaptability, self-learning, and robustness that can be used for various tasks in data processing, manufacturing systems, system modeling and control, fault detection, or cybersecurity. According to bibliometrics studies, computational intelligence plays a key role in research.
All the major academic publishers are accepting manuscripts in which a combination of Fuzzy logic, neural networks and evolutionary computation is discussed. On the other hand, Computational intelligence isn’t available in the university curriculum. The amount of technical universities in which students can attend a course is limited. Only British Columbia, Technical University of Dortmund (involved in the European fuzzy boom) and Georgia Southern University are offering courses from this domain.
The reason why major university are ignoring the topic is because they don’t have the resources. The existing computer science courses are so complex, that at the end of the semester there is no room for fuzzy logic. Sometimes it is taught as a subproject in existing introduction courses, but in most cases the universities are preferring courses about classical AI concepts based on Boolean logic, turing machines and toy problems like blocks world.
Since a while with the upraising of STEM education, the situation has changed a bit. There are some efforts available in which multidisciplinary approaches are preferred which allows the student to understand complex adaptive systems. These objectives are discussed only on a theoretical basis. The curriculum of real universities wasn’t adapted yet.
Leave a Reply