(From Various Sources)

Artificial Intelligence Definition

The subfield of computer science concerned with the concepts and methods of symbolic inference by computer and symbolic knowledge representation for use in making inferences. AI can be seen as an attempt to model aspects of human thought on computers. It is also sometimes defined as trying to solve by computer any problem that a human can solve faster.

Examples of AI problems are computer vision (building a system that can understand images as well as a human) and natural language processing (building a system that can understand and speak a human language as well as a human). These may appear to be modular, but all attempts so far (1993) to solve them have foundered on the amount of context information and "intelligence" they seem to require.

What is Artificial Intelligence ?

The branch of computer science concerned with making computers behave like humans. The term was coined in 1956 by John McCarthy at the Massachusetts Institute of Technology. Artificial intelligence includes :

  1. games playing: programming computers to play games such as chess and checkers
  2. expert systems : programming computers to make decisions in real-life situations (for example, some expert systems help doctors diagnose diseases based on symptoms)
  3. natural language : programming computers to understand natural human languages
  4. neural networks : Systems that simulate intelligence by attempting to reproduce the types of physical connections that occur in animal brains
  5. robotics : programming computers to see and hear and react to other sensory stimuli

Genetic Algorithms

A genetic algorithm is a method for searching for the optimum solution to a complex problem, based on the principles of natural selection. It's basically an automated, intelligent approach to trial and error. Given specific formulas, rules, or arrangements to be optimized, a genetic algorithm can find a solution.

If you have an optimization problem with, say, 10 parameters, and each of those parameters could take on, say, 100 values, you have what is known as a very large "search space" for a solution. In fact, the number of possible combinations would be 100 to the 10th power - that's 1,000,000,000,000,000,000,000 possibilities!

In the past, people would solve problems like this by making intelligent guesses about the values of the parameters, and with whatever trial and error as they could afford, timewise. This way, you could get a solution while you're still alive, just not necessarily a good one.

A genetic algorithm approaches the problem by using the principles of natural selection. First, a number of solutions (a population) are created by setting the parameters randomly throughout the search space. From this population of solutions, the worst are discarded and the best solutions are then "bred" with each other by mixing the parameters (genes) from the most successful organisms, thus creating a new population. Additionally, every so often a gene will be altered slightly to produce a "mutation". As in real life, this type of continuous adaptation creates a very robust organism. The whole process continues through many "generations", with the best genes being handed down to future generations.

The result is typically a very good solution to the problem. Genetic algorithms allow us to solve problems that were previously considered too large or complicated. Additionally, genetic algorithms are useful in the very tricky area of nonlinear problems.
Click here to see some GA archives

Logic Programming

Logic programming is, like functional programming, a declarative way of composing programs. In brief, declarative programming is much more concerned on what should be computed and much less with how should be done. But it is also much more: I think it's very educational to understand logic programming and its virtues even if you are never going to write a logic program.

Logic programming language is a formalism for expressing ones ideas in the form of some chosen logic. The execution of the program consists then of proving (or disproving) that some logical consequence follows from those expressions. There are, however, different logics, different ways of proving theorems in those logics, and different ways to implement a certain mechanized form of reasoning by traditional computer. The most well-known logic programming language Prolog is based on (a restricted case of) predicate logic, a theorem proving algorithm called SLD-resolution, and an (approximative) implementation based on simple depth-first search. Later languages have added parallellism, types, modules, more declarative and flexible control mechanisms.

The inherently declarative nature of logic programming stems from the fact that each program has a natural meaning: the only difficulty is to make sure that the meaning of the program and the meaning of the problem the same.

Machine Learning

Machine Learning is the area of Artificial Intelligence that focuses on developing principles and techniques for automating the acquisition of knowledge. Some machine learning methods can dramatically reduce the cost of developing knowledge-based software by extracting knowledge directly from existing databases. Other machine learning methods enable software systems to improve their performance over time with minimal human intervention. These approaches are expected to enable the development of effective software for autonomous systems that must operate in poorly understood environments.