A computational decision problem $P$ consists of a family $\bar x_n$ of questions (`instances') that are to be answered with yes or no. For a function problem, instances are to be answered with values from some set.

Complexity is fundamental to theoretical computer science. For a problem $P$, it deals with the question:

How expensive (in terms of resources like memory or, primarily, time) is a computational solution to $P$?

However the question of computability logically precedes that of complexity:

Does $P$ admit of a computational solution at all?

Many problems are clearly computable (e.g. by means of exhaustive search); but for others, computability is not obvious or even fails.

Complexity generally measures the amount of some ressource necessary/sufficient in order to solve $P$: asymptotically in terms of the input size. Typical ressources are running time (#steps) or space (#memory cells); others, for instance in parallel computing, include the number of processors or the volume of communication.

Turing machines are generally agreed to be the appropriate mathematical idealization describing the capabilities of actual digital computers, that is, for problems involving finite binary strings, integers, or anything encodable as such (e.g. rational numbers with numerator/denominator or finite graphs). With regard to computations on real numbers on the other hand, two major extensions have become well established:

  • Recursive Analysis considers computation on reals in terms of infinite approximations by rational numbers.
  • Blum-Shub-Smale machines (`algebraic model') perform finitely many arithmetic operations on real numbers exactly.