0
votes

The time complexity of algorithms can differ from programming language to programming language in which it is implemented, because of certain things not possible to be done in one language as opposed to the other. Turing-completeness also doesn't say anything about the time complexity possibilities of said languages.

My question is, what are the requirements for a programming language to be able to solve every algorithm in the best time complexity possible by any language? Would being Turing-complete and added with that a possibility to inspect/edit datastructures in constant time be enough?

1
Being a non-deterministic turing machine? (Or quantum computer, if you disallow the first)?amit
I'm not getting your point??Turing-completeness apart, how do you say "The time complexity of algorithms can differ from programming language to programming language in which it is implemented". Can you give an example?vish4071

1 Answers

2
votes

I think that it's provably impossible to build a single programming language or model of computation that can optimally solve every computational problem.

There's a result in theoretical computer science called the time hierarchy theorem that says that for many functions f(n), there are many problems that can be solved in time O(f(n)) on a Turing machine but not time O(f(n) / log n). The proof of the result essentially works like this: consider the problem "Does a Turing machine M reject input w within f(|w|) steps?" You can show that you can solve this deterministically in time O(f(n)k) for some k by just simulating M on w for f(|w|) steps and seeing what happens. However, if you solve the problem "too much faster" than this, then you can use a halting-problem-like argument to make a program that asks whether it is going to reject its input within f(|w|) steps, then does the opposite of whatever it's supposed to do.

I am fairly confident that for any feasible model of computation, you could find an analogue of the time hierarchy theorem. For example, suppose you have a computer of type X, and consider the problem "does X accept w within f(|w|) steps?" A computer of type X can solve this, but it can't do it too quickly or it results in a contradiction. Therefore, we could probably argue that there are some functions a(x) and b(x) such that a computer of type X could solve this problem in time O(a(x)) but not time O(a(x) / b(x)). So now go and define a model of computation X' that is basically X, but in which the operation "solve the problem 'does X accept w within f(|w|) steps?'" is a built-in operation that requires one step. Now, this model of computation X' can solve at least one problem faster than X. We could keep iterating this construction to build a model of computation X'' that can solve certain problems faster than X', a model of computation X''' that can solve certain problems faster than X'', etc.

Therefore, I'm fairly confident that you can't find a single "best" model of computation, since any model of computation can be used against itself to define a model of computation that's even faster than it on some specific input. (This is somewhat related to the Incompleteness Theorem - any sound and complete formal system will have something that it can't prove, and if you fix this by adding in that statement as an axion, then the new system has something that it can't prove, etc.)