Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

I\'m wondering why the following argument doesn\'t work for showing that the exi

ID: 653377 • Letter: I

Question

I'm wondering why the following argument doesn't work for showing that the existence of a Las Vegas algorithm also implies the existence of a deterministic algorithm:

Suppose that there is a Las Vegas algorithm A that solves some graph problem P, i.e., A takes an n-node input graph G as input (I'm assuming the number of edges is ?n) and eventually yields a correct output, while terminating within time T(G) with some nonzero probability.

Suppose that there is no deterministic algorithm that solves P. Let A? be the deterministic algorithm that is given by running the Las Vegas algorithm A with a fixed bit string ? as its random string. Let k=k(n) be the number of n-node input graphs (with ?n edges). Since there is no deterministic algorithm for P, it follows that, for any ?, the deterministic algorithm A? fails on at least one of the k input graphs. Returning to the Las Vegas algorithm A, this means that A has a probability of failure of ?1/k, a contradiction to A being Las Vegas.

Explanation / Answer

There is no contradiction here. A Las Vegas algorithm terminates in expected polynomial time, say Q(n). Markov's inequality shows that for any given polynomial R(n), it terminates in polynomial time Q(n)R(n) with probability 1?1/R(n). The number of graphs on n vertices, k(n), is exponential in n, so the failure probability 1/R(n) is much larger than 1/k(n).

Another subtle point is uniformity. A deterministic algorithm needs to handle any input size, while your A? can only handle inputs of certain size. In general, a randomized algorithm can use an unbounded number of random bits. However, your argument can still be formulated in terms of non-uniform models such as circuits, and it is still wrong for the reason described above.