Decision, Optimisation & Construction

Posted on Nov 18, 2023

(You can watch the video version of this post on YouTube.)


Now when we talk about an optimisation problem, right, there’s actually different variants that we’re not usually explicit about. What is, say, the independent set problem?

INDEPENDENT SET (CONSTRUCTION)
GIVEN: a graph G
QUESTION: what is the largest independent set in G?
INDEPENDENT SET (OPTIMISATION)
GIVEN: a graph G
QUESTION: what is the size of the largest independent set in $G$?
INDEPENDENT SET (DECISION)
GIVEN: a graph G, an integer k
QUESTION: does there exist an independent set of size at least $k$?

We don’t usually care that much about this distinction: it’s just that algorithms usually solve the construction version (but only usually! We’ll talk about some exceptions another time) and complexity theory is usually about the decision version, where we don’t mind because typically it’s all polynomially related.

Going down the list, you can pretty much just read off the answer for the next one. In the other direction if there is no funny business with weird problems and complexity classes, then you can get from Decision to Optimisation efficiently. What I wanted to talk about today is how to get from Decision to Construction. In some specific cases it’s a fun puzzle and sometimes you can be smart about it.


Let’s say we have a procedure to decide satisfiability. Given a formula, it tells us Yes or No. How do we now find a satisfying assignment? Let’s say I can adaptively run the procedure on multiple inputs, like a Turing reduction. Here’s one way to do it.

First you check the entire formula: this must be Yes or there is no satisfying assignment at all. Then you pick an arbitrary variable and set it to true, let’s say. Just substitute it everywhere: that’s a valid SAT instance again, so we can check it. If Yes, then there exists a satisfying assignment that has this variable as true; if No, all satisfying assignments have this variable as false – remember: the entire thing was satisfiable. So substitute that instead. Either way, we have now fixed one of the variables and have a satisfiable formula for the others.

So with 1 query per variable, we find a satisfying assignment. You might analyse that later queries work on smaller instances, or think about entropy or something, but let’s not get into that.


Instead, let’s do Hamiltonian Cycle. How can we find a Hamiltonian cycle given a procedure to decide Hamiltonicity?

  1. Same idea as for SAT. Pick an arbitrary edge and remove it: is the graph still Hamiltonian? If YES, there exists a Hamiltonian cycle without this edge, so just forget about it. If NO, put it back: all (remaining) Hamiltonian cycles use this edge. Do this to all edges and you’ll be left with a Hamiltonian cycle. The number of queries is the number of edges of the graph.

  2. Pick an arbitrary vertex and check all its incident edges as above. Only two edges will survive. Follow one of these edges and repeat the precess: this edge must stay, we just need to found the “outgoing” edge from this vertex. This effectively walks along the cycle, discovering it as you go.

  3. Let’s find this “outgoing edge” more efficiently. Rather than checking each incident edge individually, let’s remove half of them. The same logic as before says that either this is fine and we can forget about them forever, or: the other half of the edges are no good, and we need this half. Either way, you get rid of half of the incident edges, so you find a valid outgoing edge in $\log(\Delta)$ time, where $\Delta$ is the degree of the vertex. And this is worst case $O(n \log n)$ queries instead of $n^2$.

So if you have nonconstructive algorithm, maybe you can be smart about how to use it.