Solving SAT via Positive Supercompilation

  • There are many very simple heuristics for solving 3-SAT, some even with analytical convergence estimates. A simple random walk (Algorithm P in Knuth's Satisfiability fascicle), which iterates through the clauses and flips a variable at random if it encounters a violated clause, will do the trick too

  • Conversion of an NFA graph to DFA (in regex processing) seems to be a kind of supercompilation. The NFA is treated as a process, and is simulated for all possible input symbols at every step. The supercompiler makes note of what states it goes into and turns sets of them into DFA states.

  • Ah, should've double checked before posting. Thank you everyone who corrected me. Leaving it up for others.

    > SAT is an NP-complete problem, meaning that it is at least as hard as any other problem in NP

    I think this statement is backwards. Instead, any other problem in NP is at least as hard as SAT

  • I'm almost completely clueless with both 3-SAT and supercompilation, but I'm willing to bet that if you found a neat trick that solves 3-SAT problems easily, then either or both (a) there are classes and/or sizes of 3-SAT problems for which your solution breaks down badly (goes exponential), or (b) what you're doing reduces to a 3-SAT strategy that the people who enter 3-SAT contests[1] already knew about a long time ago.

    [1] http://www.satcompetition.org/

  • Is this just compiler optimized memoization? I mean this is great for parallelization as well but I thought this was standard in the functional programming paradigm. Maybe I am wrong.