Local Search

1.  The Maximum Cut Problem

    --  Input : An undirected graph G = (V , E).

    --  Goal : A cut (A , B) -  a partition of V into two non-empty sets that maximizes the number of crossing edges.

    --  Fact : NP-complete.

    --  Computationally tractable special case : Bipartite graphs (i.e., where there is a cut such that all edges are crossing.) Can be solved in linear time via breadth-first search(odd level and even level in different set)

 

2.  A Local Search Algorithm

    --  Notation: For a cut (A , B) and a vertex v, define :

        -- cv(A , B) = # of edges incident on v that cross (A , B)

        -- dv(A , B) = # of edges incident on v that don't cross (A , B)

    --  Local search algorithm:

        -- Let (A , B) be an arbitrary cut of G.

        -- While there is a vertex v with dv(A , B) > cv (A , B):

            -- Move v to other side of the cut

           [key point: increases number of crossing edges by dv(A , B) - cv(A , B) > 0]

        -- Return final cut (A , B)

    --  Running time: Terminates within C(n , 2) ( max number of edges) iterations. [ each iteration will increase # of crossing edges, and max crossing edges are # of all edges)

    --  Performance Guarantees : This local search algorithm always outputs a cut in which the number of crossing edges is at least 50% of the maximum possible. (Even 50% of |E|)

 

3.  Expected number of crossing edges of a random cut already is 1/2|E|.

    -- Proof: Consider a random cut (A , B). For edge e in E, define

        Xe = 1 if e crosses (A , B) , 0 otherwise. 

        We have E[Xe]=Pr[Xe = 1]=1/2.

        So E[# crossing edges]=E[Sum(e in E){Xe}]= Sum(e in E) {E[Xe]}=|E|/2.

 

4.  Proof of Performance Guarantee

    -- Let (A , B) be a locally optimal cut. Then, for every vertex v, dv (A , B) <= cv(A , B). 

    -- Summing over all v in V:

       Sum(v in V){dv(A , B) <= Sum(v in V){cv(A , B)}

       [counts each non-crossing edge twice counts each crossing edge twice]

    -- So:

       2[# of non-crossing edges] <= 2[# of crossing edges]

       2|E| <= 4[# of crossing edges]

       [# of crossing edges] >= 1/2 |E|

 

5.  The Weighted Maximum Cut Problem

    -- Generalization: Each edge e in E has a nonnegative weight we ,

    -- want to maximize total weight of crossing edges.

    -- Notes:

        -- Local search still well defined

        -- Performance guarantee of 50% still holds for locally optimal cuts (also for a random cut)

        -- No longer guaranteed to converge in polynomial time

 

6.  Neighborhoods

    -- Let X = set of candidate solutions to a problem.

       Examples: Cuts of a graph, TSP tours, CSP variable assignments

    -- Key ingredient: Neighborhoods

       For each x in X, specify which y in X are its "neighbors"

       Examples: x, y are neighboring cuts <==> Differ by moving one vertex

                 x, y are neighboring variable assignments <==> Differ in the value of a single variable

                 x, y are neighboring TSP tours <==> Differ by 2 edges

 

7.  A Generic Local Search Algorithm

    --  Let x = some initial solution.

    --  While the current solution x has a superior neighboring solution y: Set x := y

    --  Return the final (locally optimal) solution x

 

8.  FAQ of Local Search Algorithm

    --  How to pick initial solution x?

        -- Use a random solution as an initial solution. Run many independent trials of local search, return the best locally optimal solution found.

        -- Use your best heuristics to find an initial solution and use local search as a postprocessing step to make your solution even better.

    --  If there are superior neighboring y, which to choose?

        -- Choose at random 

        -- biggest improvement

        -- more complex heuristics

    --  How to define neighborhoods?

        -- bigger neighborhoods (distance from neighbors is big) ==> slower to verify local optimality, but fewer (bad) local optima

        -- Find "sweet spot" between solution quality and efficient searchability.

    --  Is local search guaranteed to terminate (eventually)?

        -- If X is finite and every local step improves some objective function, then yes.

    --  Is local search guaranteed to converge quickly?

        -- Usually not.

    --  Are locally optimal solutions generally good approximations to globally optimal ones?

        -- No. [To mitigate, run randomized local search many times, remember the best locally optimal solution found]

 

9.  The 2-SAT Problem

    --  Input:

        -- n boolean variables x1, x2, ... , xn. (Can be set to TRUE or FALSE)

        -- m clauses of 2 literals each ("literal" = xi or !xi )

           Example: (x1 v x2) ^ (!x1 v x3) ^ (x3 v x4) ^ (!x2 v !x4)

    --  Output: "Yes" if there is an assignment that simultaneously satisfies every clause, "no" otherwise.

           Example: "yes", via (e.g.) x1 = x3 =TRUE and x2 = x4 =FALSE

 

10.  Papadimitriou's 2-SAT Algorithm

    --  Repeat log2 n times:

        -- Choose random initial assignment

        -- Repeat 2n^2 times:

            -- If current assignment satisfies all clauses, halt + report this

            -- Else, pick arbitrary unsatisfied clause and flip the value of one of its variables [choose between the two uniformly at random]

    --  Report "unsatisfiable"

    --  Obvious good points:

        -- Runs in polynomial time

        -- Always correct on unsatisfiable instances    

    --  Key question: If there's a satisfying assignment, will the algorithm find one (with probability close to 1)?

 

11.  Random Walks

    --  Setup: Initially (at time 0), at position 0.

    --  At each time step, your position goes up or down by 1, with 50/50 probability.

    --  Except if at position 0, in which case you move to position 1 with 100% probability.

    --  For an integer n >= 0, let Tn = number of steps until random walk reaches position n. Then E[Tn] = n^2

        --  Let Zi = number of random walk steps to get to n from i . (Note Z0 = Tn)

        --  Edge cases: E[Zn]=0, E[Z0]=1+E[Z1]

        --  For i in {1, 2, ... , n }, 

            E[Zi] = Pr[go left] E[Zi | go left] + Pr[go right] E[Zi | go right]

                  = 1/2(1+ E[Zi-1]) + 1/2(1+ E[Zi+1])

            Rearranging: E[Zi] - E[Zi+1] = E[Zi-1] - E[Zi] + 2

            while E[Z0]-E[Z1] = 1 , so : E[Z0] = 1 + 3 + 5 + ... + 2n-1 + E[Zn] = n^2

    -- A Corollary : Pr[Tn > 2n^2] <= 1/2.

        -- n^2 = E[Tn] = Sum(k = 0 to 2n^2) {k Pr(Tn = k)} + Sum(k = 2n^2 + 1 to +Infinity) {k Pr(Tn=k)}

                 >= 2n^2 * Pr(Tn > 2n^2) 

 

12.  For a satisfiable 2-SAT instance with n variables, Papadimitriou's algorithm produces a satisfying assignment with probability >= 1 - 1/n.

    --  Fix an arbitrary satisfying assignment a*.

    --  Let at = algorithm's assignment after inner iteration t (t = 0, 1, ... , 2n^2)

    --  Let Xt = number of variables on which at and a* agree. (Xt in {0, 1, ... , n} is a random variable)

    --  Note: If Xt = n, algorithm halts with satisfying assignment a*.

    --  Key point: Suppose at not a satisfying assignment and algorithm picks unsatisfied clause with variables xi , xj .

    --  Consequence of algorithm's random variable flip:

        -- If a* and at differ on both xi & xj , then Xt+1 = Xt + 1 (100% probability)

        -- If a* and at differ on exactly one of xi , xj , then Xt+1 = Xt + 1 (50% probability) or Xt - 1 (50% probability)

    --  The random variables X0, X1, . . . ,X2n^2 behave just like a random walk of the nonnegative integers except that:

        -- Sometimes move right with 100% probability (instead of 50%) [when both variable xi, xj are different between at and a*.]

        -- Might have X0 > 0 instead of X0 = 0

        -- Might stop early, before Xt = n [there may be satisfiable instance other than a*]

    --  Probability that a single iteration of the outer for loop fails to find a satisfying assignment is Pr[Tn >= 2n^2] <= 1/2

        Pr[algorithm fails] = Pr[all log2 n independent trials fail] = (1/2)^log2 n = 1/n.

你可能感兴趣的:(Local Search,2SAT,Maximum Cut,3SAT)