Optimizing the backtracking algorithm solving Sudoku

て烟熏妆下的殇ゞ 提交于 2019-12-18 10:55:15

问题


I'm hoping to optimize my backtracking algorithm for my Sudoku Solver.


What it does now:

The recursive solver function takes a sudoku puzzle with various given values.

I will scour through all the empty slots in the puzzle, looking for the slot that has the least possibilities, and get the list of values.

From the list of values, I will loop through it by placing one of the values from the list in the slot, and recursively solve it, until the entire grid is filled.


This implementation still takes incredibly long for some puzzles and I hope to further optimize this. Does anyone have any ideas how I might be able to further optimize this?


Here's my code in Java if you're interested.

public int[][] Solve(int[][] slots) {
    // recursive solve v2 : optimization revision

    int[] least = new int[3];
    least[2] = Integer.MAX_VALUE;
    PuzzleGenerator value_generator = new PuzzleGenerator();
    LinkedList<Integer> least_values = null;

    // 1: find a slot with the least possible solutions
    // 2: recursively solve.

    // 1 - scour through all slots.
    int i = 0;
    int j = 0;
    while (i < 9) {
        j = 0;
        while (j < 9) {
            if (slots[i][j] == 0) {
                int[] grid_posi = { i, j };
                LinkedList<Integer> possible_values = value_generator
                        .possibleValuesInGrid(grid_posi, slots);
                if ((possible_values.size() < least[2])
                        && (possible_values.size() != 0)) {
                    least[0] = i;
                    least[1] = j;
                    least[2] = possible_values.size();
                    least_values = possible_values;
                }
            }
            j++;
        }
        i++;
    }

    // 2 - work on the slot
    if (least_values != null) {
        for (int x : least_values) {
            int[][] tempslot = new int[9][9];
            ArrayDeepCopy(slots, tempslot);
            tempslot[least[0]][least[1]] = x;

            /*ConsoleInterface printer = new gameplay.ConsoleInterface();
            printer.printGrid(tempslot);*/

            int[][] possible_sltn = Solve(tempslot);
            if (noEmptySlots(possible_sltn)) {
                System.out.println("Solved");
                return possible_sltn;
            }
        }
    }
    if (this.noEmptySlots(slots)) {
        System.out.println("Solved");
        return slots;
    }
    slots[0][0] = 0;
    return slots;
}

回答1:


I had an assignment to do just that: build the fastest sudoku solver in Java. I ended up winning the contest with a time of 0.3 millisecond.

I didn't use the dancing links algorithm and didn't compare with it, but some contestants must have tried it, yet my closest competitor took about 15 milliseconds.

I simply used a recursive backtracking algorithm, augmented it with 4 "rules" (which made backtracking unnecessary for almost every puzzle) and kept a bit field as a list of legal values for each position.

I wrote a blog post about it and posted the code here : http://www.byteauthor.com/2010/08/sudoku-solver/




回答2:


a long time I wrote a Sudoku solver (several years ago, but I keep all the code I write). It hasn't been generalized to solve "bigger" size than the usual Sudoku, but it is pretty fast.

It solves the following in 103 ms (on a Core 2 Duo 1.86 Ghz) and really hasn't been optimized:

        {0,0,0,0,7,0,9,4,0},
        {0,7,0,0,9,0,0,0,5},
        {3,0,0,0,0,5,0,7,0},
        {0,8,7,4,0,0,1,0,0},
        {4,6,3,0,0,0,0,0,0},
        {0,0,0,0,0,7,0,8,0},
        {8,0,0,7,0,0,0,0,0},
        {7,0,0,0,0,0,0,2,8},
        {0,5,0,2,6,8,0,0,0},            

How fast is yours and on which board is it slow? Are you sure you're not constantly revisiting path that shouldn't be revisited?

Here's the meat of the algo:

private static void solveRec( final IPlatform p ) {
    if (p.fullBoardSolved()) {
        solved = p;
        return;
    }
    boolean newWayTaken = false;
    for (int i = 0; i < 9 && !newWayTaken; i++) {
        for (int j = 0; j < 9 && !newWayTaken; j++) {
            if (p.getByteAt(i, j) == 0) {
                newWayTaken = true;
                final Set<Byte> s = p.avail(i / 3, j /3);
                for (Iterator<Byte> it = s.iterator(); it.hasNext();) {
                    final Byte b = it.next();
                    if (!p.columnContains(j, b) && !p.lineContains(i, b)) {
                        final IPlatform ptemp = duplicateChangeOne(p, b, i, j);
                        solveRec(ptemp);
                        if (solved != null) {
                            return;
                        }
                    }
                }
            }
        }
    }
}

And the IPlatform abstraction (please be nice, it was written a lot of years ago, before I knew that in Java adding 'I' before interface names wasn't all the rage):

public interface IPlatform {

    byte getByteAt(int i, int j);

    boolean lineContains(int line, int value);

    boolean columnContains(int column, int value);

    Set<Byte> avail(int i, int j);

    boolean fullBoardSolved();

}



回答3:


Do some constraint propagation before each nondeterministic step.

In practice this means that you have some rules which detect forced values and inserts them, and only if this doesn't make progress any more you resort to backtracking search through possible values.

Most Sudoku puzzles for humans are designed so that they don't need backtracking at all.




回答4:


A while ago I implemented Donald Knuth's Dancing Links and his Algorithm X for Sudoku in Ruby (a language not known to be too efficient). For the few examples I checked, it took few milliseconds on my 1.5 GHz laptop.

You can look at the wikpedia how the Dancing Links work, and adapt it to Sudoku yourself. Or you take a look at "A Sudoku Solver in Java implementing Knuth’s Dancing Links Algorithm".

PS: Algorithm X is a backtracking algorithm.




回答5:


I think a big optimization would be to keep not only the state of the board, but for each row/col/square if it contains each of the numbers 1-9. Now to check if a position can have a number you simply need to check if the row/col/square the position is in don't contain that number (which is just 3 array lookups).

Also a big speed loss has to be making a new array for each recursive call. Instead of doing this make the change in the array before the recursive call, then undo it after the recursive call. Basically add the invariant that Solve will change slots while it runs, but when it returns it will leave it as it was when the function was called.

Also every time solve returns you have to check if the board is solved or not. If solve doesn't find a solution it should just return null, if it finds a solution it should return that. This way you can quickly test if your recursively call to solve found a solution or not.

Does placing a number in the square with the fewest options really help? Without that the code is a lot simpler (you don't have to save things in linked lists etc.)

Here is my psuedo code:

for(square on the board)
      for(possible value)
           if(this square can hold this value){
                place value on the board
                update that this row/col/square now contains this value

                recursive call
                if recursive call succeeded return the value from that call

                update that this row/col/square does not contain this value
                undo placing value on board
           }
if (no empty squares)
    return solved

Here is my code (I haven't tested it):

public int[][] solve(int[][] board, boolean[][] row, boolean[][] col, boolean[][] square){
    boolean noEmpty = true;
    for(int i = 0; i < 9;i++){
        for(int j = 0; j < 9;j++){
            if(board[i][j] == 0){
                noEmpty = false;
                for(int v = 1; v <= 9; v++){
                    int sq = (i/3)*3+(j/3);
                    if(row[i][v-1] == false && col[j][v-1] == false && square[sq][v-1] == false){
                        board[i][j] = v;
                        row[i][v-1] = true;
                        col[j][v-1] = true;
                        square[sq][v-1] = true;
                        int[][] ans = solve(board,row,col,square);
                        if(ans != null)
                            return ans;
                        square[sq][v-1] = false;
                        col[j][v-1] = false;
                        row[i][v-1] = false;
                        board[i][j] = 9;
                    }
                }
            }
        }
    }
    if(noEmpty){
        int[][] ans = new int[9][9];
        for(int i = 0; i < 9;i++)
            for(int j = 0; j < 9;j++)
                ans[i][j] = board[i][j];
        return ans;
    }else{
        return null;
    }       
}



回答6:


Finding the slot with the least possible solutions is incredibly expensive, and for a traditional Sudoku puzzle probably isn't worth the overhead.

An easier optimization is to keep track of how many of each digit has been used, and when you "try" to place a digit in a slot, start with the one that has been used the least (edit: make sure to include the ones the puzzle was seeded with). This will make your algorithm more likely to start down a successful path rather than a failed one.

Also, do check out Artificial Intelligence: A Modern Approach as suggested by Imsasu. It is a fantastic book and covers recursive backtracking in good detail.

P.S. I'm curious as to the performance gains (if any) given by your "step 1" optimization. Do you have a figure?




回答7:


I recently wrote a program in Python that can solve a Sudoku puzzle. It is basically a backtracking algorithm which brute forces the search space. I have posted more details on the actual algorithm in this thread.

Here however I would like to focus more on the optimization process. To be more precise, I have explored different approaches to minimize the solve time and the number of iterations. And this is more about the algorithmic improvements that can be made, rather than the programming ones.

So having thought about it, there aren't many things in a backtracking brute force algorithm that can be optimized (happy to be proven wrong here). The two real improvements that can be made are: first, the method by which the next blank cell is chosen and second, the method by which the next possible digit is chosen. These two choices can make the difference between going down a dead-end searching path or going down a searching path that ends with a solution.

Next, I sat down and tried to come up with different methods for the aforementioned two choices. Here's what I came up with.

The next blank cell can be chosen in the following ways:

  • A - the first cell from left to right, from top to bottom
  • B - the first cell from right to left, from bottom to top
  • C - a randomly chosen cell
  • D - the closest cell to the center of the grid
  • E - the cell that currently has the fewest choices available (choice here means a digit from 1 to 9)
  • F - the cell that currently has the most choices available
  • G - the cell that has the fewest blank related cells (a related cells is one from the same row, from the same column or from the same 3x3 quadrant)
  • H - the cell that has the most blank related cells
  • I - the cell that is closest to all filled cells (as measured from cell center point to cell center point)
  • J - the cell that is furthest from all filled cells
  • K - the cell whose related blank cells have the fewest available choices
  • L - the cell whose related blank cells have the most available choices

And the next digit can be chosen in the following ways:

  • 0 - the lowest digit
  • 1 - the highest digit
  • 2 - a randomly chosen digit
  • 3 - heuristically, the least used digit across the board
  • 4 - heuristically, the most used digit across the board
  • 5 - the digit that will cause related blank cells to have the least number of choices available
  • 6 - the digit that will cause related blank cells to have the most number of choices available
  • 7 - the digit that is the least common available choice among related blank cells
  • 8 - the digit that is the most common available choice among related blank cells
  • 9 - the digit that is the least common available choice across the board
  • a - the digit that is the most common available choice across the board

So I have programmed the above methods into the program. The preceding digits and letters can be passed as parameters to the program and it will use the corresponding optimization method. What's more, because sometimes two and more cells can have the same score, there is an option to provide a second sorting parameter. For example parameter "EC" would mean choose a random cell from all the cells that have the fewest choices available.

The first function will assign weights multiplied by 1000 and the second function will add new weights multiplied by 1. Thus, if for example from the first function three cells have the same weight, e.g. 3000, 3000 3000, then the second function will add its own weights. e.g. 3111, 3256, 3025. The sorting will always choose the lowest weight. And if the opposite is needed, then the weight functions are called with -1000 amd -1, but the sorting still chooses the lowest weight.

Before proceeding it's worth mentioning that the program will always choose a blank cell (not a filled one) and will always choose a digit that is within the current Sudoku constraints of the cell (doing otherwise is just so unreasonable).

Having the above, I then decided to run the program with every possible combination of parameters and see what happens, which ones perform the best - basically to brute force the brute force :) There are 12 methods for cell choosing and 11 methods for digit choosing so in theory there are 17,424 combinations to try, but I removed some unnecessary ones (such as "AA", "BB", etc., and also excluded the random methods as they are all terribly inefficient), so the number of combinations in the end was 12,100. Each run was done on the same Sudoku puzzle, which is an easy one:

0,3,0,0,9,0,6,1,0
6,0,8,5,0,3,4,9,7
0,9,0,6,7,0,0,0,3
0,5,0,8,0,4,0,0,1
1,6,0,3,0,0,9,8,2
0,0,2,9,6,0,3,0,0
0,8,0,1,3,0,2,0,6
3,0,5,0,4,6,0,7,9
0,4,6,0,8,0,1,0,0

...and the search space is 36,691,771,392. This is just a simple product of the number of choices for each blank cell of the given puzzle. It is an overstatement because as soon as one cell is filled this reduces the number of choices for other cells, but it is the fastest and easiest score I could come up with.

I wrote a short script (in Python of course) that automated the whole process of testing - it ran the solver for each set of parameters, recorded the completion time and dumped everything into a file. Also, I decided to do 20 runs of each because I was getting some 0 times from the time.time() function for single runs. And also, if any combination took more than 10 seconds to complete, the script would stop and move to the next one.

The script completed in 13:04:31 hours on a laptop with Intel Core i7-4712MQ 2.30GHz, no more than 2 out of 8 cores were being used and the average CPU load was about 12%. 8,652 out of the 12,100 combinations completed in under 10 seconds.

And the winners are: (* numbers adjusted back for single run times/iterations)

1) Fastest time of 1.55 ms: "A0" and "A1" with 84 iterations and 46 backtrack iterations and "B0", "B01", "B1", "B10", "BA01", "BA1", "BD01", "BD1" and "BD10" with 65 iterations and 27 backtrack iterations The fastest methods are the simplest ones like A, B and D. Another method does not appear until ranking position 308, and that is "E0".

2) Fewest iterations of 38 and 0 backtrack iterations: Surprisingly many methods managed to achieve this, the fastest ones being "B17", "B6", "B7", "BA16", "BA60", "BA7", "BD17" and "BD70" with time of 2.3 ms and the slowest ones being "IK91", "JK91", "KI91", "KJ91", "KJ9a", "IK9a", "JK9a" and "KI9a" with time of about 107 ms. Also surprisingly, method F has a few good positions here such as "FB6" with 7 ms (???)

Overall A, B, D, E, G and K seemed to perform significantly better than C, F, H, and L, and I and J are kinda in between. Also, the choice of digit didn't seem to matter much.

And finally, let's see how these winner methods handle the world's hardest Sudoku puzzle, as claimed by this article http://www.telegraph.co.uk/news/science/science-news/9359579/Worlds-hardest-sudoku-can-you-crack-it.html * Having in mind that algorithms are not universally fast, maybe some algorithms do better on some Sudoku puzzles, but not on others... The puzzle is:

8,0,0,0,0,0,0,0,0
0,0,3,6,0,0,0,0,0
0,7,0,0,9,0,2,0,0
0,5,0,0,0,7,0,0,0
0,0,0,0,4,5,7,0,0
0,0,0,1,0,0,0,3,0
0,0,1,0,0,0,0,6,8
0,0,8,5,0,0,0,1,0
0,9,0,0,0,0,4,0,0

...and the search space is 95,865,912,019,648,512 x 10^20.

The winner is "A0" finishing in 1092 ms with 49,559 iterations and 49,498 backtrack iterations. Most of the other ones didn't do very well. "A0", "A1", "B0", "B01", "B1", "B10", "BA01", "BA1", "BD01', "BD1" and "BD10" finished in about 2500 ms and 91k iterations, and the rest 30+ seconds, 400k+ iterations.

But that's not enough so I ran a full test of all set of parameters for the hardest Sudoku too. This time doing a single run not 20, and also a cut-off time of 2.5 seconds. The script completed in 8:23:30 hours. 149 out of the 12,100 combinations completed in under 2.5 seconds. The winners in both categories are "E36", "E37", "EA36" and "EA37" with time of 109 ms, 362 iterations and 301 backtrack iterations. Also, the first 38 positions were dominated with by a beginning "E".

Overall E tops the charts, no doubt about it just by looking at the summary spreadsheet. A, B, I and J have a few rankings but nothing much and the rest did not even make it once under 2.5 seconds.

In conclusion, I think it is safe to say that if the Sudoku puzzle is an easy one then brute force it with the least complex algorithm, but if the Sudoku puzzle is a hard one then it's worth spending the overhead of the choosing methods.

Hope this helps :)




回答8:


You should probably use a profiler to see which statement is taking the most time, and then think about how to optimize that.

Without using a profiler, my suggestion is that you're creating a new PuzzleGenerator from scratch each time, and passing slots as an argument to the possibleValuesInGrid method. I think that means that PuzzleGenerator is recalculating everything from scratch each time, for each position and for each slots configuration; whereas instead it might be [much] more efficient if it remembered previous results and changed incrementally.




回答9:


The results from my optimizations of the backtracking algorithm for Sudoku are below. You can download the code from http://yikes.com/~bear/suds.c. This is purely based on pigeon hole principle and I found it to generally be faster than rules based solving.

Using the values from another post on this thread I get a result of 7ms on a core2 duo @2.2 ghz or 3ms on a core i5. This compares to the poster's result of 100ms, although that may have been measured in a different way. Timing added in http://yikes.com/~bear/suds2.c.

I wrote this 10 years ago, and would certainly optimize in a different way if I re-did this problem.

$ ./a.out 000070940070090005300005070087400100463000000000007080800700000700000028050268000
[----------------------- Input  Data ------------------------]

*,*,*   *,7,*   9,4,*   
*,7,*   *,9,*   *,*,5   
3,*,*   *,*,5   *,7,*   

*,8,7   4,*,*   1,*,*   
4,6,3   *,*,*   *,*,*   
*,*,*   *,*,7   *,8,*   

8,*,*   7,*,*   *,*,*   
7,*,*   *,*,*   *,2,8   
*,5,*   2,6,8   *,*,*   

[------------------ Solution 01 -------------------]

2,1,5   8,7,6   9,4,3   
6,7,8   3,9,4   2,1,5   
3,4,9   1,2,5   8,7,6   

5,8,7   4,3,2   1,6,9   
4,6,3   9,8,1   7,5,2   
1,9,2   6,5,7   3,8,4   

8,2,6   7,4,3   5,9,1   
7,3,4   5,1,9   6,2,8   
9,5,1   2,6,8   4,3,7   

Time: 0.003s Cyles: 8619081 


来源:https://stackoverflow.com/questions/1518346/optimizing-the-backtracking-algorithm-solving-sudoku

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!