问题
This is problem 9.5 from Cracking the Coding Interview 5th edition
The Problem: Write a method to compute all permutations of a string
Here is my solution, coded in Java(test it, it works :) )
public static void generatePerm(String s) {
Queue<Character> poss = new LinkedList<Character>();
int len = s.length();
for(int count=0;count<len;count++)
poss.add(s.charAt(count));
generateRecurse(poss, len, "");
}
private static void generateRecurse(Queue<Character> possibles, int n, String word) {
if(n==0)
System.out.println(word);
else {
for(int count=0;count<n;count++) {
char first = possibles.remove();
generateRecurse(possibles, n-1, word+first);
possibles.add(first);
}
}
}
I agreed with the author that my solution runs in O(n!) time complexity because to solve this problem, you have to consider factorials, like for a word like "top", there are three possibilities for the first letter, 2 for the second and so on....
However she didn't make any mention of space complexity. I know that interviewers love to ask you the time and space complexity of your solution. What would the space complexity of this solution be?
My initial guess was O(n2) because there are n recursive calls at each level n. So you would add n + n - 1 + n - 2 + n - 3.....+ 1 to get n(n+1)⁄2 which is in O(n2). I reasoned that there are n recursive calls, because you have to backtrack n times at each level and that space complexity is the number of recursive calls your algorithm makes. For example, when considering all the permutations of "TOP", at level, 3 recursive calls, gR([O,P],2,"T"), gR([P,T],2,"O"), gR([T,O],2,"P") are made. Is my analysis of space complexity correct?
回答1:
I think you got the right answer but for the wrong reason. The number of recursive calls doesn't have anything to do with it. When you make a recursive call, it will add a certain amount of space to the stack; but when that call exits, the stack space is released. So suppose you have something like this:
void method(int n) {
if (n == 1) {
for (int i = 0; i < 10000; i++) {
method(0);
}
}
}
method(1);
Although method
calls itself 10000 times, there will still be no more than 2 invocations of method
on the stack at any one time. So the space complexity would be O(1) [constant].
The reason your algorithm has space complexity O(n2) is because of the word
string. When n
gets down to 0, there will be len
stack entries being taken up by invocations of generateRecurse
. There will be len
stack entries at most, so the space usage of the stack will only be O(n); but each of those stack entries has its own word
, which will all exist on the heap at the same time; and the lengths of those word
parameters are 1, 2, ..., len
, which of course do add up to (len * (len+1)) / 2
, which means the space usage will be O(n2).
MORE ABOUT STACK FRAMES: It appears that an explanation of the basics of stack frames would be helpful...
A "stack frame" is just an area of memory that's part of the "stack". Typically, the stack is a predefined area of memory; the location and size of stack frames, however, are not predefined. When a program is first executed, there won't be anything on the stack (actually, there will probably be some initial data there, but let's say there's nothing, just to keep things simple). So the stack area of memory looks like this:
bottom of stack top of stack
------------------------------------------------------------------
| nothing |
------------------------------------------------------------------
^
+--- stack pointer
(This assumes that the stack grows upward, from lower to higher addresses. Many machines have stacks that grow downward. To simplify, I'll keep assuming that this is a machine whose stack grows upward.)
When a method (function, procedure, subroutine, etc.) is called, a certain area of the stack is allocated. The area is enough to hold the method's local variables (or references to them), parameters (or references to them), some data so that the program will know where to go back when you return
, and possibly other information--the other information is highly dependent on the machine, the programming language, and the compiler. In Java, the first method will be main
bottom of stack top of stack
------------------------------------------------------------------
| main's frame | nothing |
------------------------------------------------------------------
^
+--- stack pointer
Note that the stack pointer has moved up. Now main
calls method1
. Since method1
will return to main
, the local variables and parameters of main
have to be preserved for when main
gets to resume executing. A new frame, of some size, is allocated on the stack:
bottom of stack top of stack
------------------------------------------------------------------
| main's frame | method1's frame | nothing |
------------------------------------------------------------------
^
+--- stack pointer
and then method1
calls method2
:
bottom of stack top of stack
------------------------------------------------------------------
| main's frame | method1's frame | method2's frame | nothing |
------------------------------------------------------------------
^
+--- stack pointer
Now method2
returns. After method2
returns, its parameters and local variables will no longer be accessible. Therefore, the entire frame can be thrown out. This is done by moving the stack pointer back to where it was before. (The "previous stack pointer" is one of the things saved in some frame.) The stack goes back to looking like this:
bottom of stack top of stack
------------------------------------------------------------------
| main's frame | method1's frame | nothing |
------------------------------------------------------------------
^
+--- stack pointer
This means that, at this point, the machine will see the portion of the stack starting with the stack pointer as "unused". It's not really correct to speak of method2
's frame being reused. You can't really use something that has ceased to exist, and method2
's frame no longer exists. Conceptually, all there is is a big empty area of the stack. If method1
calls another method, whether it's method2
, method1
recursively, System.out.println
, or something else, a new frame will be created at the place where the stack pointer is now pointing. This frame could be smaller, equal, or larger in size than the method2
frame used to be. It will take up part or all of the memory where the method2
frame was. If it's another call to method2
, it doesn't matter whether it's called with the same or different parameters. It can't matter, because the program doesn't remember what parameters were used last time. All it knows is that the area of memory starting with the stack pointer is empty and available for use. The program has no idea what frame most recently lived there. That frame is gone, gone, gone.
If you can follow this, you can see that when computing the space complexity and when looking just at the amount of space used by the stack, the only thing that matters is, how many frames can exist on the stack at any one point in time? Frames that may have existed in the past but no longer do are not relevant to the computation, no matter what parameters the methods were called with.
(P.S. In case anyone was planning to point out how I'm technically wrong about this or that detail--I already know that this is a gross oversimplification.)
来源:https://stackoverflow.com/questions/29224822/is-my-analysis-of-space-complexity-correct