LIS:wikipedia
There is one thing that I can\'t understand:
why is X[M[i]] a non-decreasing sequence?
You can surely check this video for explanation:
https://www.youtube.com/watch?v=nf3YG4CnTbg&feature=youtu.be
My code for nlogn approch is:
int n;
cin>>n;//LENGTH OF ARRAY
vector<int>v(n);
for(int i=0;i<n;i++){
cin>>v[i];
}
vector<int>d(n+1,INT_MAX);//AUXILLARY ARRAY
for(int i=0;i<=n;i++){
*lower_bound(d.begin(),d.end(),v[i])=v[i];
}
for(int i=0;i<n;i++){
if(d[i]==INT_MAX){
cout<<i;//LENGTH OF LIS
exit(0);
}
}
This is the O(n*lg(n)) solution from The Hitchhiker’s Guide to the Programming Contests (note: this implementation assumes there are no duplicates in the list):
set<int> st;
set<int>::iterator it;
st.clear();
for(i=0; i<n; i++) {
st.insert(array[i]);
it=st.find(array[i]);
it++;
if(it!=st.end()) st.erase(it);
}
cout<<st.size()<<endl;
To account for duplicates one could check, for example, if the number is already in the set. If it is, ignore the number, otherwise carry on using the same method as before. Alternatively, one could reverse the order of the operations: first remove, then insert. The code below implements this behaviour:
set<int> st;
set<int>::iterator it;
st.clear();
for(int i=0; i<n; i++) {
it = st.lower_bound(a[i]);
if (it != st.end()) st.erase(it);
st.insert(a[i]);
}
cout<<st.size()<<endl;
The second algorithm could be extended to find the longest increasing subsequence(LIS) itself by maintaining a parent array which contains the position of the previous element of the LIS in the original array.
typedef pair<int, int> IndexValue;
struct IndexValueCompare{
inline bool operator() (const IndexValue &one, const IndexValue &another){
return one.second < another.second;
}
};
vector<int> LIS(const vector<int> &sequence){
vector<int> parent(sequence.size());
set<IndexValue, IndexValueCompare> s;
for(int i = 0; i < sequence.size(); ++i){
IndexValue iv(i, sequence[i]);
if(i == 0){
s.insert(iv);
continue;
}
auto index = s.lower_bound(iv);
if(index != s.end()){
if(sequence[i] < sequence[index->first]){
if(index != s.begin()) {
parent[i] = (--index)->first;
index++;
}
s.erase(index);
}
} else{
parent[i] = s.rbegin()->first;
}
s.insert(iv);
}
vector<int> result(s.size());
int index = s.rbegin()->first;
for(auto iter = s.rbegin(); iter != s.rend(); index = parent[index], ++iter){
result[distance(iter, s.rend()) - 1] = sequence[index];
}
return result;
}
Let's first look at the n^2 algorithm:
dp[0] = 1;
for( int i = 1; i < len; i++ ) {
dp[i] = 1;
for( int j = 0; j < i; j++ ) {
if( array[i] > array[j] ) {
if( dp[i] < dp[j]+1 ) {
dp[i] = dp[j]+1;
}
}
}
}
Now the improvement happens at the second loop, basically, you can improve the speed by using binary search. Besides the array dp[], let's have another array c[], c is pretty special, c[i] means: the minimum value of the last element of the longest increasing sequence whose length is i.
sz = 1;
c[1] = array[0]; /*at this point, the minimum value of the last element of the size 1 increasing sequence must be array[0]*/
dp[0] = 1;
for( int i = 1; i < len; i++ ) {
if( array[i] < c[1] ) {
c[1] = array[i]; /*you have to update the minimum value right now*/
dp[i] = 1;
}
else if( array[i] > c[sz] ) {
c[sz+1] = array[i];
dp[i] = sz+1;
sz++;
}
else {
int k = binary_search( c, sz, array[i] ); /*you want to find k so that c[k-1]<array[i]<c[k]*/
c[k] = array[i];
dp[i] = k;
}
}
You cannot understand, because the code in wikipedia is wrong(I strongly believe so). It is not only wrong but the variables are poorly named. But it allowed me to spend time to understand how it works :D.
Now after I read the patience-sort. I rewrote the algorithm. I also wrote the corrected binary search.
Like insertion sort, patience-sort finds appropriate place for the next item by doing binary search. The binary search is done on the card-piles built in sorted order. Let me assign a variable for the card-pile.(I am talking about playing cards because patience is a simplified card game).
//! card piles contain pile of cards, nth pile contains n cards.
int top_card_list[n+1];
for(int i = 0; i <= n; i++) {
top_card_list[i] = -1;
}
Now the top_card_list
contains the top-card of the card pile of height n
. Patience sort places the card in hand over the highest top-card that is smaller than it(or the opposite). For further sorting note, please refer to wikipedia page for patience sort.
3
* 7 2
-------------------------------------------------------------
Pile of cards above (top card is larger than lower cards)
(note that pile of card represents longest increasing subsequence too !)
Now to find a number while we do dynamic programming for longest-increasing subsequence, we run an inner loop which is O(n)
.
for(int i = 1; i < n; i++) { // outer loop
for(int j = 0; j < i; j++) { // inner loop
if(arr[i] > arr[j]) {
if(memo_len[i] < (memo_len[j]+1)) {
// relaxation
memo_len[i] = memo_len[j]+1;
result = std::max(result,memo_len[i]);
pred[i] = j;
}
}
}
}
And the inner-loop is there to find the highest-top card that is smaller than our card in hand.
But we know that we can do it by binary search ! (exercise: prove the correctness) In that way we can do that in O(log (number of piles))
time. Now O(number of piles)
= O(number of cards)
(but number of card is 52, it should be O(1)!, just joking!). So the total application runs in O(n log n)
time.
Here is the revised the DP with binary search.
for(int i = 1; i < n; i++) {
pile_height[i] = 1;
const int j = pile_search(top_card_list, arr, pile_len, arr[i]);
if(arr[i] > arr[j]) {
if(pile_height[i] < (pile_height[j]+1)) {
// relaxation
pile_height[i] = pile_height[j]+1;
result = std::max(result,pile_height[i]);
pile_len = std::max(pile_len,pile_height[i]);
}
}
if(-1 == top_card_list[pile_height[i]] || arr[top_card_list[pile_height[i]]] > arr[i]) {
top_card_list[pile_height[i]] = i; // top card on the pile is now i
}
}
Here is the correct pile search below. It is simply a binary search, but it finds the index of the top-card which is smaller than card in hand.
inline static int pile_search(const int*top_card_list, const vector<int>& arr, int pile_len, int strict_upper_limit) {
int start = 1,bound=pile_len;
while(start < bound) {
if(arr[top_card_list[bound]] < strict_upper_limit) {
return top_card_list[bound];
}
int mid = (start+bound)/2 + ((start+bound)&1);
if(arr[top_card_list[mid]] >= strict_upper_limit) {
// go lower
bound = mid-1;
} else {
start = mid;
}
}
return top_card_list[bound];
}
Notice that unlike wikipedia, it returns top_card_list[bound]
(my fix). Also notice where the top_card_list[]
is updated in the dp. This code is tested for the boundary cases. I hope it helps.
i came up with this
set<int> my_set;
set<int>::iterator it;
vector <int> out;
out.clear();
my_set.clear();
for(int i = 1; i <= n; i++) {
my_set.insert(a[i]);
it = my_set.find(a[i]);
it++;
if(it != my_set.end())
st.erase(it);
else
out.push_back(*it);
}
cout<< out.size();
We need to maintain lists of increasing sequences.
In general, we have set of active lists of varying length. We are adding an element A[i] to these lists. We scan the lists (for end elements) in decreasing order of their length. We will verify the end elements of all the lists to find a list whose end element is smaller than A[i] (floor value).
Our strategy determined by the following conditions,
1. If A[i] is smallest among all end candidates of active lists, we will start new active list of length 1.
2. If A[i] is largest among all end candidates of active lists, we will clone the largest active list, and extend it by A[i].
3. If A[i] is in between, we will find a list with largest end element that is smaller than A[i]. Clone and extend this list by A[i]. We will discard all other lists of same length as that of this modified list.
Note that at any instance during our construction of active lists, the following condition is maintained.
“end element of smaller list is smaller than end elements of larger lists”.
It will be clear with an example, let us take example from wiki :
{0, 8, 4, 12, 2, 10, 6, 14, 1, 9, 5, 13, 3, 11, 7, 15}.
A[0] = 0. Case 1. There are no active lists, create one.
0.
-----------------------------------------------------------------------------
A[1] = 8. Case 2. Clone and extend.
0.
0, 8.
-----------------------------------------------------------------------------
A[2] = 4. Case 3. Clone, extend and discard.
0.
0, 4.
0, 8. Discarded
-----------------------------------------------------------------------------
A[3] = 12. Case 2. Clone and extend.
0.
0, 4.
0, 4, 12.
-----------------------------------------------------------------------------
A[4] = 2. Case 3. Clone, extend and discard.
0.
0, 2.
0, 4. Discarded.
0, 4, 12.
-----------------------------------------------------------------------------
A[5] = 10. Case 3. Clone, extend and discard.
0.
0, 2.
0, 2, 10.
0, 4, 12. Discarded.
-----------------------------------------------------------------------------
A[6] = 6. Case 3. Clone, extend and discard.
0.
0, 2.
0, 2, 6.
0, 2, 10. Discarded.
-----------------------------------------------------------------------------
A[7] = 14. Case 2. Clone and extend.
0.
0, 2.
0, 2, 6.
0, 2, 6, 14.
-----------------------------------------------------------------------------
A[8] = 1. Case 3. Clone, extend and discard.
0.
0, 1.
0, 2. Discarded.
0, 2, 6.
0, 2, 6, 14.
-----------------------------------------------------------------------------
A[9] = 9. Case 3. Clone, extend and discard.
0.
0, 1.
0, 2, 6.
0, 2, 6, 9.
0, 2, 6, 14. Discarded.
-----------------------------------------------------------------------------
A[10] = 5. Case 3. Clone, extend and discard.
0.
0, 1.
0, 1, 5.
0, 2, 6. Discarded.
0, 2, 6, 9.
-----------------------------------------------------------------------------
A[11] = 13. Case 2. Clone and extend.
0.
0, 1.
0, 1, 5.
0, 2, 6, 9.
0, 2, 6, 9, 13.
-----------------------------------------------------------------------------
A[12] = 3. Case 3. Clone, extend and discard.
0.
0, 1.
0, 1, 3.
0, 1, 5. Discarded.
0, 2, 6, 9.
0, 2, 6, 9, 13.
-----------------------------------------------------------------------------
A[13] = 11. Case 3. Clone, extend and discard.
0.
0, 1.
0, 1, 3.
0, 2, 6, 9.
0, 2, 6, 9, 11.
0, 2, 6, 9, 13. Discarded.
-----------------------------------------------------------------------------
A[14] = 7. Case 3. Clone, extend and discard.
0.
0, 1.
0, 1, 3.
0, 1, 3, 7.
0, 2, 6, 9. Discarded.
0, 2, 6, 9, 11.
----------------------------------------------------------------------------
A[15] = 15. Case 2. Clone and extend.
0.
0, 1.
0, 1, 3.
0, 1, 3, 7.
0, 2, 6, 9, 11.
0, 2, 6, 9, 11, 15. <-- LIS List
Also, ensure we have maintained the condition, “end element of smaller list is smaller than end elements of larger lists“.
This algorithm is called Patience Sorting.
http://en.wikipedia.org/wiki/Patience_sorting
So, pick a suit from deck of cards. Find the longest increasing sub-sequence of cards from the shuffled suit. You will never forget the approach.
Complexity : O(NlogN)
Source: http://www.geeksforgeeks.org/longest-monotonically-increasing-subsequence-size-n-log-n/