How to make elements of vector unique? (remove non adjacent duplicates)

后端 未结 11 978
醉梦人生
醉梦人生 2020-11-29 05:01

I have a vector containing few non-adjacent duplicates.

As a simple example, consider:

2 1 6 1 4 6 2 1 1

I am trying to make this

相关标签:
11条回答
  • 2020-11-29 05:30

    Based on the @Corden's answer, but uses lambda expression and removes duplicates instead of writing them in the output vector

        set<int> s;
        vector<int> nodups;
        remove_copy_if(v.begin(), v.end(), back_inserter(nodups), 
            [&s](int x){ 
                return !s.insert(x).second; //-> .second false means already exists
            } ); 
    
    0 讨论(0)
  • 2020-11-29 05:31

    You can remove some of the loops in fa's answer using remove_copy_if:

    class NotSeen : public std::unary_function <int, bool>
    {
    public:
      NotSeen (std::set<int> & seen) : m_seen (seen) { }
    
      bool operator ()(int i) const  {
        return (m_seen.insert (i).second);
      }
    
    private:
      std::set<int> & m_seen;
    };
    
    void removeDups (std::vector<int> const & iv, std::vector<int> & ov) {
      std::set<int> seen;
      std::remove_copy_if (iv.begin ()
          , iv.end ()
          , std::back_inserter (ov)
          , NotSeen (seen));
    }
    

    This has no affect on the complexity of the algorithm (ie. as written it's also O(n log n)). You can improve upon this using unordered_set, or if the range of your values is small enough you could simply use an array or a bitarray.

    0 讨论(0)
  • 2020-11-29 05:31

    Given your your input is in vector<int> foo you can use remove to do the leg work for you here, then if you want to shrink the vector just use erase otherwise just use last as your one-past-the-end iterator when you want the vector with duplicates removed but order preserved:

    auto last = end(foo);
    
    for(auto first = begin(foo); first < last; ++first) last = remove(next(first), last, *first);
    
    foo.erase(last, end(foo));
    

    Live Example

    As far as time complexity this will be O(nm). Where n is the number of elements and m is the number of unique elements. As far as space complexity this will use O(n) because it does the removal in place.

    0 讨论(0)
  • 2020-11-29 05:33

    Without using a temporary set it's possible to do this with (possibly) some loss of performance:

    template<class Iterator>
    Iterator Unique(Iterator first, Iterator last)
    {
        while (first != last)
        {
            Iterator next(first);
            last = std::remove(++next, last, *first);
            first = next;
        }
    
        return last;
    }
    

    used as in:

    vec.erase( Unique( vec.begin(), vec.end() ), vec.end() );
    

    For smaller data sets, the implementation simplicity and lack of extra allocation required may offset the theoretical higher complexity of using an additional set. Measurement with a representative input is the only way to be sure, though.

    0 讨论(0)
  • 2020-11-29 05:33

    Based on the answer of @fa. It can also get rewritten using the STL algorithm std::stable_partition:

    struct dupChecker_ {
        inline dupChecker_() : tmpSet() {}
        inline bool operator()(int i) {
            return tmpSet.insert(i).second;
        }
    private:
        std::set<int> tmpSet;
    };
    
    k.erase(std::stable_partition(k.begin(), k.end(), dupChecker_()), k.end());
    

    This way it is more compact and we don't need to care of the iterators.

    It seems even not to introduce to much performance penalty. I use it in my project which needs to handle quite large vectors of complex types often and it makes no real difference.

    Another nice feature is, that it is possible to adjust the uniqueness by using std::set<int, myCmp_> tmpSet;. For instance, in my project to ignore certain rounding errors.

    0 讨论(0)
  • 2020-11-29 05:35

    I think you would do it like this:

    I would use two iterators on the vector :

    The first of one reads the data and inserts it a temporary set.

    When the read data was not in the set you copy it from the first iterator to the second and increment it.

    At the end you keep only the data up to the second iterator.

    The complexity is O( n .log( n ) ) as the lookup for duplicated elements uses the set, not the vector.

    #include <vector>
    #include <set>
    #include <iostream>
    
    int main(int argc, char* argv[])
    {
        std::vector< int > k ;
    
        k.push_back( 2 );
        k.push_back( 1 );
        k.push_back( 6 );
        k.push_back( 1 );
        k.push_back( 4 );
        k.push_back( 6 );
        k.push_back( 2 );
        k.push_back( 1 );
        k.push_back( 1 );
    
    {
        std::vector< int >::iterator r , w ;
    
        std::set< int > tmpset ;
    
        for( r = k.begin() , w = k.begin() ; r != k.end() ; ++r )
        {
            if( tmpset.insert( *r ).second )
            {
                *w++ = *r ;
            }
        }
    
        k.erase( w , k.end() );
    }
    
    
        {
            std::vector< int >::iterator r ;
    
            for( r = k.begin() ; r != k.end() ; ++r )
            {
                std::cout << *r << std::endl ;
            }
        }
    }
    
    0 讨论(0)
提交回复
热议问题