I am trying to \"combine\" two arrayLists, producing a new arrayList that contains all the numbers in the two combined arrayLists, but without any duplicate elements and the
Add ArrayList1, ArrayList2 and produce a Single arraylist ArrayList3. Now convert it into
Set Unique_set = new HashSet(Arraylist3);
in the unique set you will get the unique elements.
Note
ArrayList allows to duplicate values. Set doesn't allow the values to duplicate. Hope your problem solves.
Here is one solution using java 8:
Stream.of(list1, list2)
.flatMap(Collection::stream)
.distinct()
// .sorted() uncomment if you want sorted list
.collect(Collectors.toList());
Java 8 Stream API
can be used for the purpose,
ArrayList<String> list1 = new ArrayList<>();
list1.add("A");
list1.add("B");
list1.add("A");
list1.add("D");
list1.add("G");
ArrayList<String> list2 = new ArrayList<>();
list2.add("B");
list2.add("D");
list2.add("E");
list2.add("G");
List<String> noDup = Stream.concat(list1.stream(), list2.stream())
.distinct()
.collect(Collectors.toList());
noDup.forEach(System.out::println);
En passant, it shouldn't be forgetten that distinct()
makes use of hashCode()
.
your nested for loop
for(int j = 0; j < array2.size(); i++){
is infinite as j will always equal to zero, on the other hand, i will be increased at will in this loop. You get OutOfBoundaryException when i is larger than plusArray.size()
I got your point that you don't wanna use the built-in functions for merging or remove duplicates from the ArrayList. Your first code is running forever because the outer for loop condition is 'Always True'. Since you are adding elements to plusArray, so the size of the plusArray is increasing with every addition and hence 'i' is always less than it. As a result the condition never fails and the program runs forever. Tip: Try to first merge the list and then from the merged list remove the duplicate elements. :)
You don't have to handcode this. The problem definition is precisely the behavior of Apache Commons CollectionUtils#collate. It's also overloaded for different sort orders and allowing duplicates.