问题
I have two entities with one to many relationship which are joined together by composite primary keys. Since Spring Data generates wrong count distinct query for oracle database , I have SQL output with cartesian join which leads to repeating row for parent object for every row of child object. I need to find out distinct parent objects based on composite keys and then add each child object for parent in a list and set it as property of parent object
I am able to find out distinct parent object based on composite keys of the parent object. Following is the relevant code
private static <T> Predicate<T> distinctByKeys(Function<? super T, ?>... keyExtractors)
{
final Map<List<?>, Boolean> seen = new ConcurrentHashMap<>();
return t ->
{
final List<?> keys = Arrays.stream(keyExtractors)
.map(ke -> ke.apply(t))
collect(Collectors.toList());
return seen.putIfAbsent(keys, Boolean.TRUE) == null;
};
}
suppose I have the following input
list.add(new Book("Core Java", 200, new ArrayList(){{add("Page 1");}}));
list.add(new Book("Core Java", 200, new ArrayList(){{add("Page 2");}}));
list.add(new Book("Learning Freemarker", 150, new ArrayList(){{add("Page 15");}}));
list.add(new Book("Spring MVC", 300, new ArrayList(){{add("Page 16");}}));
list.add(new Book("Spring MVC", 300, new ArrayList(){{add("Page 17");}}));
I need to produce following output
Core Java,200, [Page 1, Page 2]
Learning Freemarker,150, [Page 15]
Spring MVC,300 , [Page 16, Page 17]
Any help in this regard will be very helpful
回答1:
One option is to use Collectors.toMap()
using the title
and pages
value as key. If you find duplicates you can merge both lists:
Collection<Book> result = list.stream()
.collect(Collectors.toMap(
b -> Map.entry(b.getTitle(), b.getPages()),
b -> new Book(b.getTitle(), b.getPages(), b.getList()),
(b1, b2) -> new Book(b1.getTitle(), b1.getPages(),
Stream.concat(b1.getList().stream(), b2.getList().stream()).collect(Collectors.toList())),
LinkedHashMap::new))
.values();
Alternatively you can use Collectors.groupingBy()
:
List<Book> result = list.stream().collect(
Collectors.groupingBy(b -> Map.entry(b.getTitle(), b.getPages()), LinkedHashMap::new,
Collectors.flatMapping(b -> b.getList().stream(), Collectors.toList())))
.entrySet().stream()
.map(e -> new Book(e.getKey().getKey(), e.getKey().getValue(), e.getValue()))
.collect(Collectors.toList());
The second approach creates a Map using title
and pages
as key and merging the lists of all books using Collectors.flatMapping()
. After that you map the entries back to your Book object.
If you can not use Collectors.flatMapping()
use Collectors.mapping()` instead:
List<Book> r = list.stream().collect(
Collectors.groupingBy(b -> Map.entry(b.getTitle(), b.getPages()), LinkedHashMap::new,
Collectors.mapping(Book::getList, Collectors.toList())))
.entrySet().stream()
.map(e -> new Book(e.getKey().getKey(), e.getKey().getValue(),
e.getValue().stream().flatMap(Collection::stream).collect(Collectors.toList())))
.collect(Collectors.toList());
If you can not use Map.entry()
use new AbstractMap.SimpleEntry<>()
instead.
The result in both cases will be this:
Book[title='Core Java', pages=200, list=[Page 1, Page 2]]
Book[title='Learning Freemarker', pages=150, list=[Page 15]]
Book[title='Spring MVC', pages=300, list=[Page 16, Page 17]]
来源:https://stackoverflow.com/questions/57221600/how-to-merge-child-objects-in-list-based-on-duplicate-parent-object