java - Flatten a Map<Integer, List<String>> to Map<String, Integer> with stream and lambda -
i flatten map
associates integer
key list of string
, without losing key mapping. curious though possible , useful stream
, lambda
.
we start this:
map<integer, list<string>> mapfrom = new hashmap<>();
let's assume mapfrom populated somewhere, , looks like:
1: a,b,c 2: d,e,f etc.
let's assume values in lists unique.
now, want "unfold" second map like:
a: 1 b: 1 c: 1 d: 2 e: 2 f: 2 etc.
i (or similarly, using foreach
):
map<string, integer> mapto = new hashmap<>(); (map.entry<integer, list<string>> entry: mapfrom.entryset()) { (string s: entry.getvalue()) { mapto.put(s, entry.getkey()); } }
now let's assume want use lambda instead of nested for
loops. this:
map<string, integer> mapto = mapfrom.entryset().stream().map(e -> { e.getvalue().stream().? // here can iterate on each list, // best try give me flat map each key, // wouldn't know how flatten. }).collect(collectors.tomap(/*a string value*/,/*an integer key*/))
i gave try flatmap
, don't think right way go, because although helps me rid of dimensionality issue, lose key in process.
in nutshell, 2 questions :
- is possible use
streams
,lambda
achieve this? - is useful (performance, readability) so?
you need use flatmap
flatten values new stream, since still need original keys collecting map
, have map temporary object holding key , value, e.g.
map<string, integer> mapto = mapfrom.entryset().stream() .flatmap(e->e.getvalue().stream() .map(v->new abstractmap.simpleimmutableentry<>(e.getkey(), v))) .collect(collectors.tomap(map.entry::getvalue, map.entry::getkey));
the map.entry
stand-in nonexistent tuple type, other type capable of holding 2 objects of different type sufficient.
an alternative not requiring these temporary objects, custom collector:
map<string, integer> mapto = mapfrom.entryset().stream().collect( hashmap::new, (m,e)->e.getvalue().foreach(v->m.put(v, e.getkey())), map::putall);
this differs tomap
in overwriting duplicate keys silently, whereas tomap
without merger function throw exception, if there duplicate key. basically, custom collector parallel capable variant of
map<string, integer> mapto = new hashmap<>(); mapfrom.foreach((k, l) -> l.foreach(v -> mapto.put(v, k)));
but note task wouldn’t benefit parallel processing, large input map. if there additional computational intense task within stream pipeline benefit smp, there chance of getting benefit parallel streams. perhaps, concise, sequential collection api solution preferable.
Comments
Post a Comment