SPARK - How to force error on sc.parallelize -


question:

this statement gives right result, no matter how paralleziation provided. why give correct result?

reading big file or mappartitions approach result in minor loss of accuracy, why not here? must simple, cannot see it.

val rdd = sc.parallelize(array("a", "b", "c", "d", "e", "f"),5)    rdd.sliding(2).collect() 

reading big file or mappartitions approach result in minor loss of accuracy,

it won't. result exact independent of source.


Comments

Popular posts from this blog

sql - can we replace full join with union of left and right join? why not? -

javascript - Parallax scrolling and fixed footer code causing width issues -

iOS: Performance of reloading UIImage(name:...) -