SPARK - How to force error on sc.parallelize -


question:

this statement gives right result, no matter how paralleziation provided. why give correct result?

reading big file or mappartitions approach result in minor loss of accuracy, why not here? must simple, cannot see it.

val rdd = sc.parallelize(array("a", "b", "c", "d", "e", "f"),5)    rdd.sliding(2).collect() 

reading big file or mappartitions approach result in minor loss of accuracy,

it won't. result exact independent of source.


Comments

Popular posts from this blog

php - How to add and update images or image url in Volusion using Volusion API -

javascript - jQuery UI Splitter/Resizable for unlimited amount of columns -

javascript - IE9 error '$'is not defined -