SPARK - How to force error on sc.parallelize -
question:
this statement gives right result, no matter how paralleziation provided. why give correct result?
reading big file or mappartitions approach result in minor loss of accuracy, why not here? must simple, cannot see it.
val rdd = sc.parallelize(array("a", "b", "c", "d", "e", "f"),5) rdd.sliding(2).collect()
reading big file or mappartitions approach result in minor loss of accuracy,
it won't. result exact independent of source.
Comments
Post a Comment