How can I solve nested RDD in Spark

Previous Topic Next Topic
classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
Report Content as Inappropriate

How can I solve nested RDD in Spark

This post has NOT been accepted by the mailing list yet.
val combinations = Array[(Int, Int)] = Array(Array((1953,1307), (1953,527), (1953,1272), (1953,1387), (1953,318)),Array(( ...))...)

val simOnly = combinations.foreach{ x => x.map{ case(item_1, item_2) =>
val itemFactor_1 = modelMLlib.productFeatures.lookup(item_1).head
val itemFactor_2 = modelMLlib.productFeatures.lookup(item_2).head
val itemVector_1 = new DoubleMatrix(itemFactor_1)
val itemVector_2 = new DoubleMatrix(itemFactor_2)
val sim = cosineSimilarity(itemVector_1,itemVector_2)

This is my code that calculates the cosine similarity between items.

but the nested RDD doesn't support in apache spark How can I solve this properly? Please help me..!