Your code looks overly complicated and the relevant parts are missing. If possible please post the complete snippet including the retrieval/type if rows so we get the complete picture and can try to help.
For first simplification you can just convert aMap to Seq[(String, (String, String))] and further map it to flatten the nested tuple into a Seq which you then pass to toDF via var arg expansion.
Val colNames: Seq[String] = aMap.toSeq.map(kv => Seq(kv._1, kv._2._1, kv._2._2))
Depending on the type of aMap this leads to problems as we assume it to be Map[String, (String, String)].
Best Regards
I am getting the table schema through Map which I have converted to Seq and passing to toDF
It's not really a Spark question. .toDF() takes column names.
atrb.head.toSeq.map(_.toString)? but it's not clear what you mean the col names to be
Hi,
Can someone please help me how to convert Seq[Any] to Seq[String]
For line
val df = row.toSeq.toDF(newCol.toSeq: _*)
I get that error message.
I converted Map "val aMap = Map("admit" -> ("description","comments"))"
to Seq
var atrb = ListBuffer[(String,String,String)]()
for((key,value) <- aMap){
atrb += ((key, value._1, value._2))
}
var newCol = atrb.head.productIterator.toList.toSeq
Please someone help me on this.
Thanks
--
Roland Johann
Data Architect/Data Engineer
phenetic GmbH
Lütticher Straße 10, 50674 Köln, Germany
Mobil: +49 172 365 26 46
Mail:
[hidden email]Web:
phenetic.ioHandelsregister: Amtsgericht Köln (HRB 92595)
Geschäftsführer: Roland Johann, Uwe Reimann