# Convert Seq[Any] to Seq[String]

4 messages
Open this post in threaded view
|

## Convert Seq[Any] to Seq[String]

 Hi,Can someone please help me how to convert Seq[Any] to Seq[String]For lineval df = row.toSeq.toDF(newCol.toSeq: _*)I get that error message.I converted Map "val aMap = Map("admit" -> ("description","comments"))"to Seq`var atrb = ListBuffer[(String,String,String)]()``for((key,value) <- aMap){ atrb += ((key, value._1, value._2))}var newCol = atrb.head.productIterator.toList.toSeq`Please someone help me on this.Thanks
Open this post in threaded view
|

## Re: Convert Seq[Any] to Seq[String]

 It's not really a Spark question. .toDF() takes column names. atrb.head.toSeq.map(_.toString)? but it's not clear what you mean the col names to beOn Fri, Dec 18, 2020 at 8:37 AM Vikas Garg <[hidden email]> wrote:Hi,Can someone please help me how to convert Seq[Any] to Seq[String]For lineval df = row.toSeq.toDF(newCol.toSeq: _*)I get that error message.I converted Map "val aMap = Map("admit" -> ("description","comments"))"to Seq`var atrb = ListBuffer[(String,String,String)]()``for((key,value) <- aMap){ atrb += ((key, value._1, value._2))}var newCol = atrb.head.productIterator.toList.toSeq`Please someone help me on this.Thanks
 I am getting the table schema through Map which I have converted to Seq and passing to toDFOn Fri, 18 Dec 2020 at 20:13, Sean Owen <[hidden email]> wrote:It's not really a Spark question. .toDF() takes column names. atrb.head.toSeq.map(_.toString)? but it's not clear what you mean the col names to beOn Fri, Dec 18, 2020 at 8:37 AM Vikas Garg <[hidden email]> wrote:Hi,Can someone please help me how to convert Seq[Any] to Seq[String]For lineval df = row.toSeq.toDF(newCol.toSeq: _*)I get that error message.I converted Map "val aMap = Map("admit" -> ("description","comments"))"to Seq`var atrb = ListBuffer[(String,String,String)]()``for((key,value) <- aMap){ atrb += ((key, value._1, value._2))}var newCol = atrb.head.productIterator.toList.toSeq`Please someone help me on this.Thanks
 Your code looks overly complicated and the relevant parts are missing. If possible please post the complete snippet including the retrieval/type if rows so we get the complete picture and can try to help.For first simplification you can just convert aMap to Seq[(String, (String, String))] and further map it to flatten the nested tuple into a Seq which you then pass to toDF via var arg expansion.Val colNames: Seq[String] = aMap.toSeq.map(kv => Seq(kv._1, kv._2._1, kv._2._2))Depending on the type of aMap this leads to problems as we assume it to be Map[String, (String, String)].Best RegardsVikas Garg <[hidden email]> schrieb am Fr. 18. Dez. 2020 um 15:46:I am getting the table schema through Map which I have converted to Seq and passing to toDFOn Fri, 18 Dec 2020 at 20:13, Sean Owen <[hidden email]> wrote:It's not really a Spark question. .toDF() takes column names. atrb.head.toSeq.map(_.toString)? but it's not clear what you mean the col names to beOn Fri, Dec 18, 2020 at 8:37 AM Vikas Garg <[hidden email]> wrote:Hi,Can someone please help me how to convert Seq[Any] to Seq[String]For lineval df = row.toSeq.toDF(newCol.toSeq: _*)I get that error message.I converted Map "val aMap = Map("admit" -> ("description","comments"))"to Seq`var atrb = ListBuffer[(String,String,String)]()``for((key,value) <- aMap){ atrb += ((key, value._1, value._2))}var newCol = atrb.head.productIterator.toList.toSeq`Please someone help me on this.Thanks -- Roland JohannData Architect/Data Engineerphenetic GmbHLütticher Straße 10, 50674 Köln, GermanyMobil: +49 172 365 26 46Mail: [hidden email]Web: phenetic.ioHandelsregister: Amtsgericht Köln (HRB 92595)Geschäftsführer: Roland Johann, Uwe Reimann