Custom line/record delimiter

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Custom line/record delimiter

sk skk
Hi,

Do we have an option to write a csv or text file with a custom record/line separator through spark ?

I could not find any ref on the api. I have a issue while loading data into a warehouse as one of the column on csv have a new line character and the warehouse is not letting to escape that new line character .

Thank you ,
Sk
Reply | Threaded
Open this post in threaded view
|

Re: Custom line/record delimiter

Hyukjin Kwon
Hi, 


There's a PR - https://github.com/apache/spark/pull/18581 and JIRA - SPARK-21289

Alternatively, you could check out multiLine option for CSV and see if applicable.


Thanks.


2017-12-30 2:19 GMT+09:00 sk skk <[hidden email]>:
Hi,

Do we have an option to write a csv or text file with a custom record/line separator through spark ?

I could not find any ref on the api. I have a issue while loading data into a warehouse as one of the column on csv have a new line character and the warehouse is not letting to escape that new line character .

Thank you ,
Sk

Reply | Threaded
Open this post in threaded view
|

Re: Custom line/record delimiter

sk skk
Thanks for the update Kwon.

Regards,


On Mon, Jan 1, 2018 at 7:54 PM Hyukjin Kwon <[hidden email]> wrote:
Hi, 


There's a PR - https://github.com/apache/spark/pull/18581 and JIRA - SPARK-21289

Alternatively, you could check out multiLine option for CSV and see if applicable.


Thanks.


2017-12-30 2:19 GMT+09:00 sk skk <[hidden email]>:
Hi,

Do we have an option to write a csv or text file with a custom record/line separator through spark ?

I could not find any ref on the api. I have a issue while loading data into a warehouse as one of the column on csv have a new line character and the warehouse is not letting to escape that new line character .

Thank you ,
Sk