I have a Dataframe with one column. Each row of that column has an Array of String values:
Values in my Spark 2.2 Dataframe
[\"123\", \"abc\", \"2017\
you can do something like below
import org.apache.spark.sql.functions._
val ds = Seq(
Array("123", "abc", "2017", "ABC"),
Array("456", "def", "2001", "ABC"),
Array("789", "ghi", "2017", "DEF")).toDF("col")
ds.withColumn("col1",element_at('col,1))
.withColumn("col2",element_at('col,2))
.withColumn("col3",element_at('col,3))
.withColumn("col4",element_at('col,4))
.drop('col)
.show()
+----+----+----+----+
|col1|col2|col3|col4|
+----+----+----+----+
| 123| abc|2017| ABC|
| 456| def|2001| ABC|
| 789| ghi|2017| DEF|
+----+----+----+----+