I have built a dataframe using concat
which produces a string.
import sqlContext.implicits._
val df = s
You cannot convert it to double because it is simply not a valid double representation. If you want an array just use array
function:
import org.apache.spark.sql.functions.array
df.select(array($"k", $"v").as("test"))
You can also try to split and convert but it is far from optimal:
import org.apache.spark.sql.types.{ArrayType, DoubleType}
import org.apache.spark.sql.functions.split
dfConcat.select(split($"test", ",").cast(ArrayType(DoubleType)))