问题
I am using Spark 1.6 in scala.
I created an index in ElasticSearch with an object. The object "params" was created as a Map[String, Map[String, String]]. Example:
val params : Map[String, Map[String, String]] = ("p1" -> ("p1_detail" -> "table1"), "p2" -> (("p2_detail" -> "table2"), ("p2_filter" -> "filter2")), "p3" -> ("p3_detail" -> "table3"))
That gives me records that look like the following:
{
"_index": "x",
"_type": "1",
"_id": "xxxxxxxxxxxx",
"_score": 1,
"_timestamp": 1506537199650,
"_source": {
"a": "toto",
"b": "tata",
"c": "description",
"params": {
"p1": {
"p1_detail": "table1"
},
"p2": {
"p2_detail": "table2",
"p2_filter": "filter2"
},
"p3": {
"p3_detail": "table3"
}
}
}
},
Then I am trying to read the Elasticsearch index in order to update the values.
Spark reads the index with the following schema:
|-- a: string (nullable = true)
|-- b: string (nullable = true)
|-- c: string (nullable = true)
|-- params: struct (nullable = true)
| |-- p1: struct (nullable = true)
| | |-- p1_detail: string (nullable = true)
| |-- p2: struct (nullable = true)
| | |-- p2_detail: string (nullable = true)
| | |-- p2_filter: string (nullable = true)
| |-- p3: struct (nullable = true)
| | |-- p3_detail: string (nullable = true)
My problem is that the object is read as a struct. In order to manage and easily update the fields I want to have a Map as I am not very familiar with StructType.
I tried to get the object in a UDF as a Map but I have the following error:
User class threw exception: org.apache.spark.sql.AnalysisException: cannot resolve 'UDF(params)' due to data type mismatch: argument 1 requires map<string,map<string,string>> type, however, 'params' is of struct<p1:struct<p1_detail:string>,p2:struct<p2_detail:string,p2_filter:string>,p3:struct<p3_detail:string>> type.;
UDF code snippet:
val getSubField : Map[String, Map[String, String]] => String = (params : Map[String, Map[String, String]]) => { val return_string = (params ("p1") getOrElse("p1_detail", null.asInstanceOf[String]) return_string }
My question: How can we convert this Struct to a Map? I already read saw the toMap method available in the documentation but can not find how to use it (not very familiar with implicit parameters) as I am a scala beginner.
Thanks in advance,
回答1:
I finally solved it as follows:
def convertRowToMap[T](row : Row) : Map[String, T] = {
row.schema.fieldNames.filter(field => !row.isNullAt(row.fieldIndex(field))).map(field => field -> row.getAs[T](field)).toMap
}
/* udf that converts Row to Map */
val rowToMap : Row => Map[String, Map[String, String]] = (row:Row) => {
val map_temp = convertRowToMap[Row](row)
val map_to_return = map_temp.map{case(k,v) => k -> convertRowToMap[String](v)}
map_to_return
}
val udfrowToMap = udf(rowToMap)
回答2:
You can't specify type of param as StructType object, instead specify type as Row.
//Schema of parameter
def schema:StructType = (new StructType).add("p1", (new StructType).add("p1_detail", StringType))
.add("p2", (new StructType).add("p2_detail", StringType).add("p2_filter",StringType))
.add("p3", (new StructType).add("p3_detail", StringType))
//Not allowed
val extractVal: schema => collection.Map[Nothing, Nothing] = _.getMap(0)
Solution:
// UDF example to process struct column
val extractVal: (Row) => collection.Map[Nothing, Nothing] = _.getMap(0)
// You would implement something similar
val getSubField : Map[String, Map[String, String]] => String =
(params : Row) =>
{
val p1 = params.getAs[Row]("p1")
.........
return null;
}
I hope this helps !
来源:https://stackoverflow.com/questions/46566374/spark-scala-nested-structtype-conversion-to-map