How can I retrieve the alias for a DataFrame in Spark

后端 未结 3 1369
北海茫月
北海茫月 2021-01-15 17:13

I\'m using Spark 2.0.2. I have a DataFrame that has an alias on it, and I\'d like to be able to retrieve that. A simplified example of why I\'d want that is below.

3条回答
  •  囚心锁ツ
    2021-01-15 18:06

    You can try something like this but I wouldn't go so far to claim it is supported:

    • Spark < 2.1:

      import org.apache.spark.sql.catalyst.plans.logical.SubqueryAlias
      import org.apache.spark.sql.Dataset
      
      def getAlias(ds: Dataset[_]) = ds.queryExecution.analyzed match {
        case SubqueryAlias(alias, _) => Some(alias)
        case _ => None
      }
      
    • Spark 2.1+:

      def getAlias(ds: Dataset[_]) = ds.queryExecution.analyzed match {
        case SubqueryAlias(alias, _, _) => Some(alias)
        case _ => None
      }
      

    Example usage:

    val plain = Seq((1, "foo")).toDF
    getAlias(plain)
    
    Option[String] = None
    
    val aliased = plain.alias("a dataset")
    getAlias(aliased)
    
    Option[String] = Some(a dataset)
    

提交回复
热议问题