apache-spark-function

join two dataframe without having common column spark, scala

人走茶凉 提交于 2019-12-29 08:13:11
问题 I have two dataframes which has different types of columns. I need to join those two different dataframe. Please refer the below example val df1 has Customer_name Customer_phone Customer_age val df2 has Order_name Order_ID These two dataframe doesn't have any common column. Number of rows and Number of columns in the two dataframes also differs. I tried to insert a new dummy column to increase the row_index value as below val dfr=df1.withColumn("row_index",monotonically_increasing_id()). But

join two dataframe without having common column spark, scala

纵然是瞬间 提交于 2019-12-01 13:49:11
I have two dataframes which has different types of columns. I need to join those two different dataframe. Please refer the below example val df1 has Customer_name Customer_phone Customer_age val df2 has Order_name Order_ID These two dataframe doesn't have any common column. Number of rows and Number of columns in the two dataframes also differs. I tried to insert a new dummy column to increase the row_index value as below val dfr=df1.withColumn("row_index",monotonically_increasing_id()). But As i am using spark-2, monotonically_increasing_id method is not supporting for me. Is there any way to