How to capture a hive exit status or error code using JDBC API

前端 未结 2 500
旧巷少年郎
旧巷少年郎 2021-01-26 16:40

Executing an insert query in hive using the JDBC API. But the query is not running. Could someone suggest what is going wrong. Also, please let me know how to capture the error

相关标签:
2条回答
  • I think there are still some issues in Hive insert/update/delete functionality is hive 1.2.1 . But you seem to be still in hive 0.14. So I wouldn't recommend using these functionalities yet. Please see similar issue happening to other users:

    hive 1.2.1 error on delete command

    0 讨论(0)
  • 2021-01-26 17:20

    Too bad you are stuck with Hive 0.13, because...

    Starting with Hive 0.14.0, HiveServer2 operation logs are available for Beeline clients. https://cwiki.apache.org/confluence/display/Hive/HiveServer2+Clients#HiveServer2Clients-HiveServer2Logging

    Once the log dispatch is activated server-side, you can retrieve these log entries from your Java code -- either asynchronously, or en masse when execution is over, with something like...

      private static void DumpHiveMessages (java.sql.Statement stmtGeneric)
      { org.apache.hive.jdbc.HiveStatement stmtExtended ;
        try
        { stmtExtended =(org.apache.hive.jdbc.HiveStatement)stmtGeneric ;
          for (String sLogMessage : stmtExtended.getQueryLog())
          { JustTraceIt("HIVE SAYS>" +sLogMessage) ;    } 
          if (stmtExtended.hasMoreLogs())
          { JustTraceIt("WARNING>(...log stream still open...") ; }
        }
        catch (Exception duh)
        { JustTraceIt("WARNING>Error while accessing Hive log stream");
          JustTraceIt("WARNING>" +MakeSenseOfDirtyHadoopException(duh)) ;
        }
      }
    

    That stuff is not really documented, but there's the source code for HiveStatement that shows several non-JDBC-standard methods such as getQueryLog and hasMoreLogs -- also getYarnATSGuid for Hive 2+ and other stuff for Hive 3+.
    Here is the link to the "master" branch on GitHub, switch to whichever version you are using (possibly an old 1.2 for compatibility with Spark).

    0 讨论(0)
提交回复
热议问题