Passing HBase credentials in oozie Java Action

两盒软妹~` 提交于 2019-12-05 03:57:22

问题


I need to schedule an oozie Java action which interacts with secured hbase, so I need to provide hbase credentials to the Java action. I am using a secured hortonworks 2.2 environment, my workflow XML is as below

<workflow-app xmlns="uri:oozie:workflow:0.4" name="solr-wf">
    <credentials>
         <credential name="hbase" type="hbase">
         </credential>
      </credentials>

    <start to="java-node"/>
    <action name="java-node" cred="hbase">
        <java>  
             <job-tracker>${jobTracker}</job-tracker>
             <name-node>${nameNode}</name-node>
             <main-class>com.test.hbase.TestHBaseSecure</main-class>
            <arg>${arg1}</arg>
        </java>
        <ok to="end"/>
        <error to="fail"/>
    </action>
    <kill name="fail">
        <message>Java failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>

I have also modified the oozie property to include HbaseCredentials Class

oozie.credentials.credentialclasses=hcat=org.apache.oozie.action.hadoop.HCatCredentials,hbase=org.apache.oozie.action.hadoop.HbaseCredentials

But I am not able to run the job it is throwing an error, below is the stacktrace

java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
    at org.apache.oozie.action.hadoop.HbaseCredentials.copyHbaseConfToJobConf(HbaseCredentials.java:60)
    at org.apache.oozie.action.hadoop.HbaseCredentials.addtoJobConf(HbaseCredentials.java:49)
    at org.apache.oozie.action.hadoop.JavaActionExecutor.setCredentialTokens(JavaActionExecutor.java:1054)
    at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:913)
    at org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1135)
    at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:228)
    at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:63)
    at org.apache.oozie.command.XCommand.call(XCommand.java:281)
    at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:323)
    at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:252)
    at org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:174)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)

Other jobs runs fine, it's only the job with hbase interaction which is failing. I have included all the hbase jar in my lib directory, I am not able to figure out the issue.

Updated workflow.xml:

<workflow-app xmlns="uri:oozie:workflow:0.4" name="${appName}">
<credentials>
        <credential name="hbase-cred" type="hbase">
            <property>
                <name>hbase.master.kerberos.principal</name>
                <value>hbase/_HOST@ABC.COM</value>
            </property>

            <property>
                <name>hbase.master.keytab.file</name>
                <value>/etc/security/keytabs/hbase.service.keytab</value>
            </property>

            <property>
                <name>hbase.regionserver.kerberos.principal</name>
                <value>hbase/_HOST@ABC.COM</value>
            </property>

            <property>
                <name>hbase.regionserver.keytab.file</name>
                <value>/etc/security/keytabs/hbase.service.keytab</value>
            </property>

            <property>
                <name>hbase.security.authentication</name>
                <value>kerberos</value>
            </property>

            <property>
                <name>hbase.zookeeper.quorum</name>
                <value>dev1-dn2,dev1-dn3,dev1-dn1</value>
            </property>

            <property>
                <name>zookeeper.znode.parent</name>
                <value>/hbase-secure</value>
            </property>
        </credential>


    </credentials>
    <start to="java-node" />
    <action name="java-node" cred='hbase-cred'>
        <java>
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <main-class>com.test.hbase.TestHBaseSecure</main-class>
        </java>
        <ok to="end" />
        <error to="fail" />
    </action>
    <kill name="fail">
        <message>Java failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end" />
</workflow-app>

回答1:


This solution was tested on HDP2.2.8:

  1. Copy to /usr/hdp/current/oozie-server/oozie-server/webapps/oozie/WEB-INF/lib the following jars:

    • hbase-client-*-hadoop2.jar
    • hbase-common-*-hadoop2.jar
    • hbase-protocol-*-hadoop2.jar
    • hbase-server-*-hadoop2.jar
    • htrace-core-2.04.jar
  2. Restart Oozie server.




回答2:


These "credentials" are managed by the Oozie service, not by your job.

So, if HortonWorks had done a decent job of packaging their distro...

  1. hbase-common-*-hadoop2.jar would have been deployed in /usr/hdp/current/oozie-client/libserver/ on installation
  2. that JAR would not conflict with other JARs over the definition of class org.apache.hadoop.conf.Configuration
  3. and in the end you would be able to manage HBase credentials in Oozie

It is not the case for HDP2.2.4 as installed on our Prod cluster. Arghh. The damn thing is broken in that damn release. You've got to manage the Kerberos ticket all by yourself, downloading a keytab <file> from HDFS and creating the TGT before actually connecting to HBase. We've been there.

Have a look at that post for some insights about how it can be done.



来源:https://stackoverflow.com/questions/33212535/passing-hbase-credentials-in-oozie-java-action

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!