I wrote a spark job which registers a temp table and when I expose it via beeline (JDBC client)
$ ./bin/beeline
beeline> !connect jdbc:hive2://IP:10003 -n
<property>
<name>hive.server2.enable.doAs</name>
<value>true</value>
</property>
Also if you want user ABC to impersonate all(*), add below properties to your core-site.xml
<property>
<name>hadoop.proxyuser.ABC.groups</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.ABC.hosts</name>
<value>*</value>
</property>