一時テーブルを登録するsparkジョブを作成し、それをbeeline(JDBCクライアント)経由で公開したとき
$ ./bin/beeline
beeline> !connect jdbc:Hive2://IP:10003 -n ram -p xxxx
0: jdbc:Hive2://IP> show tables;
+---------------------------------------------+--------------+---------------------+
| tableName | isTemporary |
+---------------------------------------------+--------------+---------------------+
| f238 | true |
+---------------------------------------------+--------------+---------------------+
2 rows selected (0.309 seconds)
0: jdbc:Hive2://IP>
テーブルを見ることができます。クエリを実行すると、このエラーメッセージが表示されます
0: jdbc:Hive2://IP> select * from f238;
Error: org.Apache.hadoop.ipc.RemoteException(org.Apache.hadoop.security.authorize.AuthorizationException): User: ram is not allowed to impersonate ram (state=,code=0)
0: jdbc:Hive2://IP>
これはHive-site.xmlにあります。
<property>
<name>Hive.metastore.sasl.enabled</name>
<value>false</value>
<description>If true, the metastore Thrift interface will be secured with SASL. Clients must authenticate with Kerberos.</description>
</property>
<property>
<name>Hive.server2.enable.doAs</name>
<value>false</value>
</property>
<property>
<name>Hive.server2.authentication</name>
<value>NONE</value>
</property>
私はcore-site.xmlにこれを持っています、
<property>
<name>hadoop.proxyuser.Hive.groups</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.Hive.hosts</name>
<value>*</value>
</property>
完全なログ
ERROR [pool-19-thread-2] thriftserver.SparkExecuteStatementOperation: Error running Hive query:
org.Apache.Hive.service.cli.HiveSQLException: org.Apache.hadoop.ipc.RemoteException(org.Apache.hadoop.security.authorize.AuthorizationException): User: ram is not allowed to impersonate ram
at org.Apache.spark.sql.Hive.thriftserver.SparkExecuteStatementOperation.runInternal(SparkExecuteStatementOperation.scala:259)
at org.Apache.spark.sql.Hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:171)
at Java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.Java:422)
at org.Apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.Java:1657)
at org.Apache.spark.sql.Hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:182)
at Java.util.concurrent.Executors$RunnableAdapter.call(Executors.Java:511)
at Java.util.concurrent.FutureTask.run(FutureTask.Java:266)
at Java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.Java:1142)
at Java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.Java:617)
at Java.lang.Thread.run(Thread.Java:745)
どのような構成が欠けているのですか?
<property>
<name>Hive.server2.enable.doAs</name>
<value>true</value>
</property>
また、ユーザーABCにall(*)を偽装させる場合は、core-site.xmlに以下のプロパティを追加します
<property>
<name>hadoop.proxyuser.ABC.groups</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.ABC.hosts</name>
<value>*</value>
</property>