apache/submarine

operation not allowed when using spark-security in my thrift server, maybe lost the table owner?

shenbinglife opened this issue · 1 comments

I import spark-security into my thrift server with ranger 2.0 and spark 2.4.5。
But I can not insert into a table created by myself even select.

Error Msg:

2021-05-13 17:59:08,180 DEBUG [OperationManager-Background-Pool: Thread-164] org.apache.ranger.plugin.policyengine.RangerPolicyEngineImpl:516  - <== RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null} elements={database=default; table=t1; } }} accessType={select} user={user1} userGroups={hive hadoop ficommon default_1000 } accessTime={Thu May 13 17:59:08 CST 2021} clientIPAddress={null} forwardedAddresses={} remoteIPAddress={null} clientType={null} action={QUERY} requestData={null} sessionId={null} resourceMatchingScope={SELF} clusterName={} clusterType={} context={token:USER={user1} } }, policyType=0): RangerAccessResult={isAccessDetermined={false} isAllowed={false} isAuditedDetermined={true} isAudited={true} policyType={0} policyId={-1} zoneName={null} auditPolicyId={3} policyVersion={null} evaluatedPoliciesCount={4} reason={null} additionalInfo={}}

....

org.apache.ranger.authorization.spark.authorizer.SparkAccessControlException: Permission denied: user [user1] does not have [SELECT] privilege on [default/t1]
        at org.apache.ranger.authorization.spark.authorizer.RangerSparkAuthorizer$$anonfun$checkPrivileges$1.apply(RangerSparkAuthorizer.scala:123)
        at org.apache.ranger.authorization.spark.authorizer.RangerSparkAuthorizer$$anonfun$checkPrivileges$1.apply(RangerSparkAuthorizer.scala:98)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at org.apache.ranger.authorization.spark.authorizer.RangerSparkAuthorizer$.checkPrivileges(RangerSparkAuthorizer.scala:98)
        at org.apache.spark.sql.catalyst.optimizer.RangerSparkAuthorizerExtension.apply(RangerSparkAuthorizerExtension.scala:62)
        at org.apache.spark.sql.catalyst.optimizer.RangerSparkAuthorizerExtension.apply(RangerSparkAuthorizerExtension.scala:36)
        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:87)
        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:84)
        at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
        at scala.collection.immutable.List.foldLeft(List.scala:84)
        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:84)
        at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:76)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:76)
        at org.apache.spark.sql.execution.QueryExecution$$anonfun$optimizedPlan$1.apply(QueryExecution.scala:74)
        at org.apache.spark.sql.execution.QueryExecution$$anonfun$optimizedPlan$1.apply(QueryExecution.scala:74)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:789)

I set the table owner into the RangerSparkResource , then the problem disappeard.

The [Submarine Spark Security] functionality has been moved to the apache/incubator-kyuubi standalone project.