Apache Spark 如何配置星火节俭用户和密码

7z5jn7bk  于 2022-11-16  发布在  Apache
关注(0)|答案(1)|浏览(199)

我正在尝试通过直线连接Spark Thrift服务器,我启动Spark Thrift如下:

start-thriftserver.sh --master yarn-client --num-executors 2 --conf spark.driver.memory=2g --executor-memory 3g

和spark conf/hive-site.xml文件,如下所示:

<configuration>
    <property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:node001:3306/hive?useSSL=false</value>
    </property>
    <property>
        <name>javax.jdo.option.ConnectionDriverName</name>
        <value>com.mysql.jdbc.Driver</value>
    </property>
    <property>
        <name>hive.server2.authentication</name>
        <value>NONE</value>
    </property>
    <property>
        <name>hive.server2.thrift.client.user</name>
        <value>root</value>
    </property>
    <property>
        <name>hive.server2.thrift.client.password</name>
        <value>123456</value>
    </property>
    <property>
        <name>hive.server2.thrift.port</name>
        <value>10001</value>
    </property>
    <property>
        <name>hive.security.authorization.enabled</name>
        <value>true</value>
        <description>enableor disable the hive clientauthorization</description>         
    </property>
    <property>
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>hive</value>
    </property>            
    <property>
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>123456</value>
    </property>
</configuration>

当我使用直线cli访问spark thrift服务器时,它提示我输入用户名和密码,但我没有输入任何内容,只是点击回车(没有输入hive-site.xml中配置的用户名和密码),我也可以访问spark。如何才能使配置工作正常?
谢谢你:)

beeline> !connect jdbc:hive2://node001:10001
Connecting to jdbc:hive2://node001:10001
Enter username for jdbc:hive2://node001:10001: 
Enter password for jdbc:hive2://node001:10001: 
18/12/06 16:13:44 INFO Utils: Supplied authorities: node001:10001
18/12/06 16:13:44 INFO Utils: Resolved authority: node001:10001
18/12/06 16:13:45 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://node001:10001
Connected to: Spark SQL (version 2.4.0)
Driver: Hive JDBC (version 1.2.1.spark2)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://node001:10001> show databases;
+-------------------------+--+
|      databaseName       |
+-------------------------+--             |
| default                 |
| test                    |
+-------------------------+--+
7 rows selected (0.142 seconds)
0: jdbc:hive2://node001:10001>
ars1skjm

ars1skjm1#

但我没有输入任何内容,只需点击回车(不输入hive-site.xml中配置的用户名和密码),我也可以访问spark
第一步:
更新您的conf/hive-site.xml,最重要的部分是CUSTOM

<configuration>
  <property>
    <name>hive.server2.authentication</name>
    <value>CUSTOM</value>
  </property>
  <property>
    <name>hive.server2.custom.authentication.class</name>
    <value>yourFullyQualifiedClassName</value>
  </property>
</configuration>

步骤2:
假设你的完全限定类名是com.example.Authentication,那么你应该实现org.apache.hive.service.auth.PasswdAuthenticationProvider接口,如下所示。当然,如何实现这个接口实际上取决于你:

public class Authentication implements PasswdAuthenticationProvider {

    @Override
    public void Authenticate(String userName, String passwd)
            throws AuthenticationException {
        LOG.info("user: " + userName + " try login.");

        String passwdBase = getPasswd(userName);
        if (null == passwdBase) {
            String message = "User not found. user:"+userName;
            LOG.info(message);
            throw new AuthenticationException(message);
        }

        BCryptPasswordEncoder bCryptPasswordEncoder =
                new BCryptPasswordEncoder(10);
        boolean isSuccess = bCryptPasswordEncoder.matches(passwd, passwdBase);
        if (!isSuccess) {
            String message = "User name and password is mismatch. user:"+userName;
            LOG.info(message);
            throw new AuthenticationException(message);
        }
    }

步骤3:
{spark_home}/lib中放置两个jar:

  • 步骤2的编译结果,就像sts_authen-1.0-SNAPSHOT.jar一样
  • spring-security-core-5.6.6.RELEASE.jar

步骤4:
开始倡导节俭:

start-thriftserver.sh --master yarn-client --num-executors 2 --conf spark.driver.memory=2g --executor-memory 3g --driver-class-path {yourDatabaseDriverJarPath}:{yourSparkHome}/lib/sts_authen-1.0-SNAPSHOT.jar:{yourSparkHome}/lib/spring-security-core-5.6.6.RELEASE.jar

相关问题