hadoop设置环境可实时更改谷歌bigquery

7ajki6be  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(299)

我正在为spark编写一个google大查询连接器,它下面使用googlehadoop连接器。
目前googlehadoop连接器需要一个指向creds json文件的googleenv变量。
当您在dataproc世界之外启动集群时,这可能会很烦人
在代码中实时设置它是一种不好的做法吗?或者有没有一个解决方法来告诉hadoop连接器忽略env变量,因为它是在“fs.gs.auth.service.account.json.keyfile”hadoop配置中设置的?
丹尼斯既然你是这个项目的贡献者,也许这次你也能帮上忙?

ckx4rj1h

ckx4rj1h1#

对于那些感兴趣的人,我只是在scala中使用下面的要点在运行时设置它们
https://gist.github.com/jaytaylor/770bc416f0dd5954cf0f
但这是代码,以防gist离线

trait EnvHacker {
/**
 * Portable method for setting env vars on both *nix and Windows.
 * @see http://stackoverflow.com/a/7201825/293064
 */
def setEnv(newEnv: Map[String, String]): Unit = {
    try {
        val processEnvironmentClass = Class.forName("java.lang.ProcessEnvironment")
        val theEnvironmentField = processEnvironmentClass.getDeclaredField("theEnvironment")
        theEnvironmentField.setAccessible(true)
        val env = theEnvironmentField.get(null).asInstanceOf[JavaMap[String, String]]
        env.putAll(newEnv)
        val theCaseInsensitiveEnvironmentField = processEnvironmentClass.getDeclaredField("theCaseInsensitiveEnvironment")
        theCaseInsensitiveEnvironmentField.setAccessible(true)
        val cienv = theCaseInsensitiveEnvironmentField.get(null).asInstanceOf[JavaMap[String, String]]
        cienv.putAll(newEnv)
    } catch {
        case e: NoSuchFieldException =>
            try {
                val classes = classOf[Collections].getDeclaredClasses()
                val env = System.getenv()
                for (cl <- classes) {
                    if (cl.getName() == "java.util.Collections$UnmodifiableMap") {
                        val field = cl.getDeclaredField("m")
                        field.setAccessible(true)
                        val obj = field.get(env)
                        val map = obj.asInstanceOf[JavaMap[String, String]]
                        map.clear()
                        map.putAll(newEnv)
                    }
                }
            } catch {
                case e2: Exception => e2.printStackTrace()
            }

        case e1: Exception => e1.printStackTrace()
    }
}

}

相关问题