不使用putty/ssh通过python启动hadoop mapreduce作业

oknwwptz  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(428)

我通过putty登录ssh来运行hadoopmapreduce作业,这要求我在putty中输入主机名/ip地址、登录名和密码,以便获得ssh命令行窗口。进入ssh控制台窗口后,我将提供适当的mr命令,例如:
hadoop jar/usr/lib/hadoop-0.20-mapreduce/contrib/streaming/hadoop-streaming-2.0.0-mr1-cdh4.0.1.jar-file/nfs\u home/appers/user1/mapper.py-file/nfs\u home/appers/user1/reducer.py-mapper'/usr/lib/python\u 2.7.3/bin/python mapper.py'-reducer'/usr/lib/python\u 2.7.3/bin/python reducer.py'-input/ccexp/data/test\u xml/0901282-510179094535002-oozie-oozi-w/extractout//.xml-output/user/ccexptest/output/user1/mroutput
我想做的是使用python来改变这个笨拙的过程,这样我就可以从python脚本中启动mapreduce作业,避免通过putty登录ssh。
这可以做到吗?如果可以,有人能告诉我怎么做吗?

4ktjp1zp

4ktjp1zp1#

我用以下脚本解决了这个问题:

import paramiko

# Define connection info

host_ip = 'xx.xx.xx.xx'
user = 'xxxxxxxx'
pw = 'xxxxxxxx'

# Paths

input_loc = '/nfs_home/appers/extracts/*/*.xml'
output_loc = '/user/lcmsprod/output/cnielsen/'
python_path = "/usr/lib/python_2.7.3/bin/python"
hdfs_home = '/nfs_home/appers/cnielsen/'
output_log = r'C:\Users\cnielsen\Desktop\MR_Test\MRtest011316_0.txt'

# File names

xml_lookup_file = 'product_lookups.xml'
mapper = 'Mapper.py'
reducer = 'Reducer.py'
helper_script = 'Process.py'
product_name = 'test1'
output_ref = 'test65'

# ----------------------------------------------------

def buildMRcommand(product_name):
    space = " "
    mr_command_list = [ 'hadoop', 'jar', '/share/hadoop/tools/lib/hadoop-streaming.jar',
                        '-files', hdfs_home+xml_lookup_file,
                        '-file', hdfs_home+mapper,
                        '-file', hdfs_home+reducer,
                        '-mapper', "'"+python_path, mapper, product_name+"'",
                        '-file', hdfs_home+helper_script,
                        '-reducer', "'"+python_path, reducer+"'",
                        '-input', input_loc,
                        '-output', output_loc+output_ref]

    MR_command = space.join(mr_command_list)
    print MR_command
    return MR_command

# ----------------------------------------------------

def unbuffered_lines(f):
    line_buf = ""
    while not f.channel.exit_status_ready():
        line_buf += f.read(1)
        if line_buf.endswith('\n'):
            yield line_buf
            line_buf = ''

# ----------------------------------------------------

client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(host_ip, username=user, password=pw)

# Build Commands

list_dir = "ls "+hdfs_home+" -l"
getmerge = "hadoop fs -getmerge "+output_loc+output_ref+" "+hdfs_home+"test_011216_0.txt"

# Run Command

stdin, stdout, stderr = client.exec_command(list_dir)

## stdin, stdout, stderr = client.exec_command(buildMRcommand(product_name))

## stdin, stdout, stderr = client.exec_command(getmerge)

print "Executing command..."
writer = open(output_log, 'w')

for l in unbuffered_lines(stderr):
    e = '[stderr] ' + l
    print '[stderr] ' + l.strip('\n')
    writer.write(e)

for line in stdout:
    r = '[stdout]' + line
    print '[stdout]' + line.strip('\n')
    writer.write(r)

client.close()
writer.close()

相关问题