我正在运行这个hadoop程序,这是减速机:
public static class joinsReduce extends Reducer<Text, Text, Text, Text>
{ //start class reduce (1)
public void reduce(Text key, Iterable<Text> values, Context context)
throws IOException, InterruptedException
{ // start method reduce (2)
String result = "";
Map<String, String> memOf = new HashMap<String, String>();
Map<String, String> subOrg = new HashMap<String, String>();
Map<String, String> email = new HashMap<String, String>();
List<String> studList = new ArrayList<String>();
String line = "";
String source = "";
String sub = "";
String obj = "";
String hashKey = "";
String hashVal = "";
int count = 0;
for (Text value : values)
{ //start iterate over values (3)
line = value.toString();
String[] parts = line.trim().split(",");
source = parts[0].trim();
sub = parts[1].trim();
obj = parts[2].trim();
hashKey = sub;
hashVal = obj;
if (source.equals("type"))
{
studList.add(sub);
}
if (source.equals("memberOf"))
{
memOf.put(hashKey, hashVal);
} // end mem
if (source.equals("subOrganizationOf"))
{
subOrg.put(hashKey, hashVal);
} // end if sub
if (source.equals("emailAddress"))
{ //(4)
email.put(hashKey, hashVal);
} // end if email (4)
} //end reading loop iterating over values (3)
String y = "";
String z = "";
String z1 = "";
for (String x : studList)
{ // (6)
y = memOf.get(x);
z = email.get(x);
z1 = subOrg.get(y);
if (y != null && z != null && z1 != null)
{ // (8)
result = x + ',' + y + ',' + z + ',' + z1;
context.write(new Text(x), new Text(result));
} // end inner z if (8)
} // end iterating loop //(6)
}//end method (2)
}//end class (1)
它抛出一个错误:java堆空间达到reduce阶段的92%。我用了9个减速机,我不想增加它们。
有没有一种方法可以减少java内存的使用,代码方面?或者可以增加java堆的大小以超过我的主内存(4gb)。
我现在用的是第四个
<property>
<name>mapreduce.map.java.opts</name>
<value>-Xmx3072m</value>
<description>No description</description>
</property>
如果我可以超出可用内存,并且如果我将-xmx3072m更改为-xmx4096m,我该怎么做才能使它生效?
暂无答案!
目前还没有任何答案,快来回答吧!