暂无个人介绍
2022年08月
原理如下
job.setMapperClass(MaxTemperatureMapper.class);
job.setkeducerClass(MaxlemperatureReducer.class)
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable,class);
System.exit(job.waitForCompletion(true)?0:1):
YARN是一个文件
-允许Hadoop将MapReduce计算移动到每台机器上 托管一部分数据。 MapReduce作业是客户端希望成为的工作单元 执行: —它由输入数据、MapReduce程序和配置组成 信息。 Hadoop通过将作业划分为任务来运行作业,其中有两个任务 类型:映射任务和减少任务。
结果图如下
结果图如下
流程图如下
代码如下
public void map (LongWritable key, Text value, Context context)
throws lOException, InterruptedException(
String line = value.toString();
String year = line.substring(15, 19);
it airTemperature;
if (line.charAt(87) == '+') // parselnt doesn't like leading plus signs
代码如下
context.write(new Text(year), new IntWritable(airTemperature))
代码如下
@Override
public void reduce(Text key/ Iterable<IntWritable>/values
Context context)
throws IOException, InterruptedException(
int maxValue = Integer.MIN_ VALUE;
for AntWritable value : values)
代码如下
int maxValue = Integer.MIN_ VALUE;
for AntWritable value : values)
maxValue = Math.max(maxValue, value.get())
context.write/key new IntWritable(maxValue)):
代码如下
extends Reducer<Text, IntWritable, Text, IntWritable>(
@Override
public void reduce(Text key/ Iterable<IntWritable>/values
Context context)
throws IOException, InterruptedException(
int maxValue = Integer.MIN_ VALUE;
for AntWritable value : values)
流程图如下
代码如下
FilelnputFormat.addinputPath(job, new Path(args[0))):
FileOutputformat.setOutputPath(job, new Path(args[1])):
job.setMapperClass(MaxTemperatureMapper.class);
job.setkeducerClass(MaxlemperatureReducer.class)
代码如下
Configuration conf = new Configuration():
conf set("dfs defaultFS",
"hefs://hadoop:9000'1:
Job job = Job.getinstance(conf, "max tempr
job.setJarByClass(MaxTemperature.class)
job.setlobName("Max temperature");:
FilelnputFormat.addinputPath(job, new Path(args[0))):
FileOutputformat.setOutputPath(job, new Path(args[1])):
代码如下
in (String!] args) throws Exception f
if (args.length (= 2) f
System.err.printin("Usage: MaxTemperature «input path> <output path>");
System.exit(-1):
Configuration conf = new Configuration():
conf set("dfs defaultFS",
代码如下
Configuration conf = new Configuration():
conf set("dfs defaultFS",
"hefs://hadoop:9000'1:
Job job = Job.getinstance(conf, "max tempr
job.setJarByClass(MaxTemperature.class)
job.setlobName("Max temperature");:
FilelnputFormat.addinputPath(job, new Path(args[0))):
FileOutputformat.setOutputPath(job, new Path(args[1])):
代码如下
public static void mal
in (String!] args) throws Exception f
if (args.length (= 2) f
System.err.printin("Usage: MaxTemperature «input path> <output path>");
System.exit(-1):
Configuration conf = new Configuration():
conf set("dfs defaultFS",
代码如下
job.setMapperClass(MaxTemperatureMapper.class);
job.setReducerClass(MaxTemperatureReducer.class)
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable,class):
System.exit(job.waitForCompletion(true)?0:1);
代码如下
Configuration conf = new Configuration():
conf.set("dfs.defaultS", "hets://hadoop:9000"):
Jobjob=Job.getinstance(conf, "max temperature"):
job.setJarByClass(MaxTemperature.class)
job.setlobName("Max temperature"):
FilelnputFormat.addinputPath(job, new Path(args[0])):
代码如下
job.setJarByClass(MaxTemperature.class)
job.setlobName("Max temperature"):
FilelnputFormat.addinputPath(job, new Path(args[0])):
FileOutputformat.setOutputPath(job, new Path(args[1])):
job.setMapperClass(MaxTemperatureMapper.class);
job.setReducerClass(MaxTemperatureReducer.class)
job.setOutputKeyClass(Text.class);
结果图如下