中文亚洲精品无码_熟女乱子伦免费_人人超碰人人爱国产_亚洲熟妇女综合网

當(dāng)前位置: 首頁(yè) > news >正文

山東今天新冠疫情最新消息新鄉(xiāng)seo網(wǎng)絡(luò)推廣費(fèi)用

山東今天新冠疫情最新消息,新鄉(xiāng)seo網(wǎng)絡(luò)推廣費(fèi)用,網(wǎng)站開發(fā)外包公司有哪些部門,青島網(wǎng)頁(yè)設(shè)計(jì)制作實(shí)驗(yàn)?zāi)康?amp;#xff1a; 1.掌握MapReduce的基本編程流程; 2.掌握MapReduce序列化的使用; 實(shí)驗(yàn)內(nèi)容: 一、在本地創(chuàng)建名為MapReduceTest的Maven工程,在pom.xml中引入相關(guān)依賴包,配置log4j.properties文件,搭…

實(shí)驗(yàn)?zāi)康?#xff1a;

1.掌握MapReduce的基本編程流程;

2.掌握MapReduce序列化的使用;

實(shí)驗(yàn)內(nèi)容:

一、在本地創(chuàng)建名為MapReduceTest的Maven工程,在pom.xml中引入相關(guān)依賴包,配置log4j.properties文件,搭建windwos開發(fā)環(huán)境。 編程實(shí)現(xiàn)以下內(nèi)容:

(1)創(chuàng)建com.nefu.(xingming).maxcount包,編寫wordcountMapper、Reducer、Driver三個(gè)類,實(shí)現(xiàn)統(tǒng)計(jì)每個(gè)學(xué)號(hào)的最高消費(fèi)。

輸入數(shù)據(jù)data.txt格式如下:

??????????序號(hào) \t 學(xué)號(hào)?\t ?日期 ?\t ?消費(fèi)總額

輸出數(shù)據(jù)格式要求如下:

??????????學(xué)號(hào) ?\t ?最高消費(fèi)?

ZnMapper.java

package com.nefu.zhangna.maxcount;import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;import java.io.IOException;public class ZnMapper extends Mapper<LongWritable, Text,Text, IntWritable> {private Text outk=new Text();private IntWritable outv=new IntWritable();@Overridepublic void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {String line=value.toString();String[] content=line.split("\t");String schoolnumber=content[1];String totalFee=content[3];outk.set(schoolnumber);outv.set(Integer.parseInt(totalFee));context.write(outk,outv);}
}

ZnReducer.java

package com.nefu.zhangna.maxcount;import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;import java.io.IOException;public class ZnReducer extends Reducer<Text,IntWritable,Text, IntWritable> {private IntWritable outv=new IntWritable();@Overrideprotected void  reduce(Text key,Iterable<IntWritable> values,Context context) throws IOException, InterruptedException {int total=0;for (IntWritable value:values){if(value.get()>total)total=value.get();}outv.set(total);context.write(key,outv);}
}

ZnDriver.java

package com.nefu.zhangna.maxcount;import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.CombineTextInputFormat;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;import java.io.FileNotFoundException;
import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;public class ZnDriver {public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException, URISyntaxException {Configuration configuration=new Configuration();Job job=Job.getInstance(configuration);//FileSystem fs=FileSystem.get(new URI("hdfs://hadoop101:8020"),configuration,"hadoop");//fs.copyFromLocalFile(new Path("D://mapreducetest//data.txt"),new Path("/zn/data.txt"));job.setJarByClass(ZnDriver.class);job.setMapperClass(ZnMapper.class);job.setReducerClass(ZnReducer.class);job.setMapOutputKeyClass(Text.class);job.setMapOutputValueClass(IntWritable.class);//job.setOutputKeyClass(Text.class);//job.setOutputValueClass(StudentBean.class);// job.setInputFormatClass(CombineTextInputFormat.class);   //否則默認(rèn)是TextInputFormat.class//CombineTextInputFormat.setMaxInputSplitSize(job,4194304);   //設(shè)4MFileInputFormat.setInputPaths(job,new Path("D:\\mapreducetest\\data.txt"));FileOutputFormat.setOutputPath(job,new Path("D:\\cluster\\shiyan3-1"));boolean result=job.waitForCompletion(true);System.exit(result?0:1);}
}

(2)測(cè)試上述程序,查看運(yùn)行結(jié)果

原數(shù)據(jù)

mapreduce之后

(3)查看日志,共有幾個(gè)切片,幾個(gè)MapTask(截圖)

Number of split表示有一個(gè)切片,Starting task:?attempt_local649325949_0001_m_000000_0表示有一個(gè)Map Tast任務(wù)

(4)添加文件data1.txt,重新運(yùn)行程序,共有幾個(gè)切片,幾個(gè)MapTask(截圖)

可見我輸入了兩個(gè)文件,切片的數(shù)目為2,也就有兩個(gè)Map Text任務(wù)

(5)使用CombinTextInputFormat,讓data.txt,data1.txt兩個(gè)文件在一個(gè)切片中

在驅(qū)動(dòng)類中CombinTextInputFormat可見只有一個(gè)切片

(6)將data.txt上傳至HDFS

(7)使用maven將程序打成jar包并上傳至hadoop集群運(yùn)行,觀察是否能正確運(yùn)行。

?用 maven jar 包,需要添加的打包插件依賴, pom.xml
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.6.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>

將程序打成jar

上傳 jar 包到 hadoop101 中的 /opt/module/hadoop-3.1.3/testcode 目錄
確保 hadoop 集群已經(jīng)正常啟動(dòng),運(yùn)行 jar 文件
java運(yùn)行環(huán)境有問(wèn)題

二、創(chuàng)建com.nefu.(xingming).serialize包,編寫ScoreBean、Mapper、Reducer、Driver三個(gè)類,實(shí)現(xiàn)統(tǒng)計(jì)每個(gè)學(xué)號(hào)的平均成績(jī)。并將結(jié)果按照年級(jí)分別寫到三個(gè)文件中。

輸入數(shù)據(jù)mydata.txt文件格式:

學(xué)號(hào) ?\t ?姓名??\t ??成績(jī)

輸出數(shù)據(jù)格式(共3個(gè)文件):

學(xué)號(hào) ??\t ?姓名??\t ??平均成績(jī)

MyPartition

package com.nefu.zhangna.serialize;import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Partitioner;public class MyPartition extends Partitioner<Text,ScoreBean > {@Overridepublic int getPartition(Text text,ScoreBean studentBean,int numPartitions) {String snum = text.toString();int partition;if (snum.contains("2021")) {partition = 0;} else if (snum.contains("2022")) {partition = 1;} else{partition=2;}return partition;}
}

Scorebean

package com.nefu.zhangna.serialize;import org.apache.hadoop.io.Writable;
import java.io.DataInput;
import java.io.DataOutput;
import java.io.IOException;public  class ScoreBean implements Writable{private String name;private Double score;public ScoreBean(){}public String getName() {return name;}public void setName(String name) {this.name = name;}public Double getScore() {return score;}public void setScore(Double score) {this.score = score;}@Overridepublic void write(DataOutput out) throws IOException {out.writeUTF(name);out.writeDouble(score);}@Overridepublic  void readFields(DataInput in) throws IOException {this.name=in.readUTF();this.score=in.readDouble();}@Overridepublic String toString(){return this.name+"\t"+this.score;}
}

ZnMapper1

package com.nefu.zhangna.serialize;import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;import java.io.IOException;public class ZnMapper1 extends Mapper<LongWritable, Text, Text,ScoreBean> {private Text outk=new Text();private ScoreBean outv=new ScoreBean();@Overrideprotected void map(LongWritable key,Text value,Context context) throws IOException, InterruptedException {String line=value.toString();String[] content=line.split("\t");String  schoolnumber=content[0];String name=content[1];String score=content[2];outk.set(schoolnumber);outv.setName(name);outv.setScore(Double.parseDouble(score));context.write(outk,outv);}
}

ZnReducer1

package com.nefu.zhangna.serialize;import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;import java.io.IOException;public class ZnReducer1 extends Reducer<Text, ScoreBean,Text,ScoreBean> {private ScoreBean outv=new ScoreBean();@Overrideprotected void reduce(Text key,Iterable<ScoreBean> values,Context context) throws IOException, InterruptedException {double score=0;int sum=0;String name = null;for (ScoreBean value:values){sum=sum+1;score=score+value.getScore();name=value.getName();}outv.setName(name);outv.setScore(score/sum);context.write(key,outv);}
}

ZnDriver1

package com.nefu.zhangna.serialize;import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;import java.io.IOException;public class ZnDriver1 {public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException {Configuration configuration=new Configuration();Job job=Job.getInstance(configuration);job.setJarByClass(ZnDriver1.class);job.setMapperClass(ZnMapper1.class);job.setReducerClass(ZnReducer1.class);job.setMapOutputKeyClass(Text.class);job.setOutputValueClass(ScoreBean.class);job.setOutputKeyClass(Text.class);job.setOutputValueClass(ScoreBean.class);job.setPartitionerClass(MyPartition.class);job.setNumReduceTasks(3);FileInputFormat.setInputPaths(job,new Path("D:\\mapreducetest\\mydata.txt"));FileOutputFormat.setOutputPath(job,new Path("D:\\cluster\\serialize"));boolean result=job.waitForCompletion(true);System.exit(result?0:1);}
}

http://www.risenshineclean.com/news/53627.html

相關(guān)文章:

  • 推廣網(wǎng)站的軟件網(wǎng)頁(yè)查詢
  • 杭州的網(wǎng)站建設(shè)公司有哪些百度搜索熱度指數(shù)
  • 響應(yīng)式網(wǎng)站開發(fā)步驟商鋪營(yíng)銷推廣方案
  • 公司做網(wǎng)站費(fèi)用外貿(mào)網(wǎng)站建站和推廣
  • 什么是網(wǎng)站微商城的建設(shè)seo建站教學(xué)
  • 騰訊官方網(wǎng)站網(wǎng)絡(luò)營(yíng)銷成功案例
  • 北京旅游設(shè)計(jì)網(wǎng)站建設(shè)網(wǎng)站優(yōu)化推廣seo公司
  • 商城手機(jī)網(wǎng)站制作口碑營(yíng)銷有哪些
  • 網(wǎng)站建設(shè)銀行卡死期存款提前取出百度學(xué)術(shù)論文查重免費(fèi)檢測(cè)
  • 深圳網(wǎng)站制作公司訊息站長(zhǎng)統(tǒng)計(jì)幸福寶下載
  • 萊蕪高新區(qū)管委會(huì)網(wǎng)站求職seo服務(wù)
  • 蘇州企業(yè)網(wǎng)站優(yōu)化成都新一輪疫情
  • 網(wǎng)站建設(shè)費(fèi)賬務(wù)處理英雄聯(lián)盟世界排名
  • 企業(yè)網(wǎng)站建站寧波seo公司
  • 做蛋糕招聘網(wǎng)站域名申請(qǐng)
  • 響應(yīng)式網(wǎng)站建設(shè)平臺(tái)鄭州網(wǎng)絡(luò)推廣平臺(tái)
  • 建設(shè)網(wǎng)站的技術(shù)手段優(yōu)化大師有必要花錢嗎
  • 做裝修的業(yè)務(wù)網(wǎng)站網(wǎng)絡(luò)營(yíng)銷常用的工具有哪些
  • 香港特別行政區(qū)成品網(wǎng)站源碼的優(yōu)化技巧
  • vr網(wǎng)站開發(fā)免費(fèi)大數(shù)據(jù)查詢
  • 杭州知名的網(wǎng)站制作策略谷歌seo外包
  • 商城網(wǎng)站html模板競(jìng)價(jià)廣告是什么意思
  • 蘇州建站模板展示360投放廣告怎么收費(fèi)
  • 本機(jī)怎么放自己做的網(wǎng)站市場(chǎng)推廣外包團(tuán)隊(duì)
  • 手機(jī)建網(wǎng)站怎么弄企業(yè)關(guān)鍵詞優(yōu)化最新報(bào)價(jià)
  • 關(guān)于建設(shè)網(wǎng)站的培訓(xùn)知識(shí)長(zhǎng)沙做優(yōu)化的公司
  • 西安做網(wǎng)站建設(shè)哪家好搜索引擎優(yōu)化排名seo
  • 自適應(yīng)導(dǎo)航網(wǎng)站模板線下推廣活動(dòng)策劃方案
  • 做網(wǎng)站生意旁軟文廣告經(jīng)典案例600
  • 視頻拍攝腳本模板廣州seo優(yōu)化費(fèi)用