1.添加hadoop插件
- hadoop-eclipse-plugin-2.7.0.jar放入eclipse\plugins
hadoop-eclipse-plugin-2.7.0.jar 下载链接:https://pan.baidu.com/s/194y5Qli83YBPzKoAtGsvOA 密码:5b6t)
2.配置hadoop主目录
- 重启eclipse后,在Preferences中找到Hadoop Map/Reduce,设置安装目录
[图片上传失败...(image-185f56-1527909635757)]
3.配置插件
- 打开Window ->Perspective -> open perspective -> other,打开Map/Reduce
注: 如果没看到Map/reduce的界面出来,就试试window -> show view -> other
右键新的hadoop连接
[图片上传失败...(image-25c732-1527909635757)]
[图片上传失败...(image-36f587-1527909635757)]
Host 那里写hadoop主节点的ip,port 为9001
如果连接成功,在project explore 的DFS locations下会展现hdfs集群中的文件
[图片上传失败...(image-346be-1527909635757)]
4.配置本地开发环境
- 复制hadoop.dll和winutils.exe到本地hadoop目录的bin文件夹中
- 复制hadoop.dll到C:\Windows\System32目录
hadoop.dll,winutils.exe 在hadoop-common-2.7.x-bin-master目录的bin文件夹下。
[图片上传失败...(image-c98e4-1527909635757)]
5.配置环境变量
- HADOOP_HOME F:\software\hadoop-common-2.7.x-bin-master
- path添加%HADOOP_HOME%\bin;%HADOOP_HOME%\sbin
6.加入依赖jar 包
编写pom.xml
两大核心依赖
- hadoop-common
- hadoop-hdfs
<properties>
<org.apache.hadoop.version>2.7.2</org.apache.hadoop.version>
</properties>
<dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${org.apache.hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>${org.apache.hadoop.version}</version>
</dependency>
</dependencies>
-- 完!