flink 连接 hdfs 读取文件配置
hadoop 版本为 2.7.3
window系统本地运行flink程序读取hdfs文件配置
1, 请导入hadoop和httpclient 的包
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-hadoop-fs</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>${httpclient.version}</version>
</dependency>
2, 下载 windows版 hadoop 配置环境变量 HADOOP_HOME
https://github.com/SweetInk/hadoop-common-bin
下载后放置到任意位置 在环境变量中配置 HADOOP_HOME 值为 2.7.3 文件夹路径