在网上参考了一些资料,配置windows7系统上远程访问HDFS集群,包括添加系统变量$HADOOP_HOME\
、$HADOOP_USER
,配置环境变量$HADOOP_HOME\bin
,其中bin目录下有hadoop.dll、winutils.exe等程序,这些是从网上找的,但是在eclipse中运行程序遇到了一个问题,程序代码如下:
public class App
{
public static void main( String[] args )
{
Configuration conf = new Configuration();
try {
FileSystem fs = FileSystem.get(conf);
fs.copyFromLocalFile(new Path("D:\\testhdfs.txt"), new Path("/user/hado_cli/dist"));
} catch (IOException e) {
e.printStackTrace();
System.out.println("Put file to HDFS Failed");
}
try {
FileSystem fs = FileSystem.get(conf);
fs.copyToLocalFile(new Path("/user/hado_cli/dist/testhdfs.txt"), new Path("D:\\hdfstest"));
} catch (IOException e) {
e.printStackTrace();
System.out.println("Get file from HDFS Failed");
}
}
}
其中在执行前一段代码,及将本地文件上传至HDFS,程序是没有问题的,但执行从HDFS拷贝文件到我电脑上时,报了错,如下:
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createFileWithMode0(Ljava/lang/String;JJJI)Ljava/io/FileDescriptor;
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createFileWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createFileOutputStreamWithMode(NativeIO.java:559)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:219)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:209)
at org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:307)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:295)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:328)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.<init>(ChecksumFileSystem.java:393)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:456)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:435)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:923)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:904)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:801)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:368)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:341)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:292)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2017)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1986)
at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1962)
之前由于从网上找的hadoop.dll这些文件版本与集群上的hadoop版本不一致,在上传文件至HDFS时也报了类似的错误,后来换了个版本,现在下载文件仍然报错。我猜测的原因有以下几点:
- 版本原因,集群上时hadoop2.6.0-cdh5.9.1,程序用的jar包也是这个版本,hadoop.dll这写文件是从网上找的apache-hadoop2.6.0,是版本不一致还是找的这些dll文件有错,那为什么上传文件确可以?
- 程序验证了下,是可以读HDFS的文件的,但是很看报错好像是在我电脑上无法创建输出流,难道没有权限吗?
万能的网友帮忙看看吧