添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接

Hello everyone,

I want to run some MapReduce jobs with Hbase locally in TOS for Big Data 7.0 in my Windows 10 x64 with cloudera 5.13.0. I don’t have hadoop installed on my PC.

I have created a Maven project with hadoop 2.6.0 in other machin, and I added it to My job as a dependency on a user routine. I can successfully connect to Hbase and add new tables but when I want submit the job MapReduce I receive the following error:

Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
	at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
	at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
	at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:478)
	at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:532)
	at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:509)
	at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:312)
	at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133)
	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:144)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Unknown Source)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)

 I have found some solutions in the web.

I have put hadoop.dll and winutils.exe and other files from this link in C:\winutils\bin folder and I added this folder to PATH and HADOOP_HOME as it shown in the figure, I also put hadoop.dll into the C:\Windows\System32 folder. But always I have the same problem can anybody helps me to resolve this problem

Hello,

Will you see many files under the bin directory when you unzip the download winutils from https://github.com/steveloughran/winutils and  copy the hadoop.dll file then place it in the C:/Windows/System32 directory?

Best regards

Sabrina

Are you able to see many files under the bin directory?

Copy the hadoop.dll file then place it in the C:/Windows/System32 directory

To run the big data batch job using the “Use local mode” in spark, we need to setup the HADOOP_HOME in the system variable:

Click on the path

Then click on the Edit button

Add a Hadoop path as below:

Best regards

Sabrina

Hello,

Thank you, for your response.

I did all these steps, the error message become : incompatible version of winutils

java.io.IOException: Cannot run program "C:\winutils\bin\winutils.exe": 
CreateProcess error=216,This version of% 1 is not compatible with the version of Windows currently running.
  Check in the system information of your computer, then contact the software publisher
at java.lang.ProcessBuilder.start (Unknown Source)
	at java.lang.ProcessBuilder.start(Unknown Source)
	at org.apache.hadoop.util.Shell.runCommand(Shell.java:523)
	at org.apache.hadoop.util.Shell.run(Shell.java:482)
	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:776)
	at org.apache.hadoop.util.Shell.execCommand(Shell.java:869)
	at org.apache.hadoop.util.Shell.execCommand(Shell.java:852)
	at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:733)
	at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:491)
	at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:532)
	at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:509)
	at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:312)
	at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133)
	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:144)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Unknown Source)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
	at routines.HBase_Hadoop_Operation.ManyScan_SingleJob_PerTable_operation(HBase_Hadoop_Operation.java:371)
	at routines.HBase_Transformation.ProcessQuery(HBase_Transformation.java:71)
	at local_project.contextexpl_0_1.contextexpl.tTuto_1Process(contextexpl.java:726)
	at local_project.contextexpl_0_1.contextexpl.tHBaseConnection_1Process(contextexpl.java:547)
	at local_project.contextexpl_0_1.contextexpl.runJobInTOS(contextexpl.java:1425)
	at local_project.contextexpl_0_1.contextexpl.main(contextexpl.java:1225)
Caused by: java.io.IOException: CreateProcess error=216, Cette version de %1 n’est pas compatible avec la version de Windows actuellement exécutée. Vérifiez dans les informations système de votre ordinateur, puis contactez l’éditeur de logiciel
	at java.lang.ProcessImpl.create(Native Method)
	at java.lang.ProcessImpl.<init>(Unknown Source)
	at java.lang.ProcessImpl.start(Unknown Source)
	... 25 more
[statistics] disconnected