Hadoop Eclipse plug-in (installed in Eclipse IDE or Spring Tool Suite) eases the experience of Map/Reduce on Hadoop. Hadoop distribution does not include Hadoop Eclipse plug-in jar, but includes source code of that plug-in.
In this post, we'll see all the steps to build Hadoop eclipse plug-in from source code and install that plug-in in Eclipse / Spring Tool Suite to access Hadoop Environment in "Map/Reduce" perspective of the IDE.
Navigate to <Hadoop-Installation-Directory>/src/contrib/eclipse-plugin. Following highlighted files will be modified.
1. Try Ant Build
Lets us first see, what happens if we try to build Hadoop Eclipse plug-in from the provided source code.
Terminal
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
|
static-217:hadoop srccodes$
cd
/Users/srccodes/hadoop/src/contrib/eclipse-plugin
static-217:eclipse-plugin srccodes$ ant jar
Buildfile:
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/build
.xml
check-contrib:
[
echo
] eclipse.home
unset
: skipping eclipse plugin
init:
ivy-download:
[get] Getting: http:
//repo2
.maven.org
/maven2/org/apache/ivy/ivy/2
.1.0
/ivy-2
.1.0.jar
[get] To:
/Users/srccodes/hadoop/ivy/ivy-2
.1.0.jar
[get] Not modified - so not downloaded
ivy-probe-antlib:
ivy-init-antlib:
ivy-init:
[ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http:
//ant
.apache.org
/ivy/
::
[ivy:configure] :: loading settings ::
file
=
/Users/srccodes/hadoop/ivy/ivysettings
.xml
ivy-resolve-common:
ivy-retrieve-common:
[ivy:cachepath] DEPRECATED:
'ivy.conf.file'
is deprecated, use
'ivy.settings.file'
instead
[ivy:cachepath] :: loading settings ::
file
=
/Users/srccodes/hadoop/ivy/ivysettings
.xml
compile:
jar:
BUILD SUCCESSFUL
Total
time
: 2 seconds
|
Above echo message indicates that eclipse.home is not set and build of the plugin is skipped.
2. Modify build.properties
Open build.properties in a text editor and set eclipse.home with your Eclipse / STS installation directory. Also define version of different jars required by Hadoop Eclipse plug-in.
build.properties
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
|
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
output.. = bin/
bin.includes = META-INF/,\
plugin.xml,\
resources/,\
classes/,\
classes/,\
lib/
# set eclipse installation path
eclipse.home=
/Users/srccodes/software/springsource/sts-3
.2.0.RELEASE
# version of different jars required by Hadoop eclipse plugin
version=1.2.0
commons-cli.version=1.2
commons-configuration.version=1.6
commons-httpclient.version=3.0.1
commons-lang.version=2.4
jackson-core-asl.version=1.8.8
jackson-mapper-asl.version=1.8.8
|
3. Try Ant Build Once Again
Now eclipse.home is set, but we are not done yet. To find out why, try ant build once again. This time we will get java compilation errors as shown below.
Terminal
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
|
static-166:eclipse-plugin srccodes$ ant jar
Buildfile:
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/build
.xml
check-contrib:
init:
[
echo
] contrib: eclipse-plugin
init-contrib:
ivy-download:
[get] Getting: http:
//repo2
.maven.org
/maven2/org/apache/ivy/ivy/2
.1.0
/ivy-2
.1.0.jar
[get] To:
/Users/srccodes/hadoop/ivy/ivy-2
.1.0.jar
[get] Not modified - so not downloaded
ivy-probe-antlib:
ivy-init-antlib:
ivy-init:
[ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http:
//ant
.apache.org
/ivy/
::
[ivy:configure] :: loading settings ::
file
=
/Users/srccodes/hadoop/ivy/ivysettings
.xml
ivy-resolve-common:
ivy-retrieve-common:
[ivy:cachepath] DEPRECATED:
'ivy.conf.file'
is deprecated, use
'ivy.settings.file'
instead
[ivy:cachepath] :: loading settings ::
file
=
/Users/srccodes/hadoop/ivy/ivysettings
.xml
compile:
[
echo
] contrib: eclipse-plugin
[javac]
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/build
.xml:62: warning:
'includeantruntime'
was not
set
, defaulting to build.sysclasspath=last;
set
to
false
for
repeatable builds
[javac] Compiling 45
source
files to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/classes
[javac]
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DFSFolder
.java:28: package org.apache.hadoop.fs does not exist
[javac]
import
org.apache.hadoop.fs.FileStatus;
[javac] ^
[javac]
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DFSFolder
.java:29: package org.apache.hadoop.fs does not exist
[javac]
import
org.apache.hadoop.fs.Path;
[javac] ^
[javac]
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DFSPath
.java:25: package org.apache.hadoop.hdfs does not exist
[javac]
import
org.apache.hadoop.hdfs.DistributedFileSystem;
[javac] ^
:
:
:
:
[javac] ^
[javac]
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/HadoopServer
.java:291: cannot
find
symbol
[javac] symbol : class Configuration
[javac] location: class org.apache.hadoop.eclipse.server.HadoopServer
[javac] public Configuration getConfiguration() {
[javac] ^
[javac]
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/ConfProp
.java:24: package org.apache.hadoop.conf does not exist
[javac]
import
org.apache.hadoop.conf.Configuration;
[javac] ^
[javac]
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/HadoopServer
.java:468: cannot
find
symbol
[javac] symbol : class FileSystem
[javac] location: class org.apache.hadoop.eclipse.server.HadoopServer
[javac] public FileSystem getDFS() throws IOException {
[javac] ^
:
:
:
:
[javac]
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/HadoopJob
.java:302: package Counters does not exist
[javac] Counters.Group group = counters.getGroup(groupName);
[javac] ^
[javac]
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/server/HadoopJob
.java:305: package Counters does not exist
[javac]
for
(Counters.Counter counter : group) {
[javac] ^
[javac]
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/dfs/DFSFile
.java:74: cannot
find
symbol
[javac] symbol : class FileStatus
[javac] location: class org.apache.hadoop.eclipse.dfs.DFSFile
[javac] FileStatus fs = getDFS().getFileStatus(path);
[javac] ^
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation
for
details.
[javac] Note: Some input files use unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked
for
details.
[javac] 100 errors
BUILD FAILED
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/build
.xml:62: Compile failed; see the compiler error output
for
details.
Total
time
: 3 seconds
|
4. Modify build.xml
Open build.xml and modify path (id="classpath") and target (name="jar") as highlighted below.
build.xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
|
<?
xml
version
=
"1.0"
encoding
=
"UTF-8"
standalone
=
"no"
?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<
project
default
=
"jar"
name
=
"eclipse-plugin"
>
<
import
file
=
"../build-contrib.xml"
/>
<
path
id
=
"eclipse-sdk-jars"
>
<
fileset
dir
=
"${eclipse.home}/plugins/"
>
<
include
name
=
"org.eclipse.ui*.jar"
/>
<
include
name
=
"org.eclipse.jdt*.jar"
/>
<
include
name
=
"org.eclipse.core*.jar"
/>
<
include
name
=
"org.eclipse.equinox*.jar"
/>
<
include
name
=
"org.eclipse.debug*.jar"
/>
<
include
name
=
"org.eclipse.osgi*.jar"
/>
<
include
name
=
"org.eclipse.swt*.jar"
/>
<
include
name
=
"org.eclipse.jface*.jar"
/>
<
include
name
=
"org.eclipse.team.cvs.ssh2*.jar"
/>
<
include
name
=
"com.jcraft.jsch*.jar"
/>
</
fileset
>
</
path
>
<!-- Override classpath to include Eclipse SDK jars -->
<
path
id
=
"classpath"
>
<
pathelement
location
=
"${build.classes}"
/>
<
pathelement
location
=
"${hadoop.root}/build/classes"
/>
<
path
refid
=
"eclipse-sdk-jars"
/>
<
fileset
dir
=
"${hadoop.root}"
>
<
include
name
=
"*.jar"
/>
</
fileset
>
</
path
>
<!-- Skip building if eclipse.home is unset. -->
<
target
name
=
"check-contrib"
unless
=
"eclipse.home"
>
<
property
name
=
"skip.contrib"
value
=
"yes"
/>
<
echo
message
=
"eclipse.home unset: skipping eclipse plugin"
/>
</
target
>
<
target
name
=
"compile"
depends
=
"init, ivy-retrieve-common"
unless
=
"skip.contrib"
>
<
echo
message
=
"contrib: ${name}"
/>
<
javac
encoding
=
"${build.encoding}"
srcdir
=
"${src.dir}"
includes
=
"**/*.java"
destdir
=
"${build.classes}"
debug
=
"${javac.debug}"
deprecation
=
"${javac.deprecation}"
>
<
classpath
refid
=
"classpath"
/>
</
javac
>
</
target
>
<!-- Override jar target to specify manifest -->
<
target
name
=
"jar"
depends
=
"compile"
unless
=
"skip.contrib"
>
<
mkdir
dir
=
"${build.dir}/lib"
/>
<
copy
file
=
"${hadoop.root}/hadoop-core-${version}.jar"
tofile
=
"${build.dir}/lib/hadoop-core.jar"
verbose
=
"true"
/>
<
copy
file
=
"${hadoop.root}/lib/commons-cli-${commons-cli.version}.jar"
todir
=
"${build.dir}/lib"
verbose
=
"true"
/>
<
copy
file
=
"${hadoop.root}/lib/commons-configuration-${commons-configuration.version}.jar"
todir
=
"${build.dir}/lib"
verbose
=
"true"
/>
<
copy
file
=
"${hadoop.root}/lib/commons-httpclient-${commons-httpclient.version}.jar"
todir
=
"${build.dir}/lib"
verbose
=
"true"
/>
<
copy
file
=
"${hadoop.root}/lib/commons-lang-${commons-lang.version}.jar"
todir
=
"${build.dir}/lib"
verbose
=
"true"
/>
<
copy
file
=
"${hadoop.root}/lib/jackson-core-asl-${jackson-core-asl.version}.jar"
todir
=
"${build.dir}/lib"
verbose
=
"true"
/>
<
copy
file
=
"${hadoop.root}/lib/jackson-mapper-asl-${jackson-mapper-asl.version}.jar"
todir
=
"${build.dir}/lib"
verbose
=
"true"
/>
<
jar
jarfile
=
"${build.dir}/hadoop-${name}-${version}.jar"
manifest
=
"${root}/META-INF/MANIFEST.MF"
>
<
fileset
dir
=
"${build.dir}"
includes
=
"classes/ lib/"
/>
<
fileset
dir
=
"${root}"
includes
=
"resources/ plugin.xml"
/>
</
jar
>
</
target
>
</
project
>
|
5. Modify MANIFEST.MF
Open MANIFEST.MF in text editor and modify Bundle-ClassPath as highlighted below to incorporate the changes made in build.xml.
META-INF/MANIFEST.MF
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
|
Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: MapReduce Tools
for
Eclipse
Bundle-SymbolicName: org.apache.hadoop.eclipse;singleton:=
true
Bundle-Version: 0.18
Bundle-Activator: org.apache.hadoop.eclipse.Activator
Bundle-Localization: plugin
Require-Bundle: org.eclipse.ui,
org.eclipse.core.runtime,
org.eclipse.jdt.launching,
org.eclipse.debug.core,
org.eclipse.jdt,
org.eclipse.jdt.core,
org.eclipse.core.resources,
org.eclipse.ui.ide,
org.eclipse.jdt.ui,
org.eclipse.debug.ui,
org.eclipse.jdt.debug.ui,
org.eclipse.core.expressions,
org.eclipse.ui.cheatsheets,
org.eclipse.ui.console,
org.eclipse.ui.navigator,
org.eclipse.core.filesystem,
org.apache.commons.logging
Eclipse-LazyStart:
true
Bundle-ClassPath: classes/,
lib
/hadoop-core
.jar,
lib
/commons-cli-1
.2.jar,
lib
/commons-configuration-1
.6.jar,
lib
/jackson-core-asl-1
.8.8.jar,
lib
/commons-httpclient-3
.0.1.jar,
lib
/jackson-mapper-asl-1
.8.8.jar,
lib
/commons-lang-2
.4.jar
Bundle-Vendor: Apache Hadoop
|
6. Build Hadoop Eclipse Plug-in
Execute ant jar or ant clean package. On successful build, Hadoop Eclipse Plug-in jar (hadoop-eclipse-plugin-1.2.0.jar) will be created inside <Hadoop-Installation-Directory>/build/contrib/eclipse-plugin/
Terminal
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
|
static-217:eclipse-plugin srccodes$ ant jar
Buildfile:
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/build
.xml
check-contrib:
init:
[
echo
] contrib: eclipse-plugin
[
mkdir
] Created
dir
:
/Users/srccodes/hadoop/build/contrib/eclipse-plugin
[
mkdir
] Created
dir
:
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/classes
[
mkdir
] Created
dir
:
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/test
[
mkdir
] Created
dir
:
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/system
[
mkdir
] Created
dir
:
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/system/classes
[
mkdir
] Created
dir
:
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/examples
[
mkdir
] Created
dir
:
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/test/logs
init-contrib:
ivy-download:
[get] Getting: http:
//repo2
.maven.org
/maven2/org/apache/ivy/ivy/2
.1.0
/ivy-2
.1.0.jar
[get] To:
/Users/srccodes/hadoop/ivy/ivy-2
.1.0.jar
[get] Not modified - so not downloaded
ivy-probe-antlib:
ivy-init-antlib:
ivy-init:
[ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http:
//ant
.apache.org
/ivy/
::
[ivy:configure] :: loading settings ::
file
=
/Users/srccodes/hadoop/ivy/ivysettings
.xml
ivy-resolve-common:
ivy-retrieve-common:
[ivy:cachepath] DEPRECATED:
'ivy.conf.file'
is deprecated, use
'ivy.settings.file'
instead
[ivy:cachepath] :: loading settings ::
file
=
/Users/srccodes/hadoop/ivy/ivysettings
.xml
compile:
[
echo
] contrib: eclipse-plugin
[javac]
/Users/srccodes/hadoop/src/contrib/eclipse-plugin/build
.xml:64: warning:
'includeantruntime'
was not
set
, defaulting to build.sysclasspath=last;
set
to
false
for
repeatable builds
[javac] Compiling 45
source
files to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/classes
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation
for
details.
[javac] Note: Some input files use unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked
for
details.
jar:
[
mkdir
] Created
dir
:
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib
[copy] Copying 1
file
to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib
[copy] Copying
/Users/srccodes/hadoop/hadoop-core-1
.2.0.jar to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib/hadoop-core
.jar
[copy] Copying 1
file
to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib
[copy] Copying
/Users/srccodes/hadoop/lib/commons-cli-1
.2.jar to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib/commons-cli-1
.2.jar
[copy] Copying 1
file
to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib
[copy] Copying
/Users/srccodes/hadoop/lib/commons-configuration-1
.6.jar to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib/commons-configuration-1
.6.jar
[copy] Copying 1
file
to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib
[copy] Copying
/Users/srccodes/hadoop/lib/commons-httpclient-3
.0.1.jar to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib/commons-httpclient-3
.0.1.jar
[copy] Copying 1
file
to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib
[copy] Copying
/Users/srccodes/hadoop/lib/commons-lang-2
.4.jar to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib/commons-lang-2
.4.jar
[copy] Copying 1
file
to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib
[copy] Copying
/Users/srccodes/hadoop/lib/jackson-core-asl-1
.8.8.jar to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib/jackson-core-asl-1
.8.8.jar
[copy] Copying 1
file
to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib
[copy] Copying
/Users/srccodes/hadoop/lib/jackson-mapper-asl-1
.8.8.jar to
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/lib/jackson-mapper-asl-1
.8.8.jar
[jar] Building jar:
/Users/srccodes/hadoop/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-1
.2.0.jar
BUILD SUCCESSFUL
Total
time
: 4 seconds
|
7. Install Hadoop Eclipse Plug-in
Copy Hadoop Eclipse Plug-in jar (hadoop-eclipse-plugin-1.2.0.jar) and paste it to <Eclipse-Installation-Directory>/plugins directory. Start / Restart (if already running) Eclipse / Spring Tool Suite IDE.
8. Check Hadoop Eclipse Plug-in
Navigate Window --> Open Perspective --> Other and select Map/ReduceOpen perspective. You'll be able to seeMap/Reduce Locations window where New Hadoop Location can be added.
Now, Eclipse environment for Hadoop is ready for you to explore.