为什么我得到这个异常java.lang.NoClassDefFoundError?

时间:2021-01-21 08:24:21

I am trying to use HBase and Hadoop together. When I run the JAR file I get this error. Here is my source code:

我试图一起使用HBase和Hadoop。当我运行JAR文件时,我收到此错误。这是我的源代码:

public class TwitterTable {

    final static Charset ENCODING = StandardCharsets.UTF_8;
    final static String FILE_NAME = "/home/hduser/project04/sample.txt";

    static class Mapper1 extends TableMapper<ImmutableBytesWritable, IntWritable> 
    {
        byte[] value;

        @Override
        public void map(ImmutableBytesWritable row, Result values, Context context) throws IOException 
        {
            value = values.getValue(Bytes.toBytes("text"), Bytes.toBytes(""));
            String valueStr = Bytes.toString(value);
            System.out.println("GET: " + valueStr);
        }
    }

    public static class Reducer1 extends TableReducer<ImmutableBytesWritable, IntWritable, ImmutableBytesWritable> {

        public void reduce(ImmutableBytesWritable key, Iterable<IntWritable> values, Context context)
                throws IOException, InterruptedException {

        }
    }

    public static void main( String args[] ) throws IOException, ClassNotFoundException, InterruptedException 
    {
        Configuration conf = new Configuration();

         @SuppressWarnings("deprecation")
        Job job = new Job(conf, "TwitterTable");
        job.setJarByClass(TwitterTable.class);

        HTableDescriptor ht = new HTableDescriptor( "twitter" );
        ht.addFamily( new HColumnDescriptor("text"));
        HBaseAdmin hba = new HBaseAdmin( conf );

        if(!hba.tableExists("twitter"))
        {
            hba.createTable( ht );
            System.out.println( "Table Created!" );
        }

        //Read the file and add to the database
        TwitterTable getText = new TwitterTable();



        Scan scan = new Scan();
        String columns = "text"; 
        scan.addColumn(Bytes.toBytes(columns), Bytes.toBytes(""));


        TableMapReduceUtil.initTableMapperJob("twitter", scan, Mapper1.class, ImmutableBytesWritable.class,
                IntWritable.class, job);

        job.waitForCompletion(true);


        //getText.readTextFile(FILE_NAME);
    }

    void readTextFile(String aFileName) throws IOException 
     {
            Path path = Paths.get(aFileName);
            try (BufferedReader reader = Files.newBufferedReader(path, ENCODING)){
              String line = null;
              while ((line = reader.readLine()) != null) {
                //process each line in some way
                  addToTable(line);
              }      
            }
          System.out.println("all done!");

     }

    void addToTable(String line) throws IOException
    {
        Configuration conf = new Configuration();
        HTable table = new HTable(conf, "twitter");

        String LineText[] = line.split(","); 

        String row = "";
        String text = "";

        row = LineText[0].toString();
        row = row.replace("\"", "");
        text = LineText[1].toString();
        text = text.replace("\"", "");

        Put put = new Put(Bytes.toBytes(row));
        put.addColumn(Bytes.toBytes("text"), Bytes.toBytes(""), Bytes.toBytes(text));
        table.put(put);
        table.flushCommits();
        table.close();   
    }   
}

I added the class path to the hadoop-env.sh still no luck.. I don't know what's the problem. Here my hadoop-env.sh class path :

我添加了hadoop-env.sh的类路径仍然没有运气..我不知道是什么问题。这是我的hadoop-env.sh类路径:

export HADOOP_CLASSPATH=
    /usr/lib/hbase/hbase-1.0.0/lib/hbase-common-1.0.0.jar:
    /usr/lib/hbase/hbase-1.0.0/lib/hbase-client.jar:
    /usr/lib/hbase/hbase-1.0.0/lib/log4j-1.2.17.jar:
    /usr/lib/hbase/hbase-1.0.0/lib/hbase-it-1.0.0.jar:
    /usr/lib/hbase/hbase-1.0.0/lib/hbase-common-1.0.0-tests.jar:
    /usr/lib/hbase/hbase-1.0.0/conf:
    /usr/lib/hbase/hbase-1.0.0/lib/zookeeper-3.4.6.jar:
    /usr/lib/hbase/hbase-1.0.0/lib/protobuf-java-2.5.0.jar:
    /usr/lib/hbase/hbase-1.0.0/lib/guava-12.0.1.jar

1 个解决方案

#1


Ok I found it.. maybe you cannot add everything to the class path. In that case copy all the libraries from the HBase and add into the Hadoop(refer the hadoop.env.sh)

好的我找到了..也许你不能把所有东西都添加到类路径中。在这种情况下,从HBase复制所有库并添加到Hadoop中(请参阅hadoop.env.sh)

HADOOP_DIR/contrib/capacity-scheduler

It worked for me.

它对我有用。

#1


Ok I found it.. maybe you cannot add everything to the class path. In that case copy all the libraries from the HBase and add into the Hadoop(refer the hadoop.env.sh)

好的我找到了..也许你不能把所有东西都添加到类路径中。在这种情况下,从HBase复制所有库并添加到Hadoop中(请参阅hadoop.env.sh)

HADOOP_DIR/contrib/capacity-scheduler

It worked for me.

它对我有用。