Quantcast

Error while starting the collector

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Error while starting the collector

Wenlong Pu
Hello,
 
I am installing the chukwa , and the agent is runing ok.
when I tring to run the  collector by using "${CHUKWA_HOME}/bin/chukwa collector"
I get the problem as follows and  the terminal gets stuck itself
 
 
My hadoop is running ok. datanode  namenode tasktracer jobtracker all are ok.
 
 
 
Thanks!

 
Terminal  lines
 
[hadoop@master chukwa-0.4.0]$
2011-11-17 13:12:38.022::INFO:  Logging to STDERR via org.mortbay.log.StdErrLog
2011-11-17 13:12:38.117::INFO:  jetty-6.1.11
 
collector.log  
 
2011-11-17 13:12:38,212 INFO main ChukwaConfiguration - chukwaConf is /usr/local/chukwa-0.4.0/bin/../conf
2011-11-17 13:12:38,882 INFO main root - initing servletCollector
2011-11-17 13:12:38,890 INFO main PipelineStageWriter - using pipelined writers, pipe length is 2
2011-11-17 13:12:38,900 INFO Thread-6 SocketTeeWriter - listen thread started
2011-11-17 13:12:38,903 INFO main SeqFileWriter - rotateInterval is 300000
2011-11-17 13:12:38,903 INFO main SeqFileWriter - outputDir is /chukwa/logs/
2011-11-17 13:12:38,903 INFO main SeqFileWriter - fsname is hdfs://192.168.1.120:54310
2011-11-17 13:12:38,904 INFO main SeqFileWriter - filesystem type from core-default.xml is org.apache.hadoop.hdfs.DistributedFileSystem
2011-11-17 13:12:39,099 ERROR main SeqFileWriter - can't connect to HDFS, trying default file system instead (likely to be local)
java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration
        at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
        at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
        at org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
        at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:196)
        at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159)
        at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216)
        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:409)
        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:395)
        at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1418)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1319)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
        at org.apache.hadoop.chukwa.datacollection.writer.SeqFileWriter.init(SeqFileWriter.java:123)
        at org.apache.hadoop.chukwa.datacollection.writer.PipelineStageWriter.init(PipelineStageWriter.java:88)
        at org.apache.hadoop.chukwa.datacollection.collector.servlet.ServletCollector.init(ServletCollector.java:112)
        at org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:433)
        at org.mortbay.jetty.servlet.ServletHolder.doStart(ServletHolder.java:256)
        at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
        at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:616)
        at org.mortbay.jetty.servlet.Context.startContext(Context.java:140)
        at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:513)
        at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
        at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
        at org.mortbay.jetty.Server.doStart(Server.java:222)
        at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
        at org.apache.hadoop.chukwa.datacollection.collector.CollectorStub.main(CollectorStub.java:121)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        ... 25 more
And this is my core-site.xml of my hadoop
 
<configuration>
  <property>
        <name>fs.default.name</name>
        <value>hdfs://192.168.1.120:54310</value>
  </property>

 
----------------------------------------------------------------------------
 
Best Regards,

Wenlong Pu   
 
 

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error while starting the collector

Ahmed Fathalla
We just had a user on this group who reported the same problem. According to him Hadoop 0.20.203.0 wasn't working but things worked when he switched to Hadoop-0.20.2 

Can you try doing that? 


2011/11/17 Wenlong Pu <[hidden email]>
Hello,
 
I am installing the chukwa , and the agent is runing ok.
when I tring to run the  collector by using "${CHUKWA_HOME}/bin/chukwa collector"
I get the problem as follows and  the terminal gets stuck itself
 
 
My hadoop is running ok. datanode  namenode tasktracer jobtracker all are ok.
 
 
 
Thanks!

 
Terminal  lines
 
[hadoop@master chukwa-0.4.0]$
<a href="tel:2011-11-17%2013" value="+442011111713" target="_blank">2011-11-17 13:12:38.022::INFO:  Logging to STDERR via org.mortbay.log.StdErrLog
<a href="tel:2011-11-17%2013" value="+442011111713" target="_blank">2011-11-17 13:12:38.117::INFO:  jetty-6.1.11
 
collector.log  
 
<a href="tel:2011-11-17%2013" value="+442011111713" target="_blank">2011-11-17 13:12:38,212 INFO main ChukwaConfiguration - chukwaConf is /usr/local/chukwa-0.4.0/bin/../conf
<a href="tel:2011-11-17%2013" value="+442011111713" target="_blank">2011-11-17 13:12:38,882 INFO main root - initing servletCollector
<a href="tel:2011-11-17%2013" value="+442011111713" target="_blank">2011-11-17 13:12:38,890 INFO main PipelineStageWriter - using pipelined writers, pipe length is 2
<a href="tel:2011-11-17%2013" value="+442011111713" target="_blank">2011-11-17 13:12:38,900 INFO Thread-6 SocketTeeWriter - listen thread started
<a href="tel:2011-11-17%2013" value="+442011111713" target="_blank">2011-11-17 13:12:38,903 INFO main SeqFileWriter - rotateInterval is 300000
<a href="tel:2011-11-17%2013" value="+442011111713" target="_blank">2011-11-17 13:12:38,903 INFO main SeqFileWriter - outputDir is /chukwa/logs/
<a href="tel:2011-11-17%2013" value="+442011111713" target="_blank">2011-11-17 13:12:38,903 INFO main SeqFileWriter - fsname is hdfs://192.168.1.120:54310
<a href="tel:2011-11-17%2013" value="+442011111713" target="_blank">2011-11-17 13:12:38,904 INFO main SeqFileWriter - filesystem type from core-default.xml is org.apache.hadoop.hdfs.DistributedFileSystem
<a href="tel:2011-11-17%2013" value="+442011111713" target="_blank">2011-11-17 13:12:39,099 ERROR main SeqFileWriter - can't connect to HDFS, trying default file system instead (likely to be local)
java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration
        at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
        at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
        at org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
        at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:196)
        at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159)
        at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216)
        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:409)
        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:395)
        at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1418)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1319)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
        at org.apache.hadoop.chukwa.datacollection.writer.SeqFileWriter.init(SeqFileWriter.java:123)
        at org.apache.hadoop.chukwa.datacollection.writer.PipelineStageWriter.init(PipelineStageWriter.java:88)
        at org.apache.hadoop.chukwa.datacollection.collector.servlet.ServletCollector.init(ServletCollector.java:112)
        at org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:433)
        at org.mortbay.jetty.servlet.ServletHolder.doStart(ServletHolder.java:256)
        at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
        at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:616)
        at org.mortbay.jetty.servlet.Context.startContext(Context.java:140)
        at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:513)
        at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
        at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
        at org.mortbay.jetty.Server.doStart(Server.java:222)
        at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
        at org.apache.hadoop.chukwa.datacollection.collector.CollectorStub.main(CollectorStub.java:121)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        ... 25 more
And this is my core-site.xml of my hadoop
 
<configuration>
  <property>
        <name>fs.default.name</name>
        <value>hdfs://192.168.1.120:54310</value>
  </property>

 
----------------------------------------------------------------------------
 
Best Regards,

Wenlong Pu   
 
 




--
Ahmed Fathalla
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error while starting the collector

TARIQ
In reply to this post by Wenlong Pu
Hi Wenlong,
  Which version of Chukwa and Hadoop are you using???

Regards,
    Mohammad Tariq



On Thu, Nov 17, 2011 at 12:37 PM, Wenlong Pu [via Apache Chukwa]
<[hidden email]> wrote:

> Hello,
>
> I am installing the chukwa , and the agent is runing ok.
> when I tring to run the  collector by using "${CHUKWA_HOME}/bin/chukwa
> collector"
> I get the problem as follows and  the terminal gets stuck itself
>
>
> My hadoop is running ok. datanode  namenode tasktracer jobtracker all are
> ok.
>
>
>
> Thanks!
>
> Terminal  lines
>
> [hadoop@master chukwa-0.4.0]$
> 2011-11-17 13:12:38.022::INFO:  Logging to STDERR via
> org.mortbay.log.StdErrLog
> 2011-11-17 13:12:38.117::INFO:  jetty-6.1.11
>
> collector.log
>
> 2011-11-17 13:12:38,212 INFO main ChukwaConfiguration - chukwaConf is
> /usr/local/chukwa-0.4.0/bin/../conf
> 2011-11-17 13:12:38,882 INFO main root - initing servletCollector
> 2011-11-17 13:12:38,890 INFO main PipelineStageWriter - using pipelined
> writers, pipe length is 2
> 2011-11-17 13:12:38,900 INFO Thread-6 SocketTeeWriter - listen thread
> started
> 2011-11-17 13:12:38,903 INFO main SeqFileWriter - rotateInterval is 300000
> 2011-11-17 13:12:38,903 INFO main SeqFileWriter - outputDir is /chukwa/logs/
> 2011-11-17 13:12:38,903 INFO main SeqFileWriter - fsname is
> hdfs://192.168.1.120:54310
> 2011-11-17 13:12:38,904 INFO main SeqFileWriter - filesystem type from
> core-default.xml is org.apache.hadoop.hdfs.DistributedFileSystem
> 2011-11-17 13:12:39,099 ERROR main SeqFileWriter - can't connect to HDFS,
> trying default file system instead (likely to be local)
> java.lang.NoClassDefFoundError:
> org/apache/commons/configuration/Configuration
>         at
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
>         at
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
>         at
> org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
>         at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:196)
>         at
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159)
>         at
> org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216)
>         at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:409)
>         at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:395)
>         at
> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1418)
>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1319)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
>         at
> org.apache.hadoop.chukwa.datacollection.writer.SeqFileWriter.init(SeqFileWriter.java:123)
>         at
> org.apache.hadoop.chukwa.datacollection.writer.PipelineStageWriter.init(PipelineStageWriter.java:88)
>         at
> org.apache.hadoop.chukwa.datacollection.collector.servlet.ServletCollector.init(ServletCollector.java:112)
>         at
> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:433)
>         at
> org.mortbay.jetty.servlet.ServletHolder.doStart(ServletHolder.java:256)
>         at
> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
>         at
> org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:616)
>         at org.mortbay.jetty.servlet.Context.startContext(Context.java:140)
>         at
> org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:513)
>         at
> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
>         at
> org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
>         at org.mortbay.jetty.Server.doStart(Server.java:222)
>         at
> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
>         at
> org.apache.hadoop.chukwa.datacollection.collector.CollectorStub.main(CollectorStub.java:121)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.commons.configuration.Configuration
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>         ... 25 more
> And this is my core-site.xml of my hadoop
>
> <configuration>
>   <property>
>         <name>fs.default.name</name>
>         <value>hdfs://192.168.1.120:54310</value>
>   </property>
>
>
> ----------------------------------------------------------------------------
>
> Best Regards,
> Wenlong Pu
>
>
> [hidden email]
>
>
> ________________________________
> If you reply to this email, your message will be added to the discussion
> below:
> http://apache-chukwa.679492.n3.nabble.com/Error-while-starting-the-collector-tp3514960p3514960.html
> To unsubscribe from Apache Chukwa, click here.
> NAML
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error while starting the collector

Eric Yang-3
In reply to this post by Wenlong Pu
It looks like you are using Hadoop 0.20.2xx.  You will need to have
commons-configurations*.jar from HADOOP_HOME/lib or
HADOOP_HOME/share/hadoop/lib for the collector to work.  Hope this
helps.

regards,
Eric

2011/11/16 Wenlong Pu <[hidden email]>:

> Hello,
>
> I am installing the chukwa , and the agent is runing ok.
> when I tring to run the  collector by using "${CHUKWA_HOME}/bin/chukwa
> collector"
> I get the problem as follows and  the terminal gets stuck itself
>
>
> My hadoop is running ok. datanode  namenode tasktracer jobtracker all are
> ok.
>
>
>
> Thanks!
>
> Terminal  lines
>
> [hadoop@master chukwa-0.4.0]$
> 2011-11-17 13:12:38.022::INFO:  Logging to STDERR via
> org.mortbay.log.StdErrLog
> 2011-11-17 13:12:38.117::INFO:  jetty-6.1.11
>
> collector.log
>
> 2011-11-17 13:12:38,212 INFO main ChukwaConfiguration - chukwaConf is
> /usr/local/chukwa-0.4.0/bin/../conf
> 2011-11-17 13:12:38,882 INFO main root - initing servletCollector
> 2011-11-17 13:12:38,890 INFO main PipelineStageWriter - using pipelined
> writers, pipe length is 2
> 2011-11-17 13:12:38,900 INFO Thread-6 SocketTeeWriter - listen thread
> started
> 2011-11-17 13:12:38,903 INFO main SeqFileWriter - rotateInterval is 300000
> 2011-11-17 13:12:38,903 INFO main SeqFileWriter - outputDir is /chukwa/logs/
> 2011-11-17 13:12:38,903 INFO main SeqFileWriter - fsname is
> hdfs://192.168.1.120:54310
> 2011-11-17 13:12:38,904 INFO main SeqFileWriter - filesystem type from
> core-default.xml is org.apache.hadoop.hdfs.DistributedFileSystem
> 2011-11-17 13:12:39,099 ERROR main SeqFileWriter - can't connect to HDFS,
> trying default file system instead (likely to be local)
> java.lang.NoClassDefFoundError:
> org/apache/commons/configuration/Configuration
>         at
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
>         at
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
>         at
> org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
>         at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:196)
>         at
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159)
>         at
> org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216)
>         at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:409)
>         at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:395)
>         at
> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1418)
>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1319)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
>         at
> org.apache.hadoop.chukwa.datacollection.writer.SeqFileWriter.init(SeqFileWriter.java:123)
>         at
> org.apache.hadoop.chukwa.datacollection.writer.PipelineStageWriter.init(PipelineStageWriter.java:88)
>         at
> org.apache.hadoop.chukwa.datacollection.collector.servlet.ServletCollector.init(ServletCollector.java:112)
>         at
> org.mortbay.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:433)
>         at
> org.mortbay.jetty.servlet.ServletHolder.doStart(ServletHolder.java:256)
>         at
> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
>         at
> org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:616)
>         at org.mortbay.jetty.servlet.Context.startContext(Context.java:140)
>         at
> org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:513)
>         at
> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
>         at
> org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
>         at org.mortbay.jetty.Server.doStart(Server.java:222)
>         at
> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:39)
>         at
> org.apache.hadoop.chukwa.datacollection.collector.CollectorStub.main(CollectorStub.java:121)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.commons.configuration.Configuration
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>         ... 25 more
> And this is my core-site.xml of my hadoop
>
> <configuration>
>   <property>
>         <name>fs.default.name</name>
>         <value>hdfs://192.168.1.120:54310</value>
>   </property>
>
>
> ----------------------------------------------------------------------------
>
> Best Regards,
> Wenlong Pu
>
>
> Email:[hidden email]
>
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error while starting the collector

3con
Eric, what do you mean by this:

You will need to have commons-configurations*.jar from HADOOP_HOME/lib

I do not see a commons-configurations*.jar in HADOOP_HOME/lib also, where do I move or place this jar? What are the next steps? Any help with be greatly appreciated. I am using CDH3 Hadoop 0.20.2-cdh3u4 and Chukwa 0.4.0.  I am getting the same error when trying to connect to HDFS. Is there a compatibility issue between these versions?

Thank you.
Loading...