Spark异常处理:java.lang.NoSuchMethodError: io.netty.channel.DefaultFileRegion.(Ljava/io/File;JJ)V

最近在搞Zeppelin,Zeppelin是一个基于Web的notebook,提供交互数据分析和可视化。

后台支持接入多种数据处理引擎,如spark,hive等。

在使用Spark on zeppelin时,遇到了几个异常,现来总结一下

相关阅读:

Spark异常处理:java.io.InvalidClassException: org.apache.commons.lang3.time.FastDateParser; local class incompatible: stream classdesc serialVersionUID = 2, local class serialVersionUID = 3

Spark异常处理:java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD


环境介绍:

系统:Centos 7.2

Zeppelin:0.8.0

Spark:apache 2.1.0

问题描述:

在zeppelin notebook运行spark程序时,提交任务后,即将运行程序时,报错,异常信息如下:

ERROR [2018-07-12 01:09:06,126] ({rpc-server-3-4} TransportRequestHandler.java[operationComplete]:201) - Error sending result StreamResponse{streamId=/jars/spark-interpreter-0.8.0-SNAPSHOT.jar, byteCount=14033895, body=FileSegmentManagedBuffer{file=/data/software/test/zeppelin-0.8.0/interpreter/spark/spark-interpreter-0.8.0-SNAPSHOT.jar, offset=0, length=14033895}} to /192.168.1.40:37456; closing connection
io.netty.handler.codec.EncoderException: java.lang.NoSuchMethodError: io.netty.channel.DefaultFileRegion.<init>(Ljava/io/File;JJ)V at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:107)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:658)
	at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:716)
	at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:651)
	at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:266)
	at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:658)
	at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:716)
	at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:706)
	at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:741)
	at io.netty.channel.DefaultChannelPipeline.writeAndFlush(DefaultChannelPipeline.java:895)
	at io.netty.channel.AbstractChannel.writeAndFlush(AbstractChannel.java:240)
	at org.apache.spark.network.server.TransportRequestHandler.respond(TransportRequestHandler.java:194)
	at org.apache.spark.network.server.TransportRequestHandler.processStreamRequest(TransportRequestHandler.java:150)
	at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:111)
	at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
	at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoSuchMethodError: io.netty.channel.DefaultFileRegion.<init>(Ljava/io/File;JJ)V
	at org.apache.spark.network.buffer.FileSegmentManagedBuffer.convertToNetty(FileSegmentManagedBuffer.java:133)
	at org.apache.spark.network.protocol.MessageEncoder.encode(MessageEncoder.java:58)
	at org.apache.spark.network.protocol.MessageEncoder.encode(MessageEncoder.java:33)
	at io.netty.handler.codec.MessageToMessageEncoder.write(MessageToMessageEncoder.java:89)
	... 34 more

主要的异常信息如下:

java.lang.NoSuchMethodError: io.netty.channel.DefaultFileRegion.<init>(Ljava/io/File;JJ)V

异常分析:

出现NoSuchMethodErro这种问题,一般是因为jar包冲突,这个异常是因为netty包冲突导致的

spark 使用的netty jar包为
zeppelin使用的jar包为:

jar包版本不一致

解决办法:

将zeppelin/lib目录下netty包,替换为spark的netty-all-4.0.42.Final.jar包,重启zeppelin后,再次执行notebook,异常解决。

异常处理参考地址 :https://stackoverflow.com/questions/44801089/why-does-spark-fail-with-java-lang-nosuchmethoderror-io-netty-channel-defaultf


亲,看完了点个赞呗!

赫墨拉

我是一个喜爱大数据的小菜鸡,这里是我分享我的成长和经历的博客

You may also like...

发表评论

邮箱地址不会被公开。