Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

java.lang.NoClassDefFoundError: org/apache/spark/SparkConf #32

Open
tyx11111 opened this issue Dec 27, 2017 · 4 comments
Open

java.lang.NoClassDefFoundError: org/apache/spark/SparkConf #32

tyx11111 opened this issue Dec 27, 2017 · 4 comments

Comments

@tyx11111
Copy link

package com.oreilly.learningsparkexamples.java;

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;

public class JavaSparkTest {
public static void main(String[] args) {
SparkConf conf = new SparkConf().setMaster("local").setAppName("My App");
JavaSparkContext sc = new JavaSparkContext(conf);
// JavaRDD rdd = sc.textFile("D:\工作\20171227\newdesc.xml");
// JavaRDD lineee = rdd.filter(line -> line.contains("desc"));
// System.out.println(lineee.count());
}
}

直接git到本地,然后依赖包也全部下载完毕了,但是始终报错找不到jar包:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
at com.oreilly.learningsparkexamples.java.JavaSparkTest.main(JavaSparkTest.java:12)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 6 more

工具是使用的IntelliJ Idea

@qihouying
Copy link

遇到同样的问题

1 similar comment
@robin-lai
Copy link

遇到同样的问题

@summer1897
Copy link

<dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>${spark.version}</version> <!--<scope>provided</scope>--> </dependency>
注释掉scope或将provided改为compile,因为最后打包提交的时候要用到这个jar包。

@olashile
Copy link

java.lang.ClassNotFoundException: org.apache.spark.SparkConf
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:436)

Still having this issue, has it been solved.

Below is my sbt
`name := "ProjectA"

version := "0.1"

//scalaVersion := "2.13.1"

scalaVersion := "2.11.8"

// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.1"`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants