开发者社区> 问答> 正文

Spark sql和hive导致“错误:无法找到或加载主类”

我使用Scala开发了一个使用hive进行通信的spark应用程序。它在Intellij的想法上工作得很好。但是当我构建一个具有所有依赖性的jar文件时,我收到错误

错误:无法找到或加载主类

经过进一步调试后,我发现当我包含spark SQL或hive时,我收到错误。这是一个maven项目。请查看我的pom.xml

4.0.0

spk
spk
1.0-SNAPSHOT

<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.0.2</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>2.0.2</version>
</dependency>

<!--<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-hive_2.11</artifactId>
    <version>2.0.2</version>
</dependency>-->

<!--<dependency>
    <groupId>org.apache.hive</groupId>
    <artifactId>hive-exec</artifactId>
    <version>1.2.2</version>
</dependency>-->

<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka -->
<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka_2.11</artifactId>
    <version>2.0.1</version>
</dependency>

<!-- https://mvnrepository.com/artifact/com.101tec/zkclient -->
<dependency>
    <groupId>com.101tec</groupId>
    <artifactId>zkclient</artifactId>
    <version>0.11</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->
<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>0.10.0.0</version>
</dependency>

<sourceDirectory>src/main/scala</sourceDirectory>
<plugins>
    <plugin>
        <groupId>org.scala-tools</groupId>
        <artifactId>maven-scala-plugin</artifactId>
        <version>2.11</version>
        <executions>
            <execution>
                <goals>
                    <goal>compile</goal>
                    <goal>testCompile</goal>
                </goals>
                <configuration>
                    <args>
                        <!--<arg>-make:transitive</arg>-->
                        <arg>-dependencyfile</arg>
                        <arg>${project.build.directory}/.scala_dependencies</arg>
                    </args>
                </configuration>
            </execution>
        </executions>
    </plugin>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.11</version>
        <configuration>
            <useFile>false</useFile>
            <disableXmlReport>true</disableXmlReport>
            <!-- If you have classpath issue like NoDefClassError,... -->
            <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
            <includes>
                <include>**/*Test.*</include>
                <include>**/*Suite.*</include>
            </includes>
        </configuration>
    </plugin>
</plugins>

展开
收起
社区小助手 2018-12-06 15:49:23 3070 0
1 条回答
写回答
取消 提交回答
  • 社区小助手是spark中国社区的管理员,我会定期更新直播回顾等资料和文章干货,还整合了大家在钉群提出的有关spark的问题及回答。

    Intellij Idea构建存在一些问题,具有所有依赖性。我已完成以下步骤:

    1)添加到pom.xml

            <artifactId>maven-assembly-plugin</artifactId>
            <version>3.1.0</version>
            <configuration>
                <archive>
                    <manifest>
                        <mainClass>spk.kspark</mainClass>
                    </manifest>
                </archive>
                <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
                </descriptorRefs>
            </configuration>
        </plugin>

    2)转到项目目录并打开Ubuntu终端

    3)mvn安装

    4)mvn clean编译程序集:单个或mvn程序包程序集:单个

    2019-07-17 23:18:35
    赞同 展开评论 打赏
问答排行榜
最热
最新

相关电子书

更多
Hive Bucketing in Apache Spark 立即下载
spark替代HIVE实现ETL作业 立即下载
2019大数据技术公开课第五季—Hive迁移到MaxCompute最佳实践 立即下载