native

#native#

已有3人关注此标签

内容分类

code_xzh

《WEEX跨平台开发实战》赠书活动

本书简介 近年来,伴随着大前端概念的提出和兴起,移动端和前端的边界变得越来越模糊,一大批移动跨平台开发框架和模式涌现出来。从早期的PhoneGap、Inoic 等Hybrid技术,到现在耳熟能详的React Native、WEEX和Flutter等跨平台技术,无不体现着移动端开发的前端化。 作为阿里巴巴开源的一套移动跨平台技术框架,WEEX框架最初是为了解决移动开发过程中频繁发版和多端研发的问题而开发的。具体来说,使用WEEX提供的跨平台开发技术,开发者可以很方便地使用Web前端技术来构建高性能、可扩展的原生性能体验,并支持在Android、iOS和Web等多平台上进行部署。这样一来,如果你掌握WEEX相关的开发技术,对于进入阿里将是一个不错的途径。 作为一部入门到实战级别的基础教程,本书共分为9章,力图通过基础知识的讲解,帮助读者全面了解Weex的跨平台知识,并将它运用到实际的项目中。 第1章~第4章:这4章属于WEEX入门与基础部分。这部分内容主要包括WEEX简介、WEEX环境搭建、WEEX基础知识以及WEEX开发常用的组件和模块等内容。同时,本部分内容配备了大量的实例,通过这部分内容的学习,读者将会对WEEX有一个基本的认识。 第5章~第8章:这4章属于WEEX进阶部分。这部分内容主要由讲述Rax、Vue.js、BindingX和WEEX Eros的章节组成,主要是介绍WEEX开发中的一些进阶知识。同时,为了加快WEEX的开发效率,建议开发者直接使用WEEX Eros和weexplus等WEEX脚手架。 第9章:这一章属于WEEX项目实战部分。这部分讲述了WEEX项目实战的内容,是对WEEX基础知识的综合运用。通过此部分的知识讲解,读者将会对WEEX有一个全面的认识。 互动讨论 1,对于目前流行的跨平台技术(React Native、WEEX和Flutter),你的了解如何?2,在平时的学习和工作中,你都使用过哪些跨平台技术,你是如何看待这些技术的?3,作为阿里巴巴开源的移动跨平台技术框架,你对WEEX是如何看待的?4,除了平时的工作,你还有哪些自我提升的手段? 参加本次评论,将有机会获得图书赠送机会,活动会根据跟帖比较内容和质量,并且能够引起互动的进行择优选取,大家赶快行动吧!!!

9413

当面付支持哪些接口

问一下,当面付只支持这个文档列出的接口吗?https://docs.open.alipay.com/194/105203/ 开通当面付后,能不能请求alipay.trade.app.pay接口? 我还想知道这个文档中https://myjsapi.alipay.com/jsapi/native/trade-pay.html orderStr这个参数除了支持alipay.trade.app.pay拼装的请求query string以外还支持哪些接口的query string?

游客h677xkqjsqs4e

阿里云上本地java应用连不上本地zookeeper

java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.7.0_181] at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:744) ~[na:1.7.0_181] at org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1078) ~[zookeeper-3.3.1.jar:3.3.1-942149]cas 2019-06-23 06:58:24,599 [localhost-startStop-1-SendThread(localhost:2181)] INFO org.apache.zookeeper.ClientCnxn - Opening socket connection to server localhost/127.0.0.1:2181cas 2019-06-23 06:58:24,600 [localhost-startStop-1-SendThread(localhost:2181)] WARN org.apache.zookeeper.ClientCnxn - Session 0x0 for server null, unexpected error, closing socket connection and attempting reconnect 手动命令连接没问题: [zk: localhost:2181(CONNECTED) 1] ls /[dubbo, testnode, funi, zookeeper] java应用和zookeeper都再同一台服务器上。

caishow2

login.do 这个是个后台登录的网页地址,输入用户名和密码进入后,网页变成这样了是怎么回事?

login.do 这个是个后台登录的网页地址,输入用户名和密码进入后,网页变成这样了是怎么回事? nested exception is org.apache.ibatis.exceptions.PersistenceException: ### Error querying database. Cause: org.springframework.jdbc.CannotGetJdbcConnectionException: Could not get JDBC Connection; nested exception is java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:mysql://127.0.0.1:3306/bt3_oa?autoReconnect=true&useUnicode=true&characterEncoding=UTF-8&useServerPrepStmts=true&rewriteBatchedStatements=true, username = root. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Could not create connection to database server. Attempted reconnect 3 times. Giving up. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:395) at com.mysql.jdbc.Util.getInstance(Util.java:370) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:999) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:973) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:959) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:904) at com.mysql.jdbc.ConnectionImpl.connectWithRetries(ConnectionImpl.java:2379) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2300) at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:818) at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:31) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:395) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:400) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:347) at java.sql.DriverManager.getConnection(DriverManager.java:664) at 写不下,只复制了一小半

游客lplm6xso3kx3e

使用spark streaming连接loghub报错,是什么问题

"main" java.lang.ClassNotFoundException: Failed to find data source: loghub. Please find packages at http://spark.apache.org/third-party-projects.html at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:652) at org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:159) at com.aliyun.emr.examples.sql.streaming.RealtimeComputation.main(RealtimeComputation.java:51) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:896) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: loghub.DefaultSource at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$27$$anonfun$apply$15.apply(DataSource.scala:635) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$27$$anonfun$apply$15.apply(DataSource.scala:635) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$27.apply(DataSource.scala:635) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$27.apply(DataSource.scala:635) at scala.util.Try.orElse(Try.scala:84) at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:635)

开源大数据EMR

创建低配置机型集群注意事项

创建低配置机型集群注意事项

dzdgcs

钉钉小程序H5微应用开发或E应用想上架

钉钉小程序H5微应用开发或E应用想上架必须使用阿里云提供RDS吗?求大神指导

游客jvqfylvxkbvrg

getAcsResponse(request) 出错 Failed resolution of: Ljavax/xml/bind/DatatypeConverter;

Process: com.live.live, PID: 30455 java.lang.NoClassDefFoundError: Failed resolution of: Ljavax/xml/bind/DatatypeConverter; at com.aliyuncs.auth.HmacSHA1Signer.signString(HmacSHA1Signer.java:22) at com.aliyuncs.RpcAcsRequest.signRequest(RpcAcsRequest.java:158) at com.aliyuncs.DefaultAcsClient.doAction(DefaultAcsClient.java:247) at com.aliyuncs.DefaultAcsClient.doAction(DefaultAcsClient.java:180) at com.aliyuncs.DefaultAcsClient.doAction(DefaultAcsClient.java:77) at com.aliyuncs.DefaultAcsClient.getAcsResponse(DefaultAcsClient.java:106) at com.player.player.StsServiceSample.main(StsServiceSample.java:49) at com.player.player.AliyunPlayerSkinActivity.requestMpsSts(AliyunPlayerSkinActivity.java:191) at com.player.player.AliyunPlayerSkinActivity.onCreate(AliyunPlayerSkinActivity.java:166) at android.app.Activity.performCreate(Activity.java:6845) at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1119) at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2700) at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2808) at android.app.ActivityThread.-wrap12(ActivityThread.java) at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1541) at android.os.Handler.dispatchMessage(Handler.java:102) at android.os.Looper.loop(Looper.java:165) at android.app.ActivityThread.main(ActivityThread.java:6375) at java.lang.reflect.Method.invoke(Native Method) at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:912) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:802) Caused by: java.lang.ClassNotFoundException: Didn't find class "javax.xml.bind.DatatypeConverter" on path: DexPathList[[zip file "/data/app/com.live.live-1/base.apk"],nativeLibraryDirectories=[/data/app/com.live.live-1/lib/arm64, /data/app/com.live.live-1/base.apk!/lib/arm64-v8a, /system/lib64, /vendor/lib64]] at dalvik.system.BaseDexClassLoader.findClass(BaseDexClassLoader.java:74) at java.lang.ClassLoader.loadClass(ClassLoader.java:380) at java.lang.ClassLoader.loadClass(ClassLoader.java:312) at com.aliyuncs.auth.HmacSHA1Signer.signString(HmacSHA1Signer.java:22)  at com.aliyuncs.RpcAcsRequest.signRequest(RpcAcsRequest.java:158)  at com.aliyuncs.DefaultAcsClient.doAction(DefaultAcsClient.java:247)  at com.aliyuncs.DefaultAcsClient.doAction(DefaultAcsClient.java:180)  at com.aliyuncs.DefaultAcsClient.doAction(DefaultAcsClient.java:77)  at com.aliyuncs.DefaultAcsClient.getAcsResponse(DefaultAcsClient.java:106)  at com.player.player.StsServiceSample.main(StsServiceSample.java:49)  at com.player.player.AliyunPlayerSkinActivity.requestMpsSts(AliyunPlayerSkinActivity.java:191)  at com.player.player.AliyunPlayerSkinActivity.onCreate(AliyunPlayerSkinActivity.java:166)  at android.app.Activity.performCreate(Activity.java:6845)  at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1119)  at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2700)  at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2808)  at android.app.ActivityThread.-wrap12(ActivityThread.java)  at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1541)  at android.os.Handler.dispatchMessage(Handler.java:102)  at android.os.Looper.loop(Looper.java:165)  at android.app.ActivityThread.main(ActivityThread.java:6375)  at java.lang.reflect.Method.invoke(Native Method)  at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:912)  at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:802)  我请求的参数应该没问题 不知道为什么就是发不通

小六码奴

如何在OSX 10.14上修复Zlib丢失错误?

我在Ruby上为Rails安装gem时发现这个“Zlib缺失”,请让我知道一个解决方案: Fetching nokogiri 1.10.2Installing nokogiri 1.10.2 with native extensionsGem::Ext::BuildError: ERROR: Failed to build gem native extension. current directory: /usr/local/lib/ruby/gems/2.6.0/gems/nokogiri-1.10.2/ext/nokogiri /usr/local/opt/ruby/bin/ruby -I /usr/local/Cellar/ruby/2.6.2/lib/ruby/2.6.0 -r ./siteconf20190407-34092-u44l37.rb extconf.rbchecking if the C compiler accepts -I /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include/libxml2... yeschecking if the C compiler accepts -Wno-error=unused-command-line-argument-hard-error-in-future... noBuilding nokogiri using packaged libraries.Using mini_portile version 2.4.0checking for iconv.h... yeschecking for gzdopen() in -lz... nozlib is missing; necessary for building libxml2 extconf.rb failed Could not create Makefile due to some reason, probably lack of necessarylibraries and/or headers. Check the mkmf.log file for more details. You mayneed configuration options.我还重新安装了OSX 10.14 SDK标头,但没有用。

李博 bluemind

node 无法使用 MongoDB

问题描述Mac 下安装 MongoDB 成功,打开 localhost:27017 后显示It looks like you are trying to access MongoDB over HTTP on the native driver port.命令中使用 mongo 能够进入数据库,使用 show dbs 能够显示admin/config/local 数据库 在node 代码中连接 MongoDB let MongoClient = require('mongodb').MongoClient;let url = "mongodb://localhost:27017/test"; MongoClient.connect(url, function(err, db) { if (err) throw err; console.log('数据库已创建'); db.close() })总是出错 MongoNetworkError: failed to connect to server [localhost:27017] on first connect [MongoNetworkError: getaddrinfo ENOTFOUND localhost localhost:27017]查了下,说是mongo 没有启动,但是我在命令行中是已经启动了启动命令 MongoDB --dbpath 自定义的目录mongo // 可以使用 show dbs 命令 本问题及下方已被采纳的回答均来自云栖社区【Redis&MongoDB 社区大群】。https://yq.aliyun.com/articles/690084 点击链接欢迎加入社区大社群。

游客aqpqenklvwob2

so找不到

android application里面的代码:@Override public void onCreate() { super.onCreate(); QupaiHttpFinal.getInstance().initOkHttpFinal(); AlivcSdkCore.register(getApplicationContext()); AlivcSdkCore.setLogLevel(AlivcSdkCore.AlivcLogLevel.AlivcLogDebug); } 错误日志:java.lang.UnsatisfiedLinkError: dalvik.system.PathClassLoader[DexPathList[[zip file "/data/app/com.media.edit-1/base.apk", zip file "/data/app/com.media.edit-1/split_lib_dependencies_apk.apk", zip file "/data/app/com.media.edit-1/split_lib_slice_0_apk.apk", zip file "/data/app/com.media.edit-1/split_lib_slice_1_apk.apk", zip file "/data/app/com.media.edit-1/split_lib_slice_2_apk.apk", zip file "/data/app/com.media.edit-1/split_lib_slice_3_apk.apk", zip file "/data/app/com.media.edit-1/split_lib_slice_4_apk.apk", zip file "/data/app/com.media.edit-1/split_lib_slice_5_apk.apk", zip file "/data/app/com.media.edit-1/split_lib_slice_6_apk.apk", zip file "/data/app/com.media.edit-1/split_lib_slice_7_apk.apk", zip file "/data/app/com.media.edit-1/split_lib_slice_8_apk.apk", zip file "/data/app/com.media.edit-1/split_lib_slice_9_apk.apk"],nativeLibraryDirectories=[/data/app/com.media.edit-1/lib/arm, /data/app/com.media.edit-1/base.apk!/lib/armeabi-v7a, /data/app/com.media.edit-1/split_lib_dependencies_apk.apk!/lib/armeabi-v7a, /data/app/com.media.edit-1/split_lib_slice_0_apk.apk!/lib/armeabi-v7a, /data/app/com.media.edit-1/split_lib_slice_1_apk.apk!/lib/armeabi-v7a, /data/app/com.media.edit-1/split_lib_slice_2_apk.apk!/lib/armeabi-v7a, /data/app/com.media.edit-1/split_lib_slice_3_apk.apk!/lib/armeabi-v7a, /data/app/com.media.edit-1/split_lib_slice_4_apk.apk!/lib/armeabi-v7a, /data/app/com.media.edit-1/split_lib_slice_5_apk.apk!/lib/armeabi-v7a, /data/app/com.media.edit-1/split_lib_slice_6_apk.apk!/lib/armeabi-v7a, /data/app/com.media.edit-1/split_lib_slice_7_apk.apk!/lib/armeabi-v7a, /data/app/com.media.edit-1/split_lib_slice_8_apk.apk!/lib/armeabi-v7a, /data/app/com.media.edit-1/split_lib_slice_9_apk.apk!/lib/armeabi-v7a, /system/lib, /vendor/lib]]] couldn't find "libalivc_conan.so" at java.lang.Runtime.loadLibrary0(Runtime.java:972) at java.lang.System.loadLibrary(System.java:1567) at com.aliyun.sys.AbstractNativeLoader.loadLocalLibrary(SourceFile:75) at com.aliyun.sys.AbstractNativeLoader.<clinit>(SourceFile:34) at com.aliyun.sys.AlivcSdkCore.register(SourceFile:52) at com.media.edit.App.onCreate(App.java:45) at android.app.Instrumentation.callApplicationOnCreate(Instrumentation.java:1032) at android.app.ActivityThread.handleBindApplication(ActivityThread.java:5876) at android.app.ActivityThread.-wrap3(ActivityThread.java) at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1699) at android.os.Handler.dispatchMessage(Handler.java:102) at android.os.Looper.loop(Looper.java:154) at android.app.ActivityThread.main(ActivityThread.java:6682) at java.lang.reflect.Method.invoke(Native Method) at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1520) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1410) 2019-02-28 10:28:02.480 5390-5390/com.media.edit E/AliYunLog: Load .so failed! java.lang.UnsatisfiedLinkError: dlopen failed: library "libcurl.so" not found at java.lang.Runtime.loadLibrary0(Runtime.java:977) at java.lang.System.loadLibrary(System.java:1567) at com.aliyun.sys.AbstractNativeLoader.loadLocalLibrary(SourceFile:75) at com.aliyun.sys.AbstractNativeLoader.<clinit>(SourceFile:44) at com.aliyun.sys.AlivcSdkCore.register(SourceFile:52) at com.media.edit.App.onCreate(App.java:45) at android.app.Instrumentation.callApplicationOnCreate(Instrumentation.java:1032) at android.app.ActivityThread.handleBindApplication(ActivityThread.java:5876) at android.app.ActivityThread.-wrap3(ActivityThread.java) at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1699) at android.os.Handler.dispatchMessage(Handler.java:102) at android.os.Looper.loop(Looper.java:154) at android.app.ActivityThread.main(ActivityThread.java:6682) at java.lang.reflect.Method.invoke(Native Method) at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1520) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1410) 2019-02-28 10:28:02.487 5390-5390/com.media.edit E/art: No implementation found for void com.aliyun.sys.AlivcSdkCore.nativeSetLogLevel(int) (tried Java_com_aliyun_sys_AlivcSdkCore_nativeSetLogLevel and Java_com_aliyun_sys_AlivcSdkCore_nativeSetLogLevel__I)2019-02-28 10:28:02.488 5390-5390/com.media.edit E/AndroidRuntime: FATAL EXCEPTION: main Process: com.media.edit, PID: 5390 java.lang.UnsatisfiedLinkError: No implementation found for void com.aliyun.sys.AlivcSdkCore.nativeSetLogLevel(int) (tried Java_com_aliyun_sys_AlivcSdkCore_nativeSetLogLevel and Java_com_aliyun_sys_AlivcSdkCore_nativeSetLogLevel__I) at com.aliyun.sys.AlivcSdkCore.nativeSetLogLevel(Native Method) at com.aliyun.sys.AlivcSdkCore.setLogLevel(SourceFile:62) at com.media.edit.App.onCreate(App.java:46) at android.app.Instrumentation.callApplicationOnCreate(Instrumentation.java:1032) at android.app.ActivityThread.handleBindApplication(ActivityThread.java:5876) at android.app.ActivityThread.-wrap3(ActivityThread.java) at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1699) at android.os.Handler.dispatchMessage(Handler.java:102) at android.os.Looper.loop(Looper.java:154) at android.app.ActivityThread.main(ActivityThread.java:6682) at java.lang.reflect.Method.invoke(Native Method) at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1520) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1410)

社区小助手

SparkContext无法以master设置为“Yarn”开始

我试图在Scala API(Play框架)中运行SparkContext。当我将Spark master设置为“local”时,它工作正常,但是,当我将master设置为“YARN”时,它会引发异常: [SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.]当我检查容器的日志时,我得到以下内容: Error: Could not find or load main class org.apache.spark.deploy.yarn.ExecutorLauncher如果我运行 spark-shell --master yarn 它运行SparkContext没有任何问题。 这是我的代码: val sparkS = SparkSession.builder .config("spark.hadoop.validateOutputSpecs", "false") .config("spark.executor.memory", "4g") .config("spark.driver.memory", "3g") .config("spark.rpc.message.maxSize", "2047") .config("SPARK_DIST_CLASSPATH", "/usr/local/spark/jars/*") .config("spark.yarn.archive", "hdfs://localhost:54310/spark-libs.jar") .config("spark.yarn.jars", "/usr/local/spark/jars/*") .config("spark.executor.extraJavaOptions", "-XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"") .config("spark.executor.extraLibraryPath", "/usr/local/hadoop-2.8.5/lib/native:/usr/local/hadoop-2.8.5/lib/native/Linux-amd64-64") .config("HADOOP_CONF_DIR", "/usr/local/hadoop-2.8.5/etc/hadoop") .config("spark.yarn.am.waitTime", "1d") .master("yarn").getOrCreate

黄二刀

[@炯轩][¥20]如何将React Native集成至Android原生应用?

如何将React Native集成至Android原生应用?

黄二刀

jack胡

[@小川游鱼][¥20]如何解决运行java时报错:unable to load native library: libjava.jnilib

如何解决运行java时报错:unable to load native library: libjava.jnilib

flink小助手

运行Apache Flink时出现java.lang.ClassNotFoundException:com.mongodb.hadoop.mapred.MongoInputFormat

我正在用Java创建一个flink系统来读写MongoDB。我在JDK 1.8.0_181中使用mongo-hadoop-core版本1.3.2和Apache flink 1.6。 当我在Eclipse中运行源代码时,一切都运行成功,但是当我使用flink命令运行JAR文件时,我总是会收到此错误。 这是我的pom.xml http://maven.apache.org/xsd/maven-4.0.0.xsd">4.0.0org.apache.flinktest-mongodb-21.0-SNAPSHOT <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <version>3.0.0</version> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> <configuration> <artifactSet> <excludes> <exclude>com.google.code.findbugs:jsr305</exclude> <exclude>org.slf4j:*</exclude> <exclude>log4j:*</exclude> </excludes> </artifactSet> <filters> <filter> <!-- Do not copy the signatures in the META-INF folder. Otherwise, this might cause SecurityExceptions when using the JAR. --> <artifact>*:*</artifact> <excludes> <exclude>META-INF/*.SF</exclude> <exclude>META-INF/*.DSA</exclude> <exclude>META-INF/*.RSA</exclude> </excludes> </filter> </filters> <transformers> <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"> <mainClass>my.programs.main.clazz</mainClass> </transformer> </transformers> </configuration> </execution> </executions> </plugin> </plugins> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-java</artifactId> <version>1.6.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-streaming-java_2.11</artifactId> <version>1.6.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-core</artifactId> <version>1.6.0</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-clients_2.11</artifactId> <version>1.6.0</version> </dependency> <dependency> <groupId>org.mongodb.mongo-hadoop</groupId> <artifactId>mongo-hadoop-core</artifactId> <version>1.3.2</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-hadoop-compatibility_2.11</artifactId> <version>1.6.0</version> </dependency> 但是在我将mongo-hadoop-core版本更改为1.3.2后,它会产生不同的错误 java.lang.RuntimeException: Could not look up the main(String[]) method from the class MongoDBExample: org/apache/flink/api/java/hadoop/mapred/HadoopInputFormatat org.apache.flink.client.program.PackagedProgram.hasMainMethod(PackagedProgram.java:499)at org.apache.flink.client.program.PackagedProgram.(PackagedProgram.java:218)at org.apache.flink.client.program.PackagedProgram.(PackagedProgram.java:128)at org.apache.flink.client.cli.CliFrontend.buildProgram(CliFrontend.java:856)at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:206)at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1044)at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1120)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1120)Caused by: java.lang.NoClassDefFoundError: org/apache/flink/api/java/hadoop/mapred/HadoopInputFormatat java.lang.Class.getDeclaredMethods0(Native Method)at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)at java.lang.Class.privateGetMethodRecursive(Class.java:3048)at java.lang.Class.getMethod0(Class.java:3018)at java.lang.Class.getMethod(Class.java:1784)at org.apache.flink.client.program.PackagedProgram.hasMainMethod(PackagedProgram.java:493)... 11 moreCaused by: java.lang.ClassNotFoundException: org.apache.flink.api.java.hadoop.mapred.HadoopInputFormatat java.net.URLClassLoader.findClass(URLClassLoader.java:381)at java.lang.ClassLoader.loadClass(ClassLoader.java:424)at java.lang.ClassLoader.loadClass(ClassLoader.java:357)... 17 more到底哪里出错了?

flink小助手

在apache flink中运行生成的jar文件

目前我正在尝试运行我的第一个flink应用程序。我已经在IDE中测试了java文件(KMeans.java)它工作得很好但是我无法处理这个java文件在命令行中作为jar运行。已成功创建构建mvn clean package。但是,如果我在命令行中运行我的jar文件,flink run -c KMeans name.jar 则会显示以下错误消息: 该程序以以下异常结束: org.apache.flink.client.program.ProgramInvocationException:在jar文件中找不到程序的入口点类'KMeans'。org.apache.flink.client.program.PackagedProgram.loadMainClass(PackagedProgram.java:617)位于org.apache.fack.client.client的org.apache.flink.client.program.PackagedProgram。(PackagedProgram.java:199)。 cli.CliFrontend.buildProgram(CliFrontend.java:856)atg.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:206)at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend。 java:1044)at org.apache.flink.client.cli.CliFrontend.lambda $ main $ 11(CliFrontend.java:1120)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs (Subject.java:422)位于org.apache.hadoop.security.UserGroupInformation。 所以我查找了生成的目标文件夹,并在classes文件夹中有一个KMeans.class文件。所以我这样做错了?

社区小助手

Spark无法使用JDBC think 驱动程序连接到Ignite

我正在使用Java 8,Spark 2.1.1,Ignite 2.5和BoneCP 0.8.0 Maven pom.xml看起来像这样: <?xml version="1.0" encoding="UTF-8"?> xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>test</groupId> <artifactId>ignite-tester</artifactId> <version>1.0-SNAPSHOT</version> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding> <maven.compiler.target>1.8</maven.compiler.target> <maven.compiler.source>1.8</maven.compiler.source> <java.version>1.8</java.version> <kafka.version>0.10.1.2.6.2.0-205</kafka.version> <spark.version>2.1.1.2.6.2.0-205</spark.version> </properties> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-assembly-plugin</artifactId> <executions> <execution> <phase>package</phase> <goals> <goal>single</goal> </goals> <configuration> <archive> <manifest> <addClasspath>true</addClasspath> <mainClass>spark.IgniteTester</mainClass> </manifest> </archive> <descriptorRefs> <descriptorRef>jar-with-dependencies</descriptorRef> </descriptorRefs> </configuration> </execution> </executions> </plugin> </plugins> </build> <dependencies> <dependency> <groupId>org.apache.ignite</groupId> <artifactId>ignite-core</artifactId> <version>2.5.0</version> </dependency> <dependency> <groupId>com.jolbox</groupId> <artifactId>bonecp</artifactId> <version>0.8.0.RELEASE</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql-kafka-0-10_2.11</artifactId> <version>${spark.version}</version> <scope>compile</scope> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.11</artifactId> <version>${spark.version}</version> <scope>compile</scope> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.11</artifactId> <version>${spark.version}</version> <scope>compile</scope> </dependency> </dependencies> 我的项目编译成一个'fat'jar,它包含所有依赖项,但在Spark集群上运行下一个代码时:public static void main(String[] args) { try { Class.forName("org.apache.ignite.IgniteJdbcThinDriver").newInstance(); BoneCPConfig config = new BoneCPConfig(); config.setJdbcUrl("jdbc:ignite:thin://myhost:10840;user=myusername;password=mypassword"); pool = new BoneCP(config); } catch (Exception e) { logger.error("could not load Ignite driver", e); return; } }结果以下异常: ERROR IgniteTester: could not load Ignite driverjava.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:ignite:thin://myhost:10840;user=myusername;password=mypassword, username = null. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------java.sql.SQLException: No suitable driver found for jdbc:ignite:thin://myhost:10840;user=myusername;password=mypassword at java.sql.DriverManager.getConnection(DriverManager.java:689) at java.sql.DriverManager.getConnection(DriverManager.java:208) at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361) at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) at spark.IgniteTester.main(IgniteTester.java:56) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit $$ runMain(SparkSubmit.scala:751) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192) at com.jolbox.bonecp.BoneCP.(BoneCP.java:422) at spark.IgniteTester.main(IgniteTester.java:56) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit $$ runMain(SparkSubmit.scala:751) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.sql.SQLException: No suitable driver found for jdbc:ignite:thin://myhost:10840;user=myusername;password=mypassword at java.sql.DriverManager.getConnection(DriverManager.java:689) at java.sql.DriverManager.getConnection(DriverManager.java:208) at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361) at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) ... 10 more 提交脚本如下所示: spark-submit \--class spark.IgniteTester \--master yarn \--deploy-mode master \--driver-memory 1g \--executor-cores 1 \--num-executors 1 \--executor-memory 1664mb \ignite-tester.jar使用“本地”Spark实例时,它使用think JDBC驱动程序连接到Ignite。

flink小助手

Apache flink(稳定版本1.6.2)不起作用

"最近,apache flink的稳定版本(1.6.2)发布了。我读了这个指令。但是当我运行以下命令时: ./bin/flink run examples/streaming/SocketWindowWordCount.jar --port 9000我收到以下错误:The program finished with the following exception:org.apache.flink.client.program.ProgramInvocationException: Job failed. (JobID: 264564a337d4c6705bde681b34010d28) at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:268) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:486) at org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:66) at org.apache.flink.streaming.examples.socket.SocketWindowWordCount.main(SocketWindowWordCount.java:92) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529) at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:426) at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:816) at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:290) at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:216) at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1053) at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1129) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41) at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1129) Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed. at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146) at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265) ... 20 more Caused by: java.net.ConnectException: Connection refused (Connection refused) at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:589) at org.apache.flink.streaming.api.functions.source.SocketTextStreamFunction.run(SocketTextStreamFunction.java:96) at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:94) at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:58) at org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:99) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:711) at java.lang.Thread.run(Thread.java:748) 我找到了这个链接:当我按照flink-1.4的快速入门并使用“./bin/flink run examples / streaming / SocketWindowWordCount.jar --port 9000”时,Flink程序无法提交。但是,它没有帮助。我试着用Flink 1.6.2 with Hadoop® 2.8以及Flink 1.5.5 with Hadoop® 2.8上mac os和ubuntu。但我得到了同样的错误。 "

李博 bluemind

【Flink】FailOver显示TaskManager lost/killed

问题描述如果遇到如下的异常: java.lang.Exception: The assigned slot: SimpleSlot (1) - container_e05_1505041177764_25978_01_000033 @ hdpet2mainse011132138130.et2.tbsite.net (dataPort=59317) - ALLOCATED/ALIVE is asked to release from TaskManager: container_e05_1505041177764_25978_01_000033 @ hdpet2mainse011132138130.et2.tbsite.net (dataPort=59317), probably due to TaskManager lost/killed at org.apache.flink.runtime.instance.SimpleSlot.releaseSlot(SimpleSlot.java:217) at org.apache.flink.runtime.instance.SlotPool.releaseTaskManager(SlotPool.java:699) at sun.reflect.GeneratedMethodAccessor44.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:183) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:135) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.access$000(AkkaRpcActor.java:72) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor$1.apply(AkkaRpcActor.java:110) at akka.actor.ActorCell$$anonfun$become$1.applyOrElse(ActorCell.scala:534) at akka.actor.Actor$class.aroundReceive(Actor.scala:467) at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 问题原因说明是任务有节点的Native Memory没有申请或者太小,被yarn kill了,一般是group by,join节点的state_size没有配置,默认为0,不会申请Native Memory。