GVKun编程网logo

java.rmi.RemoteException:没有协议异常(java没有协程)

3

如果您想了解java.rmi.RemoteException:没有协议异常和java没有协程的知识,那么本篇文章将是您的不二之选。我们将深入剖析java.rmi.RemoteException:没有协

如果您想了解java.rmi.RemoteException:没有协议异常java没有协程的知识,那么本篇文章将是您的不二之选。我们将深入剖析java.rmi.RemoteException:没有协议异常的各个方面,并为您解答java没有协程的疑在这篇文章中,我们将为您介绍java.rmi.RemoteException:没有协议异常的相关知识,同时也会详细的解释java没有协程的运用方法,并给出实际的案例分析,希望能帮助到您!

本文目录一览:

java.rmi.RemoteException:没有协议异常(java没有协程)

java.rmi.RemoteException:没有协议异常(java没有协程)

如何解决java.rmi.RemoteException:没有协议异常?

我们正在尝试连接到SOAP端点:https://ABC.u1.app.cloud.net/OrderManagementService/15.09.wsdl,并看到以下完整的异常。我在这里看到很多java.net.MalformedURLException: no protocol:,例如:java.net.MalformedURLException: no protocol:,但这并不是我所看到的例外。

我已经阅读了一些有关该应用程序无法读取URL的博客?我们正在记录URL,因此应用程序具有URL。有什么建议吗?

我看到了异常:javax.xml.rpc.JAXRPCException:没有协议:,但是没有清楚解释它的确切含义?

谢谢!

代码块:

                OrderManagementServiceSoapBindingStub stubOMS = new OrderManagementServiceSoapBindingStub();
                List<CredentialProvider> credProviders = new ArrayList<CredentialProvider>();
                CredentialProvider cp = new ClientUNTCredentialProvider(username.getBytes(),password.getBytes());
                credProviders.add(cp);
                stubOMS._setProperty(WSSecurityContext.CREDENTIAL_PROVIDER_LIST,credProviders);
                OrderManagementServicePortProxy proxy = new OrderManagementServicePortProxy(wsdlUrl);
                proxy.submitOrder(order);

日志:

2020-09-10 14:54:59,689 INFO  [OMSUtility]([ACTIVE] ExecuteThread: ''2'' for queue: ''weblogic.kernel.Default (self-tuning)'' for workmanager: PortOut@null@default):: Other exception :javax.xml.rpc.JAXRPCException: no protocol:  https://om-api-gateway-int.u1.app.cloud.comcast.net/OrderManagementService/15.09.wsdl
2020-09-10 14:54:59,689 INFO  [AdminJdbcControlImpl]([ACTIVE] ExecuteThread: ''2'' for queue: ''weblogic.kernel.Default (self-tuning)'' for workmanager: PortOut@null@default):: Exception occured while calling OMS : Exception is :
java.rmi.remoteexception: no protocol:  https://om-api-gateway-int.u1.app.cloud.comcast.net/OrderManagementService/15.09.wsdl
    at com.comcast.tpp.pat.utils.OMSUtility.submitOrderToOMS(OMSUtility.java:117)
    at com.comcast.tpp.pat.utils.OMSUtility.callOMStoCreateWO(OMSUtility.java:425)
    at com.comcast.tpp.pat.customcontrol.AdminJdbcControlImpl.startPostValidations(AdminJdbcControlImpl.java:223)
    at com.comcast.tpp.pat.customcontrol.AdminJdbcControlBean.startPostValidations(AdminJdbcControlBean.java:727)
    at Controller.processLsrData(Controller.java:2866)
    at Controller.submitLSR(Controller.java:1254)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.beehive.netui.pageflow.FlowController.invokeActionMethod(FlowController.java:870)
    at org.apache.beehive.netui.pageflow.FlowController.getActionMethodForward(FlowController.java:809)
    at org.apache.beehive.netui.pageflow.FlowController.internalExecute(FlowController.java:478)
    at org.apache.beehive.netui.pageflow.PageFlowController.internalExecute(PageFlowController.java:306)
    at org.apache.beehive.netui.pageflow.FlowController.execute(FlowController.java:336)
    at org.apache.beehive.netui.pageflow.internal.FlowControllerAction.execute(FlowControllerAction.java:52)
    at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:419)
    at org.apache.beehive.netui.pageflow.PageFlowRequestProcessor.access$201(PageFlowRequestProcessor.java:97)
    at org.apache.beehive.netui.pageflow.PageFlowRequestProcessor$ActionRunner.execute(PageFlowRequestProcessor.java:2044)
    at org.apache.beehive.netui.pageflow.interceptor.action.internal.ActionInterceptors.wrapAction(ActionInterceptors.java:91)
    at org.apache.beehive.netui.pageflow.PageFlowRequestProcessor.processActionPerform(PageFlowRequestProcessor.java:2116)
    at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:224)
    at org.apache.beehive.netui.pageflow.PageFlowRequestProcessor.processInternal(PageFlowRequestProcessor.java:556)
    at org.apache.beehive.netui.pageflow.PageFlowRequestProcessor.process(PageFlowRequestProcessor.java:853)
    at org.apache.beehive.netui.pageflow.AutoRegisteractionServlet.process(AutoRegisteractionServlet.java:631)
    at org.apache.beehive.netui.pageflow.PageFlowActionServlet.process(PageFlowActionServlet.java:158)
    at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:432)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:301)
    at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:60)
    at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:60)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3748)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3714)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2283)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2182)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1491)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:221)

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)

dubbo rmi 和spring rmi 怎么结合的

dubbo rmi 和spring rmi 怎么结合的

dubbo rmi 和spring rmi 怎么结合的

dubbo 的参数 invocation 怎么和rmi结合的

Exception in thread

Exception in thread "RMI TCP Connection (idle)" java.lang.OutOfMemoryError: Java heap space 内存溢出问题求解

程序出现这个内存溢出的错误,用 jdk 自带的 visualvm 查看内存的使用情况,用了 60 多兆,visualvm 显示的有好多 RMI TCP Connection 这种线程,线程出现这个错误是什么原因啊?怎么解决?还有哪些内存分析工具吗

HBase中此类异常解决记录org.apache.hadoop.ipc.RemoteException(java.io.IOException):

HBase中此类异常解决记录org.apache.hadoop.ipc.RemoteException(java.io.IOException):

ERROR: Can''t get master address from ZooKeeper; znode data == null   一定注意这只是问题的第一层表象,真的问题是:

File /hbase/.tmp/hbase.version could only be replicated to 0 nodes instead of minReplica

网上很多都是叫用两种方式解决

  • stop/start  重启hbase
  • 格式化 hdfs namenode -format,不能随随便便就格式话hadoop的namenode

 

按照上述方式试一两个小时找问题,没有找到,最后问题就在每个应用的日志里藏着

Hbase中启动中很多异常的坑会遇到,但是请一定不要慌,坑多是因为我们对她不熟悉,我找了一上午的错误例子,在今年5月份我记得我可以启动单机的hbase hadoop zookeeper,由于我的阿里云服务器要用作别用,我就关闭了三个应用,9月我再次启动时,就不能启动了。

 

 

 

org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /hbase/.tmp/hbase.version could only be replicated to 0 nodes instead of minReplication (=1).  There are 0 datanode(s) running and no node(s) are excluded in this operation.
        at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1622)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3351)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:683)
        at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.addBlock(AuthorizationProviderProxyClientProtocol.java:214)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:495)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2216)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2212)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1796)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2210)

        at org.apache.hadoop.ipc.Client.call(Client.java:1472)
        at org.apache.hadoop.ipc.Client.call(Client.java:1409)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
        at com.sun.proxy.$Proxy17.addBlock(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:413)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:256)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
        at com.sun.proxy.$Proxy18.addBlock(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:279)
        at com.sun.proxy.$Proxy19.addBlock(Unknown Source)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1812)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1608)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:772)
2018-09-22 10:50:56,289 INFO  [app:60000.activeMasterManager] regionserver.HRegionServer: STOPPED: Unhandled exception. Starting shutdown.
2018-09-22 10:50:56,290 INFO  [master/app.server/172.16.216.42:60000] regionserver.HRegionServer: Stopping infoServer
2018-09-22 10:50:56,320 INFO  [master/app.server/172.16.216.42:60000] mortbay.log: Stopped SelectChannelConnector@0.0.0.0:60010

 我打开hbase hadoop zookeeper 三者中data缓存文件,里面还是5月份的数据,比较坑就是每次重启都不自己覆盖以前的文件的么。这里就以后不要用kill 去关掉线程了

[root@app hbase-1.2.0-cdh5.10.0]# cd data/tmp/

重新启动 hbase hadoop zookeeper  进入 hbase shell命令客户端

[root@app bin]# ./hbase shell
2018-09-22 11:12:00,809 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
2018-09-22 11:12:03,263 WARN  [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
HBase Shell; enter ''help<RETURN>'' for list of supported commands.
Type "exit<RETURN>" to leave the HBase Shell
Version 1.2.0-cdh5.10.0, rUnknown, Fri Jan 20 12:18:02 PST 2017

hbase(main):001:0> list
TABLE                                                                                                                                                                                        
0 row(s) in 0.3760 seconds

=> []
hbase(main):002:0>

最后强调一下jps 查看最近启动的进程中是不是全部启动,我这里是单机版的,仅供参考。

[root@app tmp]# jps
4336 Jps
2529 HRegionServer
2418 HMaster
2276 QuorumPeerMain
1947 DataNode
2109 SecondaryNameNode
2847 Main
1823 NameNode
[root@app tmp]#

 

Hive JDBC:java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.sec...

Hive JDBC:java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.sec...

今天使用JDBC来操作Hive时,首先启动了hive远程服务模式:hiveserver2 &(表示后台运行),然后到eclipse中运行程序时出现错误:

java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.182.11:10000/default: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:224)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
at java.sql/java.sql.DriverManager.getConnection(DriverManager.java:678)
at java.sql/java.sql.DriverManager.getConnection(DriverManager.java:252)
at demo.jdbc.JDBCUtils.getConnection(JDBCUtils.java:31)
at demo.jdbc.DemoTest.main(DemoTest.java:16)
Caused by: org.apache.hive.service.cli.HiveSQLException: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:267)
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:258)
at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:683)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:200)
... 5 more
Caused by: org.apache.hive.service.cli.HiveSQLException: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous
at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:419)
at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:362)
at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:193)
at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:440)
at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:322)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1377)
at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1362)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:89)
at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
at com.sun.proxy.$Proxy37.open(Unknown Source)
at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:410)
... 13 more
Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate anonymous
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:606)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:544)
at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:164)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
... 21 more
Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException:User: root is not allowed to impersonate anonymous
at org.apache.hadoop.ipc.Client.call(Client.java:1475)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy30.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy31.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1301)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426)
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:704)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:650)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:582)
... 28 more
java.lang.NullPointerException
at demo.jdbc.DemoTest.main(DemoTest.java:19)

主要报错内容是:java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException:User: root is not allowed to impersonate anonymous

解决办法:通过httpfs协议访问rest接口,以root用户包装自己用户的方式操作HDFS

首先需要开启rest接口,在hdfs-site.xml文件中加入:


<property>  
<name>dfs.webhdfs.enabled</name>  
<value>true</value>  
</property>  
<property>  

然后在core-site.xml文件中加入:


<property>
<name>hadoop.proxyuser.root.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.root.groups</name>
<value>*</value>
</property>

 

 

关于java.rmi.RemoteException:没有协议异常java没有协程的问题就给大家分享到这里,感谢你花时间阅读本站内容,更多关于dubbo rmi 和spring rmi 怎么结合的、Exception in thread "RMI TCP Connection (idle)" java.lang.OutOfMemoryError: Java heap space 内存溢出问题求解、HBase中此类异常解决记录org.apache.hadoop.ipc.RemoteException(java.io.IOException):、Hive JDBC:java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.sec...等相关知识的信息别忘了在本站进行查找喔。

本文标签: