使用了阿里云Iot中AMQP的示例代码,但是就是连接不上,而且还报一个反射错误。试过很多办法了,都得不到解决。 代码如下:为了安全,我把一些订阅地址的参数省略了,可以确定不是订阅地址的问题。
public class AmqpClient {
private final static Logger logger = LoggerFactory.getLogger(AmqpClient.class);
private static String accessKey = "s";
private static String accessSecret = "sl";
private static String consumerGroupId = "s";
//iotInstanceId:实例ID。若是2021年07月30日之前(不含当日)开通的公共实例,请填空字符串。
private static String iotInstanceId = "iot-06z00cal1n9fiuu";
//控制台服务端订阅中消费组状态页客户端ID一栏将显示c+
// lientId参数。
//建议使用机器UUID、MAC地址、IP等唯一标识等作为clientId。便于您区分识别不同的客户端。
private static String clientId = "s";
//${YourHost}为接入域名,请参见AMQP客户端接入说明文档。
private static String host = "s";
// 指定单个进程启动的连接数
// 单个连接消费速率有限,请参考使用限制,最大64个连接
// 连接数和消费速率及rebalance相关,建议每500QPS增加一个连接
private static int connectionCount = 4;
//业务处理异步线程池,线程池参数可以根据您的业务特点调整,或者您也可以用其他异步方式处理接收到的消息。
private final static ExecutorService executorService = new ThreadPoolExecutor(
Runtime.getRuntime().availableProcessors(),
Runtime.getRuntime().availableProcessors() * 2, 60, TimeUnit.SECONDS,
new LinkedBlockingQueue(50000));
public static void main(String[] args) throws Exception {
List<Connection> connections = new ArrayList<>();
//参数说明,请参见AMQP客户端接入说明文档。
for (int i = 0; i < connectionCount; i++) {
long timeStamp = System.currentTimeMillis();
//签名方法:支持hmacmd5、hmacsha1和hmacsha256。
String signMethod = "hmacsha1";
//userName组装方法,请参见AMQP客户端接入说明文档。
String userName = clientId + "-" + i + "|authMode=aksign"
+ ",signMethod=" + signMethod
+ ",timestamp=" + timeStamp
+ ",authId=" + accessKey
+ ",iotInstanceId=" + iotInstanceId
+ ",consumerGroupId=" + consumerGroupId
+ "|";
//计算签名,password组装方法,请参见AMQP客户端接入说明文档。
String signContent = "authId=" + accessKey + "×tamp=" + timeStamp;
String password = doSign(signContent, accessSecret, signMethod);
String connectionUrl = "failover:(amqps://" + host + ":5671?amqp.idleTimeout=80000)"
+ "?failover.reconnectDelay=30";
Hashtable<String, String> hashtable = new Hashtable<>();
hashtable.put("connectionfactory.SBCF", connectionUrl);
hashtable.put("queue.QUEUE", "default");
hashtable.put(Context.INITIAL_CONTEXT_FACTORY, "org.apache.qpid.jms.jndi.JmsInitialContextFactory");
Context context = new InitialContext(hashtable);
ConnectionFactory cf = (ConnectionFactory)context.lookup("SBCF");
Destination queue = (Destination)context.lookup("QUEUE");
// 创建连接。
Connection connection = cf.createConnection(userName, password);
connections.add(connection);
((JmsConnection)connection).addConnectionListener(myJmsConnectionListener);
// 创建会话。
// Session.CLIENT_ACKNOWLEDGE: 收到消息后,需要手动调用message.acknowledge()。
// Session.AUTO_ACKNOWLEDGE: SDK自动ACK(推荐)。
Session session = connection.createSession(false, Session.AUTO_ACKNOWLEDGE);
connection.start();
// 创建Receiver连接。
MessageConsumer consumer = session.createConsumer(queue);
consumer.setMessageListener(messageListener);
}
logger.info("amqp demo is started successfully, and will exit after 60s ");
// 结束程序运行
Thread.sleep(60 * 1000);
logger.info("run shutdown");
connections.forEach(c-> {
try {
c.close();
} catch (JMSException e) {
logger.error("failed to close connection", e);
}
});
executorService.shutdown();
if (executorService.awaitTermination(10, TimeUnit.SECONDS)) {
logger.info("shutdown success");
} else {
logger.info("failed to handle messages");
}
}
private static MessageListener messageListener = new MessageListener() {
@Override
public void onMessage(final Message message) {
try {
//1.收到消息之后一定要ACK。
// 推荐做法:创建Session选择Session.AUTO_ACKNOWLEDGE,这里会自动ACK。
// 其他做法:创建Session选择Session.CLIENT_ACKNOWLEDGE,这里一定要调message.acknowledge()来ACK。
// message.acknowledge();
//2.建议异步处理收到的消息,确保onMessage函数里没有耗时逻辑。
// 如果业务处理耗时过程过长阻塞住线程,可能会影响SDK收到消息后的正常回调。
executorService.submit(new Runnable() {
@Override
public void run() {
processMessage(message);
}
});
} catch (Exception e) {
logger.error("submit task occurs exception ", e);
}
}
};
/**
* 在这里处理您收到消息后的具体业务逻辑。
*/
private static void processMessage(Message message) {
try {
byte[] body = message.getBody(byte[].class);
String content = new String(body);
String topic = message.getStringProperty("topic");
String messageId = message.getStringProperty("messageId");
logger.info("receive message"
+ ",\n topic = " + topic
+ ",\n messageId = " + messageId
+ ",\n content = " + content);
} catch (Exception e) {
logger.error("processMessage occurs error ", e);
}
}
private static JmsConnectionListener myJmsConnectionListener = new JmsConnectionListener() {
/**
* 连接成功建立。
*/
@Override
public void onConnectionEstablished(URI remoteURI) {
logger.info("onConnectionEstablished, remoteUri:{}", remoteURI);
}
/**
* 尝试过最大重试次数之后,最终连接失败。
*/
@Override
public void onConnectionFailure(Throwable error) {
logger.error("onConnectionFailure, {}", error.getMessage());
}
/**
* 连接中断。
*/
@Override
public void onConnectionInterrupted(URI remoteURI) {
logger.info("onConnectionInterrupted, remoteUri:{}", remoteURI);
}
/**
* 连接中断后又自动重连上。
*/
@Override
public void onConnectionRestored(URI remoteURI) {
logger.info("onConnectionRestored, remoteUri:{}", remoteURI);
}
@Override
public void onInboundMessage(JmsInboundMessageDispatch envelope) {}
@Override
public void onSessionClosed(Session session, Throwable cause) {}
@Override
public void onConsumerClosed(MessageConsumer consumer, Throwable cause) {}
@Override
public void onProducerClosed(MessageProducer producer, Throwable cause) {}
};
/**
* 计算签名,password组装方法,请参见AMQP客户端接入说明文档。
*/
private static String doSign(String toSignString, String secret, String signMethod) throws Exception {
SecretKeySpec signingKey = new SecretKeySpec(secret.getBytes(), signMethod);
Mac mac = Mac.getInstance(signMethod);
mac.init(signingKey);
byte[] rawHmac = mac.doFinal(toSignString.getBytes());
return Base64.encodeBase64String(rawHmac);
}
}
下面是报错
"C:\Program Files\Java\jdk-20\bin\java.exe" "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA 2022.1.3\lib\idea_rt.jar=55153:C:\Program Files\JetBrains\IntelliJ IDEA 2022.1.3\bin" -Dfile.encoding=GBK -classpath C:\Users\Ayuan\Desktop\amqp-demo\amqp-demo\target\classes;C:\development_tools\apache-maven-3.8.6\repository\org\apache\qpid\qpid-jms-client\0.56.0\qpid-jms-client-0.56.0.jar;C:\development_tools\apache-maven-3.8.6\repository\org\apache\geronimo\specs\geronimo-jms_2.0_spec\1.0-alpha-2\geronimo-jms_2.0_spec-1.0-alpha-2.jar;C:\development_tools\apache-maven-3.8.6\repository\org\apache\qpid\proton-j\0.33.8\proton-j-0.33.8.jar;C:\development_tools\apache-maven-3.8.6\repository\io\netty\netty-buffer\4.1.55.Final\netty-buffer-4.1.55.Final.jar;C:\development_tools\apache-maven-3.8.6\repository\io\netty\netty-common\4.1.55.Final\netty-common-4.1.55.Final.jar;C:\development_tools\apache-maven-3.8.6\repository\io\netty\netty-handler\4.1.55.Final\netty-handler-4.1.55.Final.jar;C:\development_tools\apache-maven-3.8.6\repository\io\netty\netty-resolver\4.1.55.Final\netty-resolver-4.1.55.Final.jar;C:\development_tools\apache-maven-3.8.6\repository\io\netty\netty-codec\4.1.55.Final\netty-codec-4.1.55.Final.jar;C:\development_tools\apache-maven-3.8.6\repository\io\netty\netty-transport\4.1.55.Final\netty-transport-4.1.55.Final.jar;C:\development_tools\apache-maven-3.8.6\repository\io\netty\netty-transport-native-epoll\4.1.55.Final\netty-transport-native-epoll-4.1.55.Final-linux-x86_64.jar;C:\development_tools\apache-maven-3.8.6\repository\io\netty\netty-transport-native-unix-common\4.1.55.Final\netty-transport-native-unix-common-4.1.55.Final.jar;C:\development_tools\apache-maven-3.8.6\repository\io\netty\netty-transport-native-kqueue\4.1.55.Final\netty-transport-native-kqueue-4.1.55.Final-osx-x86_64.jar;C:\development_tools\apache-maven-3.8.6\repository\io\netty\netty-codec-http\4.1.55.Final\netty-codec-http-4.1.55.Final.jar;C:\development_tools\apache-maven-3.8.6\repository\commons-codec\commons-codec\1.10\commons-codec-1.10.jar;C:\development_tools\apache-maven-3.8.6\repository\org\slf4j\slf4j-api\1.7.5\slf4j-api-1.7.5.jar;C:\development_tools\apache-maven-3.8.6\repository\ch\qos\logback\logback-classic\1.2.3\logback-classic-1.2.3.jar;C:\development_tools\apache-maven-3.8.6\repository\ch\qos\logback\logback-core\1.2.3\logback-core-1.2.3.jar com.aliyun.iotx.demo.AmqpClient
23:12:30.064 [main] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Initiating initial connection attempt task
23:12:30.069 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[1] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:12:30.246 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.logging.InternalLoggerFactory - Using SLF4J as the default logging framework
23:12:30.276 [FailoverProvider: async work thread] DEBUG io.netty.channel.MultithreadEventLoopGroup - -Dio.netty.eventLoopThreads: 24
23:12:30.304 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.InternalThreadLocalMap - -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
23:12:30.305 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.InternalThreadLocalMap - -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
23:12:30.314 [FailoverProvider: async work thread] DEBUG io.netty.channel.nio.NioEventLoop - -Dio.netty.noKeySetOptimization: false
23:12:30.314 [FailoverProvider: async work thread] DEBUG io.netty.channel.nio.NioEventLoop - -Dio.netty.selectorAutoRebuildThreshold: 512
23:12:30.340 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent - Platform: Windows
23:12:30.343 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent0 - -Dio.netty.noUnsafe: false
23:12:30.343 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent0 - Java version: 20
23:12:30.346 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent0 - sun.misc.Unsafe.theUnsafe: available
23:12:30.347 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent0 - sun.misc.Unsafe.copyMemory: available
23:12:30.348 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.Buffer.address: available
23:12:30.352 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent0 - direct buffer constructor: unavailable
java.lang.UnsupportedOperationException: Reflective setAccessible(true) disabled
at io.netty.util.internal.ReflectionUtil.trySetAccessible(ReflectionUtil.java:31)
at io.netty.util.internal.PlatformDependent0$4.run(PlatformDependent0.java:238)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:319)
at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:232)
at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:293)
at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:92)
at io.netty.channel.nio.NioEventLoop.newTaskQueue0(NioEventLoop.java:279)
at io.netty.channel.nio.NioEventLoop.newTaskQueue(NioEventLoop.java:150)
at io.netty.channel.nio.NioEventLoop.<init>(NioEventLoop.java:138)
at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:146)
at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:37)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:58)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:47)
at io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:59)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:86)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:81)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:68)
at org.apache.qpid.jms.transports.netty.NettyTcpTransport.connect(NettyTcpTransport.java:151)
at org.apache.qpid.jms.provider.amqp.AmqpProvider.connect(AmqpProvider.java:230)
at org.apache.qpid.jms.provider.failover.FailoverProvider$14.run(FailoverProvider.java:747)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:577)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
at java.base/java.lang.Thread.run(Thread.java:1623)
23:12:30.355 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.Bits.unaligned: available, true
23:12:30.358 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent0 - jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable
java.lang.IllegalAccessException: class io.netty.util.internal.PlatformDependent0$6 cannot access class jdk.internal.misc.Unsafe (in module java.base) because module java.base does not export jdk.internal.misc to unnamed module @3c085c88
at java.base/jdk.internal.reflect.Reflection.newIllegalAccessException(Reflection.java:394)
at java.base/java.lang.reflect.AccessibleObject.checkAccess(AccessibleObject.java:709)
at java.base/java.lang.reflect.Method.invoke(Method.java:569)
at io.netty.util.internal.PlatformDependent0$6.run(PlatformDependent0.java:352)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:319)
at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:343)
at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:293)
at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:92)
at io.netty.channel.nio.NioEventLoop.newTaskQueue0(NioEventLoop.java:279)
at io.netty.channel.nio.NioEventLoop.newTaskQueue(NioEventLoop.java:150)
at io.netty.channel.nio.NioEventLoop.<init>(NioEventLoop.java:138)
at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:146)
at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:37)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:58)
at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:47)
at io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:59)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:86)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:81)
at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:68)
at org.apache.qpid.jms.transports.netty.NettyTcpTransport.connect(NettyTcpTransport.java:151)
at org.apache.qpid.jms.provider.amqp.AmqpProvider.connect(AmqpProvider.java:230)
at org.apache.qpid.jms.provider.failover.FailoverProvider$14.run(FailoverProvider.java:747)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:577)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
at java.base/java.lang.Thread.run(Thread.java:1623)
23:12:30.358 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.DirectByteBuffer.<init>(long, int): unavailable
23:12:30.358 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent - sun.misc.Unsafe: available
23:12:30.377 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent - maxDirectMemory: 4139778048 bytes (maybe)
23:12:30.378 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.tmpdir: C:\Users\Ayuan\AppData\Local\Temp (java.io.tmpdir)
23:12:30.378 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.bitMode: 64 (sun.arch.data.model)
23:12:30.380 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.maxDirectMemory: -1 bytes
23:12:30.380 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.uninitializedArrayAllocationThreshold: -1
23:12:30.383 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.CleanerJava9 - java.nio.ByteBuffer.cleaner(): available
23:12:30.384 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.noPreferDirect: false
23:12:30.398 [FailoverProvider: async work thread] DEBUG io.netty.util.internal.PlatformDependent - org.jctools-core.MpscChunkedArrayQueue: available
23:12:30.479 [FailoverProvider: async work thread] DEBUG io.netty.channel.DefaultChannelId - -Dio.netty.processId: 7800 (auto-detected)
23:12:30.483 [FailoverProvider: async work thread] DEBUG io.netty.util.NetUtil - -Djava.net.preferIPv4Stack: false
23:12:30.483 [FailoverProvider: async work thread] DEBUG io.netty.util.NetUtil - -Djava.net.preferIPv6Addresses: false
23:12:30.502 [FailoverProvider: async work thread] DEBUG io.netty.util.NetUtilInitializations - Loopback interface: lo (Software Loopback Interface 1, 127.0.0.1)
23:12:30.504 [FailoverProvider: async work thread] DEBUG io.netty.util.NetUtil - Failed to get SOMAXCONN from sysctl and file \proc\sys\net\core\somaxconn. Default: 200
23:12:30.553 [FailoverProvider: async work thread] DEBUG io.netty.channel.DefaultChannelId - -Dio.netty.machineId: d8:c0:a6:ff:fe:a3:c9:17 (auto-detected)
23:12:30.578 [FailoverProvider: async work thread] DEBUG io.netty.util.ResourceLeakDetector - -Dio.netty.leakDetection.level: simple
23:12:30.578 [FailoverProvider: async work thread] DEBUG io.netty.util.ResourceLeakDetector - -Dio.netty.leakDetection.targetRecords: 4
23:12:30.623 [FailoverProvider: async work thread] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.numHeapArenas: 24
23:12:30.624 [FailoverProvider: async work thread] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.numDirectArenas: 24
23:12:30.624 [FailoverProvider: async work thread] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.pageSize: 8192
23:12:30.624 [FailoverProvider: async work thread] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxOrder: 11
23:12:30.624 [FailoverProvider: async work thread] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.chunkSize: 16777216
23:12:30.624 [FailoverProvider: async work thread] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.smallCacheSize: 256
23:12:30.624 [FailoverProvider: async work thread] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.normalCacheSize: 64
23:12:30.624 [FailoverProvider: async work thread] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedBufferCapacity: 32768
23:12:30.624 [FailoverProvider: async work thread] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.cacheTrimInterval: 8192
23:12:30.624 [FailoverProvider: async work thread] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.cacheTrimIntervalMillis: 0
23:12:30.624 [FailoverProvider: async work thread] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.useCacheForAllThreads: true
23:12:30.624 [FailoverProvider: async work thread] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedByteBuffersPerChunk: 1023
23:12:30.644 [FailoverProvider: async work thread] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.allocator.type: pooled
23:12:30.644 [FailoverProvider: async work thread] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.threadLocalDirectBufferSize: 0
23:12:30.644 [FailoverProvider: async work thread] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.maxThreadLocalCharBufferSize: 16384
23:12:31.084 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[1] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
23:12:31.129 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[2] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:12:31.136 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[2] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
23:12:31.209 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[3] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:12:31.217 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[3] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
23:12:31.353 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[4] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:12:31.361 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[4] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
23:12:31.609 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[5] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:12:31.614 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[5] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
23:12:32.104 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[6] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:12:32.111 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[6] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
23:12:33.073 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[7] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:12:33.078 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[7] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
23:12:35.009 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[8] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:12:35.016 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[8] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
23:12:38.869 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[9] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:12:38.874 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[9] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
23:12:46.570 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[10] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:12:46.614 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[10] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
23:12:46.615 [FailoverProvider: async work thread] WARN org.apache.qpid.jms.provider.failover.FailoverProvider - Failed to connect after: 10 attempt(s) continuing to retry.
23:13:01.990 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[11] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:13:02.032 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[11] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
23:13:32.040 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[12] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:13:32.084 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[12] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
23:14:02.096 [FailoverProvider: async work thread] DEBUG org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[13] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 in-progress
23:14:02.134 [FailoverProvider: async work thread] INFO org.apache.qpid.jms.provider.failover.FailoverProvider - Connection attempt:[13] to: amqps://1248579809886774.iot-amqp.cn.shanghai.aliyuncs.com:5671 failed
进程已结束,退出代码130
可能有多种原因导致Java SDK连接AMQP失败,以下是一些常见的问题和解决方法:
确认配置信息是否正确:检查连接字符串、用户名、密码、交换机名称等配置信息是否填写正确。
确认端口号是否正确:如果使用了默认端口号,可以尝试使用其他可用的端口号进行连接。
确认网络连接是否正常:检查网络连接是否正常,确保能够访问AMQP服务器。
确认防火墙设置是否允许连接:如果使用了防火墙,需要确保已经允许Java SDK连接AMQP服务器的端口。
确认AMQP服务器是否启动:如果以上步骤都已经确认无误,但是仍然无法连接AMQP服务器,可能是AMQP服务器没有启动或者出现了其他故障。
尝试使用其他Java SDK版本:如果以上方法都无法解决问题,可以尝试使用其他版本的Java SDK进行连接。
对于连接AMQP失败,可以尝试以下几步:
关于反射错误,可能是因为Java代码中使用了反射机制,但是没有正确引用或者调用反射相关的类和方法。需要检查Java代码中是否存在相关问题,进行修复。
另外,建议在开发过程中加入日志记录,以便查看连接和错误信息,有助于定位问题。
楼主你好,连接不上的问题可能是由于以下原因引起的:
网络问题,检查设备是否与互联网连接正常,以及防火墙设置等。
帐号和密码错误,检查AMQP的帐号和密码是否正确。
路由器或防火墙阻止了AMQP协议连接,需要在路由器或防火墙中打开AMQP协议端口。
AMQP服务端出现故障,可以联系阿里云技术支持进行排查。
关于反射错误的问题,可以提供更详细的错误信息来进行分析。
我看代码中没有引入AMQP库,可能是缺少依赖造成的错误。请尝试添加以下依赖:
<dependencies>
<dependency>
<groupId>org.apache.qpid</groupId>
<artifactId>proton-j</artifactId>
<version>0.33.10</version>
</dependency>
<dependency>
<groupId>org.apache.qpid</groupId>
<artifactId>qpid-jms-client</artifactId>
<version>0.60.0</version>
</dependency>
</dependencies>
另外,如果仍然遇到连接问题,可以尝试检查以下点:
根据提供的信息,很难确定问题的具体原因。但是,根据用户提供的错误描述,可以注意以下几点可能会导致问题:
1、阿里云Iot中AMQP的示例代码是否与AMQP版本兼容。AMQP有多个版本,不同版本之间可能存在差异。请确保示例代码与使用的AMQP版本兼容。
2、阿里云Iot中AMQP的示例代码是否正确设置认证信息。AMQP连接需要进行认证,如果认证信息不正确或缺失,连接将无法建立。请确保示例代码正确设置认证信息。
3、阿里云Iot中AMQP的示例代码是否正确设置连接参数。AMQP连接需要设置多个参数,例如连接超时时间、心跳时间等。请确保示例代码正确设置连接参数。
4、阿里云Iot中AMQP的示例代码是否正确设置队列或交换机等AMQP实体。AMQP需要预先声明队列或交换机等实体,否则无法发送或接收消息。请确保示例代码正确设置AMQP实体。
5、反射错误可能是由于运行时类加载器不同导致的。请检查示例代码是否在正确的类加载器下运行。
如果以上几点都已经检查过了,还是无法解决问题,建议尝试其他连接方式,例如MQTT或HTTP。同时,可以尝试查看阿里云Iot的官方文档或社区,寻求更多的帮助和建议。
版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。
涵盖 RocketMQ、Kafka、RabbitMQ、MQTT、轻量消息队列(原MNS) 的消息队列产品体系,全系产品 Serverless 化。RocketMQ 一站式学习:https://rocketmq.io/