前言
本篇记录本人通过AWS KVS with WebRTC实现P2P视频需求时在开发过程中遇到的一些问题并分析原因后提供解决方案,也希望对遇到相同问题的开发人员可以提供一些帮助。而在上一篇文章中有关于AWS KVS和WebRTC相关的知识,不了解的可以前往【Android使用AWS KVS & WebRTC实现P2P需要掌握的知识点】了解一下。
问题一:拉流端黑屏
描述:三星、一加、小米等机型可以拉取到视频流,而像oppo、vivo、华为等(包括google的手机)会显示黑屏现象。
项目关键信息描述:
-
aws-android-sdk-xxx:2.50.1(亚马逊SDK)
-
org.webrtc:google-webrtc:1.0.32006(Google WebRTC SDK)
-
推流端(master): 由于本项目是以机器作为master,而且仅支持H264编码器,因此作为拉流端(viewer)的APP也必须支持使用H264解码器 (目前webrtc源码中仅支持部分大厂机型,因此即使我们的手机硬件支持,也可能导致无法使用H264,从而形成黑屏现象)
原因:目前Android端使用的WebRTC仅支持硬件上 H.264 解码和编码,并且仅支持部分芯片组。因此,如果设备不支持硬件 H.264 或具有不受支持的芯片组,您将只能使用 VP8、VP9。支持的芯片组仅有OMX.qcom. 和OMX.Exynos. (骁龙芯片),不支持的需要自行去添加。
// webrtc 源码中判断是否支持H264的代码
private boolean isH264HighProfileSupported(MediaCodecInfo info) {
String name = info.getName();
if (VERSION.SDK_INT >= 21 && name.startsWith("OMX.qcom.")) {
return true;
} else {
return VERSION.SDK_INT >= 23 && name.startsWith("OMX.Exynos.");
}
}
tips:可以查看项目运行日志:调用createOffer后sdp offer中的信息和接收到sdp answer的信息来分析master和viewer端所支持的编解码器有哪些
解决方法:
既然是因为webrtc源码中的解码器有限制,那我们的解决方法就是自定义解码器工厂,加入自己的逻辑判断。这里就直接放代码了
- 自定义硬解视频编码工厂(其实就是将源码中的HardwareVideoEncoderFactory复制过来进行修改得到的)
package org.webrtc;
import android.media.MediaCodecInfo;
import android.media.MediaCodecList;
import android.os.Build;
import androidx.annotation.Nullable;
import static org.webrtc.MediaCodecUtils.EXYNOS_PREFIX;
import static org.webrtc.MediaCodecUtils.INTEL_PREFIX;
import static org.webrtc.MediaCodecUtils.QCOM_PREFIX;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
/**
* 自定义硬解视频编码工厂
* 用于解决WebRTC支持H264编码;
* {@link CustomHardwareVideoEncoderFactory#isHardwareSupportedInCurrentSdkH264(MediaCodecInfo)}
* 目前源码中仅支持部分大厂机型,导致即使我们的手机硬件支持,也可能导致无法使用H264
* 用于解决sdp中无H264信息
*/
@SuppressWarnings("deprecation") // API 16 requires the use of deprecated methods.
public class CustomHardwareVideoEncoderFactory implements VideoEncoderFactory {
private static final String TAG = "CustomHardwareVideoEncoderFactory";
// Forced key frame interval - used to reduce color distortions on Qualcomm platforms.
private static final int QCOM_VP8_KEY_FRAME_INTERVAL_ANDROID_L_MS = 15000;
private static final int QCOM_VP8_KEY_FRAME_INTERVAL_ANDROID_M_MS = 20000;
private static final int QCOM_VP8_KEY_FRAME_INTERVAL_ANDROID_N_MS = 15000;
/**
* 默认支持对OMX.google.xxx 的匹配 ,如:OMX.google.h264.encoder
* 目前大部分手机都支持OMX.google.xxx ;
*/
static final String GOOGLE_PREFIX = "OMX.google.";
// List of devices with poor H.264 encoder quality.
// HW H.264 encoder on below devices has poor bitrate control - actual
// bitrates deviates a lot from the target value.
private static final List<String> H264_HW_EXCEPTION_MODELS =
Arrays.asList("SAMSUNG-SGH-I337", "Nexus 7", "Nexus 4");
@Nullable
private final EglBase14.Context sharedContext;
private final boolean enableIntelVp8Encoder;
private final boolean enableH264HighProfile;
@Nullable
private final Predicate<MediaCodecInfo> codecAllowedPredicate;
@Nullable
private final VideoEncoderSupportedCallback videoEncoderSupportedCallback;
/**
* Creates a HardwareVideoEncoderFactory that supports surface texture encoding.
*
* @param sharedContext The textures generated will be accessible from this context. May be null,
* this disables texture support.
* @param enableIntelVp8Encoder true if Intel's VP8 encoder enabled.
* @param enableH264HighProfile true if H264 High Profile enabled.
* @param videoEncoderSupportedCallback
*/
public CustomHardwareVideoEncoderFactory(
EglBase.Context sharedContext, boolean enableIntelVp8Encoder, boolean enableH264HighProfile, @Nullable VideoEncoderSupportedCallback videoEncoderSupportedCallback) {
this(sharedContext, enableIntelVp8Encoder, enableH264HighProfile,
/* codecAllowedPredicate= */ null, videoEncoderSupportedCallback);
}
/**
* Creates a HardwareVideoEncoderFactory that supports surface texture encoding.
*
* @param sharedContext The textures generated will be accessible from this context. May be null,
* this disables texture support.
* @param enableIntelVp8Encoder true if Intel's VP8 encoder enabled.
* @param enableH264HighProfile true if H264 High Profile enabled.
* @param codecAllowedPredicate optional predicate to filter codecs. All codecs are allowed
* when predicate is not provided.
* @param videoEncoderSupportedCallback
*/
public CustomHardwareVideoEncoderFactory(EglBase.Context sharedContext, boolean enableIntelVp8Encoder,
boolean enableH264HighProfile, @Nullable Predicate<MediaCodecInfo> codecAllowedPredicate,
@Nullable VideoEncoderSupportedCallback videoEncoderSupportedCallback) {
// Texture mode requires EglBase14.
if (sharedContext instanceof EglBase14.Context) {
this.sharedContext = (EglBase14.Context) sharedContext;
} else {
Logging.w(TAG, "No shared EglBase.Context. Encoders will not use texture mode.");
this.sharedContext = null;
}
this.enableIntelVp8Encoder = enableIntelVp8Encoder;
this.enableH264HighProfile = enableH264HighProfile;
this.codecAllowedPredicate = codecAllowedPredicate;
this.videoEncoderSupportedCallback = videoEncoderSupportedCallback;
}
@Deprecated
public CustomHardwareVideoEncoderFactory(boolean enableIntelVp8Encoder, boolean enableH264HighProfile) {
this(null, enableIntelVp8Encoder, enableH264HighProfile, null);
}
@Nullable
@Override
public VideoEncoder createEncoder(VideoCodecInfo input) {
// HW encoding is not supported below Android Kitkat.
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.KITKAT) {
return null;
}
VideoCodecMimeType type = VideoCodecMimeType.valueOf(input.name);
MediaCodecInfo info = findCodecForType(type);
if (info == null) {
return null;
}
String codecName = info.getName();
String mime = type.mimeType();
Integer surfaceColorFormat = MediaCodecUtils.selectColorFormat(
MediaCodecUtils.TEXTURE_COLOR_FORMATS, info.getCapabilitiesForType(mime));
Integer yuvColorFormat = MediaCodecUtils.selectColorFormat(
MediaCodecUtils.ENCODER_COLOR_FORMATS, info.getCapabilitiesForType(mime));
if (type == VideoCodecMimeType.H264) {
boolean isHighProfile = H264Utils.isSameH264Profile(
input.params, MediaCodecUtils.getCodecProperties(type, /* highProfile= */ true));
boolean isBaselineProfile = H264Utils.isSameH264Profile(
input.params, MediaCodecUtils.getCodecProperties(type, /* highProfile= */ false));
if (!isHighProfile && !isBaselineProfile) {
return null;
}
if (isHighProfile && !isH264HighProfileSupported(info)) {
return null;
}
}
return new HardwareVideoEncoder(new MediaCodecWrapperFactoryImpl(), codecName, type,
surfaceColorFormat, yuvColorFormat, input.params, getKeyFrameIntervalSec(type),
getForcedKeyFrameIntervalMs(type, codecName), createBitrateAdjuster(type, codecName),
sharedContext);
}
@Override
public VideoCodecInfo[] getSupportedCodecs() {
// HW encoding is not supported below Android Kitkat.
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.KITKAT) {
return new VideoCodecInfo[0];
}
List<VideoCodecInfo> supportedCodecInfos = new ArrayList<>();
// Generate a list of supported codecs in order of preference:
// VP8, VP9, H264 (high profile), H264 (baseline profile).
for (VideoCodecMimeType type : new VideoCodecMimeType[]{VideoCodecMimeType.VP8,
VideoCodecMimeType.VP9, VideoCodecMimeType.H264}) {
MediaCodecInfo codec = findCodecForType(type);
if (codec != null) {
String name = type.name();
// TODO(sakal): Always add H264 HP once WebRTC correctly removes codecs that are not
// supported by the decoder.
if (type == VideoCodecMimeType.H264 && isH264HighProfileSupported(codec)) {
supportedCodecInfos.add(new VideoCodecInfo(
name, MediaCodecUtils.getCodecProperties(type, /* highProfile= */ true)));
}
supportedCodecInfos.add(new VideoCodecInfo(
name, MediaCodecUtils.getCodecProperties(type, /* highProfile= */ false)));
}
}
return supportedCodecInfos.toArray(new VideoCodecInfo[0]);
}
@Nullable
private MediaCodecInfo findCodecForType(VideoCodecMimeType type) {
int codecCount = MediaCodecList.getCodecCount();
for (int i = 0; i < codecCount; ++i) {
MediaCodecInfo info = null;
try {
info = MediaCodecList.getCodecInfoAt(i);
} catch (IllegalArgumentException e) {
Logging.e(TAG, "Cannot retrieve encoder codec info", e);
}
if (info == null || !info.isEncoder()) {
continue;
}
if (isSupportedCodec(info, type)) {
return info;
}
}
// No support for this type.
return null;
}
// Returns true if the given MediaCodecInfo indicates a supported encoder for the given type.
private boolean isSupportedCodec(MediaCodecInfo info, VideoCodecMimeType type) {
if (!MediaCodecUtils.codecSupportsType(info, type)) {
return false;
}
// Check for a supported color format.
if (MediaCodecUtils.selectColorFormat(
MediaCodecUtils.ENCODER_COLOR_FORMATS, info.getCapabilitiesForType(type.mimeType()))
== null) {
return false;
}
return isHardwareSupportedInCurrentSdk(info, type) && isMediaCodecAllowed(info);
}
// Returns true if the given MediaCodecInfo indicates a hardware module that is supported on the
// current SDK.
private boolean isHardwareSupportedInCurrentSdk(MediaCodecInfo info, VideoCodecMimeType type) {
switch (type) {
case VP8:
return isHardwareSupportedInCurrentSdkVp8(info);
case VP9:
return isHardwareSupportedInCurrentSdkVp9(info);
case H264:
return isHardwareSupportedInCurrentSdkH264(info);
}
return false;
}
private boolean isHardwareSupportedInCurrentSdkVp8(MediaCodecInfo info) {
String name = info.getName();
// QCOM Vp8 encoder is supported in KITKAT or later.
boolean isSupported = (name.startsWith(QCOM_PREFIX) && Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT)
// Exynos VP8 encoder is supported in M or later.
|| (name.startsWith(EXYNOS_PREFIX) && Build.VERSION.SDK_INT >= Build.VERSION_CODES.M)
// Intel Vp8 encoder is supported in LOLLIPOP or later, with the intel encoder enabled.
|| (name.startsWith(INTEL_PREFIX) && Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP
&& enableIntelVp8Encoder);
if (isSupported) {
return true;
} else {
//自行判断是否支持VP8
return videoEncoderSupportedCallback != null && videoEncoderSupportedCallback.isSupportedVp8(info);
}
}
private boolean isHardwareSupportedInCurrentSdkVp9(MediaCodecInfo info) {
String name = info.getName();
boolean isSupported = (name.startsWith(QCOM_PREFIX) || name.startsWith(EXYNOS_PREFIX))
// Both QCOM and Exynos VP9 encoders are supported in N or later.
&& Build.VERSION.SDK_INT >= Build.VERSION_CODES.N;
if (isSupported) {
return true;
} else {
//自行判断是否支持VP9
return videoEncoderSupportedCallback != null && videoEncoderSupportedCallback.isSupportedVp9(info);
}
}
private boolean isHardwareSupportedInCurrentSdkH264(MediaCodecInfo info) {
// First, H264 hardware might perform poorly on this model.
if (H264_HW_EXCEPTION_MODELS.contains(Build.MODEL)) {
return false;
}
String name = info.getName();
// QCOM H264 encoder is supported in KITKAT or later.
boolean isSupported = (name.startsWith(QCOM_PREFIX) && Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) ||
// Exynos H264 encoder is supported in LOLLIPOP or later.
(name.startsWith(EXYNOS_PREFIX) && Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) ||
//目前大部分手机都支持OMX.google.xxx,添加默认支持
(name.startsWith(GOOGLE_PREFIX) && Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP);
if (isSupported) {
return true;
} else {
//自行判断是否支持H264
return videoEncoderSupportedCallback != null && videoEncoderSupportedCallback.isSupportedH264(info);
}
}
private boolean isMediaCodecAllowed(MediaCodecInfo info) {
if (codecAllowedPredicate == null) {
return true;
}
return codecAllowedPredicate.test(info);
}
private int getKeyFrameIntervalSec(VideoCodecMimeType type) {
switch (type) {
case VP8:// Fallthrough intended.
case VP9:
return 100;
case H264:
return 20;
}
throw new IllegalArgumentException("Unsupported VideoCodecMimeType " + type);
}
private int getForcedKeyFrameIntervalMs(VideoCodecMimeType type, String codecName) {
if (type == VideoCodecMimeType.VP8 && codecName.startsWith(QCOM_PREFIX)) {
if (Build.VERSION.SDK_INT == Build.VERSION_CODES.LOLLIPOP
|| Build.VERSION.SDK_INT == Build.VERSION_CODES.LOLLIPOP_MR1) {
return QCOM_VP8_KEY_FRAME_INTERVAL_ANDROID_L_MS;
} else if (Build.VERSION.SDK_INT == Build.VERSION_CODES.M) {
return QCOM_VP8_KEY_FRAME_INTERVAL_ANDROID_M_MS;
} else if (Build.VERSION.SDK_INT > Build.VERSION_CODES.M) {
return QCOM_VP8_KEY_FRAME_INTERVAL_ANDROID_N_MS;
}
}
// Other codecs don't need key frame forcing.
return 0;
}
private BitrateAdjuster createBitrateAdjuster(VideoCodecMimeType type, String codecName) {
if (codecName.startsWith(EXYNOS_PREFIX)) {
if (type == VideoCodecMimeType.VP8) {
// Exynos VP8 encoders need dynamic bitrate adjustment.
return new DynamicBitrateAdjuster();
} else {
// Exynos VP9 and H264 encoders need framerate-based bitrate adjustment.
return new FramerateBitrateAdjuster();
}
}
// Other codecs don't need bitrate adjustment.
return new BaseBitrateAdjuster();
}
private boolean isH264HighProfileSupported(MediaCodecInfo info) {
return enableH264HighProfile && Build.VERSION.SDK_INT > Build.VERSION_CODES.M
&& info.getName().startsWith(EXYNOS_PREFIX);
}
}
- 解码器工厂扩展
package org.webrtc;
import android.media.MediaCodecInfo
@JvmOverloads
fun createCustomVideoEncoderFactory(
eglContext: EglBase.Context?,
enableIntelVp8Encoder: Boolean,
enableH264HighProfile: Boolean,
codecAllowedPredicate: Predicate<MediaCodecInfo>? = null,
videoEncoderSupportedCallback: VideoEncoderSupportedCallback? = null
): DefaultVideoEncoderFactory = DefaultVideoEncoderFactory(
CustomHardwareVideoEncoderFactory(
eglContext,
enableIntelVp8Encoder,
enableH264HighProfile,
codecAllowedPredicate,
videoEncoderSupportedCallback
)
)
fun createCustomVideoEncoderFactory(
eglContext: EglBase.Context?,
enableIntelVp8Encoder: Boolean,
enableH264HighProfile: Boolean,
videoEncoderSupportedCallback: VideoEncoderSupportedCallback?
): DefaultVideoEncoderFactory = createCustomVideoEncoderFactory(
eglContext,
enableIntelVp8Encoder,
enableH264HighProfile,
null,
videoEncoderSupportedCallback
)
- 回调(用于控制开启哪些解码器)
package org.webrtc;
import android.media.MediaCodecInfo;
import androidx.annotation.NonNull;
public interface VideoEncoderSupportedCallback {
/**
* 注意当前{@link android.os.Build.VERSION#SDK_INT} 是否支持
* {@link CustomHardwareVideoEncoderFactory#isHardwareSupportedInCurrentSdkVp8(MediaCodecInfo)}
*
* @param info 编码器信息
* @return 是否支持VP8
*/
default boolean isSupportedVp8(@NonNull MediaCodecInfo info) {
return false;
}
/**
* 注意当前{@link android.os.Build.VERSION#SDK_INT} 是否支持
* {@link CustomHardwareVideoEncoderFactory#isHardwareSupportedInCurrentSdkVp9(MediaCodecInfo)}
*
* @param info 编码器信息
* @return 是否支持VP9
*/
default boolean isSupportedVp9(@NonNull MediaCodecInfo info) {
return false;
}
/**
* 注意当前{@link android.os.Build.VERSION#SDK_INT} 是否支持
* {@link CustomHardwareVideoEncoderFactory#isHardwareSupportedInCurrentSdkH264(MediaCodecInfo)}
*
* 注意:华为手机海思(OMX.hisi.video.encoder.avc)尽量不要使用,H264编码有问题。
*
* @param info 编码器信息
* @return 是否支持H264
*/
boolean isSupportedH264(@NonNull MediaCodecInfo info);
}
- 替换项目中的解码器
(1)替换前代码
PeerConnectionFactory.initialize(
PeerConnectionFactory.InitializationOptions
.builder(context)
.createInitializationOptions()
)
peerConnectionFactory = PeerConnectionFactory.builder()
.setVideoDecoderFactory(DefaultVideoDecoderFactory(rootEglBase!!.eglBaseContext))
.setVideoEncoderFactory(
DefaultVideoEncoderFactory(
rootEglBase!!.eglBaseContext,
true/*ENABLE_INTEL_VP8_ENCODER*/,
true/*ENABLE_H264_HIGH_PROFILE*/
)
)
.createPeerConnectionFactory()
(2)替换后代码(注意:具体使用哪个编码器需要根据master端来决定,或者可以将以下主流的编码器都启用)
PeerConnectionFactory.initialize(
PeerConnectionFactory.InitializationOptions
.builder(context)
.createInitializationOptions()
)
// 创建自定义解码器工厂
val encoderFactory = createCustomVideoEncoderFactory(rootEglBase!!.eglBaseContext,
enableIntelVp8Encoder = true,
enableH264HighProfile = true,
videoEncoderSupportedCallback = object : VideoEncoderSupportedCallback {
override fun isSupportedVp8(info: MediaCodecInfo): Boolean {
//不支持VP8编码
return false
}
override fun isSupportedVp9(info: MediaCodecInfo): Boolean {
//不支持VP9编码
return false
}
override fun isSupportedH264(info: MediaCodecInfo): Boolean {
//支持H264编码
return true
}
})
peerConnectionFactory = PeerConnectionFactory.builder()
.setVideoDecoderFactory(DefaultVideoDecoderFactory(rootEglBase!!.eglBaseContext))
.setVideoEncoderFactory(encoderFactory) // 使用自定义的解码器工厂替换
.createPeerConnectionFactory()
问题二
描述:项目中使用的是亚马逊的KVS SDK来进行P2P视频传输的,而且本项目中没有按照官方文档去配置并生成awsconfiguration.json文件放入项目中,而是通过后端提供的接口请求来获取awsAccessKey、awsSecretKey、sessionToken和region信息;所以这就需要修改亚马逊提供的demo中的接入方式了.demo github url
问题详解:由于不使用配置awsconfiguration.json文件的方式,所以在项目中就需要将使用到json文件中信息的地方进行替换了;目前发现需要修改的地方有三处,而我在开发时因为漏了一处修改,导致在开发时我是可以从设备端拉取到视频流的,而测试的同事在测试过程中却怎么都拉取不到视频流。最后在不断验证后,发现是因为设备端和app端所使用的wifi不是同一个导致拉流失败,而产生这个问题的原因就是因为代码中在配置TURN服务时依然是默认去使用awsconfiguration.json文件提供的配置信息,而由于本项目中没有awsconfiguration.json文件了,所以TURN服务失效,最后发生设备端和APP端使用不同的网络时无法拉取视频流。(TURN服务是一个中转服务,就是用来解决跨网络数据传输的)。
解决方法:以下修改是以demo进行的,具体项目中的实现就需要各位自己去将demo中的代码迁移到自己的工程中了,而且官方提供的demo是master和viewer都有实现的。
- 修改(一、二):在demo中的StreamWebRtcConfigurationFragment.class类中找到以下内容
修改前:
private AWSKinesisVideoClient getAwsKinesisVideoClient(final String region) {
final AWSKinesisVideoClient awsKinesisVideoClient = new AWSKinesisVideoClient(
KinesisVideoWebRtcDemoApp.getCredentialsProvider().getCredentials());
awsKinesisVideoClient.setRegion(Region.getRegion(region));
awsKinesisVideoClient.setSignerRegionOverride(region);
awsKinesisVideoClient.setServiceNameIntern("kinesisvideo");
return awsKinesisVideoClient;
}
private AWSKinesisVideoSignalingClient getAwsKinesisVideoSignalingClient(final String region, final String endpoint) {
final AWSKinesisVideoSignalingClient client = new AWSKinesisVideoSignalingClient(
KinesisVideoWebRtcDemoApp.getCredentialsProvider().getCredentials());
client.setRegion(Region.getRegion(region));
client.setSignerRegionOverride(region);
client.setServiceNameIntern("kinesisvideo");
client.setEndpoint(endpoint);
return client;
}
修改后:
// BasicSessionCredentials(awsAccessKey, awsSecretKey, sessionToken)里面的key和token就是改为通过请求后端接口获取
private AWSKinesisVideoClient getAwsKinesisVideoClient(final String region) {
final AWSKinesisVideoClient awsKinesisVideoClient = new AWSKinesisVideoClient(
new BasicSessionCredentials(awsAccessKey, awsSecretKey, sessionToken)
);
awsKinesisVideoClient.setRegion(Region.getRegion(region));
awsKinesisVideoClient.setSignerRegionOverride(region);
awsKinesisVideoClient.setServiceNameIntern("kinesisvideo");
return awsKinesisVideoClient;
}
// 这里跟上面一样,并且这里就是本人在开发中出现跨网络传输视频流失败的原因
// 此方法是用来提供KVS信令通道客户端的,demo中会通过该客户端来获取Ice Server配置信息,并且会影响到TURN中转服务的使用
private AWSKinesisVideoSignalingClient getAwsKinesisVideoSignalingClient(final String region, final String endpoint) {
final AWSKinesisVideoSignalingClient client = new AWSKinesisVideoSignalingClient(
new BasicSessionCredentials(Contants.tmpSecretId, Contants.tmpSecretKey, Contants.sessionToken)
);
client.setRegion(Region.getRegion(region));
client.setSignerRegionOverride(region);
client.setServiceNameIntern("kinesisvideo");
client.setEndpoint(endpoint);
return client;
}
- 修改(三):在demo中的WebRtcActivity.class类中的initWsConnection()方法中修改以下内容
修改前:
runOnUiThread(new Runnable() {
@Override
public void run() {
mCreds = KinesisVideoWebRtcDemoApp.getCredentialsProvider().getCredentials();
}
});
修改后:
runOnUiThread(new Runnable() {
@Override
public void run() {
//mCreds = KinesisVideoWebRtcDemoApp.getCredentialsProvider().getCredentials();
// 这里的key和token也是通过请求后端接口获取到的
mCreds = new BasicSessionCredentials(awsAccessKey, awsSecretKey, sessionToken);
}
});
总结
以上是我在开发中所遇到的问题点,也是第一次接触AWS KVS和WebRTC,所以在此记录所遇到问题的解决方法;但实际开发中每个人所遇到的问题可能不一样的。如果你也是第一次接触这些,一定要去仔细阅读文档并了解官方Demo的逻辑,这样会大大减少bug的产生。最后,如果文章对你有帮助,麻烦点个赞支持鼓励一下吧!!!