webrtc视频的采集,编码,发送流程详细分析(希望对底层深入了解的朋友有所帮助)

webrtc视频的采集,编码,发送流程详细分析(希望对底层深
⼊了解的朋友有所帮助)
----------------------------------------------------------------------------------------------------------------------------------------
-----------------------------------------------------------------------------------------------------------------------------------------
webrtc 视频的采集,编码,发送流程详细分析
我写⽂章⼀般是两个思路:
1. 下⼀步要调⽤什么对象的⽅法
2.  这⼀步的对象,怎么关联到下⼀步的对象的流程分析
这⼀步的流程主要阐述怎么关联下⼀步的对象的流程分析,当然这⼀步做了什么具体的⼯作,不能
详细展⽰,否则,太庞⼤了,需要各位朋友针对重点的部分,⾃⼰揣摩了。系统温度监控
//-----------------------------------------------------------------------------
//
// 视频编码器的创建以及与源的关联
标本缸
//
//-----------------------------------------------------------------------------
1. Java
PeerConnectionClient::createPeerConnectionInternal
peerConnection.addTrack(createVideoTrack(videoCapturer), mediaStreamLabels);
1.1. createVideoTrack 的建⽴
PeerConnectionClient::createVideoTrack(VideoCapturer capturer)
//---------------------------------------------------------
水力冲洗门
// 这个⾥⾯会在 Jni 层创建⼀个 AndroidVideoTrackSource 对象
//---------------------------------------------------------
videoSource = ateVideoSource(capturer.isScreencast());
//--------------------------------------------------------
// 这个会为 source 绑定⼀个 track 对象
/
/---------------------------------------------------------
localVideoTrack = ateVideoTrack(VIDEO_TRACK_ID, videoSource);
1.2
PeerConnectionFactory::createVideoTrack(String id, VideoSource source)
checkPeerConnectionFactoryExists();
return new VideoTrack(nativeCreateVideoTrack(nativeFactory, id, NativeVideoTrackSource()));
1.3
nativeCreateVideoTrack
1.4
Java_org_webrtc_PeerConnectionFactory_nativeCreateVideoTrack(JNIEnv* env, jclass jcaller, jlong factory, jstring id, jlong nativeVideoSource) {        return JNI_PeerConnectionFactory_CreateVideoTr
ack(env, factory, base::android::JavaParamRef<jstring>(env, id), nativeVideoSource);
}
1.5 ./sdk/android/src/jni/pc/peer_
JNI_PeerConnectionFactory_CreateVideoTrack(JNIEnv* jni, jlong native_factory, const JavaParamRef<jstring>& id, jlong native_source) {
rtc::scoped_refptr<VideoTrackInterface> track = PeerConnectionFactoryFromJava(native_factory)->CreateVideoTrack(
JavaToStdString(jni, id), reinterpret_cast<VideoTrackSourceInterface*>(native_source));
return lease());
}
native_source 就是 AndroidVideoTrackSource
1.6 ./pc/peer_
rtc::scoped_refptr<VideoTrackInterface> PeerConnectionFactory::CreateVideoTrack(const std::string& id, VideoTrackSourceInterface* source) {        RTC_DCHECK(signaling_thread_->IsCurrent());
rtc::scoped_refptr<VideoTrackInterface> track(VideoTrack::Create(id, source, worker_thread_));
return VideoTrackProxy::Create(signaling_thread_, worker_thread_, track);
}
1.7 ./pc/
rtc::scoped_refptr<VideoTrack> VideoTrack::Create(const std::string& id, VideoTrackSourceInterface* source, rtc::Thread* worker_thread) {
rtc::RefCountedObject<VideoTrack>* track = new rtc::RefCountedObject<VideoTrack>(id, source, worker_thread);
return track;
}
从上⾯流程看出 VideoTrack 包含了 AndroidVideoTrackSource
2. Java
PeerConnection::addTrack
RtpSender newSender = NativeMediaStreamTrack(), streamIds);
3.
Java_org_webrtc_PeerConnection_nativeAddTrack
JNI_PeerConnection_AddTrack
4. ./sdk/android/src/jni/pc/
JNI_PeerConnection_AddTrack
ExtractNativePC(jni, j_pc)->AddTrack(reinterpret_cast<MediaStreamTrackInterface*>(native_track),
JavaListToNativeVector<std::string, jstring>(jni, j_stream_labels, &JavaToNativeString));
5.
PeerConnection::AddTrack(rtc::scoped_refptr<MediaStreamTrackInterface> track, const std::vector<std::string>& stream_ids)
auto sender_or_error = (IsUnifiedPlan() ? AddTrackUnifiedPlan(track, stream_ids) : AddTrackPlanB(track, stream_ids));
6.
PeerConnection::AddTrackPlanB(rtc::scoped_refptr<MediaStreamTrackInterface> track, const std::vector<std::string>& stream_ids)
auto new_sender = CreateSender(media_type, track->id(), track, adjusted_stream_ids, {});
if (track->kind() == MediaStreamTrackInterface::kAudioKind) {
new_sender->internal()->SetMediaChannel(voice_media_channel());
GetAudioTransceiver()->internal()->AddSender(new_sender);
const RtpSenderInfo* sender_info = FindSenderInfo(local_audio_sender_infos_, new_sender->internal()->stream_ids()[0], track->id());      if (sender_info) {
new_sender->internal()->SetSsrc(sender_info->first_ssrc);
}
} else {
RTC_DCHECK_EQ(MediaStreamTrackInterface::kVideoKind, track->kind());
new_sender->internal()->SetMediaChannel(video_media_channel());
GetVideoTransceiver()->internal()->AddSender(new_sender);
const RtpSenderInfo* sender_info = FindSenderInfo(local_video_sender_infos_, new_sender->internal()->stream_ids()[0], track->id());      if (sender_info) {
new_sender->internal()->SetSsrc(sender_info->first_ssrc);
}
}
return rtc::scoped_refptr<RtpSenderInterface>(new_sender);
7.
PeerConnection::CreateSender(cricket::MediaType media_type, const std::string& id, rtc::scoped_refptr<MediaStreamTrackInterface> track,    const std::vector<std::string>& stream_ids, const std::vector<RtpEncodingParameters>& send_encodings)
sender = RtpSenderProxyWithInternal<RtpSenderInternal>::Create(signaling_thread(), VideoRtpSender::Create(worker_thread(), id, this));    NoteUsageEvent(UsageEvent::VIDEO_ADDED);
bool set_track_succeeded = sender->SetTrack(track); --->
RTC_DCHECK(set_track_succeeded);
sender->internal()->set_stream_ids(stream_ids);
sender->internal()->set_init_send_encodings(send_encodings);
8.
VideoRtpSender::SetTrack(MediaStreamTrackInterface* track)
RtpSenderBase::SetTrack(MediaStreamTrackInterface* track)
SetSend();
9.
VideoRtpSender::SetSend()
cricket::VideoOptions options;
VideoTrackSourceInterface* source = video_track()->GetSource();
if (source) {
options.is_screencast = source->is_screencast();
options.video_noise_reduction = source->needs_denoising();
}
switch (cached_track_content_hint_) {
case VideoTrackInterface::ContentHint::kNone:
break;
case VideoTrackInterface::ContentHint::kFluid:
options.is_screencast = false;气泡云
break;
case VideoTrackInterface::ContentHint::kDetailed:
case VideoTrackInterface::ContentHint::kText:
options.is_screencast = true;
break;
}
bool success = worker_thread_->Invoke<bool>(RTC_FROM_HERE, [&] {
return video_media_channel()->SetVideoSend(ssrc_, &options, video_track());
10.
WebRtcVideoChannel::SetVideoSend(uint32_t ssrc, const VideoOptions* options, rtc::VideoSourceInterface<webrtc::VideoFrame>* source)    const auto& kv = send_streams_.find(ssrc);
if (kv == send_streams_.end()) {
// Allow unknown ssrc only if source is null.
RTC_CHECK(source == nullptr);
RTC_LOG(LS_ERROR) << "No sending stream on ssrc " << ssrc;
return false;
}
return kv->second->SetVideoSend(options, source);
⽽ send_streams_ 就是 WebRtcVideoChannel::WebRtcVideoSendStream 参见下⾯的
《WebRtcVideoChannel 对象 AddSendStream (WebRtcVideoSendStream send_streams_) 的过程》
WebRtcVideoChannel::AddSendStream(const StreamParams& sp)11.
WebRtcVideoChannel::WebRtcVideoSendStream::SetVideoSend(const VideoOptions* options,
rtc::VideoSourceInterface<webrtc::VideoFrame>* source)
ReconfigureEncoder();  ---> 流程 12
source_ = source;
if (source && stream_) {
这段语句是把源与编码器对象关联到⼀起
stream_->SetSource(this, GetDegradationPreference());
}
stream_ 就是 VideoSendStream 对象,具体创建过程参见
void WebRtcVideoChannel::WebRtcVideoSendStream::RecreateWebRtcStream() 中的
stream_ = call_->CreateVideoSendStream(std::move(config), parameters_.encoder_config.Copy());
stream_->SetSource 的接⼝继续分析
11.
11.1
void VideoSendStream::SetSource(rtc::VideoSourceInterface<webrtc::VideoFrame>* source,
const DegradationPreference& degradation_preference) {
RTC_DCHECK_RUN_ON(&thread_checker_);
RTC_DCHECK_RUN_ON(&thread_checker_);
video_stream_encoder_->SetSource(source, degradation_preference);
}
11.2
void VideoStreamEncoder::SetSource(rtc::VideoSourceInterface<VideoFrame>* source,
const DegradationPreference& degradation_preference)
// 这个⾥⾯做 source 与 encoder 的关联
source_proxy_->SetSource(source, degradation_preference);
⽽ source_proxy_ 是 VideoStreamEncoder::VideoStreamEncoder 构造时创建的
./video/video_
VideoStreamEncoder::VideoStreamEncoder(
Clock* clock,
uint32_t number_of_cores,切削工具
VideoStreamEncoderObserver* encoder_stats_observer,
const VideoStreamEncoderSettings& settings,
std::unique_ptr<OveruseFrameDetector> overuse_detector,
TaskQueueFactory* task_queue_factory)
: source_proxy_(new VideoSourceProxy(this)),
VideoSourceProxy 的 video_stream_encoder_ 就是 VideoStreamEncoder
./video/video_
explicit VideoSourceProxy(VideoStreamEncoder* video_stream_encoder)
: video_stream_encoder_(video_stream_encoder),
degradation_preference_(DegradationPreference::DISABLED),
source_(nullptr),
max_framerate_(std::numeric_limits<int>::max()) {}
11.3 这个地⽅把 VideoStreamEncoder 和 Source 相关联
void VideoSourceProxy::SetSource(rtc::VideoSourceInterface<VideoFrame>* source,
const DegradationPreference& degradation_preference) {
// Called on libjingle's worker thread.
RTC_DCHECK_RUN_ON(&main_checker_);
rtc::VideoSourceInterface<VideoFrame>* old_source = nullptr;
rtc::VideoSinkWants wants;
{
rtc::CritScope lock(&crit_);
degradation_preference_ = degradation_preference;
old_source = source_;
source_ = source;
wants = GetActiveSinkWantsInternal();
}
if (old_source != source && old_source != nullptr) {
old_source->RemoveSink(video_stream_encoder_);
}
if (!source) {
return;
}
// source 就是上⾯设置过来的 WebRtcVideoChannel::WebRtcVideoSendStream 对象
source->AddOrUpdateSink(video_stream_encoder_, wants);
}
11.4
void WebRtcVideoChannel::WebRtcVideoSendStream::AddOrUpdateSink(
rtc::VideoSinkInterface<webrtc::VideoFrame>* sink, const rtc::VideoSinkWants& wants)
encoder_sink_ = sink;
source_->AddOrUpdateSink(encoder_sink_, wants);
11.5
void VideoTrack::AddOrUpdateSink(rtc::VideoSinkInterface<VideoFrame>* sink, const rtc::VideoSinkWants& wants) {
>亿万像素

本文发布于:2024-09-22 14:34:02,感谢您对本站的认可!

本文链接:https://www.17tex.com/tex/3/251487.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

标签:流程   对象   编码
留言与评论(共有 0 条评论)
   
验证码:
Copyright ©2019-2024 Comsenz Inc.Powered by © 易纺专利技术学习网 豫ICP备2022007602号 豫公网安备41160202000603 站长QQ:729038198 关于我们 投诉建议