New video streaming manager - ICameraStreamManager - addReceiveStreamListener vs addFrameListener
已完成Hello!
I'm looking at the following documentation for the new video streaming manager:
- Comparison between the old and the new API.
- Official SDK docs for ICameraStreamManager
Based on the docs for the new class, it says the following:
- Add the video stream data listener
addReceiveStreamListener
to receive the video stream data. The video stream data can be used for functions such as self-decoding display and third-party livestreaem.- Add the frame data listener
addFrameListener
to receive frame data. Frame data can be used for algorithm processing of functions such as AI recognition.
Based on the comparison table between the old/new API, the addFrameListener function replaces the addYuvDataListener. We were always using the addYuvDataListener to get the YUV data for our custom WebRTC live streaming capabilities. But we find it interesting, that you actually recommend using addReceiveStreamListener for third-party live streaming capability? For example, the old API for addReceiveStreamListener() is addStreamDataListener, right? And for that function, you mention in the docs the following:
- If you want to decode the H.264 video stream by yourself, you can call addStreamDataListener to monitor the H.264 video stream. You can decode H.264 video stream and video streaming by yourself.
And that it seems to contain the already encoded H.264 video, which is not useful if you want to for example use VP8/VP9/H265 for video encoding transmission as you have to first decoded the H264 stream. Why is addReceiveStreamListener than recommended over the addFrameListener for live streaming, if with the latter you get raw YUV data which is much more suitable for the live streaming encoding to various different video codecs? Is there perhaps a performance difference between these two or something else that makes the addReceiveStreamListener a more suitable choice for live streaming?
-
Thank you for your detailed description and suggestions for our documentation. When we explain that ReceiveStreamListener can be used for live streaming, we consider that most live streaming protocols use H264, such as RTSP and RTP. Using YUV data for the scenario you mentioned may indeed be better. There is no performance difference between ReceiveStreamListener and FrameListener.
请先登录再写评论。
评论
1 条评论