我们在实现Windows平台RTSP播放器或RTMP播放器的时候,需要考虑的点很多,比如多实例设计、多绘制模式兼容、软硬解码支持、快照、RTSP下TCP-UDP自动切换等,以下就其中几个方面,做个大概的探讨。
1. 视频绘制模式
我们在实现Windows平台播放的时候,一般首选D3D,D3D不支持的情况下,考虑数据回上来,采用GDI模式,一般实现如下,先做D3D检测,以大牛直播SDK播放端为例(Github),调用NT_SP_IsSupportD3DRender(),检测是否支持D3D模式,如果支持的话,调用NT_SP_SetRenderWindow(), 然后,设置是否等比例缩放(调用NT_SP_SetRenderScaleMode())。
bool is_support_d3d_render = false; Int32 in_support_d3d_render = 0; if (NT.NTBaseCodeDefine.NT_ERC_OK == NTSmartPlayerSDK.NT_SP_IsSupportD3DRender(player_handle_, playWnd.Handle, ref in_support_d3d_render)) { if (1 == in_support_d3d_render) { is_support_d3d_render = true; } } if (is_support_d3d_render) { is_gdi_render_ = false; // 支持d3d绘制的话,就用D3D绘制 NTSmartPlayerSDK.NT_SP_SetRenderWindow(player_handle_, playWnd.Handle); if (btn_check_render_scale_mode.Checked) { NTSmartPlayerSDK.NT_SP_SetRenderScaleMode(player_handle_, 1); } else { NTSmartPlayerSDK.NT_SP_SetRenderScaleMode(player_handle_, 0); } } else { is_gdi_render_ = true; playWnd.Visible = false; // 不支持D3D就让播放器吐出数据来,用GDI绘制 //video frame callback (YUV/RGB) //format请参见 NT_SP_E_VIDEO_FRAME_FORMAT,如需回调YUV,请设置为 NT_SP_E_VIDEO_FRAME_FROMAT_I420 video_frame_call_back_ = new SP_SDKVideoFrameCallBack(SetVideoFrameCallBack); NTSmartPlayerSDK.NT_SP_SetVideoFrameCallBack(player_handle_, (Int32)NT.NTSmartPlayerDefine.NT_SP_E_VIDEO_FRAME_FORMAT.NT_SP_E_VIDEO_FRAME_FORMAT_RGB32, IntPtr.Zero, video_frame_call_back_); }
如果不支持D3D,设置RGB数据回调:
video_frame_call_back_ = new SP_SDKVideoFrameCallBack(SetVideoFrameCallBack); NTSmartPlayerSDK.NT_SP_SetVideoFrameCallBack(player_handle_, (Int32)NT.NTSmartPlayerDefine.NT_SP_E_VIDEO_FRAME_FORMAT.NT_SP_E_VIDEO_FRAME_FORMAT_RGB32, IntPtr.Zero, video_frame_call_back_);
数据处理如下:
public void SetVideoFrameCallBack(IntPtr handle, IntPtr userData, UInt32 status, IntPtr frame) { if (frame == IntPtr.Zero) { return; } //如需直接处理RGB数据,请参考以下流程 NT_SP_VideoFrame video_frame = (NT_SP_VideoFrame)Marshal.PtrToStructure(frame, typeof(NT_SP_VideoFrame)); NT_SP_VideoFrame pVideoFrame = new NT_SP_VideoFrame(); pVideoFrame.format_ = video_frame.format_; pVideoFrame.width_ = video_frame.width_; pVideoFrame.height_ = video_frame.height_; pVideoFrame.timestamp_ = video_frame.timestamp_; pVideoFrame.stride0_ = video_frame.stride0_; pVideoFrame.stride1_ = video_frame.stride1_; pVideoFrame.stride2_ = video_frame.stride2_; pVideoFrame.stride3_ = video_frame.stride3_; Int32 argb_size = video_frame.stride0_ * video_frame.height_; pVideoFrame.plane0_ = Marshal.AllocHGlobal(argb_size); CopyMemory(pVideoFrame.plane0_, video_frame.plane0_, (UInt32)argb_size); if (playWnd.InvokeRequired) { BeginInvoke(set_video_frame_call_back_, status, pVideoFrame); } else { set_video_frame_call_back_(status, pVideoFrame); } }
在OnPaint()绘制即可:
private void SmartPlayerForm_Paint(object sender, PaintEventArgs e) { if (player_handle_ == IntPtr.Zero || !is_gdi_render_ || !is_playing_) { return; } if (cur_video_frame_.plane0_ == IntPtr.Zero) { return; } Bitmap bitmap = new Bitmap(cur_video_frame_.width_, cur_video_frame_.height_, cur_video_frame_.stride0_, System.Drawing.Imaging.PixelFormat.Format32bppRgb, cur_video_frame_.plane0_); int image_width = cur_video_frame_.width_; int image_height = cur_video_frame_.height_; Graphics g = e.Graphics; //获取窗体画布 g.SmoothingMode = SmoothingMode.HighSpeed; int limit_w = this.Width - 60; int limit_h = this.Height - playWnd.Top - 60; if (btn_check_render_scale_mode.Checked) { int d_w = 0, d_h = 0; int left_offset = 0; int top_offset = 0; Brush brush = new SolidBrush(Color.Black); g.FillRectangle(brush, playWnd.Left, playWnd.Top, limit_w, limit_h); GetRenderRect(limit_w, limit_h, image_width, image_height, ref left_offset, ref top_offset, ref d_w, ref d_h); g.DrawImage(bitmap, playWnd.Left + left_offset, playWnd.Top + top_offset, d_w, d_h); //在窗体的画布中绘画出内存中的图像 } else { g.DrawImage(bitmap, playWnd.Left, playWnd.Top, limit_w, limit_h); //在窗体的画布中绘画出内存中的图像 } }
2. 特定机型硬解码
Windows平台硬解码,主要适用于性能偏弱的PC端,或者有多路播放诉求的场景,一般建议在软解性能没问题的情况下,尽量软解,具体处理如下,先检测系统是否支持硬解,如果支持,再做硬解设置,这样的好处在于如果系统不支持硬解,可以继续软解播放,具体设置如下,在调用NT_SP_Open()之前,做检测,因为NT_SP_Open()每个句柄对应一个player实例,多个实例只需要做一次判断即可:
is_support_h264_hardware_decoder_ = NT.NTBaseCodeDefine.NT_ERC_OK == NT.NTSmartPlayerSDK.NT_SP_IsSupportH264HardwareDecoder(); is_support_h265_hardware_decoder_ = NT.NTBaseCodeDefine.NT_ERC_OK == NT.NTSmartPlayerSDK.NT_SP_IsSupportH265HardwareDecoder(); if (player_handle_ == IntPtr.Zero) { player_handle_ = new IntPtr(); UInt32 ret_open = NTSmartPlayerSDK.NT_SP_Open(out player_handle_, IntPtr.Zero, 0, IntPtr.Zero); if (ret_open != 0) { player_handle_ = IntPtr.Zero; MessageBox.Show("调用NT_SP_Open失败.."); return; } }
播放之前,设置硬解码:
if (checkBox_hardware_decoder.Checked) { NTSmartPlayerSDK.NT_SP_SetH264HardwareDecoder(player_handle_, is_support_h264_hardware_decoder_ ? 1 : 0, 0); NTSmartPlayerSDK.NT_SP_SetH265HardwareDecoder(player_handle_, is_support_h265_hardware_decoder_ ? 1 : 0, 0); } else { NTSmartPlayerSDK.NT_SP_SetH264HardwareDecoder(player_handle_, 0, 0); NTSmartPlayerSDK.NT_SP_SetH265HardwareDecoder(player_handle_, 0, 0); }
3. 只解码关键帧
只解关键帧的场景,也是用于多路播放诉求,比如一般的监控场景,考虑到多路的场景,一般关键帧间隔不大(如1-2秒一个),平台可对现场场景有个宏观了解,如需重点关注某几路画面的时候,再实时取消这个选项,实现全帧播放,所以,只解关键帧一定要做成实时调用的接口才更有设计意义。
// 设置是否只解码关键帧 if (btn_check_only_decode_video_key_frame.Checked) { NTSmartPlayerSDK.NT_SP_SetOnlyDecodeVideoKeyFrame(player_handle_, 1); } else { NTSmartPlayerSDK.NT_SP_SetOnlyDecodeVideoKeyFrame(player_handle_, 0); }
4. 视频view旋转
好多现场的开发人员有这样的困惑,有些设备,在安装时,可能没调整好角度,导致拍出来的角度倒立等,看着很不方便,这时候,如果现场设备比较多的话,不可能每台设备都到现场重新安装,实时view旋转,就体现了价值,具体如下:
/* * 设置旋转,顺时针旋转 * degress: 设置0, 90, 180, 270度有效,其他值无效 * 注意:除了0度,其他角度播放会耗费更多CPU * 接口调用成功返回NT_ERC_OK */ [DllImport(@"SmartPlayerSDK.dll")] public static extern UInt32 NT_SP_SetRotation(IntPtr handle, Int32 degress);
视频view选择,会消耗一定的CPU。
5. 实时快照
实时快照功能不表,是一个好的RTSP播放器和RTMP播放器必备的功能,实时快照是把解码后的yuv数据重新编码成png,所以有一定的CPU消耗,不建议过于频繁操作,具体实现如下:
if ( String.IsNullOrEmpty(capture_image_path_) ) { MessageBox.Show("请先设置保存截图文件的目录! 点击截图左边的按钮设置!"); return; } if ( player_handle_ == IntPtr.Zero ) { return; } if ( !is_playing_) { MessageBox.Show("请在播放状态下截图!"); return; } String name = capture_image_path_ + "\\" + DateTime.Now.ToString("hh-mm-ss") + ".png"; byte[] buffer1 = Encoding.Default.GetBytes(name); byte[] buffer2 = Encoding.Convert(Encoding.Default, Encoding.UTF8, buffer1, 0, buffer1.Length); byte[] buffer3 = new byte[buffer2.Length + 1]; buffer3[buffer2.Length] = 0; Array.Copy(buffer2, buffer3, buffer2.Length); IntPtr file_name_ptr = Marshal.AllocHGlobal(buffer3.Length); Marshal.Copy(buffer3, 0, file_name_ptr, buffer3.Length); capture_image_call_back_ = new SP_SDKCaptureImageCallBack(SDKCaptureImageCallBack); UInt32 ret = NTSmartPlayerSDK.NT_SP_CaptureImage(player_handle_, file_name_ptr, IntPtr.Zero, capture_image_call_back_); Marshal.FreeHGlobal(file_name_ptr); if (NT.NTBaseCodeDefine.NT_ERC_OK == ret) { // 发送截图请求成功 } else if ((UInt32)NT.NTSmartPlayerDefine.SP_E_ERROR_CODE.NT_ERC_SP_TOO_MANY_CAPTURE_IMAGE_REQUESTS == ret) { // 通知用户延时 MessageBox.Show("Too many capture image requests!"); } else { // 其他失败 }
public void SDKCaptureImageCallBack(IntPtr handle, IntPtr userData, UInt32 result, IntPtr file_name) { if (file_name == IntPtr.Zero) return; int index = 0; while (true) { if (0 == Marshal.ReadByte(file_name, index)) break; index++; } byte[] file_name_buffer = new byte[index]; Marshal.Copy(file_name, file_name_buffer, 0, index); byte[] dst_buffer = Encoding.Convert(Encoding.UTF8, Encoding.Default, file_name_buffer, 0, file_name_buffer.Length); String image_name = Encoding.Default.GetString(dst_buffer, 0, dst_buffer.Length); if (playWnd.InvokeRequired) { BeginInvoke(set_capture_image_call_back_, result, image_name); } else { set_capture_image_call_back_(result, image_name); } }
后续,我们将针对RTSP和RTMP播放器设计过程中的其他点,做更进一步的探讨。