Jetson 学习笔记(十二):CSI摄像头实现rtsp流的传输并对动态获取多路流进行探索

简介: 本文是关于如何在Jetson设备上使用CSI摄像头实现RTSP流传输的详细教程,包括安装依赖、编译gst-rtsp-server、测试、源代码介绍以及如何动态获取多路流的RTSP服务器。

需求

gcc 版本7.5.0
g++ 版本7.5.0
ubuntu 版本18.04
gst-rtsp-server 版本1.8.0

安装依赖

sudo apt-get install gtk-doc-tools 
sudo apt-get install libgstreamer1.0-0 gstreamer1.0-plugins-base 
sudo apt-get install gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly 
sudo apt-get install gstreamer1.0-libav gstreamer1.0-doc gstreamer1.0-tools 
sudo apt-get install gstreamer1.0-x gstreamer1.0-alsa gstreamer1.0-gl gstreamer1.0-gtk3 gstreamer1.0-qt5 gstreamer1.0-pulseaudio
sudo apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev

下载、编译gst-rtsp-server

git clone git://anongit.freedesktop.org/gstreamer/gst-rtsp-server
或者
wget https://github.com/GStreamer/gst-rtsp-server/archive/1.8.zip
cd gst-rtsp-server
git checkout remotes/origin/1.8 or git clone https://github.com/GStreamer/common.git
./autogen.sh
make -j4 
sudo make install

//进入demo
cd example

测试

1、切换到examples目录:
cd examples
2、搭建Rtsp Server:
./test-launch "( videotestsrc ! x264enc ! rtph264pay name=pay0 pt=96 )"

直接读取摄像头(笔记本电脑一般自带摄像头,台式机请插入USB摄像头)视频的命令就是它了:
$ ./test-launch "( v4l2src ! video/x-raw-yuv,format='fourcc'YUY2,width=640,height=480 ! ffmpegcolorspace ! x264enc ! rtph264pay name=pay0 pt=96 )"

直接读取CSI摄像头
./test-launch "nvarguscamerasrc ! nvvidconv ! clockoverlay ! omxh264enc ! rtph264pay pt=96 name=pay0"---成功

./test-launch "(  mfw_v4lsrc device=/dev/video0 ! queue ! vpuenc codec=6 ! rtph264pay name=pay0 pt=96 )"---失败

3、播放rtsp流:
gst-launch-1.0 playbin uri=rtsp://127.0.0.1:8554/test
或者直接通过VLC打开以及通过opencv也可以打开
# 发送1
gst-launch-1.0 videotestsrc ! video/x-raw,format=I420 ! omxh264enc ! video/x-h264,stream-format=byte-stream ! rtph264pay mtu=1400 ! udpsink host=127.0.0.1 port=5000

# 接收1
gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp,encoding-name=(string)H264' ! rtph264depay ! h264parse ! omxh264dec ! nvoverlaysink sync=false async=false 

# 发送2
./test-launch "(videotestsrc ! video/x-raw,format=I420,framerate= 25/1 !  x264enc ! video/x-h264,stream-format=byte-stream ! rtph264pay name=pay0 pt=96)"

# 接收2
gst-launch-1.0 rtspsrc location=rtsp://127.0.01:8554/test ! rtph264depay ! h264parse ! omxh264dec ! nvoverlaysink sync=false async=false

源代码介绍

test-readme.c

#include <gst/gst.h>
#include <gst/rtsp-server/rtsp-server.h>
int main (int argc, char *argv[]) {
     //声明相关对象
      GMainLoop *loop;
      GstRTSPServer *server;
      GstRTSPMountPoints *mounts;
      GstRTSPMediaFactory *factory;

    //构建 rtsp 服务器
      gst_init (&argc, &argv);
      loop = g_main_loop_new (NULL, FALSE);                    // 创建 rtsp 服务器的主消息循环,也是默认的消息循环。
      server = gst_rtsp_server_new ();                        // 创建 rtsp 服务器对象
      mounts = gst_rtsp_server_get_mount_points (server);        // 获取 rtsp 服务器的装载点集合的引用
                                                              // 装载点集合 mounts 是服务器 server 的属性
      factory = gst_rtsp_media_factory_new ();                // 创建媒体工厂,用来产生媒体数据流 
      gst_rtsp_media_factory_set_launch (factory, "( videotestsrc is-live=1 ! x264enc ! rtph264pay name=pay0 pt=96 )");
      gst_rtsp_media_factory_set_shared (factory, TRUE);
      gst_rtsp_mount_points_add_factory (mounts, "/test", factory);    // 把媒体工厂添加到装载点集合
      g_object_unref (mounts);
      gst_rtsp_server_attach (server, NULL);                    // 把服务器附加到默认的消息循环。

      //运行 rtsp 服务器
      g_print ("stream ready at rtsp://127.0.0.1:8554/test\n");
      g_main_loop_run (loop);

      return 0;
}

服务器管理另外四个对象: GstRTSPSessionPool、GstRTSPMountPoints、 GstRTSPAuth 和 GstRTSPThreadPool。

GstRTSPSessionPool 是一个跟踪服务器中所有活动会话的对象。通常会为每个为某个媒体流执行设置请求的客户机保留一个会话。它包含客户端与服务器协商以接收特定流的配置,即UDP使用的传输和端口对以及流的状态。会话池的默认实现通常就足够了,但服务器可以使用替代实现。

GstRTSPMountPoints 对象更有趣,在服务器对象有用之前需要更多的配置。此对象管理从请求URL到特定流的映射及其配置。我们将在下一个主题中解释如何配置此对象。

GstRTSPAuth是对用户进行身份验证并授权用户执行的操作的对象。默认情况下,服务器没有 GstRTSPAuth 对象,因此不会尝试执行任何身份验证或授权。

GstRTSPThreadPool 管理用于客户端连接和媒体管道的线程。服务器有一个线程池的默认实现,在大多数情况下应该足够了。

动态获取多路流的rtsp server

类似example里面的C文件,将你所命名的文件名(C文件),加在Makefile,Makefile.am,Makefile.in对应位置,直接通过sudo make -j8即可完成修改,将生成对应可执行文件直接./运行即可。

代码1

/* GStreamer
 * Copyright (C) 2008 Wim Taymans <wim.taymans at gmail.com>
 *
 * This library is free software; you can redistribute it and/or
 * modify it under the terms of the GNU Library General Public
 * License as published by the Free Software Foundation; either
 * version 2 of the License, or (at your option) any later version.
 *
 * This library is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
 * Library General Public License for more details.
 *
 * You should have received a copy of the GNU Library General Public
 * License along with this library; if not, write to the
 * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
 * Boston, MA 02110-1301, USA.
 */

#include <gst/gst.h>

#include <gst/rtsp-server/rtsp-server.h>

int
main (int argc, char *argv[])
{
  GMainLoop *loop;
  GstRTSPServer *server;
  GstRTSPMountPoints *mounts;
  GstRTSPMediaFactory *factory;

  gst_init (&argc, &argv);

  loop = g_main_loop_new (NULL, FALSE);

  /* create a server instance */
  server = gst_rtsp_server_new ();

  /* get the mount points for this server, every server has a default object
   * that be used to map uri mount points to media factories */
  mounts = gst_rtsp_server_get_mount_points (server);

  /* make a media factory for a test stream. The default media factory can use
   * gst-launch syntax to create pipelines. 
   * any launch line works as long as it contains elements named pay%d. Each
   * element with pay%d names will be a stream */
  factory = gst_rtsp_media_factory_new ();
  gst_rtsp_media_factory_set_launch (factory,
      "( nvarguscamerasrc ! video/x-raw(memory:NVMM),width=1280,height=720,framerate=60/1 ! nvvidconv ! clockoverlay halignment=left valignment=top time-format='%Y/%m/%d %H:%M:%S' ! x264enc ! rtph264pay name=pay0 pt=96 )");

  gst_rtsp_media_factory_set_shared (factory, TRUE);

  /* attach the test factory to the /test url */
  gst_rtsp_mount_points_add_factory (mounts, "/test", factory);

  /* don't need the ref to the mapper anymore */
  g_object_unref (mounts);

  /* attach the server to the default maincontext */
  gst_rtsp_server_attach (server, NULL);

  /* start serving */
  g_print ("stream ready at rtsp://127.0.0.1:8554/test\n");
  g_main_loop_run (loop);

  return 0;
}

代码2

/* GStreamer
 * Copyright (C) 2008 Wim Taymans <wim.taymans at gmail.com>
 *
 * This library is free software; you can redistribute it and/or
 * modify it under the terms of the GNU Library General Public
 * License as published by the Free Software Foundation; either
 * version 2 of the License, or (at your option) any later version.
 *
 * This library is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
 * Library General Public License for more details.
 *
 * You should have received a copy of the GNU Library General Public
 * License along with this library; if not, write to the
 * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
 * Boston, MA 02110-1301, USA.
 */

#include <gst/gst.h>

#include <gst/rtsp-server/rtsp-server.h>

typedef struct
{
  gboolean white;
  GstClockTime timestamp;
} MyContext;

/* called when we need to give data to appsrc */
static void
need_data (GstElement * appsrc, guint unused, MyContext * ctx)
{
  GstBuffer *buffer;
  guint size;
  GstFlowReturn ret;

  size = 385 * 288 * 2;

  buffer = gst_buffer_new_allocate (NULL, size, NULL);

  /* this makes the image black/white */
  gst_buffer_memset (buffer, 0, ctx->white ? 0xff : 0x0, size);

  ctx->white = !ctx->white;

  /* increment the timestamp every 1/2 second */
  GST_BUFFER_PTS (buffer) = ctx->timestamp;
  GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 2);
  ctx->timestamp += GST_BUFFER_DURATION (buffer);

  g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
}

/* called when a new media pipeline is constructed. We can query the
 * pipeline and configure our appsrc */
static void
media_configure (GstRTSPMediaFactory * factory, GstRTSPMedia * media,
    gpointer user_data)
{
  GstElement *element, *appsrc;
  MyContext *ctx;

  /* get the element used for providing the streams of the media */
  element = gst_rtsp_media_get_element (media);

  /* get our appsrc, we named it 'mysrc' with the name property */
  appsrc = gst_bin_get_by_name_recurse_up (GST_BIN (element), "mysrc");

  /* this instructs appsrc that we will be dealing with timed buffer */
  gst_util_set_object_arg (G_OBJECT (appsrc), "format", "time");
  /* configure the caps of the video */
  g_object_set (G_OBJECT (appsrc), "caps",
      gst_caps_new_simple ("video/x-raw",
          "format", G_TYPE_STRING, "RGB16",
          "width", G_TYPE_INT, 384,
          "height", G_TYPE_INT, 288,
          "framerate", GST_TYPE_FRACTION, 0, 1, NULL), NULL);

  ctx = g_new0 (MyContext, 1);
  ctx->white = FALSE;
  ctx->timestamp = 0;
  /* make sure ther datais freed when the media is gone */
  g_object_set_data_full (G_OBJECT (media), "my-extra-data", ctx,
      (GDestroyNotify) g_free);

  /* install the callback that will be called when a buffer is needed */
  g_signal_connect (appsrc, "need-data", (GCallback) need_data, ctx);
  gst_object_unref (appsrc);
  gst_object_unref (element);
}

int
main (int argc, char *argv[])
{
  GMainLoop *loop;
  GstRTSPServer *server;
  GstRTSPMountPoints *mounts;
  GstRTSPMediaFactory *factory;

  gst_init (&argc, &argv);

  loop = g_main_loop_new (NULL, FALSE);

  /* create a server instance */
  server = gst_rtsp_server_new ();

  /* get the mount points for this server, every server has a default object
   * that be used to map uri mount points to media factories */
  mounts = gst_rtsp_server_get_mount_points (server);

  /* make a media factory for a test stream. The default media factory can use
   * gst-launch syntax to create pipelines. 
   * any launch line works as long as it contains elements named pay%d. Each
   * element with pay%d names will be a stream */
  factory = gst_rtsp_media_factory_new ();
  gst_rtsp_media_factory_set_launch (factory,
      "( nvarguscamerasrc ! video/x-raw(memory:NVMM),width=1280,height=720,framerate=60/1 ! nvvidconv ! clockoverlay halignment=left valignment=top time-format='%Y/%m/%d %H:%M:%S' ! x264enc ! rtph264pay name=pay0 pt=96 )");

  /* notify when our media is ready, This is called whenever someone asks for
   * the media and a new pipeline with our appsrc is created */
  g_signal_connect (factory, "media-configure", (GCallback) media_configure,
      NULL);

  gst_rtsp_media_factory_set_shared (factory, TRUE);

  /* attach the test factory to the /test url */
  gst_rtsp_mount_points_add_factory (mounts, "/test", factory);

  /* don't need the ref to the mapper anymore */
  g_object_unref (mounts);

  /* attach the server to the default maincontext */
  gst_rtsp_server_attach (server, NULL);

  /* start serving */
  g_print ("stream ready at rtsp://127.0.0.1:8554/test\n");
  g_main_loop_run (loop);

  return 0;
}

代码3

/* GStreamer
 * Copyright (C) 2008 Wim Taymans <wim.taymans at gmail.com>
 *
 * This library is free software; you can redistribute it and/or
 * modify it under the terms of the GNU Library General Public
 * License as published by the Free Software Foundation; either
 * version 2 of the License, or (at your option) any later version.
 *
 * This library is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
 * Library General Public License for more details.
 *
 * You should have received a copy of the GNU Library General Public
 * License along with this library; if not, write to the
 * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
 * Boston, MA 02110-1301, USA.
 */

#include <gst/gst.h>

#include <gst/rtsp-server/rtsp-server.h>

typedef struct
{
  gboolean white;
  GstClockTime timestamp;
} MyContext;

/* called when we need to give data to appsrc */
static void
need_data (GstElement * appsrc, guint unused, MyContext * ctx)
{
  GstBuffer *buffer;
  guint size;
  GstFlowReturn ret;

  size = 385 * 288 * 2;

  buffer = gst_buffer_new_allocate (NULL, size, NULL);

  /* this makes the image black/white */
  gst_buffer_memset (buffer, 0, ctx->white ? 0xff : 0x0, size);

  ctx->white = !ctx->white;

  /* increment the timestamp every 1/2 second */
  GST_BUFFER_PTS (buffer) = ctx->timestamp;
  GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 2);
  ctx->timestamp += GST_BUFFER_DURATION (buffer);

  g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
}

/* called when a new media pipeline is constructed. We can query the
 * pipeline and configure our appsrc */
static void
media_configure (GstRTSPMediaFactory * factory, GstRTSPMedia * media,
    gpointer user_data)
{
  GstElement *element, *appsrc;
  MyContext *ctx;

  /* get the element used for providing the streams of the media */
  element = gst_rtsp_media_get_element (media);

  /* get our appsrc, we named it 'mysrc' with the name property */
  appsrc = gst_bin_get_by_name_recurse_up (GST_BIN (element), "mysrc");

  /* this instructs appsrc that we will be dealing with timed buffer */
  gst_util_set_object_arg (G_OBJECT (appsrc), "format", "time");
  /* configure the caps of the video */
  g_object_set (G_OBJECT (appsrc), "caps",
      gst_caps_new_simple ("video/x-raw",
          "format", G_TYPE_STRING, "RGB16",
          "width", G_TYPE_INT, 384,
          "height", G_TYPE_INT, 288,
          "framerate", GST_TYPE_FRACTION, 0, 1, NULL), NULL);

  ctx = g_new0 (MyContext, 1);
  ctx->white = FALSE;
  ctx->timestamp = 0;
  /* make sure ther datais freed when the media is gone */
  g_object_set_data_full (G_OBJECT (media), "my-extra-data", ctx,
      (GDestroyNotify) g_free);

  /* install the callback that will be called when a buffer is needed */
  g_signal_connect (appsrc, "need-data", (GCallback) need_data, ctx);
  gst_object_unref (appsrc);
  gst_object_unref (element);
}

int
main (int argc, char *argv[])
{
  GMainLoop *loop;
  GstRTSPServer *server;
  GstRTSPMountPoints *mounts;
  GstRTSPMediaFactory *factory;

  gst_init (&argc, &argv);

  loop = g_main_loop_new (NULL, FALSE);

  /* create a server instance */
  server = gst_rtsp_server_new ();

  /* get the mount points for this server, every server has a default object
   * that be used to map uri mount points to media factories */
  mounts = gst_rtsp_server_get_mount_points (server);

  /* make a media factory for a test stream. The default media factory can use
   * gst-launch syntax to create pipelines. 
   * any launch line works as long as it contains elements named pay%d. Each
   * element with pay%d names will be a stream */
  factory = gst_rtsp_media_factory_new ();
  gst_rtsp_media_factory_set_launch (factory,
      "( nvarguscamerasrc ! video/x-raw(memory:NVMM),width=1280,height=720,framerate=60/1 ! nvvidconv ! clockoverlay halignment=left valignment=top time-format='%Y/%m/%d %H:%M:%S' ! x264enc ! rtph264pay name=pay0 pt=96 )");

  /* notify when our media is ready, This is called whenever someone asks for
   * the media and a new pipeline with our appsrc is created */
  g_signal_connect (factory, "media-configure", (GCallback) media_configure,
      NULL);

  gst_rtsp_media_factory_set_shared (factory, TRUE);

  /* attach the test factory to the /test url */
  gst_rtsp_mount_points_add_factory (mounts, "/test", factory);

  /* don't need the ref to the mapper anymore */
  g_object_unref (mounts);

  /* attach the server to the default maincontext */
  gst_rtsp_server_attach (server, NULL);

  /* start serving */
  g_print ("stream ready at rtsp://127.0.0.1:8554/test\n");
  g_main_loop_run (loop);

  return 0;
}

操作命名

获取系统绝对时间

  gst_rtsp_media_factory_set_launch (factory,
      "( nvarguscamerasrc ! video/x-raw(memory:NVMM),width=1280,height=720,framerate=60/1 ! nvvidconv ! clockoverlay halignment=left valignment=top time-format='%Y/%m/%d %H:%M:%S' ! x264enc ! rtph264pay name=pay0 pt=96 )");

在这里插入图片描述

获取系统年月日

  gst_rtsp_media_factory_set_launch (factory,
      "( nvarguscamerasrc ! video/x-raw(memory:NVMM),width=1280,height=720,framerate=60/1 ! nvvidconv ! clockoverlay halignment=left valignment=top time-format='%Y/%m/%d %H:%M:%S' ! x264enc ! rtph264pay name=pay0 pt=96 )");

在这里插入图片描述

错误

  • Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvargus
    解决办法:sudo systemctl restart nvargus-daemon (可以重启nvargus-daemon,重启后程序可以正常运行

)

参考

基于树莓派板子

基于jetson nano

目录
相关文章
|
数据采集 前端开发 Android开发
Android平台RTMP推送或GB28181设备接入端如何实现采集audio音量放大?
我们在做Android平台RTMP推送和GB28181设备对接的时候,遇到这样的问题,有的设备,麦克风采集出来的audio,音量过高或过低,特别是有些设备,采集到的麦克风声音过低,导致播放端听不清前端采集的audio,这时候,就需要针对采集到的audio,做音量放大处理。
|
7月前
|
编解码
FFmpeg开发笔记(三十三)分析ZLMediaKit对H.264流的插帧操作
《FFmpeg开发实战》书中3.4.3节讲解如何将H.264流封装成MP4。H.264流通常以SPS→PPS→IDR帧开始,这一说法通过雷霄骅的H264分析器得到验证。分析器能解析H.264文件但不支持MP4。ZLMediaKit服务器在遇到I帧时会自动插入SPS和PPS配置帧,确保流符合标准格式。若缺少这些帧,客户端拉流时会报错。FFmpeg开发实战:从零基础到短视频上线》书中提供了更多FFmpeg开发细节。
176 0
FFmpeg开发笔记(三十三)分析ZLMediaKit对H.264流的插帧操作
|
2月前
|
编解码 vr&ar 图形学
Unity下如何实现低延迟的全景RTMP|RTSP流渲染
随着虚拟现实技术的发展,全景视频逐渐成为新的媒体形式。本文详细介绍了如何在Unity中实现低延迟的全景RTMP或RTSP流渲染,包括环境准备、引入依赖、初始化客户端、解码与渲染、优化低延迟等步骤,并提供了具体的代码示例。适用于远程教育、虚拟旅游等实时交互场景。
38 2
|
3月前
FFmpeg学习笔记(二):多线程rtsp推流和ffplay拉流操作,并储存为多路avi格式的视频
这篇博客主要介绍了如何使用FFmpeg进行多线程RTSP推流和ffplay拉流操作,以及如何将视频流保存为多路AVI格式的视频文件。
423 0
|
4月前
|
安全 API 开发工具
Android平台RTMP推送|轻量级RTSP服务如何实现麦克风|扬声器声音采集切换
Android平台扬声器播放声音的采集,在无纸化同屏等场景下,意义很大,早期低版本的Android设备,是没法直接采集扬声器audio的(从Android 10开始支持),所以,如果需要采集扬声器audio,需要先做系统版本判断,添加相应的权限。
|
5月前
|
编解码 网络协议 开发工具
Android平台如何实现多路低延迟RTSP|RTMP播放?
本文档详细介绍了大牛直播SDK在Android平台上实现RTSP与RTMP流媒体播放及录像功能的技术细节。早在2015年,SDK的第一版就已经支持了多实例播放,并且通过简单的实例封装就能轻松实现。文档中提供了代码示例,展示了如何开启播放、停止播放以及开始和停止录像等功能。此外,SDK还提供了丰富的配置选项,例如设置录像目录、文件大小限制、转码选项等。总结部分列出了该SDK的关键特性,包括但不限于高稳定性和低延迟的播放能力、多实例支持、事件回调、硬解码支持、网络状态监控以及复杂的网络环境处理等。这些功能使得SDK能够应对各种应用场景,特别是在对延迟和稳定性有极高要求的情况下表现优异。
122 5
|
5月前
|
编解码 开发工具 Android开发
低延迟播放超高分辨率(4K+)帧率(50帧+)RTSP|RTMP流技术探讨和实现
为满足安检等场景需求,需支持4K+分辨率与50帧以上的高帧率视频流播放。实现这一目标的关键步骤包括:确保视频源支持高帧率输出、选用高性能RTSP/RTMP播放器以处理高负载视频解码、采用硬件解码以降低CPU负担、保证充足的网络带宽以维持流畅播放并控制延迟、合理配置播放器缓冲策略以适应网络波动、进行性能监控与调试以优化播放效果,以及确保播放器在多平台上的良好兼容性和表现。例如,大牛直播SDK的SmartPlayer在不同平台上实现了稳定且低延迟(150-300ms)的播放体验,支持多种视频和音频格式及多种功能,如多实例播放、事件回调、视频快照等。
123 1
|
5月前
|
监控 开发工具 数据安全/隐私保护
Windows平台如何实现多路RTSP|RTMP流合成后录像或转发RTMP服务
本文介绍了在Windows平台上实现多路RTSP/RTMP视频流的合并技术。主要应用场景包括驾考、全景摄像头以及多路会议录制等。技术实现上,文章详细展示了如何使用特定的SDK来解码并回调YUV或RGB数据,再将这些数据按照图层形式进行合成。示例代码中给出了初始化参数、设置视频帧回调函数、以及如何配置不同图层的具体步骤。最终,合成后的视频可以推送到RTMP服务器、注入到本地RTSP服务,或是直接录制为MP4文件。此外,还提供了添加实时文字水印的方法,并展示了四路视频流合成后的“四宫格”效果。
129 0
|
7月前
|
图形学 异构计算
蓝易云 - Unity下如何实现低延迟的全景RTMP|RTSP流渲染
以上就是在Unity中实现低延迟的全景RTMP/RTSP流渲染的基本步骤。具体的实现可能会根据你的具体需求和所使用的库有所不同。
119 0
|
8月前
|
编解码 计算机视觉 Python
IPC机制在jetson中实现硬解码视频流数据通信的逻辑解析
IPC机制在jetson中实现硬解码视频流数据通信的逻辑解析
214 0

热门文章

最新文章