Android对接实现内网无纸化会议|智慧教室|实时同屏功能

简介: 本文主要讲的是基于Android平台实现RTMP的技术方案设计,基础架构图如下:

背景

本文主要讲的是基于Android平台实现RTMP的技术方案设计,基础架构图如下:

20200201113924231.png

组网注意事项

1. 组网:无线组网,需要好的AP模块才能撑得住大的并发流量,推送端到AP,最好是有线网链接;


2. 服务器部署:SRS或NGINX,服务器可以和Windows平台的教师机部署在一台机器;


3. 教师端:如教师有移动的PAD,可以直接推到RTMP服务器,然后共享出去;


4. 学生端:直接拉取服务端的RTMP流播放即可;


5. 教师和学生互动:学生端如需作为示范案例,屏幕数据共享给其他同学,只需请求同屏,数据反推到RTMP服务器,其他学生查看即可。


6. 扩展监控:如果需要更进一步的技术方案,如教师端想监控学生端的屏幕情况,可以有两种方案,如学生端直接推RTMP过来,或者,学生端启动内置RTSP服务,教师端想看的时候,随时看即可(亦可轮询播放)。

Android端对接

推送分辨率如何设定或缩放?

Android设备,特别是高分屏,拿到的视频原始宽高非常大,如果推原始分辨率,编码和上行压力大,所以,一般建议,适当缩放,比如宽高缩放至2/3,缩放一般建议等比例缩放,缩放宽高建议16字节对齐。

    private void createScreenEnvironment() {
        sreenWindowWidth = mWindowManager.getDefaultDisplay().getWidth();
        screenWindowHeight = mWindowManager.getDefaultDisplay().getHeight();
        Log.i(TAG, "screenWindowWidth: " + sreenWindowWidth + ",screenWindowHeight: "
                + screenWindowHeight);
        if (sreenWindowWidth > 800)
        {
            if (screenResolution == SCREEN_RESOLUTION_STANDARD)
            {
                scale_rate = SCALE_RATE_HALF;
                sreenWindowWidth = align(sreenWindowWidth / 2, 16);
                screenWindowHeight = align(screenWindowHeight / 2, 16);
            }
            else if(screenResolution == SCREEN_RESOLUTION_LOW)
            {
                scale_rate = SCALE_RATE_TWO_FIFTHS;
                sreenWindowWidth = align(sreenWindowWidth * 2 / 5, 16);
            }
        }
        Log.i(TAG, "After adjust mWindowWidth: " + sreenWindowWidth + ", mWindowHeight: " + screenWindowHeight);
        int pf = mWindowManager.getDefaultDisplay().getPixelFormat();
        Log.i(TAG, "display format:" + pf);
        DisplayMetrics displayMetrics = new DisplayMetrics();
        mWindowManager.getDefaultDisplay().getMetrics(displayMetrics);
        mScreenDensity = displayMetrics.densityDpi;
        mImageReader = ImageReader.newInstance(sreenWindowWidth,
                screenWindowHeight, 0x1, 6);
        mMediaProjectionManager = (MediaProjectionManager) getSystemService(Context.MEDIA_PROJECTION_SERVICE);
    }

横竖屏自动适配

横竖屏状态下,采集的屏幕宽高不一样,如果横竖屏切换,这个时候,需要考虑到横竖屏适配问题,确保比如竖屏状态下,切换到横屏时,推拉流两端可以自动适配,横竖屏自动适配,编码器需要重启,拉流端,需要能自动适配宽高变化,自动播放。

    public void onConfigurationChanged(Configuration newConfig) {
        try {
            super.onConfigurationChanged(newConfig);
            if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE) {
                Log.i(TAG, "onConfigurationChanged cur: LANDSCAPE");
            } else if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
                Log.i(TAG, "onConfigurationChanged cur: PORTRAIT");
            }
            if(isPushingRtmp || isRecording || isRTSPPublisherRunning)
            {
                stopScreenCapture();
                clearAllImages();
                createScreenEnvironment();
                setupVirtualDisplay();
            }
        } catch (Exception ex) {
        }
    }


补帧策略

好多人不理解为什么要补帧,实际上,屏幕采集的时候,屏幕不动的话,不会一直有数据下去,这个时候,比较好的做法是,保存最后一帧数据,设定一定的补帧间隔,确保不会因为帧间距太大,导致播放端几秒都收不到数据,当然,如果服务器可以缓存GOP,这个问题迎刃而解。

异常网络处理、事件回调机制

回答:如果是走RTMP,网络抖动或者其他网络异常,需要有好重连机制和状态回馈机制。

    class EventHandeV2 implements NTSmartEventCallbackV2 {
        @Override
        public void onNTSmartEventCallbackV2(long handle, int id, long param1, long param2, String param3, String param4, Object param5) {
            Log.i(TAG, "EventHandeV2: handle=" + handle + " id:" + id);
            String publisher_event = "";
            switch (id) {
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_STARTED:
                    publisher_event = "开始..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_CONNECTING:
                    publisher_event = "连接中..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_CONNECTION_FAILED:
                    publisher_event = "连接失败..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_CONNECTED:
                    publisher_event = "连接成功..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_DISCONNECTED:
                    publisher_event = "连接断开..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_STOP:
                    publisher_event = "关闭..";
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_RECORDER_START_NEW_FILE:
                    publisher_event = "开始一个新的录像文件 : " + param3;
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_ONE_RECORDER_FILE_FINISHED:
                    publisher_event = "已生成一个录像文件 : " + param3;
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_SEND_DELAY:
                    publisher_event = "发送时延: " + param1 + " 帧数:" + param2;
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_CAPTURE_IMAGE:
                    publisher_event = "快照: " + param1 + " 路径:" + param3;
                    if (param1 == 0) {
                        publisher_event = publisher_event + "截取快照成功..";
                    } else {
                        publisher_event = publisher_event + "截取快照失败..";
                    }
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUBLISHER_RTSP_URL:
                    publisher_event = "RTSP服务URL: " + param3;
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUSH_RTSP_SERVER_RESPONSE_STATUS_CODE:
                    publisher_event ="RTSP status code received, codeID: " + param1 + ", RTSP URL: " + param3;
                    break;
                case NTSmartEventID.EVENT_DANIULIVE_ERC_PUSH_RTSP_SERVER_NOT_SUPPORT:
                    publisher_event ="服务器不支持RTSP推送, 推送的RTSP URL: " + param3;
                    break;
            }
            String str = "当前回调状态:" + publisher_event;
            Log.i(TAG, str);
            Message message = new Message();
            message.what = PUBLISHER_EVENT_MSG;
            message.obj = publisher_event;
            handler.sendMessage(message);
        }
    }

部分屏幕数据采集

回答:我们遇到的好多场景下,教室端,会拿出来3/4的区域用来投递给学生看,1/4的区域,用来做一些指令等操作,这个时候,就需要考虑屏幕区域裁剪:

  /**
   * 投递裁剪过的RGBA数据
   *
   * @param data: RGBA data
   *
   * @param rowStride: stride information
   *
   * @param width: width
   *
   * @param height: height
   *
   * @param clipedLeft: 左;  clipedTop: 上; clipedwidth: 裁剪后的宽; clipedHeight: 裁剪后的高; 确保传下去裁剪后的宽、高均为偶数
   *
   * @return {0} if successful
   */
  public native int SmartPublisherOnCaptureVideoClipedRGBAData(long handle,  ByteBuffer data, int rowStride, int width, int height, int clipedLeft, int clipedTop, int clipedWidth, int clipedHeight);

文字、图片水印

好多场景下,同屏者会把公司logo,和一定的文字信息展示在推送端,这个时候,需要考虑到文字和图片水印问题:

   /**
     * Set Text water-mark(设置文字水印)
     * 
     * @param fontSize: it should be "MEDIUM", "SMALL", "BIG"
     * 
     * @param waterPostion: it should be "TOPLEFT", "TOPRIGHT", "BOTTOMLEFT", "BOTTOMRIGHT".
     * 
     * @param xPading, yPading: the distance of the original picture.
     * 
     * <pre> The interface is only used for setting font water-mark when publishing stream. </pre>  
     * 
     * @return {0} if successful
     */
    public native int SmartPublisherSetTextWatermark(long handle, String waterText, int isAppendTime, int fontSize, int waterPostion, int xPading, int yPading);
    /**
     * Set Text water-mark font file name(设置文字水印字体路径)
   *
     * @param fontFileName:  font full file name, e.g: /system/fonts/DroidSansFallback.ttf
   *
   * @return {0} if successful
     */
    public native int SmartPublisherSetTextWatermarkFontFileName(long handle, String fontFileName);
    /**
     * Set picture water-mark(设置png图片水印)
     *                      
     * @param picPath: the picture working path, e.g: /sdcard/logo.png
     * 
     * @param waterPostion: it should be "TOPLEFT", "TOPRIGHT", "BOTTOMLEFT", "BOTTOMRIGHT".
     * 
     * @param picWidth, picHeight: picture width & height
     * 
     * @param xPading, yPading: the distance of the original picture.
     * 
     * <pre> The interface is only used for setting picture(logo) water-mark when publishing stream, with "*.png" format </pre>  
     * 
     * @return {0} if successful
     */
    public native int SmartPublisherSetPictureWatermark(long handle, String picPath, int waterPostion, int picWidth, int picHeight, int xPading, int yPading);

屏幕权限获取|数据采集

采集推送之前,需要获取屏幕权限,拿到屏幕数据后,调用SDK接口,完成推送或录像操作即可:

   @TargetApi(Build.VERSION_CODES.LOLLIPOP)
    private boolean startScreenCapture() {
        Log.i(TAG, "startScreenCapture..");
        setupMediaProjection();
        setupVirtualDisplay();
        return true;
    }
    private int align(int d, int a) {
        return (((d) + (a - 1)) & ~(a - 1));
    }
    @SuppressWarnings("deprecation")
    @SuppressLint("NewApi")
    private void createScreenEnvironment() {
        sreenWindowWidth = mWindowManager.getDefaultDisplay().getWidth();
        screenWindowHeight = mWindowManager.getDefaultDisplay().getHeight();
        Log.i(TAG, "screenWindowWidth: " + sreenWindowWidth + ",screenWindowHeight: "
                + screenWindowHeight);
        if (sreenWindowWidth > 800)
        {
            if (screenResolution == SCREEN_RESOLUTION_STANDARD)
            {
                scale_rate = SCALE_RATE_HALF;
                sreenWindowWidth = align(sreenWindowWidth / 2, 16);
                screenWindowHeight = align(screenWindowHeight / 2, 16);
            }
            else if(screenResolution == SCREEN_RESOLUTION_LOW)
            {
                scale_rate = SCALE_RATE_TWO_FIFTHS;
                sreenWindowWidth = align(sreenWindowWidth * 2 / 5, 16);
                screenWindowHeight = align(screenWindowHeight * 2 / 5, 16);
            }
        }
        Log.i(TAG, "After adjust mWindowWidth: " + sreenWindowWidth + ", mWindowHeight: " + screenWindowHeight);
        int pf = mWindowManager.getDefaultDisplay().getPixelFormat();
        Log.i(TAG, "display format:" + pf);
        DisplayMetrics displayMetrics = new DisplayMetrics();
        mWindowManager.getDefaultDisplay().getMetrics(displayMetrics);
        mScreenDensity = displayMetrics.densityDpi;
        mImageReader = ImageReader.newInstance(sreenWindowWidth,
                screenWindowHeight, 0x1, 6);
        mMediaProjectionManager = (MediaProjectionManager) getSystemService(Context.MEDIA_PROJECTION_SERVICE);
    }
    @SuppressLint("NewApi")
    private void setupMediaProjection() {
        mMediaProjection = mMediaProjectionManager.getMediaProjection(
                MainActivity.mResultCode, MainActivity.mResultData);
    }
    @SuppressLint("NewApi")
    private void setupVirtualDisplay() {
        mVirtualDisplay = mMediaProjection.createVirtualDisplay(
                "ScreenCapture", sreenWindowWidth, screenWindowHeight,
                mScreenDensity,
                DisplayManager.VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR,
                mImageReader.getSurface(), null, null);
        mImageReader.setOnImageAvailableListener(
                new ImageReader.OnImageAvailableListener() {
                    @Override
                    public void onImageAvailable(ImageReader reader) {
                        Image image = mImageReader.acquireLatestImage();
                        if (image != null) {
                            processScreenImage(image);
                            //image.close();
                        }
                    }
                }, null);
    }
    private void startRecorderScreen() {
        Log.i(TAG, "start recorder screen..");
        if (startScreenCapture()) {
            new Thread() {
                @Override
                public void run() {
                    Log.i(TAG, "start record..");
                }
            }.start();
        }
    }
    private ByteBuffer deepCopy(ByteBuffer source) {
        int sourceP = source.position();
        int sourceL = source.limit();
        ByteBuffer target = ByteBuffer.allocateDirect(source.remaining());
        target.put(source);
        target.flip();
        source.position(sourceP);
        source.limit(sourceL);
        return target;
    }
    /**
     * Process image data as desired.
     */
    @SuppressLint("NewApi")
    private void processScreenImage(Image image) {
        if(!isPushingRtmp && !isRecording &&!isRTSPPublisherRunning)
        {
            image.close();
            return;
        }
        /*
        final Image.Plane[] planes = image.getPlanes();
        width_ = image.getWidth();
        height_ = image.getHeight();
        row_stride_ = planes[0].getRowStride();
       ByteBuffer buf = deepCopy(planes[0].getBuffer());
       */
       // Log.i("OnScreenImage", "new image");
        pushImage(image);
    }
    @SuppressLint("NewApi")
    private void stopScreenCapture() {
        if (mVirtualDisplay != null) {
            mVirtualDisplay.release();
            mVirtualDisplay = null;
        }
    }

基础初始化

   private void InitAndSetConfig() {
        //开始要不要采集音频或视频,请自行设置
        publisherHandle = libPublisher.SmartPublisherOpen(this.getApplicationContext(),
                audio_opt, video_opt, sreenWindowWidth,
                screenWindowHeight);
        if ( publisherHandle == 0 )
        {
            return;
        }
        Log.i(TAG, "publisherHandle=" + publisherHandle);
        libPublisher.SetSmartPublisherEventCallbackV2(publisherHandle, new EventHandeV2());
        if(videoEncodeType == 1)
        {
            int h264HWKbps = setHardwareEncoderKbps(true, sreenWindowWidth,
                    screenWindowHeight);
            Log.i(TAG, "h264HWKbps: " + h264HWKbps);
            int isSupportH264HWEncoder = libPublisher
                    .SetSmartPublisherVideoHWEncoder(publisherHandle, h264HWKbps);
            if (isSupportH264HWEncoder == 0) {
                Log.i(TAG, "Great, it supports h.264 hardware encoder!");
            }
        }
        else if (videoEncodeType == 2)
        {
            int hevcHWKbps = setHardwareEncoderKbps(false, sreenWindowWidth,
                    screenWindowHeight);
            Log.i(TAG, "hevcHWKbps: " + hevcHWKbps);
            int isSupportHevcHWEncoder = libPublisher
                    .SetSmartPublisherVideoHevcHWEncoder(publisherHandle, hevcHWKbps);
            if (isSupportHevcHWEncoder == 0) {
                Log.i(TAG, "Great, it supports hevc hardware encoder!");
            }
        }
        if(is_sw_vbr_mode)
        {
            int is_enable_vbr = 1;
            int video_quality = CalVideoQuality(sreenWindowWidth,
                    screenWindowHeight, true);
            int vbr_max_bitrate = CalVbrMaxKBitRate(sreenWindowWidth,
                    screenWindowHeight);
            libPublisher.SmartPublisherSetSwVBRMode(publisherHandle, is_enable_vbr, video_quality, vbr_max_bitrate);
        }
        //音频相关可以参考SmartPublisher工程
    /*
    if (!is_speex)
    {
      // set AAC encoder
      libPublisher.SmartPublisherSetAudioCodecType(publisherHandle, 1);
    }
    else
    {
      // set Speex encoder
      libPublisher.SmartPublisherSetAudioCodecType(publisherHandle, 2);
      libPublisher.SmartPublisherSetSpeexEncoderQuality(publisherHandle, 8);
    }
    libPublisher.SmartPublisherSetNoiseSuppression(publisherHandle, is_noise_suppression ? 1
        : 0);
    libPublisher.SmartPublisherSetAGC(publisherHandle, is_agc ? 1 : 0);
    */
        // libPublisher.SmartPublisherSetClippingMode(publisherHandle, 0);
        //libPublisher.SmartPublisherSetSWVideoEncoderProfile(publisherHandle, sw_video_encoder_profile);
        //libPublisher.SmartPublisherSetSWVideoEncoderSpeed(publisherHandle, sw_video_encoder_speed);
        // libPublisher.SetRtmpPublishingType(publisherHandle, 0);
         libPublisher.SmartPublisherSetFPS(publisherHandle, 18);    //帧率可调
         libPublisher.SmartPublisherSetGopInterval(publisherHandle, 18*3);
         //libPublisher.SmartPublisherSetSWVideoBitRate(publisherHandle, 1200, 2400); //针对软编码有效,一般最大码率是平均码率的二倍
         libPublisher.SmartPublisherSetSWVideoEncoderSpeed(publisherHandle, 3);
         //libPublisher.SmartPublisherSaveImageFlag(publisherHandle, 1);
    }

准备推送|录像|启动RTSP服务

    @SuppressWarnings("deprecation")
    @Override
    public void onStart(Intent intent, int startId) {
        // TODO Auto-generated method stub
        super.onStart(intent, startId);
        Log.i(TAG, "onStart++");
        if (libPublisher == null)
            return;
        clearAllImages();
        screenResolution = intent.getExtras().getInt("SCREENRESOLUTION");
        videoEncodeType = intent.getExtras().getInt("VIDEOENCODETYPE");
        push_type = intent.getExtras().getInt("PUSHTYPE");
        Log.i(TAG, "push_type: " + push_type);
        mWindowManager = (WindowManager) getSystemService(Service.WINDOW_SERVICE);
        // 窗口管理者
        createScreenEnvironment();
        startRecorderScreen();
        //如果同时推送和录像,设置一次就可以
        InitAndSetConfig();
        if ( publisherHandle == 0 )
        {
            stopScreenCapture();
            return;
        }
        if(push_type == PUSH_TYPE_RTMP)
        {
            String publishURL = intent.getStringExtra("PUBLISHURL");
            Log.i(TAG, "publishURL: " + publishURL);
            if (libPublisher.SmartPublisherSetURL(publisherHandle, publishURL) != 0) {
                stopScreenCapture();
                Log.e(TAG, "Failed to set publish stream URL..");
                if (publisherHandle != 0) {
                    if (libPublisher != null) {
                        libPublisher.SmartPublisherClose(publisherHandle);
                        publisherHandle = 0;
                    }
                }
                return;
            }
        }
        //启动传递数据线程
        post_data_thread = new Thread(new DataRunnable());
        Log.i(TAG, "new post_data_thread..");
        is_post_data_thread_alive = true;
        post_data_thread.start();
        //录像相关++
        is_need_local_recorder = intent.getExtras().getBoolean("RECORDER");
        if(is_need_local_recorder)
        {
            ConfigRecorderParam();
            int startRet = libPublisher.SmartPublisherStartRecorder(publisherHandle);
            if( startRet != 0 )
            {
                isRecording = false;
                Log.e(TAG, "Failed to start recorder..");
            }
            else
            {
                isRecording = true;
            }
        }
        //录像相关——
        if(push_type == PUSH_TYPE_RTMP)
        {
            Log.i(TAG, "RTMP Pusher mode..");
            //推流相关++
            int startRet = libPublisher.SmartPublisherStartPublisher(publisherHandle);
            if (startRet != 0) {
                isPushingRtmp = false;
                Log.e(TAG, "Failed to start push rtmp stream..");
                return;
            }
            else
            {
                isPushingRtmp = true;
            }
            //推流相关--
        }
        else if(push_type == PUSH_TYPE_RTSP)
        {
            Log.i(TAG, "RTSP Internal Server mode..");
            rtsp_handle_ = libPublisher.OpenRtspServer(0);
            if (rtsp_handle_ == 0) {
                Log.e(TAG, "创建rtsp server实例失败! 请检查SDK有效性");
            } else {
                int port = 8554;
                if (libPublisher.SetRtspServerPort(rtsp_handle_, port) != 0) {
                    libPublisher.CloseRtspServer(rtsp_handle_);
                    rtsp_handle_ = 0;
                    Log.e(TAG, "创建rtsp server端口失败! 请检查端口是否重复或者端口不在范围内!");
                }
                //String user_name = "admin";
                //String password = "12345";
                //libPublisher.SetRtspServerUserNamePassword(rtsp_handle_, user_name, password);
                if (libPublisher.StartRtspServer(rtsp_handle_, 0) == 0) {
                    Log.i(TAG, "启动rtsp server 成功!");
                } else {
                    libPublisher.CloseRtspServer(rtsp_handle_);
                    rtsp_handle_ = 0;
                    Log.e(TAG, "启动rtsp server失败! 请检查设置的端口是否被占用!");
                    return;
                }
                isRTSPServiceRunning = true;
            }
            if(isRTSPServiceRunning)
            {
                Log.i(TAG, "onClick start rtsp publisher..");
                String rtsp_stream_name = "stream1";
                libPublisher.SetRtspStreamName(publisherHandle, rtsp_stream_name);
                libPublisher.ClearRtspStreamServer(publisherHandle);
                libPublisher.AddRtspStreamServer(publisherHandle, rtsp_handle_, 0);
                if (libPublisher.StartRtspStream(publisherHandle, 0) != 0) {
                    Log.e(TAG, "调用发布rtsp流接口失败!");
                    return;
                }
                isRTSPPublisherRunning = true;
            }
        }
        //如果同时推送和录像,Audio启动一次就可以了
        CheckInitAudioRecorder();
        Log.i(TAG, "onStart--");
    }
    private void stopPush() {
        if(!isPushingRtmp)
        {
            return;
        }
        if (!isRecording && !isRTSPPublisherRunning) {
            if (audioRecord_ != null) {
                Log.i(TAG, "stopPush, call audioRecord_.StopRecording..");
                audioRecord_.Stop();
                if (audioRecordCallback_ != null) {
                    audioRecord_.RemoveCallback(audioRecordCallback_);
                    audioRecordCallback_ = null;
                }
                audioRecord_ = null;
            }
        }
        if (libPublisher != null) {
            libPublisher.SmartPublisherStopPublisher(publisherHandle);
        }
        if (!isRecording && !isRTSPPublisherRunning) {
            if (publisherHandle != 0) {
                if (libPublisher != null) {
                    libPublisher.SmartPublisherClose(publisherHandle);
                    publisherHandle = 0;
                }
            }
        }
    }

停止推送|录像|RTSP服务

   private void stopRecorder() {
        if(!isRecording)
        {
            return;
        }
        if (!isPushingRtmp && !isRTSPPublisherRunning) {
            if (audioRecord_ != null) {
                Log.i(TAG, "stopRecorder, call audioRecord_.StopRecording..");
                audioRecord_.Stop();
                if (audioRecordCallback_ != null) {
                    audioRecord_.RemoveCallback(audioRecordCallback_);
                    audioRecordCallback_ = null;
                }
                audioRecord_ = null;
            }
        }
        if (libPublisher != null) {
            libPublisher.SmartPublisherStopRecorder(publisherHandle);
        }
        if (!isPushingRtmp && !isRTSPPublisherRunning) {
            if (publisherHandle != 0) {
                if (libPublisher != null) {
                    libPublisher.SmartPublisherClose(publisherHandle);
                    publisherHandle = 0;
                }
            }
        }
    }
    //停止发布RTSP流
    private void stopRtspPublisher() {
        if(!isRTSPPublisherRunning)
        {
            return;
        }
        if (!isPushingRtmp && !isRecording) {
            if (audioRecord_ != null) {
                Log.i(TAG, "stopRtspPublisher, call audioRecord_.StopRecording..");
                audioRecord_.Stop();
                if (audioRecordCallback_ != null) {
                    audioRecord_.RemoveCallback(audioRecordCallback_);
                    audioRecordCallback_ = null;
                }
                audioRecord_ = null;
            }
        }
        if (libPublisher != null) {
            libPublisher.StopRtspStream(publisherHandle);
        }
        if (!isPushingRtmp && !isRecording) {
            if (publisherHandle != 0) {
                if (libPublisher != null) {
                    libPublisher.SmartPublisherClose(publisherHandle);
                    publisherHandle = 0;
                }
            }
        }
    }
    //停止RTSP服务
    private void stopRtspService() {
        if(!isRTSPServiceRunning)
        {
            return;
        }
        if (libPublisher != null && rtsp_handle_ != 0) {
            libPublisher.StopRtspServer(rtsp_handle_);
            libPublisher.CloseRtspServer(rtsp_handle_);
            rtsp_handle_ = 0;
        }
    }

感兴趣的开发者可酌情参考。

相关文章
|
5月前
|
XML 缓存 Android开发
Android开发,使用kotlin学习多媒体功能(详细)
Android开发,使用kotlin学习多媒体功能(详细)
138 0
|
5月前
|
安全 Linux Android开发
Android 安全功能
Android 安全功能
75 0
|
2月前
|
编解码 测试技术 Android开发
Android经典实战之用 CameraX 库实现高质量的照片和视频拍摄功能
本文详细介绍了如何利用CameraX库实现高质量的照片及视频拍摄功能,包括添加依赖、初始化、权限请求、配置预览与捕获等关键步骤。此外,还特别针对不同分辨率和帧率的视频拍摄提供了性能优化策略,确保应用既高效又稳定。
91 1
Android经典实战之用 CameraX 库实现高质量的照片和视频拍摄功能
|
12天前
|
Android开发 开发者
Android平台无纸化同屏如何实现实时录像功能
Android平台无纸化同屏,如果需要本地录像的话,实现难度不大,只要复用之前开发的录像模块的就可以,对我们来说,同屏采集这块,只是数据源不同而已,如果是自采集的其他数据,我们一样可以编码录像。
|
2月前
|
图形学 Android开发
小功能⭐️Unity调用Android常用事件
小功能⭐️Unity调用Android常用事件
|
2月前
|
监控 Java 开发工具
如何快速对接Android平台GB28181接入模块(SmartGBD)
大牛直播SDK推出的Android平台GB28181接入SDK(SmartGBD),可实现不具备国标音视频能力的 Android终端,通过平台注册接入到现有的GB/T28181—2016服务,可用于如执法记录仪、智能安全帽、智能监控、智慧零售、智慧教育、远程办公、明厨亮灶、智慧交通、智慧工地、雪亮工程、平安乡村、生产运输、车载终端等场景,可能是业内为数不多功能齐全性能优异的商业级水准GB28181接入SDK。
|
4月前
|
数据库 Android开发 数据安全/隐私保护
在 Android Studio 中结合使用 SQLite 数据库实现简单的注册和登录功能
在 Android Studio 中结合使用 SQLite 数据库实现简单的注册和登录功能
178 2
|
4月前
|
Android开发
Android中如何快速的实现RecycleView的拖动重排序功能
使用`ItemTouchHelper`和自定义`Callback`,在`RecyclerView`中实现拖动排序功能。定义`ItemTouchHelperAdapter`接口,`Adapter`实现它以处理`onItemMove`方法。`SimpleItemTouchHelperCallback`设置拖动标志,如`LEFT`或`RIGHT`(水平拖动),并绑定到`RecyclerView`以启用拖动。完成这些步骤后,即可实现拖放排序。关注公众号“AntDream”获取更多内容。
84 3
|
5月前
|
移动开发 监控 Android开发
构建高效Android应用:从内存优化到电池寿命代码之美:从功能实现到艺术创作
【5月更文挑战第28天】 在移动开发领域,特别是针对Android系统,性能优化始终是关键议题之一。本文深入探讨了如何通过细致的内存管理和电池使用策略,提升Android应用的运行效率和用户体验。文章不仅涵盖了现代Android设备上常见的内存泄漏问题,还提出了有效的解决方案,包括代码级优化和使用工具进行诊断。同时,文中也详细阐述了如何通过减少不必要的后台服务、合理管理设备唤醒锁以及优化网络调用等手段延长应用的电池续航时间。这些方法和技术旨在帮助开发者构建更加健壮、高效的Android应用程序。
|
5月前
|
Android开发 数据安全/隐私保护 iOS开发
ios和安卓测试包发布网站http://fir.im的注册与常用功能
ios和安卓测试包发布网站http://fir.im的注册与常用功能
186 0
ios和安卓测试包发布网站http://fir.im的注册与常用功能
下一篇
无影云桌面