iOS小技能: 自定义相机(基础知识储备)

简介: 常用基础功能

引言

I 常用基础功能

1.1模拟拍照动作

            //振动,颤动,摆动
            AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);//            // 播放一下“拍照”的声音,模拟拍照            AudioServicesPlaySystemSound(1108);

1.2 能否切换前置后置

// 
- (BOOL)canSwitchCameras {
    return [[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count] > 1;
    
}

1.3 从输出的元数据中捕捉人脸

实现输出流的代理AVCaptureMetadataOutputObjectsDelegate

        _metadataOutput = [[AVCaptureMetadataOutput alloc]init];
        
        [_metadataOutput setMetadataObjectsDelegate:self queue:self.queue];
                [self.videoDataOutput setSampleBufferDelegate:nil queue:self.queue];

// 检测人脸是为了获得“人脸区域”,做“人脸区域”与“身份证人像框”的区域对比,当前者在后者范围内的时候,才能截取到完整的身份证图像
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
    if (metadataObjects.count) {
        AVMetadataMachineReadableCodeObject *metadataObject = metadataObjects.firstObject;
        
        AVMetadataObject *transformedMetadataObject = [self.previewLayer transformedMetadataObjectForMetadataObject:metadataObject];
        CGRect faceRegion = transformedMetadataObject.bounds;
        
        if (metadataObject.type == AVMetadataObjectTypeFace) {
            NSLog(@"是否包含头像:%d, facePathRect: %@, faceRegion: %@",CGRectContainsRect(self.faceDetectionFrame, faceRegion),NSStringFromCGRect(self.faceDetectionFrame),NSStringFromCGRect(faceRegion));
            
            if (CGRectContainsRect(self.faceDetectionFrame, faceRegion)) {// 只有当人脸区域的确在小框内时,才再去做捕获此时的这一帧图像
                // 为videoDataOutput设置代理,程序就会自动调用下面的代理方法,捕获每一帧图像
                if (!self.videoDataOutput.sampleBufferDelegate) {
                    [self.videoDataOutput setSampleBufferDelegate:self queue:self.queue];
                }
            }
        }
    }
}

1.4 捕获每一帧图像: AVCaptureVideoDataOutputSampleBufferDelegate

甚至代理

        _videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];

                    [self.videoDataOutput setSampleBufferDelegate:self queue:self.queue];

从输出的数据流捕捉单一的图像帧


#pragma mark - AVCaptureVideoDataOutputSampleBufferDelegate
#pragma mark 从输出的数据流捕捉单一的图像帧
// AVCaptureVideoDataOutput获取实时图像,这个代理方法的回调频率很快,几乎与手机屏幕的刷新频率一样快
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if ([self.outPutSetting isEqualToNumber:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]] || [self.outPutSetting isEqualToNumber:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]]) {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        
        if ([captureOutput isEqual:self.videoDataOutput]) {
            // 身份证信息识别 +(void)iDCardRecognit:(CVImageBufferRef)imageBuffer WithstopRunningBlcok:(void(^)(id make)) stopRunningBlcok finishBlock:(k_finishBlock)finishBlock;
            __weak __typeof__(self) weakSelf = self;

            [KNScanCardManage iDCardRecognit:imageBuffer  WithstopRunningBlcok:^(id  _Nonnull sender) {
                
                            if ([weakSelf.session isRunning]) {
                                [weakSelf.session stopRunning];
                            }

            } finishBlock:weakSelf.finishBlock];//imageBuffer
            
            
            
            
            
            
            // 身份证信息识别完毕后,就将videoDataOutput的代理去掉,防止频繁调用AVCaptureVideoDataOutputSampleBufferDelegate方法而引起的“混乱”
            if (self.videoDataOutput.sampleBufferDelegate) {
                [self.videoDataOutput setSampleBufferDelegate:nil queue:self.queue];
            }
        }
    } else {
        NSLog(@"输出格式不支持");
    }
}

1.5 点击屏幕对焦:聚焦

监听点击事件

    UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(focusGesture:)];
    [self.view addGestureRecognizer:tapGesture];

点击屏幕对焦

#pragma mark - 点击屏幕对焦:聚焦
- (void)focusGesture:(UITapGestureRecognizer*)gesture{
    CGPoint point = [gesture locationInView:gesture.view];
    [self focusAtPoint:point];
}
- (void)focusAtPoint:(CGPoint)point{
    CGSize size = self.view.bounds.size;
    CGPoint focusPoint = CGPointMake( point.y /size.height ,1-point.x/size.width );
    NSError *error;
    if ([self.device lockForConfiguration:&error]) {
        
        if ([self.device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
            [self.device setFocusPointOfInterest:focusPoint];
            [self.device setFocusMode:AVCaptureFocusModeAutoFocus];
        }
        
        if ([self.device isExposureModeSupported:AVCaptureExposureModeAutoExpose ]) {
            [self.device setExposurePointOfInterest:focusPoint];
            [self.device setExposureMode:AVCaptureExposureModeAutoExpose];
        }
        
        [self.device unlockForConfiguration];
        self.focusView.center = point;
        _focusView.hidden = NO;
        
        //        self.focusView.alpha = 1;
        [UIView animateWithDuration:0.2 animations:^{
            self.focusView.transform = CGAffineTransformMakeScale(1.25f, 1.25f);
        } completion:^(BOOL finished) {
            [UIView animateWithDuration:0.3 animations:^{
                self.focusView.transform = CGAffineTransformMakeScale(1.0f, 1.0f);
            } completion:^(BOOL finished) {
                [self hiddenFocusAnimation];
            }];
        }];
    }
    
}
- (void)hiddenFocusAnimation{
    [UIView beginAnimations:nil context:UIGraphicsGetCurrentContext()];

    [UIView setAnimationDelay:3];
    self.focusView.alpha = 0;
    [UIView setAnimationDuration:0.5f];//动画时间
    [UIView commitAnimations];
    
}
- (void)hiddenFoucsView{
    self.focusView.alpha = !self.focusView.alpha;
}


- (void)focusDidFinsh{
    self.focusView.hidden = YES;
    self.focusView.transform = CGAffineTransformMakeScale(1.0f, 1.0f);
}

- (void)startFocusAnimation{
    self.focusView.hidden = NO;
    self.focusView.transform = CGAffineTransformMakeScale(1.25f, 1.25f);//将要显示的view按照正常比例显示出来
    [UIView beginAnimations:nil context:UIGraphicsGetCurrentContext()];
  
    [UIView setAnimationDidStopSelector:@selector(hiddenFocusAnimation)];
    [UIView setAnimationDuration:0.5f];//动画时间
    self.focusView.transform = CGAffineTransformIdentity;//先让要显示的view最小直至消失
    [UIView commitAnimations]; //启动动画
    //相反如果想要从小到大的显示效果,则将比例调换
    
}

初始化对焦区域

#pragma mark - 初始化对焦区域
-(UIImageView *)focusView{
    if (_focusView == nil) {
        _focusView = [[UIImageView alloc]initWithFrame:CGRectMake(0, 0, 80, 80)];
        _focusView.backgroundColor = [UIColor clearColor];
//        _focusView.image = [UIImage imageNamed:@"foucs80pt"];
        _focusView.hidden = YES;
        [self.view addSubview:self.focusView];

    }
    return _focusView;
}

新增【触摸屏幕对焦】提示语

//  
- (UILabel *)tip4focusLabel {
    
    if (_tip4focusLabel == nil) {
        
        UILabel *tmp =[UILabel new];
        
        
        _tip4focusLabel =tmp;
        
        
        tmp.textColor = [UIColor whiteColor];
        tmp.numberOfLines = 0;
        tmp.textAlignment = NSTextAlignmentCenter;
        tmp.font = kPingFangFont(16);
        
        
        NSMutableAttributedString *xx  = [[NSMutableAttributedString alloc]init];
        
        xx.kn_addString(@" 触摸屏幕对焦").kn_fontColor(rgb(255,255,255)).kn_addString(@"").kn_fontColor(rgb(225,66,66)).kn_addString(@"").kn_fontColor(rgb(255,255,255));

        
        
        
        
        
        
        tmp.attributedText = xx;

        tmp.transform = CGAffineTransformMakeRotation(M_PI/2);

        
        [self.view addSubview:tmp];

        
        __weak __typeof__(self) weakSelf = self;
        

        [tmp mas_makeConstraints:^(MASConstraintMaker *make) {
            
            
            make.centerX.equalTo(weakSelf.view.mas_left).offset(kAdjustRatio( 23));

            
            
            make.centerY.offset(kAdjustRatio(0));

                    
        }];

        
        
        
        
    }
    
    return _tip4focusLabel;
}


1.6 身份证和人头像的宽高比

身份证的宽高比

    CGFloat width = iPhone5or5cor5sorSE? 240: (iPhone6or6sor7? 270: 300);
    _IDCardScanningWindowLayer.bounds = (CGRect){CGPointZero, {width, width * 1.574}};

人头像的宽高比

    CGFloat facePathWidth = iPhone5or5cor5sorSE? 125: (iPhone6or6sor7? 150: 180);
    CGFloat facePathHeight = facePathWidth * 0.812;

1.7 调整屏幕亮度

常用场景我们可以在打开某个特定界面的时候调整亮度,退出时恢复亮度!

类似支付宝微信的二维码提供扫描时会使屏幕高亮状态

//获取当前屏幕的亮度
CGFloat value = [UIScreen mainScreen].brightness;

//设置屏幕亮度
//设置窗口亮度大小 范围是0.1 -1.0
[[UIScreen mainScreen] setBrightness:0.5];

//设置屏幕常亮
//设置屏幕常亮
[UIApplication sharedApplication].idleTimerDisabled = YES;

//恢复屏幕默认
  [UIApplication sharedApplication].idleTimerDisabled = NO;

1.8 获取iPhone设备摄像头所感知的环境光强度

值越大,光强度效果越明显

使用场景: 自定义相机进行OCR的时候,检测到光强度暗的时候,提示打开闪光灯(或者光线暗的时候,能够自动打开闪光灯)
#import <ImageIO/ImageIO.h>
-  (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,sampleBuffer, kCMAttachmentMode_ShouldPropagate);
    NSDictionary *metadata = [[NSMutableDictionary alloc] initWithDictionary:(__bridge NSDictionary*)metadataDict];
    CFRelease(metadataDict);
    NSDictionary *exifMetadata = [[metadata objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
    float brightnessValue = [[exifMetadata objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];

   NSLog(@"%f",brightnessValue);
    if (self.delegate && [self.delegate respondsToSelector:@selector(QRCodeScanManager:brightnessValue:)]) {
        [self.delegate QRCodeScanManager:self brightnessValue:brightnessValue];
    }

}

根据光线强弱值打开手电筒的方法

/** 根据光线强弱值打开手电筒的方法 (brightnessValue: 光线强弱值) */
- (void)QRCodeScanManager:(SGQRCodeScanManager *)scanManager brightnessValue:(CGFloat)brightnessValue;




光线明亮,隐藏闪光灯按钮;光线昏暗,显示闪光灯按钮

    if (brightness > 0) {
        // 光线明亮,隐藏按钮
        [self.maskView hideLightButton];
    } else {
        // 光线昏暗,显示按钮
        [self.maskView showLightButton];
    }
————————————————
版权声明:本文为CSDN博主「#公号:iOS逆向」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接: https://blog.csdn.net/z929118967/article/details/103687299

1.9 手电筒

定义属性

/** 手电筒 */
@property (nonatomic, strong) UIButton * flashlight;

初始化手电筒

       //
    CGFloat flashlight_width = 40;
    CGFloat flashlight_height = 40;
    
    self.flashlight = [UIButton buttonWithType:UIButtonTypeCustom];
    self.flashlight.frame = CGRectMake(0, 0, flashlight_width, flashlight_height);
    //icon_shoukuan_shoudian_p
    [self.flashlight setImage:[UIImage imageNamed:@"icon_shoukuan_shoudian"] forState:UIControlStateNormal];
    [self.flashlight addTarget:self action:@selector(flashlightAction:) forControlEvents:UIControlEventTouchUpInside];
    [self.view addSubview:self.flashlight];
    
    

处理打开关闭手电筒动作flashlightAction

#pragma mark - 打开/关闭 手电筒

- (void)flashlightAction:(UIButton *)sender{
    sender.selected = !sender.selected;
    if (sender.selected) {
        [sender setImage:[UIImage imageNamed:@"icon_shoukuan_shoudian_p"] forState:UIControlStateSelected];
        self.flashlightHintLabel.textColor = ZFColor(133, 235, 0, 1);
        
        //打开闪光灯
        AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
        NSError *error = nil;
        
        if ([captureDevice hasTorch]) {
            BOOL locked = [captureDevice lockForConfiguration:&error];//请求独占访问硬件设备
            if (locked) {
                captureDevice.torchMode = AVCaptureTorchModeOn;
                [captureDevice unlockForConfiguration];// 请求解除独占访问硬件设备
            }
        }
        
    }else{
        [sender setImage:[UIImage imageNamed:@"icon_shoukuan_shoudian"] forState:UIControlStateSelected];
        self.flashlightHintLabel.textColor = ZFWhite;
        
        //关闭闪光灯
        AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
        if ([device hasTorch]) {
            [device lockForConfiguration:nil];
            [device setTorchMode: AVCaptureTorchModeOff];
            [device unlockForConfiguration];
        }
    }
}

II 常用视图

2.1 扫描线

2.1.1 采用动画组进行实现

  • 扫描线控件
    UIImage * scanLine = [UIImage imageNamed:@"img_shoukuan_red"];
    
    

    self.scanLineImg = [[UIImageView alloc] init];
    self.scanLineImg.image = scanLine;
    self.scanLineImg.contentMode = UIViewContentModeScaleAspectFit;
    [self addSubview:self.scanLineImg];
    [self.scanLineImg.layer addAnimation:[self animation] forKey:nil];

动画

/**
 *  动画
 */
- (CABasicAnimation *)animation{
    CABasicAnimation * animation = [CABasicAnimation animationWithKeyPath:@"position"];
    animation.duration = 3;
    animation.fillMode = kCAFillModeForwards;
    animation.removedOnCompletion = NO;
    animation.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear];
    animation.repeatCount = MAXFLOAT;
    
    //第一次旋转
    if (_isFirstTransition) {
        //横屏
        if ([[UIApplication sharedApplication] statusBarOrientation] == UIInterfaceOrientationLandscapeLeft || [[UIApplication sharedApplication] statusBarOrientation] == UIInterfaceOrientationLandscapeRight){
            
            animation.fromValue = [NSValue valueWithCGPoint:CGPointMake(self.center.x, (self.center.y - self.frame.size.height * ZFScanRatio * 0.3 + self.scanLineImg.image.size.height * 0.3))];
            animation.toValue = [NSValue valueWithCGPoint:CGPointMake(self.center.x, (self.center.y + self.frame.size.height * ZFScanRatio * 0.3 - self.scanLineImg.image.size.height * 0.3))];
            
        //竖屏
        }else{
            
            
            
            animation.toValue = [NSValue valueWithCGPoint:CGPointMake(self.center.x, self.center.y + self.frame.size.width * ZFScanRatio * 0.1 - self.scanLineImg.image.size.height * 0.5)];
            
            animation.fromValue = [NSValue valueWithCGPoint:CGPointMake(self.center.x,self.center.y + self.frame.size.width * ZFScanRatio * 0.1  -self.frame.size.width * ZFScanRatio *0.9      )];

            
            //
            
//            animation.fromValue = [NSValue valueWithCGPoint:CGPointMake(self.center.x, (self.topLeftImg.y))];
//            animation.toValue = [NSValue valueWithCGPoint:CGPointMake(self.center.x, self.bottomLeftImg.y)];

            
            
            
            
        }
        
        _isFirstTransition = NO;
        
        //非第一次旋转
    }else{
        //横屏
        if ([[UIApplication sharedApplication] statusBarOrientation] == UIInterfaceOrientationLandscapeLeft || [[UIApplication sharedApplication] statusBarOrientation] == UIInterfaceOrientationLandscapeRight){
            
            animation.fromValue = [NSValue valueWithCGPoint:CGPointMake(self.center.x, (self.frame.size.height - (self.frame.size.width * ZFScanRatio)) * 0.3)];
            animation.toValue = [NSValue valueWithCGPoint:CGPointMake(self.center.x, self.scanLineImg.frame.origin.y + self.frame.size.width * ZFScanRatio - self.scanLineImg.frame.size.height * 0.3)];
            
            
            //竖屏
        }else{
            
            animation.fromValue = [NSValue valueWithCGPoint:CGPointMake(self.center.x, (self.frame.size.height - (self.frame.size.height * ZFScanRatio)) * 0.3)];
            animation.toValue = [NSValue valueWithCGPoint:CGPointMake(self.center.x, self.scanLineImg.frame.origin.y + self.frame.size.height * ZFScanRatio - self.scanLineImg.frame.size.height * 0.3)];
        }
    }
    
    return animation;
}

移除动画

/**
 *  移除动画
 */
- (void)removeAnimation{
    [self.scanLineImg.layer removeAllAnimations];
}

设置扫描线frame

//CGFloat const ZFScanRatio = 0.68f;

        self.scanLineImg.frame = CGRectMake((self.frame.size.width - (self.frame.size.width * ZFScanRatio)) * 0.5,
                                            
                                            (self.frame.size.height - (self.frame.size.width * ZFScanRatio)) * tmp3,
                                            
                                            
                                            self.frame.size.width * ZFScanRatio, scanLine.size.height);
        

2.1.2 定时调用setNeedsDisplay定时redrawn,来实现实现水平扫描线

  • 使用定时器进行实现水平扫描线
调用setNeedsDisplay定时redrawn
#pragma mark - 添加定时器
-(void)addTimer {
    _timer = [NSTimer scheduledTimerWithTimeInterval:0.02 target:self selector:@selector(timerFire:) userInfo:nil repeats:YES];
    [_timer fire];
}

-(void)timerFire:(id)notice {
    [self setNeedsDisplay];//Marks the receiver’s entire bounds rectangle as needing to be redrawn.
}

-(void)dealloc {
    [_timer invalidate];
}

- (void)drawRect:(CGRect)rect {
    rect = _IDCardScanningWindowLayer.frame;

    // 人像提示框
//    UIBezierPath *facePath = [UIBezierPath bezierPathWithRect:_facePathRect];
//    facePath.lineWidth = 1.5;
//    [[UIColor whiteColor] set];
//    [facePath stroke];
    
    // 水平扫描线
    CGContextRef context = UIGraphicsGetCurrentContext();
    
    static CGFloat moveX = 0;
    static CGFloat distanceX = 0;
    
    CGContextBeginPath(context);
    CGContextSetLineWidth(context, 2);
    CGContextSetRGBStrokeColor(context,0.3,0.8,0.3,0.8);
    CGPoint p1, p2;// p1, p2 连成水平扫描线;
    
    moveX += distanceX;
    if (moveX >= CGRectGetWidth(rect) - 2) {
        distanceX = -2;
    } else if (moveX <= 2){
        distanceX = 2;
    }
    
    p1 = CGPointMake(CGRectGetMaxX(rect) - moveX, rect.origin.y);
    p2 = CGPointMake(CGRectGetMaxX(rect) - moveX, rect.origin.y + rect.size.height);
    
    CGContextMoveToPoint(context,p1.x, p1.y);
    CGContextAddLineToPoint(context, p2.x, p2.y);
    
    /*
     // 竖直扫描线
     static CGFloat moveY = 0;
     static CGFloat distanceY = 0;
     CGPoint p3, p4;// p3, p4连成竖直扫描线
     
     moveY += distanceY;
     if (moveY >= CGRectGetHeight(rect) - 2) {
     distanceY = -2;
     } else if (moveY <= 2) {
     distanceY = 2;
     }
     p3 = CGPointMake(rect.origin.x, rect.origin.y + moveY);
     p4 = CGPointMake(rect.origin.x + rect.size.width, rect.origin.y + moveY);
     
     CGContextMoveToPoint(context,p3.x, p3.y);
     CGContextAddLineToPoint(context, p4.x, p4.y);
     */
    
    CGContextStrokePath(context);
}

2.2 iOS13适配【present 半屏问题】

自定义相机推荐模态展示,并且modalPresentationStyle设置为全屏UIModalPresentationFullScreen

iOS13适配【present 半屏问题】如果没适配会导致列表下拉刷新失效

https://blog.csdn.net/z929118967/article/details/104477314
modalPresentationStyle属性默认不是UIModalPresentationFullScreen了,需要根据需求手动设置。
- (void)K_presentViewController:(UIViewController *)viewControllerToPresent animated:(BOOL)flag completion:(void (^)(void))completion {
    if (@available(iOS 13.0, *)) {
        if (viewControllerToPresent.K_automaticallySetModalPresentationStyle) {
            viewControllerToPresent.modalPresentationStyle = UIModalPresentationFullScreen;
        }
        [self K_presentViewController:viewControllerToPresent animated:flag completion:completion];
    } else {
        // Fallback on earlier versions
        [self K_presentViewController:viewControllerToPresent animated:flag completion:completion];
    }
}


————————————————
版权声明:本文为CSDN博主「#公号:iOS逆向」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
原文链接:https://blog.csdn.net/z929118967/article/details/104477314

iOS13适配:灵活控制模态展示的视图样式(全屏/下滑返回)文中提供完整demo源码

https://kunnan.blog.csdn.net/article/details/106538442
目录
相关文章
|
4月前
|
安全 编译器 Swift
IOS开发基础知识: 对比 Swift 和 Objective-C 的优缺点。
IOS开发基础知识: 对比 Swift 和 Objective-C 的优缺点。
106 2
|
JSON JavaScript 前端开发
iOS小技能: 开发 uni-app 原生插件(支持iOS Extension)
术语:uni原生插件指的是将`原生开发的功能按照规范封装成插件包`,然后即可在 uni-app 前端项目中通过js调用原生能力。
797 0
iOS小技能: 开发 uni-app 原生插件(支持iOS Extension)
|
1月前
|
iOS开发 UED
实现一个自定义的iOS动画效果
【4月更文挑战第9天】本文将详细介绍如何在iOS平台上实现一个自定义的动画效果。我们将通过使用Core Animation框架来实现这个动画效果,并展示如何在不同的场景中使用它。文章的目标是帮助读者理解如何使用Core Animation框架来创建自定义动画,并提供一个简单的示例代码。
18 1
|
24天前
|
API 定位技术 iOS开发
IOS开发基础知识:什么是 Cocoa Touch?它在 iOS 开发中的作用是什么?
【4月更文挑战第18天】**Cocoa Touch** 是iOS和Mac OS X应用的核心框架,包含面向对象库、运行时系统和触摸优化工具。它提供Mac验证的开发模式,强调触控接口和性能,涵盖3D图形、音频、网络及设备访问API,如相机和GPS。是构建高效iOS应用的基础,对开发者至关重要。
21 0
|
8月前
|
iOS开发
iOS多线程之NSOperationQueue-依赖、并发数、优先级、自定义Operation等最全的使用总结
iOS多线程之NSOperationQueue-依赖、并发数、优先级、自定义Operation等最全的使用总结
232 0
|
4月前
|
Swift iOS开发 开发者
IOS开发基础知识:什么是 ARC(自动引用计数)?如何工作?
IOS开发基础知识:什么是 ARC(自动引用计数)?如何工作?
51 1
|
4月前
|
安全 JavaScript 前端开发
IOS开发基础知识:介绍一下 Swift 和 Objective-C,它们之间有什么区别?
IOS开发基础知识:介绍一下 Swift 和 Objective-C,它们之间有什么区别?
75 0
|
4月前
|
设计模式 前端开发 数据库
IOS开发基础知识:什么是 MVC 架构?它在 iOS 中的应用是什么样的?
IOS开发基础知识:什么是 MVC 架构?它在 iOS 中的应用是什么样的?
28 1
|
8月前
|
API iOS开发
iOS 自定义转场动画 UIViewControllerTransitioning
iOS 自定义转场动画 UIViewControllerTransitioning
49 0
|
9月前
|
Swift iOS开发
iOS 13 之后自定义 Window 不显示解决 (SceneDelegate)
iOS 13 之后自定义 Window 不显示解决 (SceneDelegate)
263 0