I am trying to find the accepted formats on an AVFoundation output:
self.theOutput=[[AVCaptureVideoDataOutput alloc]init];
if ([self.theSession canAddOutput:self.theOutput])
[self.theSession addOutput:self.theOutput];
I am then inserting a breakpoint right after and:
po [self.theOutput availableVideoCVPixelFormatTypes]
and I get this:
(NSArray *) $5 = 0x2087ad00 <__NSArrayM 0x2087ad00>(
875704438,
875704422,
1111970369
)
How do I get the string values of these format types?
Thanks
On an iPhone5 running iOS6, here are the AVCaptureVideoDataOuput availableVideoCVPixelFormatTypes:
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
kCVPixelFormatType_32BGRA
Credit where credit is due, I found a way to get the value supported here. https://gist.github.com/2327666
A category version for debugging
As a category on NSNumber
#import <CoreVideo/CoreVideo.h>
@implementation NSNumber (CVPixelFormatType)
- (NSString *)descriptivePixelFormat
{
return @{
@(kCVPixelFormatType_1Monochrome): @"kCVPixelFormatType_1Monochrome",
@(kCVPixelFormatType_2Indexed): @"kCVPixelFormatType_2Indexed",
@(kCVPixelFormatType_4Indexed): @"kCVPixelFormatType_4Indexed",
@(kCVPixelFormatType_8Indexed): @"kCVPixelFormatType_8Indexed",
@(kCVPixelFormatType_1IndexedGray_WhiteIsZero): @"kCVPixelFormatType_1IndexedGray_WhiteIsZero",
@(kCVPixelFormatType_2IndexedGray_WhiteIsZero): @"kCVPixelFormatType_2IndexedGray_WhiteIsZero",
@(kCVPixelFormatType_4IndexedGray_WhiteIsZero): @"kCVPixelFormatType_4IndexedGray_WhiteIsZero",
@(kCVPixelFormatType_8IndexedGray_WhiteIsZero): @"kCVPixelFormatType_8IndexedGray_WhiteIsZero",
@(kCVPixelFormatType_16BE555): @"kCVPixelFormatType_16BE555",
@(kCVPixelFormatType_16LE555): @"kCVPixelFormatType_16LE555",
@(kCVPixelFormatType_16LE5551): @"kCVPixelFormatType_16LE5551",
@(kCVPixelFormatType_16BE565): @"kCVPixelFormatType_16BE565",
@(kCVPixelFormatType_16LE565): @"kCVPixelFormatType_16LE565",
@(kCVPixelFormatType_24RGB): @"kCVPixelFormatType_24RGB",
@(kCVPixelFormatType_24BGR): @"kCVPixelFormatType_24BGR",
@(kCVPixelFormatType_32ARGB): @"kCVPixelFormatType_32ARGB",
@(kCVPixelFormatType_32BGRA): @"kCVPixelFormatType_32BGRA",
@(kCVPixelFormatType_32ABGR): @"kCVPixelFormatType_32ABGR",
@(kCVPixelFormatType_32RGBA): @"kCVPixelFormatType_32RGBA",
@(kCVPixelFormatType_64ARGB): @"kCVPixelFormatType_64ARGB",
@(kCVPixelFormatType_48RGB): @"kCVPixelFormatType_48RGB",
@(kCVPixelFormatType_32AlphaGray): @"kCVPixelFormatType_32AlphaGray",
@(kCVPixelFormatType_16Gray): @"kCVPixelFormatType_16Gray",
@(kCVPixelFormatType_422YpCbCr8): @"kCVPixelFormatType_422YpCbCr8",
@(kCVPixelFormatType_4444YpCbCrA8): @"kCVPixelFormatType_4444YpCbCrA8",
@(kCVPixelFormatType_4444YpCbCrA8R): @"kCVPixelFormatType_4444YpCbCrA8R",
@(kCVPixelFormatType_444YpCbCr8): @"kCVPixelFormatType_444YpCbCr8",
@(kCVPixelFormatType_422YpCbCr16): @"kCVPixelFormatType_422YpCbCr16",
@(kCVPixelFormatType_422YpCbCr10): @"kCVPixelFormatType_422YpCbCr10",
@(kCVPixelFormatType_444YpCbCr10): @"kCVPixelFormatType_444YpCbCr10",
@(kCVPixelFormatType_420YpCbCr8Planar): @"kCVPixelFormatType_420YpCbCr8Planar",
@(kCVPixelFormatType_420YpCbCr8PlanarFullRange): @"kCVPixelFormatType_420YpCbCr8PlanarFullRange",
@(kCVPixelFormatType_422YpCbCr_4A_8BiPlanar): @"kCVPixelFormatType_422YpCbCr_4A_8BiPlanar",
@(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange): @"kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange",
@(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange): @"kCVPixelFormatType_420YpCbCr8BiPlanarFullRange",
@(kCVPixelFormatType_422YpCbCr8_yuvs): @"kCVPixelFormatType_422YpCbCr8_yuvs",
@(kCVPixelFormatType_422YpCbCr8FullRange): @"kCVPixelFormatType_422YpCbCr8FullRange"
}[self];
}
@end
Diagnostic output example
NSMutableArray *mutablePixelFormatTypes = [NSMutableArray array];
[captureOutput.availableVideoCVPixelFormatTypes enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
[mutablePixelFormatTypes addObject:[obj descriptivePixelFormat]];
}];
NSString *pixelFormats = [mutablePixelFormatTypes componentsJoinedByString:@",\n"];
NSLog(@"Available pixel formats:\n%@\n", pixelFormats);
When you call availableVideoCVPixelFormatTypes you get the decimal representation of all of the pixel format type labels. If you convert those to hex you can match some of them to labels listed on Apple's Documentation. For the rest you have to convert the hex value you got into ASCII characters to finally match the label.
For example:
(Decimal) ------> (Hex) ---> (ASCII)
875704438 -> 34323076 -> 420v
875704422 -> 34323066 -> 420f
1111970369 -> 42475241 -> BGRA
I found this site "ASCII to Hex" to be useful.
来源:https://stackoverflow.com/questions/14537897/getting-actual-nsstring-of-avcapturevideodataoutput-availablevideocvpixelformatt