Is there a way to use the iPhone proximity sensor to detect whether the phone is in a room with no light?
This question seems to imply that this is not possible...Do
Here's a much simpler way of using the camera to find out how bright a scene is. (Obviously, it only reads the data that can be "seen" in the camera's field of view, so it's not a true ambient light sensor...)
Using the AVFoundation framework, set up a video input and then, using the ImageIO framework, read the metadata that's coming in with each frame of the video feed (you can ignore the actual video data):
#import <ImageIO/ImageIO.h>
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,
sampleBuffer, kCMAttachmentMode_ShouldPropagate);
NSDictionary *metadata = [[NSMutableDictionary alloc]
initWithDictionary:(__bridge NSDictionary*)metadataDict];
CFRelease(metadataDict);
NSDictionary *exifMetadata = [[metadata
objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
float brightnessValue = [[exifMetadata
objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];
}
You now have the Brightness Value for the scene updated (typically—you can configure this) 15-30 times per second. Lower numbers are darker.
There is much simpler solution for this if anyone needs. Use screen brightness to detect the light conditions:
0 - 0.3 (Dark)
0.4 - 1 (Bright)
Tweak as needed:
switch UIScreen.main.brightness {
case 0 ... 0.3:
print("LOW LIGHT")
default:
print("ENOUGH LIGHT")
}
Although it is possible to access the ambient light sensor data through IOKit framework, Apple discourages developers from using it, and any apps using it will be rejected from App Store.
But it is possible to deduce the luminosity of the environment approximately through the camera. That is by implementing the camera through AVFoundation framework and processing the meta data coming though each of the camera frames. Refer to this answer on question : How to get light value from AVFoundation
Proximity sensor is not what you should be looking for. Ambient light sensor it is. Apparently that API is undocumented or not available at all for developers. An alternative way of detecting if iPhone is in a dark room would be using the camera and obtaining the luminosity . Here's a good guide on how to do that,
https://www.transpire.com/insights/blog/obtaining-luminosity-ios-camera/
Swift 4.2 version based upon Wildaker's code. Xcode 10 refused to allow it to be a Float, but the double has worked.
func getBrightness(sampleBuffer: CMSampleBuffer) -> Double {
let rawMetadata = CMCopyDictionaryOfAttachments(allocator: nil, target: sampleBuffer, attachmentMode: CMAttachmentMode(kCMAttachmentMode_ShouldPropagate))
let metadata = CFDictionaryCreateMutableCopy(nil, 0, rawMetadata) as NSMutableDictionary
let exifData = metadata.value(forKey: "{Exif}") as? NSMutableDictionary
let brightnessValue : Double = exifData?[kCGImagePropertyExifBrightnessValue as String] as! Double
return brightnessValue
}
If you're doing some Augmented Reality stuff using ARKit you can get the lightEstimate
value on each frame of the video feed from the ARSession
.
See documentation on this.