Detecting if headphones are plugged into iPhone

怎甘沉沦 提交于 2019-11-27 19:54:57

http://developer.apple.com/iphone/library/samplecode/SpeakHere/Introduction/Intro.html

In this project there is a code-snippet where it pauses recording if the headphones is unpluged. Maybe you can use it to achieve your result.

Good luck!

(edit)

You will have to study the SpeakHereController.mm file.
I found this code in the awakeFromNib method

// we do not want to allow recording if input is not available
error = AudioSessionGetProperty(kAudioSessionProperty_AudioInputAvailable, &size, &inputAvailable);
if (error) printf("ERROR GETTING INPUT AVAILABILITY! %d\n", error);
btn_record.enabled = (inputAvailable) ? YES : NO;

// we also need to listen to see if input availability changes
error = AudioSessionAddPropertyListener(kAudioSessionProperty_AudioInputAvailable, propListener, self);
if (error) printf("ERROR ADDING AUDIO SESSION PROP LISTENER! %d\n", error);

With this code you can detect the changes between:

  • MicrophoneWired
  • Headphone
  • LineOut
  • Speaker

Detecting when an iOS Device connector was plugged/unplugged

Note: Since iOS 5 part of the "audioRouteChangeListenerCallback(...)" behavior is deprecated but you can update it with:

// kAudioSession_AudioRouteChangeKey_PreviousRouteDescription -> Previous route
// kAudioSession_AudioRouteChangeKey_CurrentRouteDescription -> Current route

CFDictionaryRef newRouteRef = CFDictionaryGetValue(routeChangeDictionary, kAudioSession_AudioRouteChangeKey_CurrentRouteDescription);
NSDictionary *newRouteDict = (NSDictionary *)newRouteRef;

// RouteDetailedDescription_Outputs -> Output
// RouteDetailedDescription_Outputs -> Input

NSArray * paths = [[newRouteDict objectForKey: @"RouteDetailedDescription_Outputs"] count] ? [newRouteDict objectForKey: @"RouteDetailedDescription_Outputs"] : [newRouteDict objectForKey: @"RouteDetailedDescription_Inputs"];

NSString * newRouteString = [[paths objectAtIndex: 0] objectForKey: @"RouteDetailedDescription_PortType"];

// newRouteString -> MicrophoneWired, Speaker, LineOut, Headphone

Greetings

Nilesh Kikani

Here is the solution, you may like it or it is helpful to you.

Before using below method please write this two line also

UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_None; AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);

(void)isHeadsetPluggedIn {
    UInt32 routeSize = sizeof (CFStringRef); CFStringRef route;
    AudioSessionGetProperty (kAudioSessionProperty_AudioRoute, &routeSize, &route);

    //NSLog(@"Error >>>>>>>>>> :%@", error); 
    /* Known values of route:
    "Headset"
    "Headphone"
    "Speaker"
    "SpeakerAndMicrophone"
    "HeadphonesAndMicrophone"
    "HeadsetInOut"
    "ReceiverAndMicrophone"
    "Lineout" */

    NSString* routeStr = (NSString*)route;

    NSRange headsetRange = [routeStr rangeOfString : @"Headset"]; NSRange receiverRange = [routeStr rangeOfString : @"Receiver"];

    if(headsetRange.location != NSNotFound) {
        // Don't change the route if the headset is plugged in. 
        NSLog(@"headphone is plugged in "); 
    } else
        if (receiverRange.location != NSNotFound) { 
            // Change to play on the speaker 
            NSLog(@"play on the speaker");
        } else {
            NSLog(@"Unknown audio route.");
        }
}

To perform a one-off check to determine if headphones are plugged in (rather than setting a callback when they're unplugged) I found the following works in iOS5 and above:

- (BOOL) isAudioJackPlugged
{

// initialise the audio session - this should only be done once - so move this line to your AppDelegate
AudioSessionInitialize(NULL, NULL, NULL, NULL);
UInt32 routeSize;

// oddly, without calling this method caused an error.
AudioSessionGetPropertySize(kAudioSessionProperty_AudioRouteDescription, &routeSize);
CFDictionaryRef desc; // this is the dictionary to contain descriptions

// make the call to get the audio description and populate the desc dictionary
AudioSessionGetProperty (kAudioSessionProperty_AudioRouteDescription, &routeSize, &desc);

// the dictionary contains 2 keys, for input and output. Get output array
CFArrayRef outputs = CFDictionaryGetValue(desc, kAudioSession_AudioRouteKey_Outputs);

// the output array contains 1 element - a dictionary
CFDictionaryRef dict = CFArrayGetValueAtIndex(outputs, 0);

// get the output description from the dictionary
CFStringRef output = CFDictionaryGetValue(dict, kAudioSession_AudioRouteKey_Type);

/**
 These are the possible output types:
 kAudioSessionOutputRoute_LineOut
 kAudioSessionOutputRoute_Headphones
 kAudioSessionOutputRoute_BluetoothHFP
 kAudioSessionOutputRoute_BluetoothA2DP
 kAudioSessionOutputRoute_BuiltInReceiver
 kAudioSessionOutputRoute_BuiltInSpeaker
 kAudioSessionOutputRoute_USBAudio
 kAudioSessionOutputRoute_HDMI
 kAudioSessionOutputRoute_AirPlay
 */

return CFStringCompare(output, kAudioSessionOutputRoute_Headphones, 0) == kCFCompareEqualTo;
}

For those keeping score at home, that's a string in a dictionary in an array in a dictionary.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!