Play audio through upper (phone call) speaker

后端 未结 4 526
隐瞒了意图╮
隐瞒了意图╮ 2020-12-09 00:06

I\'m trying to get audio in my app to play through the upper speaker on the iPhone, the one you press to your ear during a phone call. I know it\'s possible, because I\'ve

相关标签:
4条回答
  • 2020-12-09 00:15

    You have to initialise your audio session first.

    Using the C API

      AudioSessionInitialize (NULL, NULL, NULL, NULL);
    

    In iOS6 you can use AVAudioSession methods instead (you will need to import the AVFoundation framework to use AVAudioSession):

    Initialization using AVAudioSession

     self.audioSession = [AVAudioSession sharedInstance];
    

    Setting the audioSession category using AVAudioSession

     [self.audioSession setCategory:AVAudioSessionCategoryPlayAndRecord
                                           error:nil];
    

    For further research, if you want better search terms, here are the full names of the constants for the speakers:

    const CFStringRef kAudioSessionOutputRoute_BuiltInReceiver;
    const CFStringRef kAudioSessionOutputRoute_BuiltInSpeaker;
    

    see apple's docs here

    But the real mystery is why you are having any trouble routing to the receiver. It's the default behaviour for the playAndRecord category. Apple's documentation of kAudioSessionOverrideAudioRoute_None:

    "Specifies, for the kAudioSessionCategory_PlayAndRecord category, that output audio should go to the receiver. This is the default output audio route for this category."

    update

    In your updated question you reveal that you are using the MPMusicPlayerController class. This class invokes the global music player (the same player used in the Music app). This music player is separate from your app, and so doesn't share the same audio session as your app's audioSession. Any properties you set on your app's audioSession will be ignored by the MPMusicPlayerController.

    If you want control over your app's audio behaviour, you need to use an audio framework internal to your app. This would be AVAudioRecorder / AVAudioPlayer or Core Audio (Audio Queues, Audio Units or OpenAL). Whichever method you use, the audio session can be controlled either via AVAudioSession properties or via the Core Audio API. Core Audio gives you more fine-grained control, but with each new release of iOS more of it is ported over to AVFoundation, so start with that.

    Also remember that the audio session provides a way for you to describe the intended behaviour of your app's audio in relation to the total iOS environment, but it will not hand you total control. Apple takes care to ensure that the user's expectations of their device's audio behaviour remain consistent between apps, and when one app needs to interrupt another's audio stream.

    update 2

    In your edit you allude to the possibility of audio sessions checking other app's audio session settings. That does not happen1. The idea is that each app sets it's preferences for it's own audio behaviour using it's self-contained audio session. The operating system arbitrates between conflicting audio requirements when more than one app competes for an unshareable resource, such as the internal microphone or one of the speakers, and will usually decide in favour of that behaviour which is most likely to meet the user's expectations of the device as a whole.

    The MPMusicPlayerController class is slightly unusual in that it gives you the ability for one app to have some degree of control over another. In this case, your app is not playing the audio, it is sending a request to the Music Player to play audio on your behalf. Your control is limited by the extent of the MPMusicPlayerController API. For more control, your app will have to provide it's own implementation of audio playback.

    In your comment you wonder:

    Could there be a way to pull an MPMediaItem from the MPMusicPlayerController and then play them through the app-specific audio session, or anything like that?

    That's a (big) subject for a new question. Here is a good starting read (from Chris Adamson's blog) From iPod Library to PCM Samples in Far Fewer Steps Than Were Previously Necessary - it's the sequel to From iphone media library to pcm samples in dozens of confounding and potentially lossy steps - that should give you a sense to the complexity you will face. This may have got easier since iOS6 but I wouldn't be so sure!


    1 there is an otherAudioPlaying read-only BOOL property in ios6, but that's about it

    0 讨论(0)
  • 2020-12-09 00:16

    swift 5.0

    func activateProximitySensor(isOn: Bool) {
        let device = UIDevice.current
        device.isProximityMonitoringEnabled = isOn
        if isOn {
            NotificationCenter.default.addObserver(self, selector: #selector(proximityStateDidChange), name: UIDevice.proximityStateDidChangeNotification, object: device)
            let session = AVAudioSession.sharedInstance()
            do{
                try session.setCategory(.playAndRecord)
                try session.setActive(true)
                try session.overrideOutputAudioPort(AVAudioSession.PortOverride.speaker)
            } catch {
                print ("\(#file) - \(#function) error: \(error.localizedDescription)")
            }
        } else {
            NotificationCenter.default.removeObserver(self, name: UIDevice.proximityStateDidChangeNotification, object: device)
        }
    }
    
    @objc func proximityStateDidChange(notification: NSNotification) {
        if let device = notification.object as? UIDevice {
            print(device)
            let session = AVAudioSession.sharedInstance()
            do{
                let routePort: AVAudioSessionPortDescription? = session.currentRoute.outputs.first
                let portType = routePort?.portType
                if let type = portType, type.rawValue == "Receiver" {
                    try session.overrideOutputAudioPort(AVAudioSession.PortOverride.speaker)
                } else {
                    try session.overrideOutputAudioPort(AVAudioSession.PortOverride.none)
                }
            } catch {
                print ("\(#file) - \(#function) error: \(error.localizedDescription)")
            }
        }
    }
    
    0 讨论(0)
  • 2020-12-09 00:19

    Was struggling with this for a while too. Maybe this would help someone later.You can also use the newer methods of overriding ports. Many of the methods in your sample code are actually deprecated.

    So if you have your AudioSession sharedInstance by getting,

    NSError *error = nil;
    AVAudioSession *session = [AVAudioSession sharedInstance];
    [session setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
    [session setActive: YES error:nil];
    

    The session category has to be AVAudioSessionCategoryPlayAndRecord You can get the current output by checking this value.

    AVAudioSessionPortDescription *routePort = session.currentRoute.outputs.firstObject;
    NSString *portType = routePort.portType;
    

    And now depending on the port you want to send it to, simply toggle the output using

    if ([portType isEqualToString:@"Receiver"]) {
           [session  overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&error];
    } else {
           [session  overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&error];
    }
    

    This should be a quick way to toggle the outputs to the speaker phone and receiver.

    0 讨论(0)
  • 2020-12-09 00:19

    Swift 3.0 Code

    func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {       
    let routePort: AVAudioSessionPortDescription? = obsession. current Route. outputs. first
    let portType: String? = routePort?.portType
    if (portType == "Receiver") {
        try? audioSession.overrideOutputAudioPort(.speaker)
       }
       else {
            try? audioSession.overrideOutputAudioPort(.none)
       }
    
    0 讨论(0)
提交回复
热议问题