WebRTC + IOS + Freeswitch : Can't hear audio

后端 未结 1 1451
醉酒成梦
醉酒成梦 2021-01-31 22:47

I\'m trying to implement mod_verto on IOS (calling from iPhone to Desktop). I\'m using Google\'s libjingle library for the RTC side, got it up and running using this excellent t

相关标签:
1条回答
  • 2021-01-31 23:30

    Update

    It is very difficult to understand what is going wrong with the WebRTC implementation unless you have check Media Server, Web Client and iOS Client in our case.

    Your case is an Audio Call, so no video streams need to be included on your localStream, but if you take a look closely you see that you are actually adding a videoTrack on your mobile stream:

     if(mediaOptions?["video"] as? Bool ?? true){
            for captureDevice in AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo){
                if (captureDevice.position == mediaOptions?["devicePosition"] as? AVCaptureDevicePosition ?? AVCaptureDevicePosition.Front){
                    cameraID = captureDevice.localizedName
                    break
                }
            }
    
            if(cameraID == nil){
                reject(NSError(domain: "No cammera detected", code: 0, userInfo: nil))
            }
    
            let capturer = RTCVideoCapturer.init(deviceName: cameraID)
    
            let videoSource = self.factory.videoSourceWithCapturer(capturer, constraints: mediaOptions?["constraints"] as? RTCMediaConstraints ?? nil)
    
            if let localVideoTrack = self.factory.videoTrackWithID("Local-Video", source: videoSource){
                //!!!! THIS IS THE PROBLEMATIC LINE !!!!
                self.localMediaStream?.addVideoTrack(localVideoTrack)
            }else{
                reject(NSError(domain: "No Video track", code: 0, userInfo: nil))
            }
        }
    

    So the line which causes the trouble is: self.localMediaStream?.addVideoTrack(localVideoTrack), because you are attaching the video to the localStream.

    Opinions

    We may have different scenarios of trouble as I mentioned, here I am listing some opinions based on my experience when we built a similar system:

    1. Your MediaServer may not have an implementation which can redirect and handle your calls in a successful state, because there are additional things added when you attach a video (Please see your session description what you actually send there), and it simply refuses to create a call.
    2. Even if your MediaServer handles the scenario, this will include the right implementation of the Client (both Desktop and Mobile) to conform to signaling of its protocols.
    3. You passed all the tests, and you are now adding video and audio, so you are initiating localStream from mobile, and the same need to be created other way. Then you need to handle events when you add streams, remove stream and other stuff via websockets.

    Solution in this case

    Remove the part that adds localTrack inside localStream, and then even if you have errors, are not caused by creating your localStream, so this step is currently solved.


    Original Answer

    Here I have a working version of mine, but adapted for your needs as you use only audio.

    Creating and setting up the peerConnection (localSide)

    // Connecting to the socket
        .........
    
    // Create PeerConnectionFactory
    self.peerConnectionFactory = [[RTCPeerConnectionFactory alloc] init];
    
    RTCMediaConstraints *constraints = [self defaultPeerConnectionConstraints];
    
    // Initialize peerConnection based on a list of ICE Servers
    self.peerConnection = [self.peerConnectionFactory peerConnectionWithICEServers:[self getICEServers] constraints:constraints delegate:self];
    
    // Create the localStram which contains the audioTrack
    RTCMediaStream *localStream = [self createLocalMediaStream];
    
    // Add this stream to the peerConnection
    [self.peerConnection addStream:localStream];
    
    // Please be aware here that I am using blocks, as I created a wrapper for easier maintenance, but you can use createOfferWithDelegate: which will go back at your delegation
    NSLog(@"Creating peer offer");
    RTCManager *strongSelf = self;
    [strongSelf.peerConnection createOfferWithCallback:^(RTCSessionDescription *sdp, NSError *error) {
        if (!error) {
            dispatch_async(dispatch_get_main_queue(), ^{
                NSLog(@"Success at creating offer, now setting local description");
                [strongSelf.peerConnection setLocalDescriptionWithCallback:^(NSError *error) {
                    if (!error) {
                        dispatch_async(dispatch_get_main_queue(), ^{
                            NSLog(@"Success at setting local description");
                            // On my type of signalization here I am connected, but yours is based on what type of signalization requires
                        });
                    }
    
                } sessionDescription:sdp];
            });
        }
    } constraints:[strongSelf defaultPeerConnectionConstraints]];
    

    Helpers

    // Now here we create the stream which contains the audio (Please note the ID)
    - (RTCMediaStream *)createLocalMediaStream {
        RTCMediaStream *localStream = [self.peerConnectionFactory mediaStreamWithLabel:@"ARDAMS"];
        [localStream addAudioTrack:[self.peerConnectionFactory audioTrackWithID:@"ARDAMSa0"]];
    
        return localStream;
    }
    
    - (RTCMediaConstraints *)defaultPeerConnectionConstraints {
        // DtlsSrtpKeyAgreement is required for Chrome and Firefox to interoperate.
        NSArray *optionalConstraints = @[[[RTCPair alloc] initWithKey:@"DtlsSrtpKeyAgreement" value:@"true"]];
    
        RTCMediaConstraints *constraints = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:nil optionalConstraints:optionalConstraints];
        return constraints;
    }
    

    Please be aware that your problem might be caused also for not calling stuff on the main thread.

    0 讨论(0)
提交回复
热议问题