问题
I would like to implement this AntMedia iOS and Android native interface for Codename One:
import com.codename1.system.NativeInterface;
import com.codename1.ui.PeerComponent;
/**
* @deprecated Native Interface, deprecated because you normally should use the
* public API in the AntMediaClient class.
*/
public interface AntMediaNative extends NativeInterface {
/**
* Initializes the connection.
*
* @param serverURL is WebSocket url to connect (wss://...)
* @param streamId is the stream id in the server to process
* @param mode true means MODE_PUBLISH, false means MODE_PLAY
* @param tokenId is one time token string
* @return PeerComponent
*/
public PeerComponent createPeer(String serverURL, String streamId, boolean mode, String tokenId);
/**
* Starts the streaming according to the mode.
*/
public void startStream();
/**
* Stops the streaming.
*/
public void stopStream();
/**
* Switches the cameras.
*/
public void switchCamera();
/**
* Toggle microphone.
*
* @return microphone current status.
*/
public boolean toggleMic();
/**
* Stops the video source.
*/
public void stopVideoSource();
/**
* Starts or restarts the video source.
*/
public void startVideoSource();
/**
* Get the error.
*
* @return error or null if not.
*/
public String getError();
/**
* Camera open order.By default front camera is attempted to be opened at
* first, if it is set to false, another camera that is not front will be
* tried to be open.
*
* @param openFrontCamera if it is true, front camera will tried to be
* opened, if it is false, another camera that is not front will be tried to
* be opened
*/
public void setOpenFrontCamera(boolean openFrontCamera);
}
I need help on two specific issues.
The first problem is how to get the PeerComponent in which to view the live streaming. I don't understand what I have to do in this case in the native Android and iOS code. Could you answer me with an example code for iOS and Android that returns a PeerComponent? Below are the links to the SDKs documentation, I hope it is enough to answer this question.
The second problem is that the SDK for iOS is written in Swift: how do I call the Swift code from a native interface that must be written in Objective-C? Could you answer me with a code example here too?
Thank you for your support.
This is the documentation of the two SDKs:
Android SDK documentation: https://github.com/ant-media/Ant-Media-Server/wiki/WebRTC-Android-SDK-Documentation
iOS SDK documentation: https://github.com/ant-media/Ant-Media-Server/wiki/WebRTC-iOS-SDK-Documentation
回答1:
When you use the Generate Native Interface tool in the IDE it generates matching native code. That code generates native OS methods for each operating system e.g. in the case of Android the createPeer
method will return a View
.
So for this case you would need to create an instance of org.webrtc.SurfaceViewRenderer
and store it in the class (for followup calls of init) then return that from the createPeer
method.
来源:https://stackoverflow.com/questions/62738263/antmedia-native-interface-issues