问题
I have a client iOS application that uses GCDWebServer to serve images and videos stored in the NSSearchPathDirectory.DocumentDirectory
folder of the app on my device.
Upon app startup, I start an instance of GCDWebServer
and add a file response handler for my requests:
self.addDefaultHandlerForMethod("GET", requestClass: GCDWebServerFileRequest.self) { request in
return GCDWebServerFileResponse(file: self.filePathForResponse(request.URL))
}
I can verify that while the app is running I can successfully download files from my device:
curl -O http://192.168.0.15:8080/files/IMG_1213-1280x720.MOV
My app talks to a Chromecast device by sending URLs like the one above to a media channel, and the Chromecast receiver app loads and plays the videos at the specified URLs - so far all good.
My problem is that I want to implement seeking support for the currently playing video, and as soon as I send a seek request to the media channel, I get a "Broken pipe" error from GCDWebServer
and the video playback is interrupted. Log from the server is as follows:
....
[DEBUG] Connection sent 32768 bytes on socket 24
[DEBUG] Connection sent 32768 bytes on socket 24
[ERROR] Error while writing to socket 24: Broken pipe (32)
[DEBUG] Did close connection on socket 24
My best understanding of the problem is that normal playback works because it is the same as downloading a file from beginning to end and this can be served with a regular GCDWebServerFileResponse
, however seeking is equivalent to 'jumping' to a different part of the file and I'm not sure that reading a file like this would work with my configuration.
- Is there a way that I can configure
GCDWebServer
to make this work? I know that the problem can be solved because there are several live apps that do this. - Do I need to use a server that supports other protocols such as HLS or RTSP?
- Do I need to encode my video files in a particular manner?
For reference I have also tried another HTTP server called Swifter but I encountered the same problem.
回答1:
iOS' AVPlayer
at first requests file size and then asks server for a chunks of data, including a desired range in the request. In this answer, the network activity of AVPlayer
is shown, and the response status code is 206
, i.e. Partial Content
.
So, we need to respond only with desired chunk of data:
webServer?.addDefaultHandlerForMethod("GET", requestClass: GCDWebServerRequest.self, asyncProcessBlock: { (request, completionBlock) in
let response = GCDWebServerFileResponse(file: path, byteRange: request.byteRange)
completionBlock(response)
})
Please, notice, that it is important to check, whether request.byteRange
is defined by invoking request.hasByteRange()
.
Using the code I've provided, I was able to play a video using AVPlayer
or web browser and scrubbing / seek to time worked perfectly.
来源:https://stackoverflow.com/questions/36651451/how-to-implement-video-seek-support-with-an-embedded-http-server-on-ios