问题
I have a requirement of streaming from server and displaying the streamed content on the screen...Streaming is working fine using NSStream, and NSInputStream and NSOutputStream.How can I display it on the screen?
Stream used looks like @"http://191.168.143.41:1212/;
if(stream == inputStream) {
uint8_t buf[1024];
unsigned int len = 0;
len = [inputStream read:buf maxLength:1024];
if(len > 0) {
NSMutableData* datas=[[NSMutableData alloc] initWithLength:0];
[datas appendBytes: (const void *)buf length:len];
NSString *s = [[NSString alloc] initWithData:datas encoding:NSASCIIStringEncoding];
[self readIn:s];
NSLog(@"ss%@",s);
[self loadMovie:s]; //method for movie player
}
I tried to display this is in a movieplayer as below..
-(void_loadMovie:(NSString*)moviePrefix
{
NSString *path = [NSString stringWithFormat:@"%@.mjpg", moviePrefix];
NSURL *url = [NSURL fileURLWithPath:path];
if (url) {
_moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:url];
_moviePlayer.view.frame = CGRectMake(0, 70, 600, 450);
_moviePlayer.controlStyle = MPMovieControlStyleNone;
_moviePlayer.scalingMode = MPMovieScalingModeNone;
[dic setObject:__moviePlayer forKey:path];
}
}
[_moviePlayer prepareToPlay];
[self.view addSubview: _moviePlayer.view];
[self.view bringSubviewToFront:_moviePlayer.view];
[self.view addSubview: _moviePlayer.view];
[_moviePlayer play];
}
Is NSString *path = [NSString stringWithFormat:@"%@.mjpg", moviePrefix]; correct way??
This displays a black screen.What is wrong? If this way is not correct,Is there any other way I can display those frames? Can anyone help me to solve this...
回答1:
MJPEG are only JPEG sent one after the other.
I worked a few years ago on this.
On a version of iOS (iOS5?), it was easily read with a UIWebView
, but an update of iOS broke all this. This broke all my current work.
Maybe a UIWebView
could do the trick today again (fix).
Anyway, since it's just bunch of JPEG, you could just read the JPG (detect start/end of JPG file), create the JPG image and show it in a UIImageView
.
A work around (not tested), but you should get the whole idea:
//Properties
@property (nonatomic, strong) NSMutableData *data;
@property (nonatomic, weak) IBOutlet UIImageView *streamImageView;
//Initialize somewhere
_data = [[NSMutableData alloc] init];
//In the stream delegate method:
//Start JPG: FFD8 — End JPG: FFD9
UInt8 startJPEGBytes[2];
startJPEGBytes[0] = 0xFF;
startJPEGBytes[1] = 0xD8;
NSData *startData = [NSData dataWithBytes:&startJPEGBytes length:2];
UInt8 endJPEGBytes[2];
endJPEGBytes[0] = 0xFF;
endJPEGBytes[0] = 0xD9;
NSData *endData = [NSData dataWithBytes:&endJPEGBytes length:2];
[_data appendBytes: (const void *)buf length:len];
NSRange startRange = [_data rangeOfData:startData options:0 range:NSMakeRange(0, [_data length])];
if (startRange.location != NSNotFound) //We found the start of a JPEG
{
NSRange endRange = [_data rangeOfData:endData options:0 range:NSMakeRange(startRange.location, [_data length]-startRange.location)];
if (endRange.location != NSNotFound) //We found the end of a JPEG
{
NSRange imageRange = NSMakeRange(startRange.location, endRange.location+endRange.length-startRange.location);
NSData *imageData = [_data subDataWithRange: imageRange];
streamImage = [UIImage imageWithData:imageData];
[_streamImageView setImage:streamImage];
[_data replaceBytesInRange:NSMakeRange(0, imageRange.location+imageRange.length withBytes:NULL length:0]; //We remove the start till the end of JPEG frame. Start at 0, since there could be garbage at the start.
}
}
回答2:
You are not adding moviePrefix
to the string
NSString *path = [NSString stringWithFormat:@".mjpg", moviePrefix, @"movie"];
Change it to
NSString *path = [NSString stringWithFormat:@"%@.mjpg", moviePrefix, @"movie"];
回答3:
https://github.com/horsson/mjpeg-iphone/tree/55251a85e2c2489014036ddf5a491783f9b1962d
Used this to get the stream and display.It works
来源:https://stackoverflow.com/questions/30481563/getting-frames-though-a-stream-and-display-on-screen