I'm trying to send an image that user takes to the server. I get the JPEG representation, append that to the string needed to upload the photo, and then send the data through NSOutputStream. However, when I try to get the photo back from server, I only see 10% of it from the top. Any help would be appreciated.
btw. socket is open and connected.
Here's my code:
NSString *requestString = [NSString stringWithFormat:@"SubmitPhoto::%@::", userID];
NSData * stringData = [requestString dataUsingEncoding:NSUTF8StringEncoding];
NSData *imgData = UIImageJPEGRepresentation(image, 1.0);
NSMutableData *completeData = [[NSMutableData alloc] initWithBytes:[stringData bytes] length:[stringData length]];
[completeData appendData:imgData];
//sending NSData over to server
[self.outputStream write:[completeData bytes] maxLength:[completeData length]];
It's because of your image size limit is being exceeded.
Better way to handle this is to implement the following logic.
Sender
Convert
UIimage
toNSData
Split up the
NSData
to different chunks (1024 per chunk is recommended)Send & track each chunk of
NSData
Receiver
Declare
NSData
and Store the first part ofNSData
chunk (1024) into it, which is received.Receive the next chunks of
NSData
and make useappendData:
method to append itOnce all the chunks are received, convert the received
NSData
as anUIImage
Make sure to design different structures for transferring the data as chunks such as structure to represent the details (total chunk, total size, chunk size etc..), structure to represent the data (current chunk size, current chunk number etc..), structure to represent the responds data (delivery status,chunk number delivered etc..).
I'd guess you're just trying to write too much data at a time for your buffer. Do something like this to loop over the data and send it in chunks instead:
NSString *requestString = [NSString stringWithFormat:@"SubmitPhoto::%@::", userID];
NSData * stringData = [requestString dataUsingEncoding:NSUTF8StringEncoding];
NSData *imgData = UIImageJPEGRepresentation(image, 1.0);
NSMutableData *completeData = [[NSMutableData alloc] initWithBytes:[stringData bytes] length:[stringData length]];
[completeData appendData:imgData];
NSInteger bytesWritten = 0;
while ( completeData.length > bytesWritten )
{
while ( ! self.outputStream.hasSpaceAvailable )
[NSThread sleepForTimeInterval:0.05];
//sending NSData over to server
NSInteger writeResult = [self.outputStream write:[completeData bytes]+bytesWritten maxLength:[completeData length]-bytesWritten];
if ( writeResult == -1 ) {
NSLog(@"error code here");
}
else {
bytesWritten += writeResult;
}
}
}
// Both input and output should be closed to make the code work in swift
来源:https://stackoverflow.com/questions/15526977/sending-uiimage-over-nsoutputstream