I am getting the text size of a string with this
textSize = [[tempDict valueForKeyPath:@\"caption.text\"] sizeWithFont:[UIFont systemFontOfSize:12] constrain
This is what I'm using:
func isAllEmoji(aString: String) -> Bool {
for scalar in aString.unicodeScalars {
switch scalar.value {
case 0x1F600...0x1F64F, // Emoticons
0x1F300...0x1F5FF, // Misc Symbols and Pictographs
0x1F680...0x1F6FF, // Transport and Map
0x2600...0x26FF, // Misc symbols
0x2700...0x27BF, // Dingbats
0xFE00...0xFE0F, // Variation Selectors
0x0030...0x0039,
0x00A9...0x00AE,
0x203C...0x2049,
0x2122...0x3299,
0x1F004...0x1F251,
0x1F910...0x1F990:
break
default:
return false
}
}
return true
}
I took this which was missing some emoji ranges, and then used this emoji array to find missing rantes by iteration... Have not deep tested.
As you've probably noticed, all of these emoji detecting methods break almost anytime Apple adds new emojis.
I've created a CG scanning solution which should work for all current and all FUTURE emojis here: https://stackoverflow.com/a/14472163/2057171
To my knowledge it's the only actual answer to this issue posted anywhere online.
-(BOOL)isEmoji:(NSString *)character {//argument can be character or entire string
UILabel *characterRender = [[UILabel alloc] initWithFrame:CGRectMake(0, 0, 1, 1)];
characterRender.text = character;
characterRender.backgroundColor = [UIColor blackColor];//needed to remove subpixel rendering colors
[characterRender sizeToFit];
CGRect rect = [characterRender bounds];
UIGraphicsBeginImageContextWithOptions(rect.size,YES,0.0f);
CGContextRef contextSnap = UIGraphicsGetCurrentContext();
[characterRender.layer renderInContext:contextSnap];
UIImage *capturedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGImageRef imageRef = [capturedImage CGImage];
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *rawData = (unsigned char*) calloc(height * width * 4, sizeof(unsigned char));
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height,
bitsPerComponent, bytesPerRow, colorSpace,
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);
BOOL colorPixelFound = NO;
int x = 0;
int y = 0;
while (y < height && !colorPixelFound) {
while (x < width && !colorPixelFound) {
NSUInteger byteIndex = (bytesPerRow * y) + x * bytesPerPixel;
CGFloat red = (CGFloat)rawData[byteIndex];
CGFloat green = (CGFloat)rawData[byteIndex+1];
CGFloat blue = (CGFloat)rawData[byteIndex+2];
CGFloat h, s, b, a;
UIColor *c = [UIColor colorWithRed:red green:green blue:blue alpha:1.0f];
[c getHue:&h saturation:&s brightness:&b alpha:&a];
b /= 255.0f;
if (b > 0) {
colorPixelFound = YES;
}
x++;
}
x=0;
y++;
}
return colorPixelFound;
}
You can use it like so:
NSString *myString = @"Hello this contains
Now with the new release of iOS 10, for the following emojis it doesn't work:
Emojis
The following code snippet is tried and tested for both iOS 9 and iOS 10:
extension String {
var containsEmoji: Bool {
for scalar in unicodeScalars {
switch scalar.value {
case 0x1F600...0x1F64F, // Emoticons
0x1F300...0x1F5FF, // Misc Symbols and Pictographs
0x1F680...0x1F6FF, // Transport and Map
0x2600...0x26FF, // Misc symbols
0x2700...0x27BF, // Dingbats
0xFE00...0xFE0F, // Variation Selectors
0x1F910...0x1F918, // New Emoticons
0x1F1E6...0x1F1FF, // Flags
0x1F980...0x1F984,
0x1F191...0x1F19A,
0x1F201...0x1F202,
0x1F232...0x1F23A,
0x1F250...0x1F251,
0x23E9...0x23F3,
0x23F8...0x23FA,
0x1F170...0x1F171,
0x1F17E,
0xA9,
0xAE,
0x2122,
0x2328,
0x3030,
0x1F0CF,
0x1F18E,
0x1F9C0:
return true
default:
continue
}
}
return false
}
}
Create a String extension in your app as mentioned above.
And can be used like this:
if string.containsEmoji {
// Do operations here
}
Simple Swift solution with checking every scalar in unicodeScalars are in the set CharacterSet.symbols
extension String {
var containsEmoji: Bool {
for scalar in unicodeScalars {
if CharacterSet.symbols.contains(scalar) {
return true
}
}
return false
}
}
But I found some emoji 1.0 items like ℹ️ is not classify as an emoji. So I create this checker:
extension Unicode.Scalar {
extension Unicode.Scalar {
var isEmojiMiscSymbol: Bool {
switch self.value {
case 0x2030...0x329F: // Misc symbols
return true
default:
return false
}
}
}
}
And this is the checker which can detect ℹ️:
extension String {
var containsEmoji: Bool {
for scalar in unicodeScalars {
if CharacterSet.symbols.contains(scalar) {
return true
} else if scalar.isEmojiMiscSymbol {
return true
}
}
return false
}
}
This issue may occur if there are any characters past 0x00ff
. In other words, there are many Unicode characters in addition to Emoji that you may need to account for. In order to see if there are any Unicode characters (beyond the boundary of extended ASCII) use the following.
extension String {
var containsUnicodeCharacters: Bool {
for scalar in unicodeScalars {
if scalar.value > 0x00FF {
return true
}
}
return false
}
}
Again and again people use valueForKeyPath instead of objectForKey. None of them can ever explain why. Read the documentation. If after reading it you can explain why you are using valueForKeyPath (and "I copied it from somewhere" is not an explanation), change it to objectForKey.
The problem you have has nothing to do with Emojis at all. Any attempt to detect Emojis in the string will fail - for the simple reason that you don't have a string in the first place, you have [NSNull null]. The problem might be fixed by using objectForKey - you might get nil instead which behaves a lot more forgiving. Or you still get [NSNull null].
Find out why you are getting [NSNull null]. Somebody puts it there. If you can't prevent it from being there, then you need to handle it.