问题
I have seen that since Lollipop
, Android has built in Emoji
flags for different countries. Is it possible to use the devices locale to retrieve the Emoji
flag for that country?
I wanted to insert the Emoji
flag into a TextView
which contains the user's location.
回答1:
I was looking for that too but I don't think it's possible yet.
Have a look here: http://developer.android.com/reference/java/util/Locale.html
No mentioning about flags.
_
Alternately you can check the answer here:
Android Countries list with flags and availability of getting iso mobile codes
that might help you.
回答2:
Emoji is a Unicode symbols. Based on the Unicode character table Emoji flags consist of 26 alphabetic Unicode characters (A-Z) intended to be used to encode ISO 3166-1 alpha-2 two-letter country codes (wiki).
That means it is possible to split two-letter country code and convert each A-Z letter to regional indicator symbol letter:
private String localeToEmoji(Locale locale) {
String countryCode = locale.getCountry();
int firstLetter = Character.codePointAt(countryCode, 0) - 0x41 + 0x1F1E6;
int secondLetter = Character.codePointAt(countryCode, 1) - 0x41 + 0x1F1E6;
return new String(Character.toChars(firstLetter)) + new String(Character.toChars(secondLetter));
}
Where 0x41
represents uppercase A
letter and 0x1F1E6
is REGIONAL INDICATOR SYMBOL LETTER A
in the Unicode table.
Note: This code example is simplified and doesn't have required checks related to country code, that could be not available inside the locale.
回答3:
Based on this answer, I wrote a Kotlin version below using extension function.
I also added some checks to handle unknown country code.
/**
* This method is to change the country code like "us" into 🇺🇸
* Stolen from https://stackoverflow.com/a/35849652/75579
* 1. It first checks if the string consists of only 2 characters: ISO 3166-1 alpha-2 two-letter country codes (https://en.wikipedia.org/wiki/Regional_Indicator_Symbol).
* 2. It then checks if both characters are alphabet
* do nothing if it doesn't fulfil the 2 checks
* caveat: if you enter an invalid 2 letter country code, say "XX", it will pass the 2 checks, and it will return unknown result
*/
fun String.toFlagEmoji(): String {
// 1. It first checks if the string consists of only 2 characters: ISO 3166-1 alpha-2 two-letter country codes (https://en.wikipedia.org/wiki/Regional_Indicator_Symbol).
if (this.length != 2) {
return this
}
val countryCodeCaps = this.toUpperCase() // upper case is important because we are calculating offset
val firstLetter = Character.codePointAt(countryCodeCaps, 0) - 0x41 + 0x1F1E6
val secondLetter = Character.codePointAt(countryCodeCaps, 1) - 0x41 + 0x1F1E6
// 2. It then checks if both characters are alphabet
if (!countryCodeCaps[0].isLetter() || !countryCodeCaps[1].isLetter()) {
return this
}
return String(Character.toChars(firstLetter)) + String(Character.toChars(secondLetter))
}
Runnable Code Snippet
I also included a runnable Kotlin snippet using Kotlin Playground. In order to run the snippet you need to:
- click "Show code snippet"
- click "Run Code Snippet"
- click the play button at the right top of the generated console
- scroll to the bottom to see the result (it's hidden..)
<script src="https://unpkg.com/kotlin-playground@1.6.0/dist/playground.min.js" data-selector=".code"></script> <div class="code" style="display:none;"> /** * This method is to change the country code like "us" into 🇺🇸 * Stolen from https://stackoverflow.com/a/35849652/75579 * 1. It first checks if the string consists of only 2 characters: ISO 3166-1 alpha-2 two-letter country codes (https://en.wikipedia.org/wiki/Regional_Indicator_Symbol). * 2. It then checks if both characters are alphabet * do nothing if it doesn't fulfil the 2 checks * caveat: if you enter an invalid 2 letter country code, say "XX", it will pass the 2 checks, and it will return unknown result */ fun String.toFlagEmoji(): String { // 1. It first checks if the string consists of only 2 characters: ISO 3166-1 alpha-2 two-letter country codes (https://en.wikipedia.org/wiki/Regional_Indicator_Symbol). if (this.length != 2) { return this } val countryCodeCaps = this.toUpperCase() // upper case is important because we are calculating offset val firstLetter = Character.codePointAt(countryCodeCaps, 0) - 0x41 + 0x1F1E6 val secondLetter = Character.codePointAt(countryCodeCaps, 1) - 0x41 + 0x1F1E6 // 2. It then checks if both characters are alphabet if (!countryCodeCaps[0].isLetter() || !countryCodeCaps[1].isLetter()) { return this } return String(Character.toChars(firstLetter)) + String(Character.toChars(secondLetter)) } fun main(args: Array<String>){ println("us".toFlagEmoji()) println("AF".toFlagEmoji()) println("BR".toFlagEmoji()) println("MY".toFlagEmoji()) println("JP".toFlagEmoji()) } </div>
回答4:
When I first wrote this answer I somehow overlooked that I've only worked on Android via React Native!
Anyway, here's my JavaScript solution that works with or without ES6 support.
function countryCodeToFlagEmoji(country) {
return typeof String.fromCodePoint === "function"
? String.fromCodePoint(...[...country].map(c => c.charCodeAt() + 0x1f185))
: [...country]
.map(c => "\ud83c" + String.fromCharCode(0xdd85 + c.charCodeAt()))
.join("");
}
console.log(countryCodeToFlagEmoji("au"));
console.log(countryCodeToFlagEmoji("aubdusca"));
If you want to pass in the country codes as capital letters instead, just change the two offsets to 0x1f1a5
and 0xdda5
.
回答5:
I am using this so easily. Get the Unicode from here.
For Bangladesh flag it is U+1F1E7 U+1F1E9
Now,
{...
String flag = getEmojiByUnicode(0x1F1E7)+getEmojiByUnicode(0x1F1E9)+ " Bangladesh";
}
public String getEmojiByUnicode(int unicode){
return new String(Character.toChars(unicode));
}
It will show > (Bangladeshi flag) Bangladesh
来源:https://stackoverflow.com/questions/30494284/android-get-country-emoji-flag-using-locale