问题
I need to encode/decode an integer which is up to 9 digits long but most often 7 digits long. I'd like to make it easier to communicate/memorise - it will be communicated by phone, copied & pasted, keyed in from a card/memory/label, hand-written on labels and generally abused!
I'm looking to reduce the number of digits (whilst adding a checksum) using a base 32 scheme. I'm most in favour of z-base-32 (over the RFC4648 implementation) because of some of the design goals (e.g. handwriting) and choice of alphabet (lower-case, permuted to favour characters that are easier to read, write, speak, and remember). However, I can't find a C# implementation and I'm concerned about porting from the existing Python implementation.
Does anyone have a C# implementation? Alternatively, does anyone have a set of test cases (other than the examples in the spec) that I can use to validate a port?
I'm open to suggestions about alternative encoding schemes.
回答1:
I started with the code provided by Dead account and made some changes based on my tests. I hope this is useful.
/// <summary>
/// Author: Ben Maddox
/// </summary>
public class ZBase32Encoder
{
/*
* Accepted characters based on code from:
* http://www.codeproject.com/KB/recipes/Base32Encoding.aspx?display=Print
*/
public const string AcceptedCharacters = "ybndrfg8ejkmcpqxot1uwisza345h769";
public static string Encode(int input)
{
string result = "";
if (input == 0)
{
result += AcceptedCharacters[0];
}
else
{
while (input > 0)
{
//Must make sure result is in the correct order
result = AcceptedCharacters[input%AcceptedCharacters.Length] + result;
input /= AcceptedCharacters.Length;
}
}
return result;
}
public static int Decode(string input)
{
var inputString = input.ToLower();
int result = 0;
for (int i = 0; i < inputString.Length; i++)
{
result *= AcceptedCharacters.Length;
var character = inputString[i];
result += AcceptedCharacters.IndexOf(character);
}
return result;
}
public static int Decode(char data)
{
return Decode(data.ToString());
}
}
And here are the tests I used. MS Test with the Should assertions library.
[TestClass]
public class ZBase32EncoderTests
{
[TestMethod]
public void Encoding_0_ReturnsFirstCharacter()
{
var result = ZBase32Encoder.Encode(0);
result.ShouldEqual(ZBase32Encoder.AcceptedCharacters[0].ToString());
}
[TestMethod]
public void Encoding_1_ReturnsSecondCharacter()
{
var result = ZBase32Encoder.Encode(1);
result.ShouldEqual(ZBase32Encoder.AcceptedCharacters[1].ToString());
}
[TestMethod]
public void Encoding_32_ReturnsSecondAndFirstValues()
{
var result = ZBase32Encoder.Encode(32);
result.ShouldEqual(ZBase32Encoder.AcceptedCharacters[1].ToString() + ZBase32Encoder.AcceptedCharacters[0].ToString());
}
[TestMethod]
public void Encoding_64_ReturnsThirdAndFirstValues()
{
var result = ZBase32Encoder.Encode(64);
result.ShouldEqual(ZBase32Encoder.AcceptedCharacters[2].ToString() + ZBase32Encoder.AcceptedCharacters[0].ToString());
}
[TestMethod]
public void Encoding_65_ReturnsThirdAndSecondValues()
{
var result = ZBase32Encoder.Encode(65);
result.ShouldEqual(ZBase32Encoder.AcceptedCharacters[2].ToString() + ZBase32Encoder.AcceptedCharacters[1].ToString());
}
[TestMethod]
public void Decoding_FirstCharacter_Returns_0()
{
var inputCharacter = ZBase32Encoder.AcceptedCharacters[0];
var result = ZBase32Encoder.Decode(inputCharacter);
result.ShouldEqual(0);
}
[TestMethod]
public void Decoding_SecondCharacter_Returns_1()
{
var inputCharacter = ZBase32Encoder.AcceptedCharacters[1];
var result = ZBase32Encoder.Decode(inputCharacter);
result.ShouldEqual(1);
}
[TestMethod]
public void Decoding_SecondAndFirstValues_Shows_32()
{
var inputCharacters = ZBase32Encoder.AcceptedCharacters[1].ToString() + ZBase32Encoder.AcceptedCharacters[0];
var result = ZBase32Encoder.Decode(inputCharacters);
result.ShouldEqual(32);
}
[TestMethod]
public void Decoding_ThirdAndFirstCharacters_Shows_64()
{
var inputCharacters = ZBase32Encoder.AcceptedCharacters[2].ToString() + ZBase32Encoder.AcceptedCharacters[0];
var result = ZBase32Encoder.Decode(inputCharacters);
result.ShouldEqual(64);
}
}
回答2:
You might want to use your own encode / decode routine?
Encode:
string acceptedChar = "ABCDEFGHJKLMNPQRSTUWXZ0123456789";
int yourNumber = 12345678;
string response = "";
while (yourNumber > 0)
{
response += acceptedChar[yourNumber % acceptedChar.Length];
yourNumber /= acceptedChar.Length;
}
Decode:
string acceptedChar = "ABCDEFGHJKLMNPQRSTUWXZ0123456789";
string inputStr = "ABCD";
int yourNumber = 0;
for (int i = inputStr.Length; i > 0; i--)
{
yourNumber *= acceptedChar.Length;
yourNumber += acceptedChar.IndexOf(inputStr[i]);
}
(Untested code)
回答3:
If you look at your cell phone keyboard, the number 1985239 can be represented using these characters (a,b,c), (w,x,y,z), (t,u,v), ... Try to find an algorithm that would generate more-or-less English-like words given an ordered set of unordered sets of characters -- these will be easier to memorize.
回答4:
slight improvement on BenMaddox's post (using do-while):
public static String encode(int num, String base) {
String response = "";
do{
response = base.charAt(num % base.length()) + response;
num /= base.length();
} while(num>0);
return response;
}
回答5:
This project looks like what you're after:
Base 36 type for .NET (C#)
来源:https://stackoverflow.com/questions/729268/encoding-a-number-c-sharp-implementation-of-z-base-32-or-something-else