问题
A project I'm working on requires serializing a data structure before shutting down and restores its state from this serialized data when it start up again.
Last year, we were building for .NET 1.1, and ran into a tricky issue where
- our code ran on .NET 2.0
- a customer upgraded with some software that somehow set 1.1 as default
- our code ran on .NET 1.1 and was unable to deserialize its stored state
This particular issue was "resolved" by barring that particular software upgrade, and shouldn't be a problem now that we're targeting the .NET 2.0 framework (so we can't possibly run on 1.1).
What is the chance that this serialization could again change incompatibly between, 2.0 and newer frameworks? If we use <supportedVersion>
to fix our code to 2.0.50727, what are the chances of changes between 2.0.50727.1434 and 2.0.50727.nnnn (some future release)? The data structures being serialized are arrays, maps, strings, et cetera from the standard class libraries.
Additionally, is it guaranteed that a 2.0.50727 framework will be always installed even after further .NET upgrades? Pointers to Microsoft documentation welcome.
回答1:
The chances are low (but not zero!) that there will be changes between framework versions. The intention would be that you should be able to use binary serialization and remoting to communicate between a client and a server running different framework versions. The incompatibility between .NET 1.x and 2.0 is a bug for which a patch is available.
However binary serialization has other issues, especially poor support for versioning of the structure you're serializing. From the use case you've described, Xml serialization is the obvious choice: DataContractSerializer being more flexible than XmlSerializer if you don't mind the dependency on .NET 3.x.
You can't guarantee that the .NET framework 2.0 will always be installed on future versions of Windows. But I'm sure Microsoft will work hard to ensure that most .NET 2.0 apps will run unchanged on .NET 4.x and later versions. I don't have any references for this: any such commitment would in any case only really apply to the next version of Windows (Windows 7).
回答2:
The rule of thumb is generally: XML serialization should be able to survive new framework versions and therefore can be stored long-term, but binary serialization cannot (and therefore should only ever be transient).
回答3:
What serializer are you using? In many ways, a serializer like XmlSerializer or DataContractSerializer buffers you from many details, and provides simpler extensibility options. At some point, a new CLR version will undoubtably be necessary - so I don't think anybody can make any guarantees about 2.0.50727; you should be safe short-term, though. And I would hope for fewer breaking changes...
[updated following note on another reply]
If you want a binary format for space/performance reasons, then another option is to use a different binary serializer. For example, protobuf-net works on all .NET variants*, but the binary format (dvised by Google) is cross-platform compatible (Java, C++, etc) - making it very portable, quick, and small.
*=I haven't tried it on micro framework, but CF, Silverlight, Mono, .NET 2.0 etc are all supported.
回答4:
If compatibility is a concern, the ISerializable interface might be the cure you're looking for. This interface gives you more control over how items are serialized. For more info try this article on msdn.
回答5:
I have two things to add to the other answers...
First, making use of a custom SerializationBinder can get you round lots of difficulties importing legacy serialized data.
Second, I consider it mandatory to write extensive unit tests for any persisted data. I always do two tests in particular:
- Round-trip test - can you serialize and deserialize your objects and get exactly the same thing back?
- Legacy import test - make sure you have versions of serialized data exported from every released version of your app. Import the data and check that everything comes back in as expected.
回答6:
You do not have to use XML in order to gain higher flexibility and versioning.
I have used Simon Hewitt's open source library, see Optimizing Serialization in .NET - part 2 instead of default .NET serialisation. It offers some automation, but essentially you can control the stream of information that is serialised and deserialised. For versioning the (file) version can be serialised first and at deserialisation time the way the stream of information is interpreted is dependent on the version.
It is quite simple to do, although somewhat tedious due to the explicit serialisation / deserialisation.
As a bonus it is 20-40 times faster and takes up less space for large data sets (but may be unimportant in your case).
来源:https://stackoverflow.com/questions/203694/stability-of-net-serialization-across-different-framework-versions