DER encoding - How to convert implicit tag to explicit tag

可紊 提交于 2019-12-22 12:39:32

问题


I have an X.509 certificate that contains a set of data with the following IMPLICIT [0] tag:

A0 81 C6 (value)...

And I have this excerpt from a standards document:

The IMPLICIT [0] tag is not used for the DER encoding, rather an EXPLICIT SET OF tag is used. That is, the DER encoding of the EXPLICIT SET OF tag, rather than of the IMPLICIT [0] tag, MUST be included along with the length and content octets of the value.

I've done a lot of searching around, but I can't figure out exactly what the standard is calling for. I'm looking for a bit of clarification.

EDIT: Here is the standard I am following: http://tools.ietf.org/html/rfc3852

I am trying to verify the X.509 signature and I need to calculate the message digest to do this. This certificate includes the optional SignedAttributes in the SignerInfo type. I have hashed the signed content and verified that the message digest in the SignedAttributes is correct. The standard says that if the SignedAttributes is present, it should be hashed and encrypted to create the cert's signature. The standard also says that the tag of the SignedAttributes should be changed as discussed in the original question.

Here is the Asn.1 Grammar for the SignerInfo:

SignerInfo ::= SEQUENCE {
        version CMSVersion,
        sid SignerIdentifier,
        digestAlgorithm DigestAlgorithmIdentifier,
        signedAttrs [0] IMPLICIT SignedAttributes OPTIONAL,
        signatureAlgorithm SignatureAlgorithmIdentifier,
        signature SignatureValue,
        unsignedAttrs [1] IMPLICIT UnsignedAttributes OPTIONAL }

SignerIdentifier ::= CHOICE {
        issuerAndSerialNumber IssuerAndSerialNumber,
        subjectKeyIdentifier [0] SubjectKeyIdentifier }

SignedAttributes ::= SET SIZE (1..MAX) OF Attribute

UnsignedAttributes ::= SET SIZE (1..MAX) OF Attribute

Attribute ::= SEQUENCE {
        attrType OBJECT IDENTIFIER,
        attrValues SET OF AttributeValue }

AttributeValue ::= ANY

SignatureValue ::= OCTET STRING

回答1:


I'm not sure how to interpret that comment. What standard are you reading? Do you have the ASN.1 grammar for the structure?

An explicit tag is like a wrapper around some underlying type. For example, the underlying type might be a SEQUENCE. It is encoded with the universal SEQUENCE tag, 0x30. But to avoid ambiguity in how the SEQUENCE should be interpreted in the enclosing structure, it is wrapped in an EXPLICIT structure with a context-specific tag. It's not clear from the snippet above what that tag is.

I'm guessing what they mean is a syntax like [0] EXPLICIT SET OF foo, which (using the example from the original question as a value) would be encoded as (hex) A0 81 C9 31 81 C6 (value) ...

Note that the original value that was tagged with a context-specific zero (A0) has been re-tagged with a universal SET OF (31).


Okay, in this case, I think what they mean is that when you sign the attributes, instead of using the implicit tag, the signature is computed over the SET OF tag. If that's what they mean, throwing in "EXPLICIT" really muddied the waters, but whatever. If that's the case, then the encoding would be simply 31 81 C6 (value) ... (replace the context-specific 0xA0 with a universal SET OF 0x31).



来源:https://stackoverflow.com/questions/3638495/der-encoding-how-to-convert-implicit-tag-to-explicit-tag

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!