byte-order-mark

Is it possible to redirect output to a file using the redirection operator without writing a byte-order mark in Powershell?

谁说我不能喝 提交于 2019-12-23 00:48:12
问题 Is there any way to omit the byte-order mark when redirecting the output stream to a file? For example, if I want to take the contents of an XML file and replace a string with a new value, I need to do create a new encoding and write the new output to a file like the following which is rather ham-handed: $newContent = ( Get-Content .\settings.xml ) -replace 'expression', 'newvalue' $UTF8NoBom = New-Object System.Text.UTF8Encoding( $false ) [System.IO.File]::WriteAllText( '.\settings.xml',

How can I remove any UTF-8 BOM that exists -within- some text, not at the start of some text

筅森魡賤 提交于 2019-12-22 11:45:54
问题 We receive some files, which have been concatenated by another party. In the middle of these files are some BOM characters. Is there a way we can detect these 3 chars and remove them? I've seen plenty of examples about how to remove the BOM from the -start- of a file ... but not the middle. 回答1: Assuming that your file is small enough to hold in memory, and that you have an Enumerable.Replace extension method for replacing subsequences, then you could use: var bytes = File.ReadAllBytes

C++ How to inspect file Byte Order Mark in order to get if it is UTF-8?

断了今生、忘了曾经 提交于 2019-12-22 05:28:17
问题 I wonder how to inspect file Byte Order Mark in order to get if it is UTF-8 in C++? 回答1: In general, you can't. The presence of a Byte Order Mark is a very strong indication that the file you are reading is Unicode. If you are expecting a text file, and the first four bytes you receive are: 0x00, 0x00, 0xfe, 0xff -- The file is almost certainly UTF-32BE 0xff, 0xfe, 0x00, 0x00 -- The file is almost certainly UTF-32LE 0xfe, 0xff, XX, XX -- The file is almost certainly UTF-16BE 0xff, 0xfe, XX,

Bad UTF-8 without BOM encoding

◇◆丶佛笑我妖孽 提交于 2019-12-21 15:00:53
问题 I converted all my files to UTF-8 without BOM encoding using Notepad++. I have no problem with BOMs anymore but the UTF without BOM encoding is simply not working, it's as if my site was encoded in ANSI. All special characters display either as: Â, Ú or á. What can be the reason for this and how can I fix it? http://chusmix.com/?ciudad=Pilar Thanks 回答1: You have to tell the browser to accept it as UTF-8 so it will properly parse multibyte characters. Add this meta tag in your <head> tag

Bad UTF-8 without BOM encoding

∥☆過路亽.° 提交于 2019-12-21 14:59:30
问题 I converted all my files to UTF-8 without BOM encoding using Notepad++. I have no problem with BOMs anymore but the UTF without BOM encoding is simply not working, it's as if my site was encoded in ANSI. All special characters display either as: Â, Ú or á. What can be the reason for this and how can I fix it? http://chusmix.com/?ciudad=Pilar Thanks 回答1: You have to tell the browser to accept it as UTF-8 so it will properly parse multibyte characters. Add this meta tag in your <head> tag

Dealing with UTF-8 numbers in Python

雨燕双飞 提交于 2019-12-21 03:48:20
问题 Suppose I am reading a file containing 3 comma separated numbers. The file was saved with with an unknown encoding, so far I am dealing with ANSI and UTF-8. If the file was in UTF-8 and it had 1 row with values 115,113,12 then: with open(file) as f: a,b,c=map(int,f.readline().split(',')) would throw this: invalid literal for int() with base 10: '\xef\xbb\xbf115' The first number is always mangled with these '\xef\xbb\xbf' characters. For the rest 2 numbers the conversion works fine. If I

Web server response generates UTF-8 (BOM) JSON

僤鯓⒐⒋嵵緔 提交于 2019-12-20 05:55:05
问题 I have a ZF2 application with a method which returns a JSON formatted array using: $response->setContent(json_encode($reponse)); return $response; The request is sent via Ajax using jQuery 1.10.2 and when I intercept the response body, using the developer mode or Fiddler, I can see in http://jsonlint.com/ that the JSON is not valid. As a result, my Ajax success callback is triggered with IE8 but with more recent versions or browsers such as Firefox or Chrome, it directly goes to the error

PHP Streaming CSV always adds UTF-8 BOM

别来无恙 提交于 2019-12-20 03:23:39
问题 The following code gets a 'report line' as an array and uses fputcsv to tranform it into CSV. Everything is working great except for the fact that regardless of the charset I use, it is putting a UTF-8 bom at the beginning of the file. This is exceptionally annoying because A) I am specifying iso and B) We have lots of users using tools that show the UTF-8 bom as characters of garbage. I have even tried writing the results to a string, stripping the UTF-8 BOM and then echo'ing it out and

Modify a JSON file with PowerShell without writing BOM

泄露秘密 提交于 2019-12-20 02:42:23
问题 I need to modify an existing UTF8 encoded JSON file with PowerShell. I tried with the following code: $fileContent = ConvertFrom-Json "$(Get-Content $filePath -Encoding UTF8)" $fileContent.someProperty = "someValue" $fileContent | ConvertTo-Json -Depth 999 | Out-File $filePath This adds a BOM to the file and also encodes it in UTF16. Is it possible to have ConvertFrom-Json and ConvertTo-Json do not do the encoding / BOM? 回答1: This has nothing to do with ConvertTo-Json or ConvertFrom-Json .

Modify a JSON file with PowerShell without writing BOM

前提是你 提交于 2019-12-20 02:42:12
问题 I need to modify an existing UTF8 encoded JSON file with PowerShell. I tried with the following code: $fileContent = ConvertFrom-Json "$(Get-Content $filePath -Encoding UTF8)" $fileContent.someProperty = "someValue" $fileContent | ConvertTo-Json -Depth 999 | Out-File $filePath This adds a BOM to the file and also encodes it in UTF16. Is it possible to have ConvertFrom-Json and ConvertTo-Json do not do the encoding / BOM? 回答1: This has nothing to do with ConvertTo-Json or ConvertFrom-Json .