<0xEF> character showing up in files. How to remove them?

后端 未结 13 988
甜味超标
甜味超标 2020-11-29 18:08

I am doing compressing of JavaScript files and the compressor is complaining that my files have  character in them.

How can I search for these cha

相关标签:
13条回答
  • 2020-11-29 18:25

    Using tail might be easier:

    tail --bytes=+4 filename > new_filename
    
    0 讨论(0)
  • 2020-11-29 18:26

    I'm suggest the use of "dos2unix" tool, please test to run dos2unix ./thefile.js.

    If necessary try to use something like this for multiple files:

    for x in $(find . -type f -exec echo {} +); do dos2unix $x ; done
    

    My Regards.

    0 讨论(0)
  • 2020-11-29 18:27

    @tripleee's solution didn't work for me. But changing the file encoding to ASCII and again to UTF-8 did the trick :-)

    0 讨论(0)
  • 2020-11-29 18:32

    I've used vimgrep for this

    :vim "[\uFEFF]" *
    

    also normal vim search command

    /[\uFEFF]
    
    0 讨论(0)
  • 2020-11-29 18:33
    perl -pi~ -CSD -e 's/^\x{fffe}//' file1.js path/to/file2.js
    

    I would assume the tool will break if you have other utf-8 in your files, but if not, perhaps this workaround can help you. (Untested ...)

    Edit: added the -CSD option, as per tchrist's comment.

    0 讨论(0)
  • 2020-11-29 18:34

    The 'file' command shows if the BOM is present:

    For example: 'file myfile.xml' displays: "XML 1.0 document, UTF-8 Unicode (with BOM) text, with very long lines, with CRLF line terminators"

    dos2unix will remove the BOM.

    0 讨论(0)
提交回复
热议问题