Get all lines containing a string in a huge text file - as fast as possible?

后端 未结 5 856
伪装坚强ぢ
伪装坚强ぢ 2021-02-05 09:04

In Powershell, how to read and get as fast as possible the last line (or all the lines) which contains a specific string in a huge text file (about 200000 lines / 30 MBytes) ?

相关标签:
5条回答
  • 2021-02-05 09:20

    Have you tried using [System.IO.File]::ReadAllLines();? This method is more "raw" than the PowerShell-esque method, since we're plugging directly into the Microsoft .NET Framework types.

    $Lines = [System.IO.File]::ReadAllLines();
    [Regex]::Matches($Lines, 'my_string_pattern');
    
    0 讨论(0)
  • 2021-02-05 09:34
    $reader = New-Object System.IO.StreamReader("myfile.txt")
    
    $lines = @()
    
    if ($reader -ne $null) {
        while (!$reader.EndOfStream) {
            $line = $reader.ReadLine()
            if ($line.Contains("my_string")) {
                $lines += $line
            }
        }
    }
    
    $lines | Select-Object -Last 1
    
    0 讨论(0)
  • 2021-02-05 09:36

    Have you tried:

    gc myfile.txt | % { if($_ -match "my_string") {write-host $_}}
    

    Or, you can create a "grep"-like function:

    function grep($f,$s) {
        gc $f | % {if($_ -match $s){write-host $_}}
        }
    

    Then you can just issue: grep $myfile.txt $my_string

    0 讨论(0)
  • 2021-02-05 09:38

    I wanted to extract the lines that contained failed and also write this lines to a new file, I will add the full command for this

    get-content log.txt -ReadCount 1000 |
    >>  foreach { $_ -match "failed" } | Out-File C:\failes.txt 
    
    0 讨论(0)
  • 2021-02-05 09:40

    Try this:

    get-content myfile.txt -ReadCount 1000 |
     foreach { $_ -match "my_string" }
    

    That will read your file in chunks of 1000 records at a time, and find the matches in each chunk. This gives you better performance because you aren't wasting a lot of cpu time on memory management, since there's only 1000 lines at a time in the pipeline.

    0 讨论(0)
提交回复
热议问题