问题
During these days i'm working on different scripts to delete lines into different log files. But i have still one file that i can't handle because the structure is bit more complex than the others files i have purged. To give you an idea i put here some example lines of that log file.
[ 30-10-2017 16:38:07.62 | INFO | Some text
[ 30-10-2017 16:38:11.07 | INFO | Some text
[1];Erreur XXXX non-gérée : Some text.
Merci de communiquer some text :
- Some text again
- Identifiant : XXXXXXXX-1789-XXXXX-b41b-XXXX. >> Some text
[ 02-11-2017 16:38:11.10 | INFO | Some text
[ 02-11-2017 16:38:11.10 | INFO | Some text
[2];88852228343 / some text
[ 03-11-2017 16:38:11.10 | INFO | Some text
[ 03-11-2017 16:38:11.10 | INFO | Some text
Other text here
And other one
[ 04-11-2017 16:38:11.10 | INFO | Some text
[ 04-11-2017 16:38:11.10 | INFO | Some text
I have a tried something but i think it was not the right way to doing it -> How to catch terminating error and still continue my PowerShell script
The file is 4Mo and the code to do it right now is not working because i do it the wrong way. I do first a substring to extract date then compare but when i hit a line which is not starting by a date i have an error and the script stop
try
{
(Get-Content $Fichier) |
Where-Object {$_} |
Where-Object { ([datetime]::ParseExact(([string]$_).Substring(2,19), $Format, $Culture) -ge (Get-Date).AddDays(-$Jours)) } |
Set-Content $Fichier
LogMessage -Message "Fichier $Fichier purgé des toutes les lignes datant de plus de $Jours jours"
}
catch
{
$ErrorMessage = $_.Exception.Message
$FailedItem = $_.Exception.ItemName
LogMessage -Message "$FailedItem - $ErrorMessage"
}
I think it will be better if i start reading file and start deleting each line and then stop to delete the rest of the file when i find a date which is greater or equal than (today - 20 days). But right now i can't find a proper way to achieve it in PowerShell.
回答1:
Your suggested approach of parsing each line until you find a date that's 20 days or less old would probably work. I'd do something like this:
# Placeholder for the dates we're about to parse
$DT = Get-Date
# Define the threshold once
$Limit = (Get-Date).AddDays(-$Jours)
# We'll use this flag to keep track of whether we've reached the limit or not
$Keep = $false
(Get-Content $Fichier |ForEach-Object {
if($Keep){
$_
}
elseif([datetime]::TryParseExact($_.Substring(2,19), $Format, $Culture, $null, [ref]$DT) -and $DT -ge $Limit){
$Keep = $true
$_
}
}) |Set-Content $Fichier
回答2:
Another idea could be to generate a file for each day and generate a new logfile for the current day.
A simultaneous export to a log window and file could save you from log loss and also allow you to keep the logsize at a reasonable size without stressing the CPU.
I was at the same problem with logs especially when using RTF and replacing certain strings with icons to beef up the log. and make it exportable easily.
There are several ways to generate logs but for my needs I used the way to write one log per day and purge the log window once the day has switched (00:00:00).
来源:https://stackoverflow.com/questions/48903904/purge-with-powershell-a-big-log-file-by-deleting-line-by-line-and-stopping-it-wh