[powershell] How to print a certain line of a file with PowerShell?

I don't have a decent text editor on this server, but I need to see what's causing an error on line 10 of a certain file. I do have PowerShell though...

This question is related to powershell

The answer is


You can use the -TotalCount parameter of the Get-Content cmdlet to read the first n lines, then use Select-Object to return only the nth line:

Get-Content file.txt -TotalCount 9 | Select-Object -Last 1;

Per the comment from @C.B. this should improve performance by only reading up to and including the nth line, rather than the entire file. Note that you can use the aliases -First or -Head in place of -TotalCount.


Here's a function that uses .NET's System.IO classes directly:

function GetLineAt([String] $path, [Int32] $index)
{
    [System.IO.FileMode] $mode = [System.IO.FileMode]::Open;
    [System.IO.FileAccess] $access = [System.IO.FileAccess]::Read;
    [System.IO.FileShare] $share = [System.IO.FileShare]::Read;
    [Int32] $bufferSize = 16 * 1024;
    [System.IO.FileOptions] $options = [System.IO.FileOptions]::SequentialScan;
    [System.Text.Encoding] $defaultEncoding = [System.Text.Encoding]::UTF8;
    # FileStream(String, FileMode, FileAccess, FileShare, Int32, FileOptions) constructor
    # http://msdn.microsoft.com/library/d0y914c5.aspx
    [System.IO.FileStream] $input = New-Object `
        -TypeName 'System.IO.FileStream' `
        -ArgumentList ($path, $mode, $access, $share, $bufferSize, $options);
    # StreamReader(Stream, Encoding, Boolean, Int32) constructor
    # http://msdn.microsoft.com/library/ms143458.aspx
    [System.IO.StreamReader] $reader = New-Object `
        -TypeName 'System.IO.StreamReader' `
        -ArgumentList ($input, $defaultEncoding, $true, $bufferSize);
    [String] $line = $null;
    [Int32] $currentIndex = 0;

    try
    {
        while (($line = $reader.ReadLine()) -ne $null)
        {
            if ($currentIndex++ -eq $index)
            {
                return $line;
            }
        }
    }
    finally
    {
        # Close $reader and $input
        $reader.Close();
    }

    # There are less than ($index + 1) lines in the file
    return $null;
}

GetLineAt 'file.txt' 9;

Tweaking the $bufferSize variable might affect performance. A more concise version that uses default buffer sizes and doesn't provide optimization hints could look like this:

function GetLineAt([String] $path, [Int32] $index)
{
    # StreamReader(String, Boolean) constructor
    # http://msdn.microsoft.com/library/9y86s1a9.aspx
    [System.IO.StreamReader] $reader = New-Object `
        -TypeName 'System.IO.StreamReader' `
        -ArgumentList ($path, $true);
    [String] $line = $null;
    [Int32] $currentIndex = 0;

    try
    {
        while (($line = $reader.ReadLine()) -ne $null)
        {
            if ($currentIndex++ -eq $index)
            {
                return $line;
            }
        }
    }
    finally
    {
        $reader.Close();
    }

    # There are less than ($index + 1) lines in the file
    return $null;
}

GetLineAt 'file.txt' 9;

Just for fun, here some test:

#Added this for @Graimer's request ;) (not same computer, but one with HD little more #performant...)

measure-command { Get-Content ita\ita.txt -TotalCount 260000 | Select-Object -Last 1 }

Days              : 0
Hours             : 0

Minutes           : 0
Seconds           : 28
Milliseconds      : 893
Ticks             : 288932649
TotalDays         : 0,000334412788194444
TotalHours        : 0,00802590691666667
TotalMinutes      : 0,481554415
TotalSeconds      : 28,8932649
TotalMilliseconds : 28893,2649


> measure-command { (gc "c:\ps\ita\ita.txt")[260000] }


Days              : 0
Hours             : 0
Minutes           : 0
Seconds           : 9
Milliseconds      : 257
Ticks             : 92572893
TotalDays         : 0,000107144552083333
TotalHours        : 0,00257146925
TotalMinutes      : 0,154288155
TotalSeconds      : 9,2572893
TotalMilliseconds : 9257,2893


> measure-command { ([System.IO.File]::ReadAllLines("c:\ps\ita\ita.txt"))[260000] }


Days              : 0
Hours             : 0
Minutes           : 0
Seconds           : 0
Milliseconds      : 234
Ticks             : 2348059
TotalDays         : 2,71766087962963E-06
TotalHours        : 6,52238611111111E-05
TotalMinutes      : 0,00391343166666667
TotalSeconds      : 0,2348059
TotalMilliseconds : 234,8059



> measure-command {get-content .\ita\ita.txt | select -index 260000}


Days              : 0
Hours             : 0
Minutes           : 0
Seconds           : 36
Milliseconds      : 591
Ticks             : 365912596
TotalDays         : 0,000423509949074074
TotalHours        : 0,0101642387777778
TotalMinutes      : 0,609854326666667
TotalSeconds      : 36,5912596
TotalMilliseconds : 36591,2596

the winner is : ([System.IO.File]::ReadAllLines( path ))[index]


To reduce memory consumption and to speed up the search you may use -ReadCount option of Get-Content cmdlet (https://technet.microsoft.com/ru-ru/library/hh849787.aspx).

This may save hours when you working with large files.

Here is an example:

$n = 60699010
$src = 'hugefile.csv'
$batch = 100
$timer = [Diagnostics.Stopwatch]::StartNew()

$count = 0
Get-Content $src -ReadCount $batch -TotalCount $n | %  { 
    $count += $_.Length
    if ($count -ge $n ) {
        $_[($n - $count + $_.Length - 1)]
    }
}

$timer.Stop()
$timer.Elapsed

This prints $n'th line and elapsed time.


It's as easy as using select:

Get-Content file.txt | Select -Index (line - 1)

E.g. to get line 5

Get-Content file.txt | Select -Index 4

Or you can use:

(Get-Content file.txt)[4]