Excel does not render correctly CSV files created by redirecting the standard output of PowerShell or the Command Prompt
You have a PowerShell script that prints a Comma Separated Value file (csv file). For example:
Write-Output "L0,L1,L2,L3,L4,Sizes"
Write-Output "A,AA,AAA,AAAA,AAAAA,10"
Write-Output "B,BB,BBB,BBBB,BBBBB,10"
Write-Output "C,CC,CCC,CCCC,CCCCC,10"
Write-Output "D,DD,DDD,DDDD,DDDDD,10"
It is common to print the output of the script directly to the console, so we can debug the script while we work on it. Once the script is ready, we can simply redirect the output into a file, like this:
.\Make-Csv.ps1 > .\Book5.csv
Bad news! When you open the csv file using Excel, the file is not displayed correctly. Each row is displayed in a single column, the whole line in one cell. You would expect that each row is split on every comma and displayed in multiple columns. The reason is that the file has been encoded using USC-2 Little Endian encoding. Why does Excel do this? Who knows! But there are two solutions for this problem.
The first solution is to add Write-Output "sep=,"
at the beginning of the script.
The second solution is to encode the file using UTF-8 encoding. For example:
.\Make-Csv.ps1 | Out-File .\Book6.csv -Encoding utf8
The second option requires more work, but it has an advantage. Avoiding the line that contains the sep=,
string, makes the csv file compatible with more tools. For example, Power BI does not ignore the sep=,
row as Excel does, so omitting the line is a better option if you will use both tools!