I am bulkcopying records from a csv file to a sql table. The sql table has columns that are varchar, and columns that are real datatype (based on the csv attributes we are given
Out-DataTable
is inspecting the properties of the first input object...
foreach($property in $object.PsObject.get_properties())
{
if ($first)
{
...to determine the DataType
of the corresponding DataColumn
...
if ($property.value -isnot [System.DBNull]) {
$Col.DataType = [System.Type]::GetType("$(Get-Type $property.TypeNameOfValue)")
}
The problem is, the input objects are produced by Import-Csv
...
$CSVDataTable = Import-Csv $csvFile | Out-DataTable
...which doesn't do any conversion of the CSV fields; every property will be of type [String]
, therefore, every DataColumn
will be, too.
The .NET equivalent of real is Single, so you either need to hard-code which columns (by name or ordinal) should be of type [Single]...
$objectProperties = @($object.PSObject.Properties)
for ($propertyIndex = 0; $propertyIndex -lt $objectProperties.Length)
{
$property = $objectProperties[$propertyIndex]
if ($propertyIndex -lt 7) {
$columnDataType = [String]
$itemValue = $property.Value
}
else {
$columnDataType = [Single]
$itemValue = if ($property.Value -match '^\s*-\s*$') {
[Single] 0
} else {
[Single]::Parse($property.Value, 'Float, AllowThousands, AllowParentheses')
}
}
if ($first)
{
$Col = new-object Data.DataColumn
$Col.ColumnName = $property.Name
$Col.DataType = $columnDataType
$DT.Columns.Add($Col)
}
$DR.Item($property.Name) = $itemValue
}
...or augment your detection logic...
foreach($property in $object.PSObject.Properties)
{
$singleValue = $null
$isSingle = [Single]::TryParse($property.Value, [ref] $singleValue)
if ($first)
{
$Col = new-object Data.DataColumn
$Col.ColumnName = $property.Name
$Col.DataType = if ($isSingle) {
[Single]
} else {
[String]
}
$DT.Columns.Add($Col)
}
$DR.Item($property.Name) = if ($isSingle) {
$singleValue
} else {
$property.value
}
}
To comply with the column DataType
, this code substitutes the [Single]
value for the original property [String]
value when parsing succeeds. Note that I've removed the checks for [DBNull]
and IsArray
because they would never evaluate to $true
since, again, Import-Csv
will only produce [String]
properties.
The above assumes that if a property's value from the first input object can be parsed as a [Single]
then the same is true for every input object. If that's not guaranteed, then you can do one pass through all input objects to determine the appropriate column types and a second pass to load the data...
function Out-DataTable
{
End
{
$InputObject = @($input)
$numberStyle = [System.Globalization.NumberStyles] 'Float, AllowThousands, AllowParentheses'
$dt = new-object Data.datatable
foreach ($propertyName in $InputObject[0].PSObject.Properties.Name)
{
$columnDataType = [Single]
foreach ($object in $InputObject)
{
$singleValue = $null
$propertyValue = $object.$propertyName
if ($propertyValue -notmatch '^\s*-?\s*$' `
-and -not [Single]::TryParse($propertyValue, $numberStyle, $null, [ref] $singleValue))
{
# Default to [String] if not all values can be parsed as [Single]
$columnDataType = [String]
break
}
}
$Col = new-object Data.DataColumn
$Col.ColumnName = $propertyName
$Col.DataType = $columnDataType
$DT.Columns.Add($Col)
}
foreach ($object in $InputObject)
{
$DR = $DT.NewRow()
foreach($property in $object.PSObject.Properties)
{
$DR.Item($property.Name) = if ($DT.Columns[$property.Name].DataType -eq [Single]) {
if ($property.Value -match '^\s*-?\s*$') {
[Single] 0
} else {
[Single]::Parse($property.Value, $numberStyle)
}
} else {
$property.value
}
}
$DT.Rows.Add($DR)
}
Write-Output @(,($dt))
}
} #Out-DataTable
I had a similar challenge after importing XML-data via .readXml because the XML included empty strings instead of dbnull. I made a lot of tests to get this converted as fast as possible and for me this worked best: