SqlMapper.Settings.cs: New option "StrictBinding"
HugoRoss opened this issue · 3 comments
I would like to see a new option bool StrictBinding
or similar in SqlMapper.Settings.cs
that solves the following issues when set to true
:
Issue 1: Null assignments to non-nullable (value) types
Currently, the following applies:
int i = connection.ExecuteScalar<int>("select cast(null as int)");
Assert.Equal(0, i);
which is not the behavior I was hoping for - I would have liked that Dapper slaps an exception in my face, saying "Are you OK? You are attempting to assign a null
value to an Int32
!" Or less prosa:
Action a = () => connection.ExecuteScalar<int>("select cast(null as int)");
Assert.ThrowsAny<Exception>(a);
Reason: I made a mistake, my data type in code does not match the data type returned by my query and I should have used one of the following options instead:
int? i = connection.ExecuteScalar<int?>("select cast(null as int)");
Assert.False(i.HasValue);
or
int i = connection.ExecuteScalar<int?>("select cast(null as int)") ?? 0;
Assert.Equal(0, i);
Issue 2: Narrowing casts (e.g. Int64 -> Int32)
Currently, the following applies:
int i = connection.ExecuteScalar<int>("select cast(123 as bigint)");
Assert.Equal(123, i);
which is cool at first glance (why not casting it as it fits into the range) but no, again, I prefer Dapper would slap me an exception in my face, saying "Are you OK? You are attempting to assign an Int64
value to an Int32
!" Or less prosa:
Action a = () => connection.ExecuteScalar<int>("select cast(123 as bigint)");
Assert.ThrowsAny<Exception>(a);
Reason: I made a mistake, it is not safe to cast an Int64
into an Int32
, that could be a sleeping bug that strikes when my code is long in production, e.g. when the record id exceeds Int32.MaxValue
. Again my data type in code does not match the data type returned by my query and I should have used one of the following options instead:
long i = connection.ExecuteScalar<long>("select cast(123 as bigint)");
Assert.Equal(123L, i);
or
int i = (int)connection.ExecuteScalar<long>("select cast(123 as bigint)");
Assert.Equal(123, i);
What do you think?
Hello, at a glance your issue looks like a legitimate concern. However, data type definitions on SQL Server and C# is relatable, but mutually exclusive. Therefore, the connection library can only take a stab at parsing the value to correct form. If you look at the implementation of the ExecuteScalar<T>
method you will see, the below method is called to parse the value into the correct form. Notice the T is what you - as the caller - passed into the ExecuteScalar<T>
method. As you can see, it just - rightfully so - attempts to parse the value. Take a look at it and see
private static T Parse<T>(object? value)
{
if (value is null || value is DBNull) return default!;
if (value is T t) return t;
var type = typeof(T);
type = Nullable.GetUnderlyingType(type) ?? type;
if (type.IsEnum)
{
if (value is float || value is double || value is decimal)
{
value = Convert.ChangeType(value, Enum.GetUnderlyingType(type), CultureInfo.InvariantCulture);
}
return (T)Enum.ToObject(type, value);
}
if (typeHandlers.TryGetValue(type, out ITypeHandler? handler))
{
return (T)handler.Parse(type, value)!;
}
return (T)Convert.ChangeType(value, type, CultureInfo.InvariantCulture);
}
That code shows me that it is relatively easy to implement:
private static T ParseStrict<T>(object? value)
{
Type targetType = typeof(T);
if (value is null || value is DBNull)
{
if (IsReferenceType(targetType) || IsNullableValueType(targetType))
{
return default!;
}
throw new InvalidCastException($"Unable to cast value NULL to a '{targetType.FullName}'!");
}
if (value is T t) return t;
targetType = Nullable.GetUnderlyingType(targetType) ?? targetType;
Type sourceType = value.GetType();
if (targetType.IsEnum)
{
Type underlyingEnumType = Enum.GetUnderlyingType(targetType);
if (IsWideningConversion(sourceType, underlyingEnumType))
{
return (T)Enum.ToObject(targetType, value);
}
}
else
{
if (IsWideningConversion(sourceType, targetType))
{
if (typeHandlers.TryGetValue(targetType, out ITypeHandler? handler))
{
return (T)handler.Parse(targetType, value)!;
}
return (T)Convert.ChangeType(value, targetType, CultureInfo.InvariantCulture);
}
}
throw new InvalidCastException($"Unable to cast a value of type '{sourceType.FullName}' to '{targetType.FullName}'!");
}
The only thing still to do is then to define method IsWideningConversion
, but I cannot imagine that this would cause any major difficulties...
Problem solved. Why don't you go ahead and implement this by yourself? This is the open-source world after all.
Create a pull request and send it in. If you are able to implement it in a way that:
- it works
- it doesn't break other existing functionality
- it is an extensible way of achieving this functionality
- it is in-line with what Dapper like ORMs do
- it has good coding patterns implemented
I am sure there won't be any issues and your request will be approved.