What does null! statement mean?












33















I've recently seen the following code:



public class Person
{
//line 1
public string FirstName { get; }
//line 2
public string LastName { get; } = null!;
//assign null is possible
public string? MiddleName {get; } = null;

public Person(string firstName, string lastName, string middleName)
{
FirstName = firstName;
LastName = lastName;
MiddleName = middleName;
}

public Person(string firstName, string lastName)
{
FirstName = firstName;
LastName = lastName;
MiddleName = null;
}
}


Basically I try to dig into new c# 8 features. One of them is NullableReferenceTypes. Actually there're a lot of articles and information about it already. E.g. this article is quite good.
But I din't find any information about this new statement null!
Can someone provide to me explanation for it ? Why I need to use this ?
And what is difference between line1 and line2 ?










share|improve this question


















  • 10





    ! is the null-forgiving operator, telling the compiler that, even though it normally wouldn't allow it, it should look the other way and allow it anyway, because we know better. null! itself has little practical use, as it all but negates the usefulness of nullable reference types. It's more useful when you know an expression can't be null, but the compiler doesn't.

    – Jeroen Mostert
    Feb 16 at 14:58











  • @JeroenMostert so smth like force? I mean even if it is unusual lets do that forced.

    – isxaker
    Feb 16 at 15:06






  • 1





    Yes, except it's more than unusual -- because string, under the new rules, is not a nullable reference type, and so should never be null. Assigning null! effectively says "I know this should never be null, but guess what, I'm doing it anyway". There's almost no program where that would make sense -- the only reason to do it would be because you know you're going to assign a non-null value before anyone could get a NullReferenceException, and want to signal that you haven't forgotten to assign it. Possible, but unlikely, so not very good as an example.

    – Jeroen Mostert
    Feb 16 at 15:26


















33















I've recently seen the following code:



public class Person
{
//line 1
public string FirstName { get; }
//line 2
public string LastName { get; } = null!;
//assign null is possible
public string? MiddleName {get; } = null;

public Person(string firstName, string lastName, string middleName)
{
FirstName = firstName;
LastName = lastName;
MiddleName = middleName;
}

public Person(string firstName, string lastName)
{
FirstName = firstName;
LastName = lastName;
MiddleName = null;
}
}


Basically I try to dig into new c# 8 features. One of them is NullableReferenceTypes. Actually there're a lot of articles and information about it already. E.g. this article is quite good.
But I din't find any information about this new statement null!
Can someone provide to me explanation for it ? Why I need to use this ?
And what is difference between line1 and line2 ?










share|improve this question


















  • 10





    ! is the null-forgiving operator, telling the compiler that, even though it normally wouldn't allow it, it should look the other way and allow it anyway, because we know better. null! itself has little practical use, as it all but negates the usefulness of nullable reference types. It's more useful when you know an expression can't be null, but the compiler doesn't.

    – Jeroen Mostert
    Feb 16 at 14:58











  • @JeroenMostert so smth like force? I mean even if it is unusual lets do that forced.

    – isxaker
    Feb 16 at 15:06






  • 1





    Yes, except it's more than unusual -- because string, under the new rules, is not a nullable reference type, and so should never be null. Assigning null! effectively says "I know this should never be null, but guess what, I'm doing it anyway". There's almost no program where that would make sense -- the only reason to do it would be because you know you're going to assign a non-null value before anyone could get a NullReferenceException, and want to signal that you haven't forgotten to assign it. Possible, but unlikely, so not very good as an example.

    – Jeroen Mostert
    Feb 16 at 15:26
















33












33








33


4






I've recently seen the following code:



public class Person
{
//line 1
public string FirstName { get; }
//line 2
public string LastName { get; } = null!;
//assign null is possible
public string? MiddleName {get; } = null;

public Person(string firstName, string lastName, string middleName)
{
FirstName = firstName;
LastName = lastName;
MiddleName = middleName;
}

public Person(string firstName, string lastName)
{
FirstName = firstName;
LastName = lastName;
MiddleName = null;
}
}


Basically I try to dig into new c# 8 features. One of them is NullableReferenceTypes. Actually there're a lot of articles and information about it already. E.g. this article is quite good.
But I din't find any information about this new statement null!
Can someone provide to me explanation for it ? Why I need to use this ?
And what is difference between line1 and line2 ?










share|improve this question














I've recently seen the following code:



public class Person
{
//line 1
public string FirstName { get; }
//line 2
public string LastName { get; } = null!;
//assign null is possible
public string? MiddleName {get; } = null;

public Person(string firstName, string lastName, string middleName)
{
FirstName = firstName;
LastName = lastName;
MiddleName = middleName;
}

public Person(string firstName, string lastName)
{
FirstName = firstName;
LastName = lastName;
MiddleName = null;
}
}


Basically I try to dig into new c# 8 features. One of them is NullableReferenceTypes. Actually there're a lot of articles and information about it already. E.g. this article is quite good.
But I din't find any information about this new statement null!
Can someone provide to me explanation for it ? Why I need to use this ?
And what is difference between line1 and line2 ?







c# .net c#-8.0 nullablereferencetypes






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Feb 16 at 14:53









isxakerisxaker

3,24153865




3,24153865








  • 10





    ! is the null-forgiving operator, telling the compiler that, even though it normally wouldn't allow it, it should look the other way and allow it anyway, because we know better. null! itself has little practical use, as it all but negates the usefulness of nullable reference types. It's more useful when you know an expression can't be null, but the compiler doesn't.

    – Jeroen Mostert
    Feb 16 at 14:58











  • @JeroenMostert so smth like force? I mean even if it is unusual lets do that forced.

    – isxaker
    Feb 16 at 15:06






  • 1





    Yes, except it's more than unusual -- because string, under the new rules, is not a nullable reference type, and so should never be null. Assigning null! effectively says "I know this should never be null, but guess what, I'm doing it anyway". There's almost no program where that would make sense -- the only reason to do it would be because you know you're going to assign a non-null value before anyone could get a NullReferenceException, and want to signal that you haven't forgotten to assign it. Possible, but unlikely, so not very good as an example.

    – Jeroen Mostert
    Feb 16 at 15:26
















  • 10





    ! is the null-forgiving operator, telling the compiler that, even though it normally wouldn't allow it, it should look the other way and allow it anyway, because we know better. null! itself has little practical use, as it all but negates the usefulness of nullable reference types. It's more useful when you know an expression can't be null, but the compiler doesn't.

    – Jeroen Mostert
    Feb 16 at 14:58











  • @JeroenMostert so smth like force? I mean even if it is unusual lets do that forced.

    – isxaker
    Feb 16 at 15:06






  • 1





    Yes, except it's more than unusual -- because string, under the new rules, is not a nullable reference type, and so should never be null. Assigning null! effectively says "I know this should never be null, but guess what, I'm doing it anyway". There's almost no program where that would make sense -- the only reason to do it would be because you know you're going to assign a non-null value before anyone could get a NullReferenceException, and want to signal that you haven't forgotten to assign it. Possible, but unlikely, so not very good as an example.

    – Jeroen Mostert
    Feb 16 at 15:26










10




10





! is the null-forgiving operator, telling the compiler that, even though it normally wouldn't allow it, it should look the other way and allow it anyway, because we know better. null! itself has little practical use, as it all but negates the usefulness of nullable reference types. It's more useful when you know an expression can't be null, but the compiler doesn't.

– Jeroen Mostert
Feb 16 at 14:58





! is the null-forgiving operator, telling the compiler that, even though it normally wouldn't allow it, it should look the other way and allow it anyway, because we know better. null! itself has little practical use, as it all but negates the usefulness of nullable reference types. It's more useful when you know an expression can't be null, but the compiler doesn't.

– Jeroen Mostert
Feb 16 at 14:58













@JeroenMostert so smth like force? I mean even if it is unusual lets do that forced.

– isxaker
Feb 16 at 15:06





@JeroenMostert so smth like force? I mean even if it is unusual lets do that forced.

– isxaker
Feb 16 at 15:06




1




1





Yes, except it's more than unusual -- because string, under the new rules, is not a nullable reference type, and so should never be null. Assigning null! effectively says "I know this should never be null, but guess what, I'm doing it anyway". There's almost no program where that would make sense -- the only reason to do it would be because you know you're going to assign a non-null value before anyone could get a NullReferenceException, and want to signal that you haven't forgotten to assign it. Possible, but unlikely, so not very good as an example.

– Jeroen Mostert
Feb 16 at 15:26







Yes, except it's more than unusual -- because string, under the new rules, is not a nullable reference type, and so should never be null. Assigning null! effectively says "I know this should never be null, but guess what, I'm doing it anyway". There's almost no program where that would make sense -- the only reason to do it would be because you know you're going to assign a non-null value before anyone could get a NullReferenceException, and want to signal that you haven't forgotten to assign it. Possible, but unlikely, so not very good as an example.

– Jeroen Mostert
Feb 16 at 15:26














2 Answers
2






active

oldest

votes


















37














What is the ! operator when used on a type?



The ! operator, when used on a type, is called the Null Forgiving Operator [docs]. It was introduced in C# 8.0





Technical Explanation



Typical usage



Assuming this definition:



class Person
{
public string? MiddleName;
}


The usage would be:



void LogPerson(Person person)
{
Console.WriteLine(person.MiddleName.Length); // WARNING: may be null
Console.WriteLine(person.MiddleName!.Length); // No warning
}


This operator basically turns off the compiler null checks.



Inner workings



Using this operator tells the compiler that something that could be null, is safe to be accessed. You express the intent to "not care" about null safety in this instance.



There are 2 states a variable can be in - when talking about null-safety.




  • Nullable - Can be null.

  • Non-Nullable - Can not be null.


Since C# 8.0 all reference types are Non-nullable by default.



The "nullability" can be modified by these 2 new type-operators:





  • ! = From Nullable to Non-Nullable


  • ? = From Non-Nullable to Nullable


These operators are basically counterparts to one another.
The Compiler uses the information - you define with those operators - to ensure null-safety.




? Operator usage.





  1. Nullable string? x;





    • x is a reference type - So by default non-nullable.

    • We apply the ? operator - which makes it nullable.


    • x = null Works fine.




  2. Non-Nullable string y;





    • y is a reference type - So by default non-nullable.


    • x = null Generates a warning since you assign a null value to something that is not supposed to be null.





! Operator usage.



string x;
string? y = null;




  1. x = y




    • Illegal! - Warning: "y" may be null

    • The Right side of the assignment is non-nullable but the left side is nullable.




  2. x = y!




    • Legal!

    • The right and left side of the assignment is non-nullable.

    • Works since y! Applies the ! operator to y which makes it non-nullable.





WARNING The ! operator only turns off the compiler-checks at a type-system level - At runtime, the value may still be null.




This is an Anti-Pattern.



You should try to avoid using the ! Null-Forgiving-Operator.



There are valid use-cases ( outlined in detail below ) like unit-tests where this operator is appropriate to use. In 99% of the cases, you are better off with an alternative solution. Please do not slap dozens of !'s in your code, just to silence the warnings. Think if your cause really warrants the use.




Use - but with care. If there is no concrete purpose / use-case prefer not to use it.




It negates the effects of null-safety you get guaranteed by the compiler.



Using the ! operator will create very hard to find bugs. If you have a property that is marked non-nullable, you will assume you can use it safely. But at runtime, you suddenly run into a NullReferenceException and scratch your head. Since a value actually became null after bypassing the compiler-checks with !.



Why does this operator exist then?




  • In some edge cases, the compiler is not able to detect that a nullable value is actually non-nullable.

  • Easier legacy code-base migration.

  • In some cases, you just don't care if something becomes null.

  • When working with Unit-tests you may want to check the behavior of code when a null comes through.




Answering your question specifically.



So what does null! mean?



It tells the compiler that null is not a null value. Sounds weird, doesn't it?



It is the same as y! from the example above. It only looks weird since you apply the operator to the null literal. But the concept is the same.



Picking apart what is happening.



public string LastName { get; } = null!;


This line defines a non-nullable class property named LastNameof type string.
Since it is non-nullable you can technically not assign null to it - obviously.



But you do just that by using the ! operator. Because null! is not null - as far as the compiler is concerned about null-safety.






share|improve this answer


























  • a string could be null. You can assign null value like this string str = null; I'm using c# 6 .net 4.6.1. I think your statement is wrong "This line defines a non-nullable class property named LastNameof type string. Since it is non-nullable you can technically not assign null to it - obviously."

    – canbax
    Feb 20 at 6:09








  • 1





    @canbax Nullabilty checks are only supported by c#8 and above

    – Patrick Hollweck
    Feb 20 at 6:42






  • 1





    @canbax a string could be null not any more. I mean if you use c# 8 and enable NullableReferenceTypes feature. VS immediately gives you a warning if you try assign null to string. But from the other hand you're able to introduce nullable string(string? s = null). No warning in that case.

    – isxaker
    Feb 20 at 10:10













  • "You should try to never use the ! Null-Forgiving-Operator. It negates the effects of null-safety you get guaranteed by the compiler." How are you going to write unit tests for argument validation? That's my most frequent use of ! in Noda Time.

    – Jon Skeet
    13 hours ago






  • 1





    I would definitely recommend editing later - currently anyone who is widely using the null-forgiving operator in their tests could easily feel bad about it given your current text, which doesn't give any indication of limitations to the advice. I think "never" is likely to be unachievable in several code bases - for example, in Noda Time we have an invariant in parse results that means I never expose a null value, but I still end up with one where the parse itself has failed. I think "try hard to avoid" has a more positive tone. Just my opinion though.

    – Jon Skeet
    10 hours ago



















6














When the "nullable reference types" feature is turned on, the compiler tracks which values in your code it thinks may be null or not. There are times where the compiler could have insufficient knowledge.



For example, you may be using a delayed initialization pattern, where the constructor doesn't initialize all the fields with actual (non-null) values, but you always call an initialization method which guarantees the fields are non-null. In such case, you face a trade-off:




  • if you mark the field as nullable, the compiler is happy, but you have to un-necessarily check for null when you use the field,

  • if you leave the field as non-nullable, the compiler will complain that it is not initialized by the constructors (you can suppress that with null!), then the field can be used without null check.


Note that by using the ! suppression operator, you are taking on some risk. Imagine that you are not actually initializing all the fields as consistently as you thought. Then the use of null! to initialize a field covers up the fact that a null is slipping in. Some unsuspecting code can receive a null and therefore fail.



More generally, you may have some domain knowledge: "if I checked a certain method, then I know that some value isn't null":



if (CheckEverythingIsReady())
{
// you know that `field` is non-null, but the compiler doesn't. The suppression can help
UseNonNullValueFromField(this.field!);
}


Again, you must be confident of your code's invariant to do this ("I know better").






share|improve this answer























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54724304%2fwhat-does-null-statement-mean%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    37














    What is the ! operator when used on a type?



    The ! operator, when used on a type, is called the Null Forgiving Operator [docs]. It was introduced in C# 8.0





    Technical Explanation



    Typical usage



    Assuming this definition:



    class Person
    {
    public string? MiddleName;
    }


    The usage would be:



    void LogPerson(Person person)
    {
    Console.WriteLine(person.MiddleName.Length); // WARNING: may be null
    Console.WriteLine(person.MiddleName!.Length); // No warning
    }


    This operator basically turns off the compiler null checks.



    Inner workings



    Using this operator tells the compiler that something that could be null, is safe to be accessed. You express the intent to "not care" about null safety in this instance.



    There are 2 states a variable can be in - when talking about null-safety.




    • Nullable - Can be null.

    • Non-Nullable - Can not be null.


    Since C# 8.0 all reference types are Non-nullable by default.



    The "nullability" can be modified by these 2 new type-operators:





    • ! = From Nullable to Non-Nullable


    • ? = From Non-Nullable to Nullable


    These operators are basically counterparts to one another.
    The Compiler uses the information - you define with those operators - to ensure null-safety.




    ? Operator usage.





    1. Nullable string? x;





      • x is a reference type - So by default non-nullable.

      • We apply the ? operator - which makes it nullable.


      • x = null Works fine.




    2. Non-Nullable string y;





      • y is a reference type - So by default non-nullable.


      • x = null Generates a warning since you assign a null value to something that is not supposed to be null.





    ! Operator usage.



    string x;
    string? y = null;




    1. x = y




      • Illegal! - Warning: "y" may be null

      • The Right side of the assignment is non-nullable but the left side is nullable.




    2. x = y!




      • Legal!

      • The right and left side of the assignment is non-nullable.

      • Works since y! Applies the ! operator to y which makes it non-nullable.





    WARNING The ! operator only turns off the compiler-checks at a type-system level - At runtime, the value may still be null.




    This is an Anti-Pattern.



    You should try to avoid using the ! Null-Forgiving-Operator.



    There are valid use-cases ( outlined in detail below ) like unit-tests where this operator is appropriate to use. In 99% of the cases, you are better off with an alternative solution. Please do not slap dozens of !'s in your code, just to silence the warnings. Think if your cause really warrants the use.




    Use - but with care. If there is no concrete purpose / use-case prefer not to use it.




    It negates the effects of null-safety you get guaranteed by the compiler.



    Using the ! operator will create very hard to find bugs. If you have a property that is marked non-nullable, you will assume you can use it safely. But at runtime, you suddenly run into a NullReferenceException and scratch your head. Since a value actually became null after bypassing the compiler-checks with !.



    Why does this operator exist then?




    • In some edge cases, the compiler is not able to detect that a nullable value is actually non-nullable.

    • Easier legacy code-base migration.

    • In some cases, you just don't care if something becomes null.

    • When working with Unit-tests you may want to check the behavior of code when a null comes through.




    Answering your question specifically.



    So what does null! mean?



    It tells the compiler that null is not a null value. Sounds weird, doesn't it?



    It is the same as y! from the example above. It only looks weird since you apply the operator to the null literal. But the concept is the same.



    Picking apart what is happening.



    public string LastName { get; } = null!;


    This line defines a non-nullable class property named LastNameof type string.
    Since it is non-nullable you can technically not assign null to it - obviously.



    But you do just that by using the ! operator. Because null! is not null - as far as the compiler is concerned about null-safety.






    share|improve this answer


























    • a string could be null. You can assign null value like this string str = null; I'm using c# 6 .net 4.6.1. I think your statement is wrong "This line defines a non-nullable class property named LastNameof type string. Since it is non-nullable you can technically not assign null to it - obviously."

      – canbax
      Feb 20 at 6:09








    • 1





      @canbax Nullabilty checks are only supported by c#8 and above

      – Patrick Hollweck
      Feb 20 at 6:42






    • 1





      @canbax a string could be null not any more. I mean if you use c# 8 and enable NullableReferenceTypes feature. VS immediately gives you a warning if you try assign null to string. But from the other hand you're able to introduce nullable string(string? s = null). No warning in that case.

      – isxaker
      Feb 20 at 10:10













    • "You should try to never use the ! Null-Forgiving-Operator. It negates the effects of null-safety you get guaranteed by the compiler." How are you going to write unit tests for argument validation? That's my most frequent use of ! in Noda Time.

      – Jon Skeet
      13 hours ago






    • 1





      I would definitely recommend editing later - currently anyone who is widely using the null-forgiving operator in their tests could easily feel bad about it given your current text, which doesn't give any indication of limitations to the advice. I think "never" is likely to be unachievable in several code bases - for example, in Noda Time we have an invariant in parse results that means I never expose a null value, but I still end up with one where the parse itself has failed. I think "try hard to avoid" has a more positive tone. Just my opinion though.

      – Jon Skeet
      10 hours ago
















    37














    What is the ! operator when used on a type?



    The ! operator, when used on a type, is called the Null Forgiving Operator [docs]. It was introduced in C# 8.0





    Technical Explanation



    Typical usage



    Assuming this definition:



    class Person
    {
    public string? MiddleName;
    }


    The usage would be:



    void LogPerson(Person person)
    {
    Console.WriteLine(person.MiddleName.Length); // WARNING: may be null
    Console.WriteLine(person.MiddleName!.Length); // No warning
    }


    This operator basically turns off the compiler null checks.



    Inner workings



    Using this operator tells the compiler that something that could be null, is safe to be accessed. You express the intent to "not care" about null safety in this instance.



    There are 2 states a variable can be in - when talking about null-safety.




    • Nullable - Can be null.

    • Non-Nullable - Can not be null.


    Since C# 8.0 all reference types are Non-nullable by default.



    The "nullability" can be modified by these 2 new type-operators:





    • ! = From Nullable to Non-Nullable


    • ? = From Non-Nullable to Nullable


    These operators are basically counterparts to one another.
    The Compiler uses the information - you define with those operators - to ensure null-safety.




    ? Operator usage.





    1. Nullable string? x;





      • x is a reference type - So by default non-nullable.

      • We apply the ? operator - which makes it nullable.


      • x = null Works fine.




    2. Non-Nullable string y;





      • y is a reference type - So by default non-nullable.


      • x = null Generates a warning since you assign a null value to something that is not supposed to be null.





    ! Operator usage.



    string x;
    string? y = null;




    1. x = y




      • Illegal! - Warning: "y" may be null

      • The Right side of the assignment is non-nullable but the left side is nullable.




    2. x = y!




      • Legal!

      • The right and left side of the assignment is non-nullable.

      • Works since y! Applies the ! operator to y which makes it non-nullable.





    WARNING The ! operator only turns off the compiler-checks at a type-system level - At runtime, the value may still be null.




    This is an Anti-Pattern.



    You should try to avoid using the ! Null-Forgiving-Operator.



    There are valid use-cases ( outlined in detail below ) like unit-tests where this operator is appropriate to use. In 99% of the cases, you are better off with an alternative solution. Please do not slap dozens of !'s in your code, just to silence the warnings. Think if your cause really warrants the use.




    Use - but with care. If there is no concrete purpose / use-case prefer not to use it.




    It negates the effects of null-safety you get guaranteed by the compiler.



    Using the ! operator will create very hard to find bugs. If you have a property that is marked non-nullable, you will assume you can use it safely. But at runtime, you suddenly run into a NullReferenceException and scratch your head. Since a value actually became null after bypassing the compiler-checks with !.



    Why does this operator exist then?




    • In some edge cases, the compiler is not able to detect that a nullable value is actually non-nullable.

    • Easier legacy code-base migration.

    • In some cases, you just don't care if something becomes null.

    • When working with Unit-tests you may want to check the behavior of code when a null comes through.




    Answering your question specifically.



    So what does null! mean?



    It tells the compiler that null is not a null value. Sounds weird, doesn't it?



    It is the same as y! from the example above. It only looks weird since you apply the operator to the null literal. But the concept is the same.



    Picking apart what is happening.



    public string LastName { get; } = null!;


    This line defines a non-nullable class property named LastNameof type string.
    Since it is non-nullable you can technically not assign null to it - obviously.



    But you do just that by using the ! operator. Because null! is not null - as far as the compiler is concerned about null-safety.






    share|improve this answer


























    • a string could be null. You can assign null value like this string str = null; I'm using c# 6 .net 4.6.1. I think your statement is wrong "This line defines a non-nullable class property named LastNameof type string. Since it is non-nullable you can technically not assign null to it - obviously."

      – canbax
      Feb 20 at 6:09








    • 1





      @canbax Nullabilty checks are only supported by c#8 and above

      – Patrick Hollweck
      Feb 20 at 6:42






    • 1





      @canbax a string could be null not any more. I mean if you use c# 8 and enable NullableReferenceTypes feature. VS immediately gives you a warning if you try assign null to string. But from the other hand you're able to introduce nullable string(string? s = null). No warning in that case.

      – isxaker
      Feb 20 at 10:10













    • "You should try to never use the ! Null-Forgiving-Operator. It negates the effects of null-safety you get guaranteed by the compiler." How are you going to write unit tests for argument validation? That's my most frequent use of ! in Noda Time.

      – Jon Skeet
      13 hours ago






    • 1





      I would definitely recommend editing later - currently anyone who is widely using the null-forgiving operator in their tests could easily feel bad about it given your current text, which doesn't give any indication of limitations to the advice. I think "never" is likely to be unachievable in several code bases - for example, in Noda Time we have an invariant in parse results that means I never expose a null value, but I still end up with one where the parse itself has failed. I think "try hard to avoid" has a more positive tone. Just my opinion though.

      – Jon Skeet
      10 hours ago














    37












    37








    37







    What is the ! operator when used on a type?



    The ! operator, when used on a type, is called the Null Forgiving Operator [docs]. It was introduced in C# 8.0





    Technical Explanation



    Typical usage



    Assuming this definition:



    class Person
    {
    public string? MiddleName;
    }


    The usage would be:



    void LogPerson(Person person)
    {
    Console.WriteLine(person.MiddleName.Length); // WARNING: may be null
    Console.WriteLine(person.MiddleName!.Length); // No warning
    }


    This operator basically turns off the compiler null checks.



    Inner workings



    Using this operator tells the compiler that something that could be null, is safe to be accessed. You express the intent to "not care" about null safety in this instance.



    There are 2 states a variable can be in - when talking about null-safety.




    • Nullable - Can be null.

    • Non-Nullable - Can not be null.


    Since C# 8.0 all reference types are Non-nullable by default.



    The "nullability" can be modified by these 2 new type-operators:





    • ! = From Nullable to Non-Nullable


    • ? = From Non-Nullable to Nullable


    These operators are basically counterparts to one another.
    The Compiler uses the information - you define with those operators - to ensure null-safety.




    ? Operator usage.





    1. Nullable string? x;





      • x is a reference type - So by default non-nullable.

      • We apply the ? operator - which makes it nullable.


      • x = null Works fine.




    2. Non-Nullable string y;





      • y is a reference type - So by default non-nullable.


      • x = null Generates a warning since you assign a null value to something that is not supposed to be null.





    ! Operator usage.



    string x;
    string? y = null;




    1. x = y




      • Illegal! - Warning: "y" may be null

      • The Right side of the assignment is non-nullable but the left side is nullable.




    2. x = y!




      • Legal!

      • The right and left side of the assignment is non-nullable.

      • Works since y! Applies the ! operator to y which makes it non-nullable.





    WARNING The ! operator only turns off the compiler-checks at a type-system level - At runtime, the value may still be null.




    This is an Anti-Pattern.



    You should try to avoid using the ! Null-Forgiving-Operator.



    There are valid use-cases ( outlined in detail below ) like unit-tests where this operator is appropriate to use. In 99% of the cases, you are better off with an alternative solution. Please do not slap dozens of !'s in your code, just to silence the warnings. Think if your cause really warrants the use.




    Use - but with care. If there is no concrete purpose / use-case prefer not to use it.




    It negates the effects of null-safety you get guaranteed by the compiler.



    Using the ! operator will create very hard to find bugs. If you have a property that is marked non-nullable, you will assume you can use it safely. But at runtime, you suddenly run into a NullReferenceException and scratch your head. Since a value actually became null after bypassing the compiler-checks with !.



    Why does this operator exist then?




    • In some edge cases, the compiler is not able to detect that a nullable value is actually non-nullable.

    • Easier legacy code-base migration.

    • In some cases, you just don't care if something becomes null.

    • When working with Unit-tests you may want to check the behavior of code when a null comes through.




    Answering your question specifically.



    So what does null! mean?



    It tells the compiler that null is not a null value. Sounds weird, doesn't it?



    It is the same as y! from the example above. It only looks weird since you apply the operator to the null literal. But the concept is the same.



    Picking apart what is happening.



    public string LastName { get; } = null!;


    This line defines a non-nullable class property named LastNameof type string.
    Since it is non-nullable you can technically not assign null to it - obviously.



    But you do just that by using the ! operator. Because null! is not null - as far as the compiler is concerned about null-safety.






    share|improve this answer















    What is the ! operator when used on a type?



    The ! operator, when used on a type, is called the Null Forgiving Operator [docs]. It was introduced in C# 8.0





    Technical Explanation



    Typical usage



    Assuming this definition:



    class Person
    {
    public string? MiddleName;
    }


    The usage would be:



    void LogPerson(Person person)
    {
    Console.WriteLine(person.MiddleName.Length); // WARNING: may be null
    Console.WriteLine(person.MiddleName!.Length); // No warning
    }


    This operator basically turns off the compiler null checks.



    Inner workings



    Using this operator tells the compiler that something that could be null, is safe to be accessed. You express the intent to "not care" about null safety in this instance.



    There are 2 states a variable can be in - when talking about null-safety.




    • Nullable - Can be null.

    • Non-Nullable - Can not be null.


    Since C# 8.0 all reference types are Non-nullable by default.



    The "nullability" can be modified by these 2 new type-operators:





    • ! = From Nullable to Non-Nullable


    • ? = From Non-Nullable to Nullable


    These operators are basically counterparts to one another.
    The Compiler uses the information - you define with those operators - to ensure null-safety.




    ? Operator usage.





    1. Nullable string? x;





      • x is a reference type - So by default non-nullable.

      • We apply the ? operator - which makes it nullable.


      • x = null Works fine.




    2. Non-Nullable string y;





      • y is a reference type - So by default non-nullable.


      • x = null Generates a warning since you assign a null value to something that is not supposed to be null.





    ! Operator usage.



    string x;
    string? y = null;




    1. x = y




      • Illegal! - Warning: "y" may be null

      • The Right side of the assignment is non-nullable but the left side is nullable.




    2. x = y!




      • Legal!

      • The right and left side of the assignment is non-nullable.

      • Works since y! Applies the ! operator to y which makes it non-nullable.





    WARNING The ! operator only turns off the compiler-checks at a type-system level - At runtime, the value may still be null.




    This is an Anti-Pattern.



    You should try to avoid using the ! Null-Forgiving-Operator.



    There are valid use-cases ( outlined in detail below ) like unit-tests where this operator is appropriate to use. In 99% of the cases, you are better off with an alternative solution. Please do not slap dozens of !'s in your code, just to silence the warnings. Think if your cause really warrants the use.




    Use - but with care. If there is no concrete purpose / use-case prefer not to use it.




    It negates the effects of null-safety you get guaranteed by the compiler.



    Using the ! operator will create very hard to find bugs. If you have a property that is marked non-nullable, you will assume you can use it safely. But at runtime, you suddenly run into a NullReferenceException and scratch your head. Since a value actually became null after bypassing the compiler-checks with !.



    Why does this operator exist then?




    • In some edge cases, the compiler is not able to detect that a nullable value is actually non-nullable.

    • Easier legacy code-base migration.

    • In some cases, you just don't care if something becomes null.

    • When working with Unit-tests you may want to check the behavior of code when a null comes through.




    Answering your question specifically.



    So what does null! mean?



    It tells the compiler that null is not a null value. Sounds weird, doesn't it?



    It is the same as y! from the example above. It only looks weird since you apply the operator to the null literal. But the concept is the same.



    Picking apart what is happening.



    public string LastName { get; } = null!;


    This line defines a non-nullable class property named LastNameof type string.
    Since it is non-nullable you can technically not assign null to it - obviously.



    But you do just that by using the ! operator. Because null! is not null - as far as the compiler is concerned about null-safety.







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited 8 hours ago

























    answered Feb 16 at 15:18









    Patrick HollweckPatrick Hollweck

    1,100816




    1,100816













    • a string could be null. You can assign null value like this string str = null; I'm using c# 6 .net 4.6.1. I think your statement is wrong "This line defines a non-nullable class property named LastNameof type string. Since it is non-nullable you can technically not assign null to it - obviously."

      – canbax
      Feb 20 at 6:09








    • 1





      @canbax Nullabilty checks are only supported by c#8 and above

      – Patrick Hollweck
      Feb 20 at 6:42






    • 1





      @canbax a string could be null not any more. I mean if you use c# 8 and enable NullableReferenceTypes feature. VS immediately gives you a warning if you try assign null to string. But from the other hand you're able to introduce nullable string(string? s = null). No warning in that case.

      – isxaker
      Feb 20 at 10:10













    • "You should try to never use the ! Null-Forgiving-Operator. It negates the effects of null-safety you get guaranteed by the compiler." How are you going to write unit tests for argument validation? That's my most frequent use of ! in Noda Time.

      – Jon Skeet
      13 hours ago






    • 1





      I would definitely recommend editing later - currently anyone who is widely using the null-forgiving operator in their tests could easily feel bad about it given your current text, which doesn't give any indication of limitations to the advice. I think "never" is likely to be unachievable in several code bases - for example, in Noda Time we have an invariant in parse results that means I never expose a null value, but I still end up with one where the parse itself has failed. I think "try hard to avoid" has a more positive tone. Just my opinion though.

      – Jon Skeet
      10 hours ago



















    • a string could be null. You can assign null value like this string str = null; I'm using c# 6 .net 4.6.1. I think your statement is wrong "This line defines a non-nullable class property named LastNameof type string. Since it is non-nullable you can technically not assign null to it - obviously."

      – canbax
      Feb 20 at 6:09








    • 1





      @canbax Nullabilty checks are only supported by c#8 and above

      – Patrick Hollweck
      Feb 20 at 6:42






    • 1





      @canbax a string could be null not any more. I mean if you use c# 8 and enable NullableReferenceTypes feature. VS immediately gives you a warning if you try assign null to string. But from the other hand you're able to introduce nullable string(string? s = null). No warning in that case.

      – isxaker
      Feb 20 at 10:10













    • "You should try to never use the ! Null-Forgiving-Operator. It negates the effects of null-safety you get guaranteed by the compiler." How are you going to write unit tests for argument validation? That's my most frequent use of ! in Noda Time.

      – Jon Skeet
      13 hours ago






    • 1





      I would definitely recommend editing later - currently anyone who is widely using the null-forgiving operator in their tests could easily feel bad about it given your current text, which doesn't give any indication of limitations to the advice. I think "never" is likely to be unachievable in several code bases - for example, in Noda Time we have an invariant in parse results that means I never expose a null value, but I still end up with one where the parse itself has failed. I think "try hard to avoid" has a more positive tone. Just my opinion though.

      – Jon Skeet
      10 hours ago

















    a string could be null. You can assign null value like this string str = null; I'm using c# 6 .net 4.6.1. I think your statement is wrong "This line defines a non-nullable class property named LastNameof type string. Since it is non-nullable you can technically not assign null to it - obviously."

    – canbax
    Feb 20 at 6:09







    a string could be null. You can assign null value like this string str = null; I'm using c# 6 .net 4.6.1. I think your statement is wrong "This line defines a non-nullable class property named LastNameof type string. Since it is non-nullable you can technically not assign null to it - obviously."

    – canbax
    Feb 20 at 6:09






    1




    1





    @canbax Nullabilty checks are only supported by c#8 and above

    – Patrick Hollweck
    Feb 20 at 6:42





    @canbax Nullabilty checks are only supported by c#8 and above

    – Patrick Hollweck
    Feb 20 at 6:42




    1




    1





    @canbax a string could be null not any more. I mean if you use c# 8 and enable NullableReferenceTypes feature. VS immediately gives you a warning if you try assign null to string. But from the other hand you're able to introduce nullable string(string? s = null). No warning in that case.

    – isxaker
    Feb 20 at 10:10







    @canbax a string could be null not any more. I mean if you use c# 8 and enable NullableReferenceTypes feature. VS immediately gives you a warning if you try assign null to string. But from the other hand you're able to introduce nullable string(string? s = null). No warning in that case.

    – isxaker
    Feb 20 at 10:10















    "You should try to never use the ! Null-Forgiving-Operator. It negates the effects of null-safety you get guaranteed by the compiler." How are you going to write unit tests for argument validation? That's my most frequent use of ! in Noda Time.

    – Jon Skeet
    13 hours ago





    "You should try to never use the ! Null-Forgiving-Operator. It negates the effects of null-safety you get guaranteed by the compiler." How are you going to write unit tests for argument validation? That's my most frequent use of ! in Noda Time.

    – Jon Skeet
    13 hours ago




    1




    1





    I would definitely recommend editing later - currently anyone who is widely using the null-forgiving operator in their tests could easily feel bad about it given your current text, which doesn't give any indication of limitations to the advice. I think "never" is likely to be unachievable in several code bases - for example, in Noda Time we have an invariant in parse results that means I never expose a null value, but I still end up with one where the parse itself has failed. I think "try hard to avoid" has a more positive tone. Just my opinion though.

    – Jon Skeet
    10 hours ago





    I would definitely recommend editing later - currently anyone who is widely using the null-forgiving operator in their tests could easily feel bad about it given your current text, which doesn't give any indication of limitations to the advice. I think "never" is likely to be unachievable in several code bases - for example, in Noda Time we have an invariant in parse results that means I never expose a null value, but I still end up with one where the parse itself has failed. I think "try hard to avoid" has a more positive tone. Just my opinion though.

    – Jon Skeet
    10 hours ago













    6














    When the "nullable reference types" feature is turned on, the compiler tracks which values in your code it thinks may be null or not. There are times where the compiler could have insufficient knowledge.



    For example, you may be using a delayed initialization pattern, where the constructor doesn't initialize all the fields with actual (non-null) values, but you always call an initialization method which guarantees the fields are non-null. In such case, you face a trade-off:




    • if you mark the field as nullable, the compiler is happy, but you have to un-necessarily check for null when you use the field,

    • if you leave the field as non-nullable, the compiler will complain that it is not initialized by the constructors (you can suppress that with null!), then the field can be used without null check.


    Note that by using the ! suppression operator, you are taking on some risk. Imagine that you are not actually initializing all the fields as consistently as you thought. Then the use of null! to initialize a field covers up the fact that a null is slipping in. Some unsuspecting code can receive a null and therefore fail.



    More generally, you may have some domain knowledge: "if I checked a certain method, then I know that some value isn't null":



    if (CheckEverythingIsReady())
    {
    // you know that `field` is non-null, but the compiler doesn't. The suppression can help
    UseNonNullValueFromField(this.field!);
    }


    Again, you must be confident of your code's invariant to do this ("I know better").






    share|improve this answer




























      6














      When the "nullable reference types" feature is turned on, the compiler tracks which values in your code it thinks may be null or not. There are times where the compiler could have insufficient knowledge.



      For example, you may be using a delayed initialization pattern, where the constructor doesn't initialize all the fields with actual (non-null) values, but you always call an initialization method which guarantees the fields are non-null. In such case, you face a trade-off:




      • if you mark the field as nullable, the compiler is happy, but you have to un-necessarily check for null when you use the field,

      • if you leave the field as non-nullable, the compiler will complain that it is not initialized by the constructors (you can suppress that with null!), then the field can be used without null check.


      Note that by using the ! suppression operator, you are taking on some risk. Imagine that you are not actually initializing all the fields as consistently as you thought. Then the use of null! to initialize a field covers up the fact that a null is slipping in. Some unsuspecting code can receive a null and therefore fail.



      More generally, you may have some domain knowledge: "if I checked a certain method, then I know that some value isn't null":



      if (CheckEverythingIsReady())
      {
      // you know that `field` is non-null, but the compiler doesn't. The suppression can help
      UseNonNullValueFromField(this.field!);
      }


      Again, you must be confident of your code's invariant to do this ("I know better").






      share|improve this answer


























        6












        6








        6







        When the "nullable reference types" feature is turned on, the compiler tracks which values in your code it thinks may be null or not. There are times where the compiler could have insufficient knowledge.



        For example, you may be using a delayed initialization pattern, where the constructor doesn't initialize all the fields with actual (non-null) values, but you always call an initialization method which guarantees the fields are non-null. In such case, you face a trade-off:




        • if you mark the field as nullable, the compiler is happy, but you have to un-necessarily check for null when you use the field,

        • if you leave the field as non-nullable, the compiler will complain that it is not initialized by the constructors (you can suppress that with null!), then the field can be used without null check.


        Note that by using the ! suppression operator, you are taking on some risk. Imagine that you are not actually initializing all the fields as consistently as you thought. Then the use of null! to initialize a field covers up the fact that a null is slipping in. Some unsuspecting code can receive a null and therefore fail.



        More generally, you may have some domain knowledge: "if I checked a certain method, then I know that some value isn't null":



        if (CheckEverythingIsReady())
        {
        // you know that `field` is non-null, but the compiler doesn't. The suppression can help
        UseNonNullValueFromField(this.field!);
        }


        Again, you must be confident of your code's invariant to do this ("I know better").






        share|improve this answer













        When the "nullable reference types" feature is turned on, the compiler tracks which values in your code it thinks may be null or not. There are times where the compiler could have insufficient knowledge.



        For example, you may be using a delayed initialization pattern, where the constructor doesn't initialize all the fields with actual (non-null) values, but you always call an initialization method which guarantees the fields are non-null. In such case, you face a trade-off:




        • if you mark the field as nullable, the compiler is happy, but you have to un-necessarily check for null when you use the field,

        • if you leave the field as non-nullable, the compiler will complain that it is not initialized by the constructors (you can suppress that with null!), then the field can be used without null check.


        Note that by using the ! suppression operator, you are taking on some risk. Imagine that you are not actually initializing all the fields as consistently as you thought. Then the use of null! to initialize a field covers up the fact that a null is slipping in. Some unsuspecting code can receive a null and therefore fail.



        More generally, you may have some domain knowledge: "if I checked a certain method, then I know that some value isn't null":



        if (CheckEverythingIsReady())
        {
        // you know that `field` is non-null, but the compiler doesn't. The suppression can help
        UseNonNullValueFromField(this.field!);
        }


        Again, you must be confident of your code's invariant to do this ("I know better").







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Feb 16 at 18:08









        Julien CouvreurJulien Couvreur

        2,2131530




        2,2131530






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54724304%2fwhat-does-null-statement-mean%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            How to change which sound is reproduced for terminal bell?

            Can I use Tabulator js library in my java Spring + Thymeleaf project?

            Title Spacing in Bjornstrup Chapter, Removing Chapter Number From Contents