Partager via


Be a language designer...

I started writing a normal blog post - well, as normal as any blog post of mine ever is - and then I decided to let you do the hard work. So here's the situation.

You're part of the C# language design team thinking about the next version of C# (ie the version after VS 2005). You get the following email:

When I'm writing code, I often come across a situation where my code throws an exception because another component called with a null parameter. I'd like a way to prevent that from happening.

What are the pros and cons of such a feature? What are the ramifications of adding it to the language? What would have to change? What is your recommendation? Exactly what would such a feature look like?

Comments

  • Anonymous
    August 16, 2004
    How about a [NullNotAccepted] attribute on a parameter, enforced by the compiler wherever possible and otherwise by the runtime when the method is invoked. Well, thats a pretty bad idea but I think the whole concept is bad.

  • Anonymous
    August 16, 2004
    Instead of introducing a change to the language, why not create a NotNullAttribute that would be recognized at compile time and have the compiler expand that to do a simple check at the beginning of the method (when applied to a parameter) and at any assignment when it is declared “ref” or “out”?

  • Anonymous
    August 16, 2004
    The comment has been removed

  • Anonymous
    August 16, 2004
    An attribute is definately the way to go. However, there's a lot of room for flexibility. I'd argue against only creating a NotNullAttribute, and instead give Attributes a lot more "magic"; such as the ability to add prefix and/or suffix code to a method, with access to the method's parameter list -- or if that's a perf issue, just a specified parameter.

    That would allow the implementation of NotNullAttribute to be done in C# rather than magically within the compiler like ObsoleteAttribute is (which I've wanted to subclass with my own warning message many times but can't because it's "magic"), and would also allow users the power to write their own validation attributes for whatever other constraints they might want to place on the parameter data.

  • Anonymous
    August 16, 2004
    The comment has been removed

  • Anonymous
    August 16, 2004
    Well, let's see, what does the code look like?

    if( null == (object)param1 )
    throw new ArgumentNullException( "param1" );

    // repeat for further arguments

    So it's clearly achievable within the language and library at present. This pattern can easily be reduced to a function:

    static void EnsureArgNotNull( object o, string name )
    {
    if ( null == o )
    throw new ArgumentNullException( name );
    }

    EnsureArgNotNull( param1, "param1" );
    EnsureArgNotNull( param2, "param2" );
    // etc

    I'd say that it isn't much of a hardship to keep typing this, although the repetition of the name is a little annoying.

    What syntax might we use? If we're going to implement it as a language feature, I'd prefer a keyword (albeit a context-sensitive one) to a compiler-interpreted attribute:

    void MyFunc( object notnull o );

    although you could use a pseudo-custom attribute:

    void MyFunc( [NotNull] object o );

    Implementation-wise, you could either add it to the compiler, generating a prologue in the method body, or to the metadata and have the JIT generate the code for you. If you go the metadata route you probably would want to use an attribute instead - this makes it easy for developers to opt in regardless of which language they're using.

    Enhanced compilers could read the metadata and produce an error if the code passes null for that argument or a warning if one or more path of many possible paths will pass null.

  • Anonymous
    August 16, 2004
    The comment has been removed

  • Anonymous
    August 16, 2004
    I think the point of the feature would be to extend the semantics to disallow nulls and make that apparent in code from both sides. Basically to shift these runtime errors to compile time. If its a good idea or not is another question, but just saying that checking in code is the same thing is pretty off mark, IMHO.

  • Anonymous
    August 16, 2004
    Simple...

    Just remove the exception feature ;-)

    Seriously though, I think you have to ask why someone can call your function with a null parameter and what does "prevent that from happening" mean.

    It's possible that the calling code was compiled before the called code, thus making the notnull attribute less effective.

    Does making the CLR do the null check satisfy the requirement? I don't think so. Is the motivation laziness or something else?

    It's hard to answer without knowing the specific requirements and motivation.

  • Anonymous
    August 16, 2004
    >Basically to shift these runtime errors to compile time. If its a good idea or not is another question, but just saying that checking in code is the same thing is pretty off mark, IMHO.

    Null references are only occasionally detectable at compile-time. The code to check and throw still must exist.

  • Anonymous
    August 16, 2004
    You guys at Microsoft should talk to eachother a bit more. :) Have you seen this? http://blogs.msdn.com/cyrusn/archive/2004/06/03/147778.aspx

  • Anonymous
    August 16, 2004
    The comment has been removed

  • Anonymous
    August 16, 2004
    Are you guys thinking about pre / post-conditional assertions a'la Eiffel? That's a fairly heavyweight feature to add if the goal is to simply add runtime null checks in a declarative fashion. However, I truly believe that this would be a very useful feature to have in C#.

    During my AOP days, many of the useful cases that I implemented in my experimental runtime aspect weaver were exactly those: pre / post condition assertions. This would be an excellent addition to C#.

  • Anonymous
    August 16, 2004
    I'm not sure I understand exactly what the user wants.
    I recommend contacting him/her and refining it a little bit.

    Does he/she want to:

    1.Automate the parameter validation and exception generation?

    2.Use a NonNullable type like Cyrus mentions?

    3.Specify a default value for the parameter that will be used if a null is passed.


    1 and 2 seem similar in this context.
    I could see using #3(As long as it is not compiled into the calling code) but I can do it with now...most of my overloaded functions end up calling a version that has all parameters and understands nulls and default values.


  • Anonymous
    August 16, 2004
    The comment has been removed

  • Anonymous
    August 16, 2004
    If you are asking about having a data-type that would never support a null reference, I would suggest "Type!" syntax, which would expand to "NonNullable<Type>".

    However, I'm not sure that I completely agree with adding this to C#, as it's really not useful without support in the framework. It's bad enough that System.Data isn't being updated to support Nullable<T>, but it's not hard to write a wrapper to convert DBNull to null and vice versa. Unless breaking changes were introducted throughout the BCL, I don't see this happening.

    On the other hand, if you're asking about a way to rid the need of checking for null arguments, I'm all for decorating parameters with attributes, but I definately wouldn't stop at checking for null values.

    I'm all for allowing for a wide range of custom parameter validations that would append a prologue to the beginning of the method. There are several attributes I would define, such as [NotNull], [NotNullOrEmpty], [RestrictRange], etc (I'm sure there are better names).

    I already have a set of static methods for validating arguments, but this would move the check from the top of the method to the parameter defintion, which would clear up the body of the method for describing what the method DOES, not what the caller should be passing.

  • Anonymous
    August 16, 2004
    I would suggest adding something along the lines of Eifel's preconditions-postconditions system, in addition to this I would suggest a way of defaulting a parameters value to a specific value if a certain value is passed for the paramter in question.

    Regards,
    Kristoffer Sheather.

  • Anonymous
    August 16, 2004
    When I have a method that takes a reference, in most cases I don't want to be passed null. In very few places null is allowed. So whatever the syntax it should err toward less code decoration to achieve the effect (should the developer want it).

    So maybe the class should have the attribute and the exceptions can override this method by method [1].

    Other options on the ReferencesNotNullAttribute could be scope, i.e. do you want/need checks on public, private, protected, etc. You would of course want the inverse attributes where passing null is the norm.

    Having such attributes begs the question, "do we only want to test for null"? If we want the compiler to read and understand the attributes then should we have additional checks [2]?

    You could have per and post conditions for single arguments (only post for ref/out make any sense) or for the whole method. Whether the arguments list is <generic> or object or method signature matches is up for discussion. I have used what I hope is the .net 2.0 short-cut for delegates as arguments to the attribute c'tors, without it the code would get very messy.

    There are obviously static/none static-ness to consider. I'm not sure what arguments would go into the post condition, the class instance object maybe?

    Just some thoughts.

    adam

    [1]
    [ReferencesNotNull]
    public Class1
    {
    public void fn1(object o){}
    public void fn2([AllowNull]object o){}
    }

    [2]

    public class Class2
    {
    public void fn3([PreCondition(Validate)]DateTime dtm){}

    void Validate(DateTime dtm)
    {
    // condition
    }
    [PerCondition(MethodValidate)]
    public void fn4(int a, int b){}

    void MethodValidate(int a, int b)
    {
    // condition
    }
    }

  • Anonymous
    August 16, 2004
    Having seen the Comega compiler preview on http://research.microsoft.com/comega/ I'd say that's the way to go: Most of the people here seem to agree that an attribute is the way to go, that the check can be done at compile time in a way similar to deffinite assignment checks and, instead of "[NotNull] object o" or "notnull object o" why not use "object! o" ? :)

  • Anonymous
    August 16, 2004
    Non-nullable types appeal rather more than expanding attributes (which are sounding suspiciously like pre-processor Macros in some comments above).

    The debate reminded me of another recurring chore - implementing IDisposable. The following is taken from http://msdn.microsoft.com/library/default.asp?url=/library/en-us/cpguide/html/cpconImplementingDisposeMethod.asp

    // Allow your Dispose method to be called multiple times,
    // but throw an exception if the object has been disposed.
    // Whenever you do something with this class,
    // check to see if it has been disposed.
    public void DoSomething()
    {
    if(this.disposed)
    {
    throw new ObjectDisposedException();
    }
    }

    The consequences of forgetting to do the check could well be an attempt to access a null member variable (post-Dispose, that is). I guess this situation points towards pre-condition/post-condition checks.

    Am not familiar with Eiffel, but I imagine these behave similarly to NUnit's SetUp/TearDown attributes, right?

  • Anonymous
    August 16, 2004
    The comment has been removed

  • Anonymous
    August 16, 2004
    I agree with Daniel O'Connell that attributes are not the way to go, though they were my first reaction.

    After reading his entry I thought about something really cool: mark a type as non-nullable by tacking a '!' to it, just like a '?' makes valuetypes 'nullable'. So, declare your method for instance as 'void MyMethod(string! aargh){}' to indicate that the argument aargh must not be null.

    Well OK, Cyrus of course came up with that idea already two months ago, as was mentioned above, see http://blogs.msdn.com/cyrusn/archive/2004/06/03/147778.aspx

  • Anonymous
    August 16, 2004
    And why not generalize a little.
    Like in Eiffel ( http://www.eiffel.com ) I want to have pre and post condition on my argument and internal state.
    I think the problem is not only [NullNotAccepted] but for void MethodValidate(int a )
    {
    [precond]
    a > 0 && a < 10
    ......

  • Anonymous
    August 16, 2004
    Null reference exceptions are by far the greatest source of bugs I experience. A solution that moved some of the handling of nulls into compile-time checking, while deferring any undecidable cases to runtime would be a boon.

    The attribute approach seems ugly to me, and the approach using type! notation seems to me the most attractive.

    In fact, the Nice language approach of making all reference types non-nullable by default is very appealing. Its quite rare that I actually want a nullable reference type, and it would be better if I explicitly declared it as being so (using the type? notation).

    I realise, however, that making reference types non-nullable by default would be a backwards-compatability nightmare.

  • Anonymous
    August 16, 2004
    The comment has been removed

  • Anonymous
    August 16, 2004
    In my experience, now that array bounds are checked, NullException is the most frequent cause of program failure. So it definitely needs to be tackled. And there is only one effective way: at the type level, so that things can be caught at compile-time.
    Whatever the syntax (though I like "string!" to balance "int?") things to be considered would be:
    - type compatibility: string! is assignable to string, the inverse is not true and needs a cast (runtime check)
    - definite assignment for fields: at the end of every constructor, all the fields of non-nullable types should be assigned; inside a constructor a non nullable field is not useable unless/until it is assigned (on all paths)
    - the "new" operator: returns a non nullable type
    - optionally (but this is a point which is more delicate to formalize): when the compiler can prove that a variable of regular type contains a non-null value in a particular region of code, promote the variable to a non-nullable type in this region. Example
    string s;
    if (s != null)
    {
    // s is considered of type string! here
    }
    else
    {
    // s is of type string here
    }
    I clearly realize that proving non-nullability is not feasible in the general case, but it is at least in cases of the form "cond1 && cond2 && cond3" where one of the conds is of the form "x != null", inside if, while and do statements.
    - optionnally 2 (if the former is implemented): emit a (disengageable) warning when accessing a member (through the "." notation) on a variable of regular type (a disengageable error would be even better, but I am afraid people are going to throw stones at me :). Example:

    if (s != null)
    {
    int i = s.Length // OK
    }
    else
    {
    int i = s.Length // Warning
    }

    Is it worth the effort ? IMO, yes, definitely. Many, many bugs would be caught at compile-time.
    Will existing code be broken ? No, if the syntax is carefully chosen, it is a pure addition.
    Will existing code benefit from it ? No, but future code will, and this means code written for the ten years or so to come.
    Does it have to be in the CLS ? Preferably.
    Does it mean that all languages should be modified ? Not necessarily. Not all languages need compile-time type safety. For example, VB.NET could emit run-time checks when dealing with imported non-nullable types.

  • Anonymous
    August 16, 2004
    Having object as a parameter is quite a risk anywat, so the developer should be prepared to handle the risk. What if a caller sends another kind of object that the function expects? I guess a cast somewhere would break the code. So saying object is like saying 'I'm prepared to expect strange things'. And IMHO, this includes also null objects. If the parameter is strongly typed, I'd exclude quite easily that a null is not a good idea, since the function declares to expect that kind of a object. I understand this is quite strict, but strongly typed system should be.

    Automatic parameter validation is another story. I like Eiffels pre/postconditions, and ADAs declarative range checking. But these basicly just "write" code for the developer, unless compiler can do static checking (which is not usually the case) Often daunting, granted. To help the developers (on the both sides), it would be good to have declarative parameter validation, that can be seen on both sides. It's no good if the caller cannot see (for example with Intellisense) what parameter checks are made by the method.

    For this, I would see the checking like a qualifier of the parameter, not a template or attribute, like out and ref already do. Something like 'in' qualifier (as opposite of out) would be for required/must exists (just like out). I don't see any substantial changes for CLS/CLR in this case (as looking at SSCLI), since it looks, for me, like the out qualifier. It would not break exsiting code base. For interoperability, the host language might not even think about seeing/handling/implementing 'in', so that if it is out of the languages semantics, could still ignore it and pass null.

    For value types, it could get more compilcated.
    My dream would be to implement three valued logic (also to get rid of the <type>? that is, IMHO, a kludge that creates another kind of logic semantics). For value types/arrays you need range checking, validation lists.. but this is off topic.

  • Anonymous
    August 16, 2004
    You can have [NotNull] today if you use extensible C# http://www.resolvecorp.com/

  • Anonymous
    August 16, 2004
    The comment has been removed

  • Anonymous
    August 16, 2004
    The comment has been removed

  • Anonymous
    August 17, 2004
    How would you implement null argument checking

  • Anonymous
    August 17, 2004
    The comment has been removed

  • Anonymous
    August 17, 2004
    The comment has been removed

  • Anonymous
    August 17, 2004
    I'm not fully convinced on how many of these errors could be caught at compile-time. At the very least, some (and, I believe, most) will still need to be caught at run-time, which leads us to the what the writer is really asking for.

    Presently, his code is throwing an exception when a caller uses a null object.

    If this feature is implemented, his code will throw an exception when a caller uses a null object. So, what's the difference??

    Basically, he wants us to build into the language a way of saying "This is YOUR fault, not MINE!".

    Is that really an important goal?

  • Anonymous
    August 17, 2004
    James:
    It aides in debugging and code with semantic nullability statements, instead of documented ones. While this feature will still cause exceptions, it helps by pushing the error to the caller. By doing that you increase the likely hood that the caller will understand what went wrong. For NullReferenceException, many think that you shouldn't issue an ArgumetnNullException and instead simply let the NullReferenceException bubble up. Because of this, its possible the NRE will occur sometime after the object construction, thus making it far less clear what happened. In esscense, instead of his code throwing the exception, the code at the call site would throw the exception instead. Making it clear that it is the callers fault. That makes life easier all around. Its better when the compiler can tell teh caller that he might have to deal with an error(by virtue of a cast).

    Most code can be null proof if written after this feature is added, which is why Eric makes the comment in his follow up post that it might have been a good idea for v1 instead of v3. It would be difficult to do without null guarentees in the BCL, but as I've suggested parallized methods would help(potentially with compiler support, potentially not). It could still be quite the mess though.

    Also, the side effect of this would be definate not-null variables, you could have
    string! x = "marvin";

    and the compiler knows that x will never be null. And since string is sealed the compiler would be free to emit call instructions instead of callvirt or any other optimizations the compiler can manage using that knowledge(assuming of course call is faster than callvirt. One would suspect so but its certainly not guarenteed, thats something to look into). It also means you never have to bother with a null check(its minor, but removing every null check from your code could become a significant change).

    But the real value is that it says "This cannot be null", period. Without this type of support the language has no way of saying that expressly. You can debate teh actual value quite a lot, but I don't think it is as simple as just moving the blame.

  • Anonymous
    August 17, 2004
    Remove null from languages entirely.

    Null is used in several cases:

    Base case for recursive types
    -----------------------------
    For instance, null is used to mark the tail for linked lists. This is a mistake. Use inheritance thus:

    class Node {}
    class Tail {}
    class Element { Node next; }

    Blank fields in a database
    --------------------------
    This version is way over-used. You should be absolutely sure that your field should be nullable (nulls occur much to often even in professional databases) and mark it explicitly with something like Nullable<int> so that those who use the value are clued in directly.

    Temporary values to dumb compilers.
    ----------------------------------------
    I'm sure you've seen the following:

    Var a;
    if(condition)
    a = 1;
    else
    a = 2;
    print(a);

    Many compilers report that a "may not have been initialized", but any human can see that this usage is fine. More sophisticated compilers can easily explore every path and decide if a variable really might be uninitialized.

    EVERYBODY - check out languages like SML, OCaml, and Haskell, where NULL DOESN'T EXIST. You can accomplish a lot more, and have fewer errors.

    If anyone has a situation in which null is absolutely necessary, email me, I'd love to see it.

    -Geoff
    http://gweb.org

  • Anonymous
    August 17, 2004
    The first case should read:

    abstract class Node {}
    class Tail: Node {}
    class Element : Node { Node next; }

  • Anonymous
    August 17, 2004
    Daniel:
    re: parachute - I see what you're saying, but don't agree at all. Suppose we add nonnull to C#, but not vb - so we add a function in VB -

    > public nonnull string Concat( nonnull string a, nonnull string b )

    If you call that from C#, the compiler will prevent you ever calling with a null parameter. Yay.

    If you call from VB - it's exactly like you'd defined the function at the moment, ie

    > public string Concat( string a, string b )

    without having a guard on a on b inside. - so, if you send a null to it, you'll get an exception. ie - it will work EXACTLY as it currently works... (without people putting guards on each parameter)

    So - you have to be careful going across languages.

    Big Deal.

    99.99999% of shops out there will only use exactly one language - they'll be a C# shop, or a VB shop.

    Occasionallly, for various reasons, people will take advantage of the multi-language abilities. However - any shop that says "write code in anything you want" - deserves every problem they get.

    Most people will either a) design in layers, and say this particular layer is in this language - or b) design in subsystems, and say this subsystem is in this language. So - your coding standards just have to remember that you have to check nulls before they send them, (or you'll get an exception somewhere deep in code).

    So - net effect - in the very rare case where we use multiple languages in the same development - we have the situation where an exception will eventually be thrown if we send a null variable to a nonnull function - exactly as most people code right now (because most people DON'T explicitly put guards on every single parameter on every single function)

    So, in 99.9999% of cases, we get a positive benefit, and in the remaining 0.0001% cases we are stuck with the same horrible situation that we have now. Bummer.




  • Anonymous
    August 17, 2004
    The comment has been removed

  • Anonymous
    August 17, 2004
    Oh, adn teh bit I forgot. The duplicity means that libraries that don't use non-null, due to them being written in VB or whatever, takes the feature entirely away from C#. Its unfortunate, but it is something that has to be considered.

  • Anonymous
    August 17, 2004
    I see Mitch Denny hasn't posted here yet; see his blog for his analysis of this problem:

    http://notgartner.com/posts/525.aspx

  • Anonymous
    August 18, 2004
    The whole concept is bad! You'd just be masking the problem. Why are you calling with a null parameter in the first place? Get at the root of your problems!

    Though I LOVE C#, indiscriminate use of Exception handling, and the ablility to pass object around without worrying about "who owns the memory" lead to spaghetti code, and messy practices!

    Some obsolete languages, like Objective-C, won't complain if you dereference a NULL pointer! That's really, really bad.

  • Anonymous
    August 18, 2004
    Robert: The core of the concept, IMHO, is getting the compiler to the point where it will error out if you could be passing a null. Having a semantic rule that says you cannot pass null would be the fix you want, wouldn't it?

  • Anonymous
    August 18, 2004
    Just a thought...and this doesn't solve the problem of the BCL not having a concept of non-nullable throughout the framework...

    Rather than implementing such a feature as NonNullable<T>, which has the uninitialized-struct problem I mentioned here ( http://weblogs.asp.net/ericgu/archive/2004/08/17/215779.aspx#216078 ), could this be implemented if C# and/or .NET supported subtypes?

    Languages such as Pascal, Delphi, and Ada (and other non-Pascaleque languages, I'm sure) allow you to define one type to be a sub-type of the other, and the compiler can check for several problems rather than relying on checks at runtime.

    NonNullable would just be a subtype of a reference where the value of null was not allowed. It would be similar to having a UInt32 with it's range restricted between 1 and UInt32.MaxValue.

    I have often times wished I could restrict primitives to a subrange, but I can see where this would be very hard to make CLS-compliant. How would languages that automatically promote numbers to bigger versions when they overflow work with other languages that require a subset type? There's a lot of problems with requiring all CLS languages to support this functionality.

  • Anonymous
    August 18, 2004
    Primative subranges is an interesting idea...I wonder what kind of typing could be used to achieve that...

    A single language system could probably handle it easier than .NET would. Trying to design so that VB, et al can use it safely as well is tricky. I wouldn't want double checking, after all, ;).

  • Anonymous
    August 18, 2004
    The comment has been removed

  • Anonymous
    August 19, 2004
    In this case, fixing it in one language doesn't change the fact that the framework is still littered with methods that don't indicate whether null is a valid value or not.

    And with all the work being put in to moving APIs to the .NET Framework, it is already mucht o late to make such a drastic change.

  • Anonymous
    August 19, 2004
    Marcus:
    I think people would be rather crabby about having to create null objects constantly. Its simply not a good use of my time in most cases.

  • Anonymous
    August 24, 2004
    Robert:

    I don't know what you imagine yourself to be saying when you call Objective-C "obsolete" but it's obvious you don't know Objective-C.

    Objective-C a strict superset of C. If you dereference a null pointer, you're using the "C" part of Objective-C. You'll get the same effect as if you were doing it in a plain C program.

    Objective-C adds a Smalltalk-style dynamic object model on top of C. One interesting feature is that a message sent to a nil object returns nil -- it doesn't dereference a null pointer and it doesn't cause a runtime error.

    That's a language design feature that Smalltalkers envy. You might want to read Nevin Pratt's "A Generalized Null Object Pattern" in the Smalltalk Chronicles to learn more about it.

    You might want to invest some time gaining minimal competence in Objective-C. You won't understand the benefits until you've put in the work, if then. (See Paul Graham on the "Blub Paradox".) But at least you'd be able to explain why dereferencing a null pointer produces a runtime error in Objective-C and why sending a message to nil does not.

  • Anonymous
    August 24, 2004
    Robert:

    I posted my previous comment before reading the link to your page. There you mention you've worked for several large corporations. You give names (IBM, Adobe, Disney, Apple, Olivetti, ...) and you post links. All of the links point to the home page for your past employers with one exception: Apple is linked to "Jerkcity.com".

    Would it be fair to say you didn't leave there on good terms?

    A few paragraphs below you describe Visual Studio in glowing terms and offer a comparison with another development environment. Interestingly the comparison is not with IBM's Eclipse or IDEA from JetBrain, both of which are widely used, widely admired and widely taken to represent best of breed development environments not from Microsoft. The single comparison you make is between Visual Studio and "Apple's crude system for its antiquated Objective-C language".

    That's so odd. If you want to compare Visual Studio to the competition, you'd be better off comparing it to other development environments for Windows or the best development environments out there on any platform. Were you really thinking the top priority was to win over all the Objective-C programmers using OS X? And did you think you'd win them over just by using the words "crude", "antiquidated" and "obsolete" without any further discussion?

    I don't know what transpired between you and Apple, but you sure seem bitter about it. And you express that bitterness in very specific ways -- it's all directed towards Apple's use of the "antiquated/obsolete" Objective-C language and the "crude tools" Apple provides for working with it.

    It would be laughable if it weren't so sad. Were you let go because you couldn't program in Objective-C and the new talent saw you as less than ideal for the pace and direction of their future efforts? That would explain your bitterness and the form in which you express it and your ignorance of the langauge. But it would also single you out as the world's most unreliable source on the very topics you seem most interested to discuss.

    And it's embarassing. For example, you go on to say:

    "I never fail to make Apple programmer's jaws drop when I demonstrate Visual Studio .NET to them (and I usually get excuses like `that'll be there in the next release')"

    But you don't provide a single example, leaving a reader to guess what you have in mind. Hoping to learn more, I looked at the Visual Studio 2005 Beta home page at

    http://lab.msdn.microsoft.com/vs2005/

    There I found that the #1 suggestion from beta testers is "edit and continue support for C#". The response from Microsoft is "We are actually targetting implementing Edit and Continue for C# for future release. So keep your figures crossed :-)"

    Talk about irony. Apple has provided edit and continue support for Objective-C since 2003. Microsoft is "targeting" edit and continue support for C# in a release subsequent to Visual Studio 2005, and even then there's no commitment -- you have to keep your fingers crossed.

    Your conclusion: "Microsoft's system looks years ahead."

    Can you see how that looks to a disinterested reader?

    If you were badly treated by Apple, and you blame it on their adoption of Objective-C with OSX, I'm sorry to hear it and I wish you well writing raves for Visual Studio.

    But your uninformed and unsubstantiated expressions of negative sentiments towards Objective-C undermine your credibility and embarrass you. If you can do better, do yourself a favor and try.

  • Anonymous
    August 24, 2004
    Robert:

    I posted my previous comment before reading the link to your page. There you mention you've worked for several large corporations. You give names (IBM, Adobe, Disney, Apple, Olivetti, ...) and you post links. All of the links point to the home page for your past employers with one exception: Apple is linked to "Jerkcity.com".

    Would it be fair to say you didn't leave there on good terms?

    A few paragraphs below you describe Visual Studio in glowing terms and offer a comparison with another development environment. Interestingly the comparison is not with IBM's Eclipse or IDEA from JetBrain, both of which are widely used, widely admired and widely taken to represent best of breed development environments not from Microsoft. The single comparison you make is between Visual Studio and "Apple's crude system for its antiquated Objective-C language".

    That's so odd. If you want to compare Visual Studio to the competition, you'd be better off comparing it to other development environments for Windows or the best development environments out there on any platform. Were you really thinking the top priority was to win over all the Objective-C programmers using OS X? And did you think you'd win them over just by using the words "crude", "antiquidated" and "obsolete" without any further discussion?

    I don't know what transpired between you and Apple, but you sure seem bitter about it. And you express that bitterness in very specific ways -- it's all directed towards Apple's use of the "antiquated/obsolete" Objective-C language and the "crude tools" Apple provides for working with it.

    It would be laughable if it weren't so sad. Were you let go because you couldn't program in Objective-C and the new talent saw you as less than ideal for the pace and direction of their future efforts? That would explain your bitterness and the form in which you express it and your ignorance of the langauge. But it would also single you out as the world's most unreliable source on the very topics you seem most interested to discuss.

    And it's embarassing. For example, you go on to say:

    "I never fail to make Apple programmer's jaws drop when I demonstrate Visual Studio .NET to them (and I usually get excuses like `that'll be there in the next release')"

    But you don't provide a single example, leaving a reader to guess what you have in mind. Hoping to learn more, I looked at the Visual Studio 2005 Beta home page at

    http://lab.msdn.microsoft.com/vs2005/

    There I found that the #1 suggestion from beta testers is "edit and continue support for C#". The response from Microsoft is "We are actually targetting implementing Edit and Continue for C# for future release. So keep your figures crossed :-)"

    Talk about irony. Apple has provided edit and continue support for Objective-C since 2003. Microsoft is "targeting" edit and continue support for C# in a release subsequent to Visual Studio 2005, and even then there's no commitment -- you have to keep your fingers crossed.

    Your conclusion: "Microsoft's system looks years ahead."

    Can you see how that looks to a disinterested reader?

    If you were badly treated by Apple, and you blame it on their adoption of Objective-C with OSX, I'm sorry to hear it and I wish you well writing raves for Visual Studio.

    But your uninformed and unsubstantiated expressions of negative sentiments towards Objective-C undermine your credibility and embarrass you. If you can do better, do yourself a favor and try.

  • Anonymous
    September 02, 2004
    As Damien said, the Nice programming language has solved this problem:

    http://nice.sourceforge.net/safety.html#id2429032

    Since it's based so closely on Java, they of course needed to provide interop with existing Java libraries:

    http://nice.sourceforge.net/manual.html#optionTypesJava

    Would sure be nice if someone would write a Nice# compiler... :)

  • Anonymous
    December 27, 2004
    [http://itpeixun.51.net/][http://aissl.51.net/][http://kukuxz003.freewebpage.org/][http://kukuxz001.51.net/][http://kukuxz003.51.net/][http://kukuxz005.51.net/][http://kukuxz002.51.net/][http://kukuxz004.freewebpage.org/][http://kukuxz007.51.net/][http://kukuxz001.freewebpage.org/][http://kukuxz006.51.net/][http://kukuxz002.freewebpage.org/][http://kukuxz004.51.net/][http://kukuxz008.51.net/][http://kukuxz009.51.net/][http://kukuxz005.freewebpage.org/][http://kukuxz006.freewebpage.org/][http://kukuxz007.freewebpage.org/][http://kukuxz009.freewebpage.org/]

  • Anonymous
    May 31, 2009
    PingBack from http://woodtvstand.info/story.php?id=2668

  • Anonymous
    June 08, 2009
    PingBack from http://insomniacuresite.info/story.php?id=4211