back to article Faster .NET? Monster post by Microsoft software engineer shows serious improvements

The forthcoming .NET 6 will be significantly faster than its predecessors, according to a monster post by Microsoft Partner Software Engineer Stephen Toub. Toub's 45,000-word post follows similar ones for earlier versions since .NET Core 2.0, and is based on an analysis of all merged pull requests (PRs – changes to the code) …

  1. werdsmith Silver badge

    It's good stuff and when I've had need to use it I have been impressed how easy it is to get your head round.

    But I still don't like stuff that is mostly tied to one OS. Mono or official linux version or not. I'm less inclined to invest, same with the Swift and Objective-C. Just my personal 2p for what its worth. Probably less than 2p.

    1. Irongut

      I have c# code running on Windows, Linux, Android, iOS and serverless Functions (on Linux). All the same c# code with no changes. It is not running on a web server but it could do that as well.

      Tell me again how it is tied to one OS.

      1. Warm Braw

        Arguably, it's the long-standing Windows users who have most to complain about. As a consequence of reinventing .NET for multiple platforms (including macOS) it's they who have work to do.

        1. AndrueC Silver badge

          it's they who have work to do.

          Yeah - how many iterations of .NET have there been? And MS have all but killed off WPF so now we're in danger of having to learn yet-another-UI-framework.

          At times it's felt like standing on a moving vehicle with someone (Visual Studio mostly) randomly sneaking up behind you and giving you a push.

      2. This post has been deleted by its author

      3. karlkarl Silver badge

        Not necessarily one OS but certainly a limited selection of architectures:

        You have ARM*, s390x and x86*

        Admittedly Mono has a lot more (almost catching up to Java) but many of these are in no way as well tested as the tier 1 architectures and I possibly wouldn't use them in production.

        So is this a problem? Not really, .NET developers are mainly focused on x86* anyway. However I do prefer the "comfort" of the free architecture support offered by the native tools (almost always C and C++).

        The .NET native stuff is closer but the GC still makes assumptions (growing / shrinking stack?) and thus is still architecture specific.

        The best example I can refer to is Unity3D getting WebAssembly / ASM.js support (for Pluginless web) a full 3 years after Unreal Engine 4 did. Porting .NET (Mono) was a PITA!

      4. werdsmith Silver badge

        Tell me again how it is tied to one OS.

        Tell me it’s origin.

        I guess you are one of those who are invested in it.

      5. AndrueC Silver badge

        We have Windows code (client and server), iOS and Android.

        I'd estimate that at least 80% of the client code is shared across platforms. The only bits that are platform specific are:

        * Client UI - Mainly the result of Xamarin v. WPF differences but also the inevitable differences between desktop and mobile presentations. Even then we inject a lot of common code as services so it's mostly just glue logic.

        * Client download - Each platform does it slightly differently. iOS was particularly different (quelle surprise) with NSURLDownload and latterly URLSession.

        * Client audio - We have a class hierarchy with most of it shared but ultimately you have to branch off to platform specific code at the bottom.

        It's not a perfect situation though. Developing for Windows isn't too bad but programming for mobile can be frustrating at times. Debugging iOS apps can be very irritating at times.

    2. JDX Gold badge

      >>But I still don't like stuff that is mostly tied to one OS. Mono or official linux version or not

      Mono? When was the last time you looked at this stuff?

      Not only are there versions of Linux and Mac and probably others out there, both .Net and the Roslyn compiler are open source. You might as well say Java is mostly tied to to one OS because most people use it on Linux.

  2. Aleph0

    1.07 MB for a "Hello World"?

    And in the best case... Am I the only one thinking it's a little wasteful?

    Okay these days the emphasis is on reducing developers' time instead of resource consumption, but for such a size the program better write itself in under a millisecond.

    1. Brewster's Angle Grinder Silver badge

      Re: 1.07 MB for a "Hello World"?

      org 100h

      mov dx, message

      mov ah, 9

      int 21h

      sub ah, ah

      int 21h

      message: db "Hello world",13,10,"$"

      1. Herring`

        Re: 1.07 MB for a "Hello World"?

        Not a complete implementation. Text should be "Hello, world!"

        1. Brewster's Angle Grinder Silver badge

          Re: 1.07 MB for a "Hello World"?

          What you're forgetting is that the exclamation mark caused a sequence of beeps to emerge from the PC's speaker - kinda like it was laughing. And three in a row would cause a long, continuous tone which locked the PC. So, all in all it, was best avoided!!!

      2. Pirate Dave Silver badge

        Re: 1.07 MB for a "Hello World"?

        Feck, now I feel old. My assembly skills never surpassed what could handle, but damn, I remembered and understood most of what you just wrote.

        1. Brewster's Angle Grinder Silver badge

          Our web dev can code x64 asm

          Do you know what's worse? I didn't have to look under the numbers. That's a bit of my brain I'm never getting back. :/

          1. Pirate Dave Silver badge

            Re: Our web dev can code x64 asm

            Yeah, I hear ya. Not sure why I still need to remember that video memory starts at B800:0000, but it's burned into my brain.

    2. Warm Braw

      Re: 1.07 MB for a "Hello World"?

      It might surprise you to learn - it certainly opened my eyes - that the average web "page" requires 2MB of total download.

      The actual HTML is negligible. About half the traffic relates to images and the remainder is roughly 50% JavaScript.

      In that context, the 1.07MB isn't as horrendous as first appears.

      Of course, it might be better to download the complete runtime if you could cache it and use it on multiple pages, but I'm not sure all the bits are there yet.

      Or do the work on the server, like in the old days...

      1. Pascal Monett Silver badge

        Here's a hint : use NoScript and an ad blocker and that page size is down to the text and the images that have been included by the author.

        Makes the web a lot faster. Try it, you'll like it.

        1. AndrueC Silver badge

          Ad blocker, yes. uBlock Origin for me. But NoScript? Forget it. I tried, honestly I did, but so many website don't work properly (or flat out don't render usefully) that I've surrendered on that front.

          1. AMBxx Silver badge

            It takes a bit of tuning. Allow all top level sites, then gradually allow other bits. Combine with Decentralize to prevent clever trackers. Job done.

          2. Pascal Monett Silver badge

            It's called security.

            Security is not there to be user-friendly, it's there to protect you.

            Yes, many, many websites do not render properly without JS enabled. The question you need to ask yourself is : do I wish to enable JS on this website ? With NoScript, you have the choice before a catastrophe happens.

            Obviously, the websites you visit regularly will make you enable JS for them.

            It's the websites you go check out that you can control. If you click on a link and nothing shows up, you need to ask yourself : do I really need to see content on this page if JS needs to be enabled ? Is there no other way I can get that information with putting my system to risk ? If no, then you can enable temporarily, check the site and forget it when you're done.

            Let's be clear : JavaScript is the root cause for malware infections in 99.9% of all cases.

            If you don't protect yourself, well, you can't complain when things go wrong.

          3. picturethis
            Thumb Up

            "I've surrendered on that front'.

            I haven't yet. As previous posts have stated, it takes a little bit of tuning. And I've also found that preventing access to twitter, facebook, doubleclick almost never causes any issues with displaying pages for the majority of websites that I visit. These are the top websites on my "no-go" list. noscript is actually fairly flexible for tuning.

            The biggest issue that I have is that I generally, temporarily disable noscript if while I am purchasing something. e-commerce websites get very knarly if they can't access something and enabling noscript while in the middle of a transaction can result in double charges... Ask me how I know...

    3. RobLang

      Re: 1.07 MB for a "Hello World"?

      That includes the entire .NET framework. In most real world scenarios the framework (React, Vue, Angular) will be cached on the browser via a CDN. Also 1.07MB isn't very large in the world of the web. Javascript for even a medium size site comes in at around 2MB usually.

      1. Tom Chiverton 1

        Re: 1.07 MB for a "Hello World"?

        Not in FireFox nay more; every site is it's own cache, so CDN is pointless except saving some bandwidth bills at the hoster.

    4. This post has been deleted by its author

  3. thames

    Beware of Benchmarks

    I've been doing a fair bit of optimization work on a set of libraries lately and so have been writing a lot of benchmarks. One major problem you can run into is one that is mentioned in the story. Optimizing compilers can break the benchmark. A lot of benchmarks do no useful work beyond exercising the CPU. The compiler may spot this and decide to be "helpful" by deleting the benchmark. You end up thinking you are getting amazing performance improvements when in fact you didn't, it's just that your benchmark is no longer working. Writing a good benchmark can be as much work as creating the optimization you are trying to measure.

  4. Mike 137 Silver badge

    "Microsoft chose instead to bring over smaller improvements piece by piece to the .NET Runtime"

    Thereby creating the equivalent of the old "DLL hell" for anyone who doesn't keep permanently and automatically "updated". I can't count the applications that fail to install or run because they can't find "dot net runtime version xxxx" or some component of it.

    Any organisation with a serious security posture (i.e. one that doesn't just posture about security) requires their systems to be stable and known. Constant change (even supposedly for the better) results in permanent unknowns which are a nightmare for risk management.

    In the face of constant change, all one can do is reactively and blindly trust to updates from vendors who so far haven't even managed to get any application or OS completely debugged before it's superseded by the next major version.

    1. Filippo Silver badge

      Re: "Microsoft chose instead to bring over smaller improvements piece by piece to the .NET Runtime"

      DLL hell is not applications refusing to run without the right library version.

      DLL hell is applications attempting to run with an incorrect library version, and failing in unpredictable ways as a result. Java did that a lot, I don't know if it's still the case. Native Win32 apps with DLLs will still happily attempt to use any DLL that happens to have the right file name.

      The fact that a .NET 3 application won't run if you only have .NET 4 installed is exactly the opposite of DLL hell.

      1. Mike 137 Silver badge

        Re: "Microsoft chose instead to bring over smaller improvements piece by piece to the .NET Runtime"


        I actually said "the equivalent of...", meaning that from the user's perspective you keep having to install/"upgrade" components of what the user assumes is the OS in order to run the programs you want to run. Your interpretation, clearly based on knowledge of system internals, is in the minority, and ultimately, because computers are quite rightly just tools to most people, the user perspective really should take precedence.

        I know it's hard for the expert to put themselves in the shoes of the inexpert, but we've commoditised computing to such an extent that the inexpert has becomed the important party.

  5. bsimon

    Too little too late too cumbersome?

    After more than 15 years developing with VS and C# / .Net I appreciate that MS finally is interested on supporting its technology on multi platforms and it is embracing open source, however MS tech has become so cumbersome and bloated, it is almost impossible to keep yourself updated. I am seriously considering to move to a more lean and fresh technology like Rust, perhaps it is a better investment of my time and energy. One of my concerns is the maturity of libraries and abailability of tools.

    1. Ilsa Loving

      Re: Too little too late too cumbersome?

      Rust is the first language I've seen in decades that actually gave me a sense of delight. I agree that there's a lot of maturing to do, and I'm not sure I agree with all the design decisions, but on the whole it's the closest I've seen to being a true replacement to C.

      Their variable ownership mechanism alone is revolutionary.

    2. martyn.hare

      I hope that's sarcasm

      Rust is more of a maintenance burden than the competition for the following reasons:

      * Rust has no ABI stability, meaning every app you deploy will have its own private runtime (aka. bloat)

      * Static linking is encouraged over dynamic in 2021? That means more rebuilds/updates as a developer

      * Every dependency you include in an app YOU have to recompile/maintain (not the OS or Linux distro)

      * Created by Mozilla. They change things constantly, even when they're already decent, look at Firefox!

      If I choose to use the managed APIs included with .NET then I only have to worry about my own code being secure, as the runtime itself will be centrally updated with the system, just like with Java, Python, PHP and many other options.

      However, if I create apps using Rust, I will forever have to track CVEs in every dependency and am now responsible for rebuilding the entire project every single time there's a relevant fix. That's a downgrade even compared to C/C++ where I can choose to link to system-provided libraries which are managed by the OS vendor or Linux distribution, reducing the maintenance burden greatly.

      It seems that in 2021 the hipsters are running the asylum and everyone has forgotten about how long-term software maintenance works.

  6. JimmyPage

    Whats going on with .Net Core ?

    That I seem to recall was going to be the future of .NET ?

    1. Filippo Silver badge

      Re: Whats going on with .Net Core ?

      AFAIK, that is correct, and after some point they just stopped calling it Core (because the old Framework is no longer getting updates, so there is no such thing as .NET Framework 6 that you could get confused about). So it's just .NET 6 now.

      That said, I haven't much followed the last year or so of development, so I might be wrong.

    2. Anonymous Coward
      Anonymous Coward

      Re: Whats going on with .Net Core ?

      They renamed .NET Core to .NET starting with v5 (actually v4, but called v5 to prevent confusion with .NET Framework 4.x).

      1. AndrueC Silver badge

        Re: Whats going on with .Net Core ?

        to prevent confusion

        It didn't really work. Or perhaps there's just too much confusion already :)

  7. JDX Gold badge

    Performance on on-Windows platforms?

    I haven't had a chance to read 45k words but since the comparison involves .Net Framework 4.8, presumably these tests are Windows-10 specific.

    Quite a few of the examples listed - file systems, etc - are very OS-specific, so if anyone has seen reports on how .Net compares between Windows/Linux/Mac and how .Net 6 compares to .5 (or core 3.x) on non-Windows platforms, or even done tests themselves, I would be very interested.

    Performance on Linux servers is surely pretty important given the focus on this area.

    1. Nafesy

      Re: Performance on on-Windows platforms?

      There are Unix specific improvements listed (where the change is os specific)

      Why comment asking for more detail if even the original is too much for you?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like