back to article China reveals home-grown supercomputer chips after Intel x86 ban

Computer design guru Yutong Lu has outlined her blueprints for China's homegrown supercomputer chips following a US ban on Intel processor exports to the Middle Kingdom. In April it emerged that Uncle Sam had forbidden Intel from shipping high-end Xeon and Xeon Phi parts to China's defense labs and other areas of its …

  1. Anonymous Coward
    Anonymous Coward

    Interesting side effects of this development..

    What can happen now:

    1 - the Chinese have just developed their own chips (well, OK, planned to develop, but given the speed by which this showed up I think this has been in the making for quite some time). This means that the only people who understand what happens inside them are the Chinese, which is a serious own goal for spy agencies.

    2 - the Chinese are no longer held back in international trade by the patents that have turned chip development into a two horse race between Intel and AMD.

    3 - they will develop their own ecology around those chips. They're not short of volume, which helps.

    The US restricted tech going to Russia: the Russians got a LOT better with using every possible bit of power out of what they got (just imagine what they can now do with the same resources instead of lesser ones)

    The US restricted the number of IP addresses for Japan: the country is now close to a decade ahead in IPv6 use and deployment.

    It's not like this wasn't going to have a predictable result, was it?

    1. Anonymous Coward
      Anonymous Coward

      Re: Interesting side effects of this development..

      It's possible that one of the motives for banning the export of Intel parts to China was to promote the development of alternatives to Intel. Parts of the US government are worried about Intel getting a supercomputer monopoly. Despite some of the posturing aimed at the general public, the forces that determine US policy probably regard China as a fairly reliable partner.

      Interesting, though hardly relevant, fact: China has fewer nuclear weapons than France.

    2. naive

      Re: Interesting side effects of this development..

      No Intel -> No Microsoft desktops.

      Interesting to see where this will lead. Best case in 5 years or so, is that the "West" is still stuck with Wintel on the desktop, with no escape route in sight, since AMD's future does not look so bright at the moment, Russia and China will by that time have developed lively ecosystems around various competing processor designs, running hardware independent applications based on Linux.

      1. CheesyTheClown

        Re: Interesting side effects of this development..

        Let me toss in some ideas/facts :)

        Windows NT was never x86/x64 only. It wasn't even originally developed on x86. Windows has been available for multiple architectures for the past 25 years. In fact, it supported multiple architectures long before any other one operating system did. In the old days when BSD or System V were ported to a new architecture, they were renamed as something else and generally there was a lot of drift between code bases due to hardware differences. The result being that UNIX programs were riddled silly with #ifdef statements.

        The reason why other architectures with Windows never really took off was that we couldn't afford them. DEC Alpha AXP, the closest to succeeding cost thousands of dollars more than a PC... of course it was 10 times faster in some cases, but we simply couldn't afford it. Once Intel eventually conquered the challenge of working with RAM and system buses operating at frequencies not the same as the internal CPU frequency, they were able to ship DEC Alpha speed processors at x86 prices.

        There was another big problem. There was no real Internet at the time. There was no remote desktop for Windows either. The result being that developers didn't have access to DEC Alpha machines to write code on. As such, we wrote code on x86 and said "I wish I had an Alpha. If I had an Alpha, I'd make my program run on it.". So instead of making a much cheaper DEC Alpha which could be used to seed small companies and independent developers with, DEC, in collaboration with Intel decided to make an x86 emulator for Windows on AXP.

        The emulator they made was too little too late. The performance was surprisingly good, though they employed technology similar in design to Apple's Rosetta. Dynamic recompilation is not terribly difficult if you consider it. Every program in modern times has fairly clear boundaries. They call functions either in the kernel via system calls which are easy to translate... or they call functions in other libraries which are loaded and linked via 2-5 functions (depending on how they are loaded). When the libraries are from Microsoft, they know clearly what the APIs are... and if there are compatibility problems between the system level ABIs, they can be easily corrected. Some libraries can be easily instrumented with an API definition interface, though C programmers will generally reject the extra work involved... instead just porting their code. And then there's the opportunity that if an API is unknown, the system can simply recompile the library as well... and keep doing this until such time as the boundaries between the two architectures are known.

        Here's the problem. In 1996, everyone coded C and even if you were programming in C++, you were basically writing C in C++. It wasn't until around 1999 when Qt became popular that C++ started being used properly. This was a problem because we were also making use of things like inline assembler. We were bypassing normal system call interfaces to hack hardware access. There were tons of problems.

        Oh... let's not forget that before Windows XP, about 95% of the Windows world ran either Windows 3.1, 95, 98 or ME. As such, about 95% of all code was written on something other than Windows NT and used system interfaces which weren't compatible with Windows NT. This meant that the programmers would have to at least install Windows NT or 2000 to port their code. This would be great, but before Windows 2000, there weren't device drivers for... well anything. Most of the time, you had to buy special hardware just to run Windows NT. Then consider that Microsoft Visual Studio didn't work nearly as well in Windows 2000 as it did in Windows ME because most developers were targeting Windows ME and therefore Microsoft focused debugger development on ME instead.

        So... running code emulated on Alpha did work AWESOME!!!! If the code worked on Windows NT or Windows 2000 on x86 first. Sadly, there was no real infrastructure around Windows NT for a few more years.

        That brings us to the point of this rant. Microsoft has... quite publicly stated their intent to make an x86/x64 emulator for ARM. They have demoed it on stage as well. The technology is well known. The technology is well understood. I expect x86/x64 code to regularly run faster on the emulator than as native code because most code is optimized for an architecture where dynamic recompilers can optimize for the specific chip they are executing on and constantly improve the way the code is compiled as its running. This is how things like JavaScript can be faster than hand coded assembly. It adapts to the running system appropriately. In fact, Microsoft should require native code on x64 to run the same way... it would be amazing.

        So, the emulator should handle about 90% software compatibility. Not more. For example, I've written code regularly which makes use of special "half-documented" APIs from Microsoft listed as "use at your own risk" since I needed to run code in the kernel space instead of user space as I needed better control over the system scheduler to achieve more real-time results. That code will never run in an emulator. Though nearly everything else will.

        Then there's the major programming paradigm shift which has occurred. The number of people coding in system languages like C, C++ and Assembler has dropped considerably. On Linux, people code in languages like Python where possible. It's slow as shit, but works well enough. With advents like Python compiler technology, it's actually not even too pathetically slow anymore. On Windows, people program in .NET. You'd be pretty stupid not to in most cases. We don't really care about the portability. What's important is that the .NET libraries are frigging beautiful compared to legacy coding techniques. We don't need things like Qt and we don't have to diddle with horrible things like the standard C++ library which was designed by blind monkeys more excited about using every feature of the language than actually writing software.

        The benefit of this is that .NET code runs unchanged on other architectures such as ARM or MIPS. Code optimized on x86 will remain optimized on ARM. It also gets the benefits of Javascript like dynamic compiler technology since they are basically the same thing.

        Linux really never had much in the lines of hardware independent applications. Linux still has a stupid silly amount of code being written in C when it's simply the wrong tool for the job. Linux has the biggest toolbox on the planet and the Linux world still treats C as if it's a hammer and every single problem looks like a nail. Application development should never ever ever be done in system level languages anymore. It's slower... really it is... C and C++ make slower code for applications than Javascript or C#. Having to compile source code on each platform for an application is horrifying. Even considering the structure of the ABI at all is terrifying.

        Linux applications have slowly gotten better since people started using Python and C# to write them. Now developers are more focused on function and quality as opposed to untangling #ifdefs and make files.

        Now... let's talk super computing. This is not what you think it is I'd imagine. The CPU has never really meant much on super computers. The first thing to understand is that programmers will write code in a high level language which has absolutely no redeeming traits from a computer science perspective. For example, they can use Matlab, Mathematica, Octave, Scilab, ... many other languages. The code they write will generally be formulas containing complex math designed to work on gigantic flat datasets lacking structure at all. They of course could use simulation systems as well which generate this kind of code in the background... it's irrelevant. The code is then distributed to tens of thousands of cores by running a task scheduler. Often, the distributed code will be compiled locally for the local system which could be any processor from any architecture. Then using message passing, different tasks are executed and then collected back to a system which will sort through the results.

        It never really mattered what operating system or platform a super computer runs on. In fact, I think you'd find that nearly 90% of all tasks which will run on this beast of a machine would run faster on a quad-SLI PC under a desk that had code written with far less complexity. I've worked on genetic sequencing code for a prestigious university in England which was written using a genetic sequencing system.... very fancy math... very cool algorithm. It was sucking up 1.5 megawatts of power 24/7 crunching out genomes on a big fat super computer. The lab was looking for a bigger budget so they could expand to 3 megawatts for their research.

        I spent about 3 days just untangling their code... removing stupid things which made no sense at all... reducing things to be done locally instead of distributed when it would take less time to calculate it than delegate it... etc...

        The result was 9 million times better performance. What used to require a 1.5 megawatt computer could now run on a laptop with an nVidia GPU... and do it considerably faster. Sadly... my optimizations were not super computer friendly, so they ended up selling the computer for pennies on the dollar to another research project.

        People get super excited about super computers. They are almost always misused. They almost always are utterly wasted resources. It's a case of "Well I have a super computer. It doesn't work unless I message pass... so let me write the absolutely worst code EVER!!!! and then let's completely say who gives a fuck about data structure and let's just make that baby work!!!!"

        There are rare exceptions to this... but I'd bet that most supercomputer applications could have been done far better if labs bought programmers hours instead of super computer hours.

    3. Steelted

      Re: Interesting side effects of this development..

      Now, they just need to find a replacement for the Xeons, maybe homegrown ARMs?

  2. Griffo

    Talk About an Own Goal

    So not only did the US screw Intel out of playing in the worlds biggest market, they helped to accelerate China's development of their own chips.

    Sounds like a plot line straight out of "The Pentagon Wars"

    1. Anonymous Coward
      Anonymous Coward

      Re: Talk About an Own Goal

      They've even invested a hell of a lot of effort into an enormous global marketing campaign for China's IT industry!

      http://www.theregister.co.uk/2015/06/25/cisco_in_single_ssh_key_security_stuffup/

      http://www.theregister.co.uk/2015/05/20/us_export_controls_0days/

      http://www.theregister.co.uk/2015/03/19/cansecwest_talk_bioses_hack/

      http://www.theregister.co.uk/2015/03/12/nsas_on_drugs_infosec_bods_unveil_space_grade_malware/

      http://www.theregister.co.uk/2015/01/14/nsa_sorry_we_borked_nist_encryption_well_sorry_we_got_caught/

      http://www.theregister.co.uk/2014/10/17/intel_subsidiary_crypto_export_fine/

      http://www.theregister.co.uk/2014/04/07/internet_inception_security_vint_cerf_google_hangout/

      etc...

  3. Eddy Ito
    Facepalm

    Hopeless

    The US Government is bound and determined to force other nations to leave it behind. It won't learn from this blind hamfisted foray and it won't learn from its other arsehatted but likely successful attempt to shoot its own feet in a barrel.

  4. This post has been deleted by its author

  5. Anonymous Coward
    Anonymous Coward

    For fuck's sake

    The idiots in the Pentagon think this will hold them back, but it will only speed their development and make them more independent in their capabilities, while damaging US companies both today and in the future as China is able to better compete with them.

    Anyway, China is hardly the military threat the USSR was back in the day (and the threat they posed to the US and western Europe was way overblown by the fear mongers trying to boost defense spending) China doesn't have a history of invading other countries, unlike the US. If anything they should be afraid of us, not the other way around.

    Holding back technology didn't even work all that well with the USSR, despite them having to get off their asses and come over here to steal our technology, instead of hacking from their comfy office chairs like today. We wouldn't sell them VAXes, but they had VAXes... http://micro.magnet.fsu.edu/creatures/pages/russians.html

    1. Michael Habel

      Re: For fuck's sake

      Anyway, China is hardly the military threat the USSR was back in the day

      Well duh! Its not like China has any direct grievance with anyone, unlike say North Korea. The threat is largely directed towards Taiwan. Which has been know to get rather uppity at times. While the rest of the World treats Taiwan as a defacto state in its own right. Its still officially Communist China, and the Threat here, is that the Taiwanese might end up provoking mother China into invading it. Which will then involve the US, and likely a WWIII scenario. That or the NORKs actually manage to nuke someone.

      Not to mention that the Chinese, would likely have little problems in selling such tech to the norks, as long as they'd come up with the bill. But, I wonder if this might pave the way for Chinese chipmakers to deliver smaller, faster, cheaper, and more powerful Chips for PC use, to rival what both AMD & Intel are currently pumping out.

    2. Anonymous Coward
      Anonymous Coward

      Re: For fuck's sake

      China is hardly the military threat the USSR was back in the day

      Yes and no: nothing like a war to dispose of a gender imbalance of 500k or more men..

  6. Anakin
    Facepalm

    Ruin you own country

    It looks just as the three letter agancys is doing it's best to take out US biz.

    First they destroying the software companys with backdoors and now they give China a reason to continue it's R&D in chip making.

    A year from now China will be the leader of chipzillas and flogging low cost and hi perfomance chips all over the world.

    Back in the cold war Chine was so close to take over the chip market that Intel and AMD could start to sell China chips to prevent or delay it.

    It was the opinon that Intel and AMD could file a chapter 11 if China developed it's own hi performande chip.

    Today China is even better suited to put on both the yellow and green jersey

    No lesson learned here

  7. Anonymous Coward
    Anonymous Coward

    Surely it's better to sell China compromised chips

    than risk them developing their own tech with unknown capabilities ?

    Although, with the alleged level of cyber-industrial-espionage going on ...

    1) China could probably access Intel designs anyway

    2) If they could access them, could they change them ?

  8. LaeMing
    Facepalm

    It's so cute.

    It's so cute how the US thinks the rest of the world can't possibly get on with out it's input.

  9. John Savard

    Only in China, you say?

    I thought I saw a statement, either in the article or the comments, to the effect that DSP chips only support 32-bit floating-point, so this new Chinese chip that is a DSP with double-precision floats is unique. However, a brief Google search turned up the Texas Instruments TMS320C674x series as a counterexample to that.

  10. N13L5

    I find it funny, but I welcome the results.

    Intel will have to try harder to keep or exceed Moore's law.

    I will be able to buy Chinese chips if I like them.

    A Chinese government backdoor won't affect me as much as the 5 Eye backdoor we likely already have with Intel's chips ever since their insecure hardware security implementation.

    As for the guy who said China doesn't have a history of invading other countries... gee, get a clue?

    Remember Tibet? that WAS a country... The chinese also tried to take Vietnam, but the Vietmanese kicked their collective asses out of there. Just like they did with the French and the Americans. Yes, the Vietnamese had to suffer grievously, and the US having fought a completely unjust war against them even hamstrung them with trade sanctions for many decades afterwards. But since a recent trip to Vietnam, I can say their streets are cleaner and prettier than those of Los Angeles or London. Surprising what effort can do, even in the absence of money. And the garish colors of stupid corporate logos and advertising don't tower over everything either. (yet...)

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like