back to article Deep inside AMD's master plan to topple Intel

AMD's new graphics architecture isn't merely about painting prettier pictures. It's about changing the way computers compute. As first revealed last month at AMD's Fusion Developer Summit, the chip designer has gone out of its way to ensure that its future APUs – accelerated processing units, which is what the company calls it …


This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward


    As long as it runs DOOM5 at 100FPS, I couldnt give a toss



      As long as it decodes blurays and flash video, I couldn't give a toss.

  2. rav
    Thumb Up


    So now all they need is a benchmark to compare AMD APU to Intel. Or the world will say that Intel is better becasue AMD does not approve of SYSMARK.

    Seems to me that AMD needs to support some OPEN Benchmark.

    1. Piro Silver badge

      Huh? What are you getting at?

      It's commonly known that Intel's top end CPUs are a decent bit faster than AMD's top end offerings...

      But Intel doesn't even compete in the same universe as the top end AMD GPU offerings...

      So I could tell you exactly how those benchmarks would look. Anything that is heavier on the GPU, AMD would win, anything that is heavier on the CPU, Intel is likely to win. Shock!

  3. tempemeaty

    I wonder....

    I wonder if they will be unable to support Linux base OS's with this.

    1. Zeb


      This comment made me lol!!!

    2. Ammaross Danan

      You must not be aware of how things are done in Open Source....

      First, it is not "they" that need to support Linux-based OSes.

      Second, Linux may very well get support (or at least better support) for IOMMUs before Windows.

      If it's open, it's usually available. That's why nVidia drivers exist for Linux, but Radeon drivers are still "experimental": ATI didn't support an open driver interface.

      1. John Savard

        When It Is They

        It would be "they" if they chose to keep the critical details of the hardware secret, so that no one could program it themselves.

        Fortunately, that's pretty unlikely if they intend the thing to be used for general-purpose programming, rather than just for one Direct X driver to be written for it, with everyone else just calling it (or complaining that there isn't also an OpenGL driver).

        But since it's derived from video card technology, I can understand the fears, even if they are wholly unwarranted in this case, as you note.

  4. Anonymous Coward
    Anonymous Coward

    English language + techno-babble

    So in summary....

    in AMD's FSA the VLIW-based SIMD is replaced with a GPGPU architecture, created with a CU + GPU by Scaling tasks across its Vectors on a per Wavefront basis. This gives a great QoS for the GUI, and will be called GCN, according to the AMD CTO.


    1. Aaron Em

      Dammit cut that out

      I haven't had coffee.

    2. redniels


      that's why your name is jesus: nobody except your father understands what you're saying.

      you win by default.

      Jesus Wins. Fatality!

      1. God Puncher

        Are you kidding?

        Even I don't understand him most of the time!

        *(Heh. God Puncher. The militant atheist superhero...)

      2. Doug Glass

        And like a good data backup guru ...

        ... Jesus saves.

        1. E 2


          Marchand scores!

  5. Anonymous Coward

    No matter how good AMD chips are...

    ...Intel have a bigger marketing budget, so will ALWAYS be better in the publics eyes...

    1. jason 7

      Thats always the odd thing.

      AMD moans about Intel but Intel does one thing that AMD never does...Advertise.

      So whan folks go to buy a PC they get a choice of Intel or AMD.

      Well they've never heard of AMD but they get to hear the Intel jingle at least three times a day on TV so they buy from the one brand they have heard of.


      AMD needs to sack their marketing team, get a new one and start spending on some jingles etc. after all there is only one other company doing it in their field so how hard can it be?

      Even Acer has adverts in the UK.

      As for no-one wanting AMD, I build my budget PC boxes with AMD CPUs in them. Why? The saving of using AMD enables me to put a 60GB SSD in the box. Customers dont notice the difference between an Athlon II or a i5 but they notice the SSD. They also get USB3.0/HDMI/usable integrated graphics, which they dont get on the budget Intel motherboards.

      1. Danny 14
        Thumb Up

        not strictly true anymore

        The new intel celerons (E3400) are absolute crackers. Not pricey and faster than a bunch of athlons in similar brackets.

        1. jason 7

          Trouble is.....

          budget Intel boards are not as well equipped in most cases as AMD based ones.

          A lot of them still have serial and parallel ports on them. To get the modern ports you have to spend £15+ or more to get them.

          On a budget box thats eating into my profit for no benefit whatsoever.

  6. Citizen Kaned


    they can design the best hardware in the world but there drivers are garbage. i sent my last ATI card back as im not waiting 8 weeks for them to fix drivers.

    1. Anonymous Coward

      and how many decades ago was that?

      Since AMD took over ATI the driver have improved a vast amount, and since the days of the 3000series cards they have been just as good as nVidia on Windows, and recently they have been better at just about evrything.

      In the last year or so they have provided Linux drivers on par with or maybe even better that NV. True there latest cards may need the LInux crowd to wait a few weeks, but that is because they are NEW chips which need NEW drivers, as opposed to NV whihc has done naff all apart from re-naming old chips with the occaional speed bump due to process maturity.

    2. Baskitcaise

      "there" + "i"?

      I could go further but why bother.


  7. Anonymous Coward

    Yes you, boy, at the back of the class!

    Oh dear, I've really got my mortarboard in a tizzy this morning...

    "You program it in a scaler way" - that would be "scalar".

    On a more relevant note, a company that has become all but irrelevant tries to claw its way back by designing things that nobody wants and that simply don't perform.

    (Now sits quietly and waits to receive the ceremonial Inverted Thumb of Death from the AMD fanbois)

    1. Destroy All Monsters Silver badge

      Well, duh!

      You, sir, are of the highest intelligence because you happen to have, with the unusual sharpness of a Moriarty-like mind, recognized, in a flash as it were, what capitalism is all about.

      Except for the "all but irrelevant", "things that nobody wants" and "simply don't perform" parts.

      Have a cookie. It's a bit mouldy.

  8. Ilgaz

    any windows developer?

    I heard game and multimedia developers choose Intel compiler for its excellent optimization characteristics and resulting executable has tendency to look for genunineintel to perform well. Otherwise, on AMD, it is horrible.

    So, if its true, how will AMD manage to convince developers to use their cpu/gpu optimization while they sit idle with thousands of high end apps ignoring their old fashion CPU?

    As a user hating Intel brand itself, I am almost convinced to go with i5+nvidia gpu on Windows 7 because of the sad fact above. Oh, I also decided to boycott ATI because it refuses a very easy win7 recompile of Vista driver.

    1. E 2

      WRT Compilers

      There are many compilers that produce excellent code for both Intel and AMD. Devs do not have to use use Intel compilers.

  9. rav
    Thumb Up

    We should all thank AMD.......

    for without AMD Intel would still be selling 500mhz Pentium 5 at $1000.00 per unit.

    AMD is far from irrelevant. Now VIA is irrelevant.

    AMD is taking RADEON and changing the core of computing.

    Intel has Sandy Bridge which is a core duo glued to crummy graphics.

    Intel has no graphics capability. RADEON was an expensive purchase but just maybe they knew what they were doing when AMD overpaid for it.

    1. Citizen Kaned


      they revolutionised CPUs. i remember how great my first Athlon was compared to the intel chips of the time, and cheap too!

      1. Danny 14
        Thumb Up


        but lets ignore the K6's though. They were baaaaaad (as opposed to my OC'd celeron, the mighty 300 -> 450)

  10. bertino

    Said it before, will say it again.

    All that is needed for many tasks is an FPGA co-processor. Load an FPGA image that does exactly what you want, the CPU sends whatever data is needed to the FPGA, the FPGA squirts the result out of the other end. For stuff like transcoding video this would be a PoP. You could have FPGA images in a repository, just like you do with linux packages. This is not difficult to implement. You can have several versions in the repos, one for each vendor (the synthesis and place and route for each vendor is different), just like you do with i386, x64, ARM, MIPS, PowerPC.........

    1. Tim Parker


      "All that is needed for many tasks is an FPGA co-processor."

      Indeed. I've wanted a motherboard (or add-in board on a fast bus) with a few smallish FPGAs to play with for a while too.. load a trascoding image into these 2, a decrypt into that one, code generation ones into this guy - swap 'em around when you've finished one job etc etc. It wouldn't cost pennies, but shouldn't be too pricey either - there's probably more work into getting the memory access sorted and the IO traces properly matched than anything else on the hardware side, and that's just fiddly rather than expensive.

      Obviously this would be mostly, though not entirely, just to play with - but surely there's nothing wrong with that !

      1. Anonymous Coward
        Anonymous Coward


        Plenty of eval boards with CPU+FPGA. Tend to support Linux OotB but you didn't want any other OS did you?

        1. The BigYin


          You'll find the Linux kernel in many, many OSs. Certainly way more than Darwin or NT.

          What was your point?

          1. Anonymous Coward
            Anonymous Coward

            @the big yin

            Errrrrrr.... that there are a lot of the hardware evaluation boards that are designed for people to experiment with CPU and FPGA hardware architectures - which sounds very much like the thing Tim is expressing an interest in. These boards usually ship with a Linux kernel and a GNU toolchain as these tools, and open source in general, are a very powerful and effective way of enabling the intended experimentation and evaluation.

            Is that ok with you?

  11. Simbu

    Catch 22?

    AMD's problem is getting major software develoeprs to code for this new architecture. The vast majority of developers code with Intel / NVidia in mind, as they have the lion's share of the market.

    Without market share and the marketing power that comes with it, AMD can build an amazing processor and still fail, as no-one will be interested in taking advantage of it, for fear of marginalising their software.

    1. Jacqui
      Thumb Up


      I am doing work with AMD GPU's and OpenCL is what I currently use.

      There are pro's and con's but it provides me with the ability to

      run my code on/across a wide range of compatible systems.

      My devbase runs on the big servers at work and a ~200UKP desktop system with a ~70UKP GPU and home.

    2. Ken Hagan Gold badge

      No, there is no catch-22

      "The vast majority of developers code with Intel / NVidia in mind, ..."

      You what? The only people who code with NVidia in mind are the folks who write drivers for their cards. A high nines proportion of the programming community *can't* write NVidia-specific code in their usual coding environment. Similarly, although it is slightly easier to write CPU code that runs on Intel but not AMD hardware, hardly anyone does it because most the compilers for most languages won't let you and for the rest it is fairly tedious.

      There is no catch-22 for this or any other radical architecture. Here's why...

      If AMD can create a C compiler that generates decent code for the heterogenous system (and by decent, I mean faster than just using the homogenous part, even for algorithms that aren't embarrassingly parallel) then the Linux kernel will be ported within a month of the systems being available in shops. Others will provide cfront-style front-ends for all common languages. In many cases, this is already how compilers for such languages are written, so this will be trivial. Once you have a complete Linux distro running, faster than is possible on a homogeneous offering, Microsoft will announce a Windows port because if they don't then they'll spill oceans of blood in the server market, where Linux already has enough market share to be taken seriously.

      If AMD can't create such a compiler, then their new architecture just isn't as great as they are claiming and no-one (outside AMD) will care.

      1. JEDIDIAH

        Vast majority of developers...

        There is nvidia specific kit in the highend desktop computation market. These are expensive machines with multiple special purpose GPU cards. They utilize a proprietary nvidia API. However they are by no means general purpose machines. They are very specialized, much like a computing cluster.

        This is a small niche. Seems to do well for particular problems. Not at all general purpose.

        AMD seems to be trying to take that approach onto the desktop. Dunno if will really matter there though. Between Doom and XBMC, most people seem to be pretty well set with the current kit.

      2. DanFarfan

        Timid doesn't topple

        "If AMD can create a C compiler ..."

        I agree with everything you said, however, playing along with the nonsensically exaggerated title of the blog post, your recipe can't/won't topple nuttin'.

        It's high time AMD goes all in. Start from scratch. Throw out everything everyone assumes about how software and systems "have to" be developed and operate - forget the bad stuff, improve on the best stuff to create a top to bottom re-imagined stack and life cycle - complete with OS (not another clone of a clone of a clone of unix pretending to be special), GUI and software development tools (compilers? There's dinosaur) and apps and an unprecedented "contract" with the user.

        Make it run everywhere, but run best on AMD chips. Focus on mobile, but support old-fashioned non-mobile devices too. :-) "Sure, Android devices are good, but have you seen Android+ ?"

        ~$100,000,000 <5 years. Completely underground.

        Risky, but toppling isn't for the timid.


      3. Ken Hagan Gold badge

        Re: No, there is no catch-22

        If I can just reply to myself...

        "Microsoft will announce a Windows port because if they don't then they'll spill oceans of blood in the server market, ..."

        I'm *assuming* that this heterogeneous thingy is fully virtualisable. Current GPUs are not, which is why you don't get bleeding edge games performance in a VM. However, it is obvious (to me!) that if you are depending on the GPGPU for performance on general benchmarks then you won't be considered AT ALL in the modern server market unless you preserve that performance advantage under virtualisation.

        So, um, AMD could block their main route to market if they choose to omit virtualisation support in the first release.

    3. ZeroGravitas

      @simbu - Huh?

      AMD currently has a slightly larger market share than Nvidia for desktop/laptop graphics doesn't it? So is Intel / NVidia "the lion's share of the market"? Just as likely to be Intel / AMD I'd have thought... unless I'm missing the point and you're talking about something other than graphics.

      Agree with the basic issue you raise though - the nuances of the dominant cpu (Intel) are far more likely to get an optimised code path compared to carrying out a major re-write to support a radical improvement (AMD) but with a smaller installed base. AMD need to make it very easy for developers to get a decent advantage from this new hardware.

  12. George 24


    Let us hope that AMD will produce a great processor and give intel a kick up the butt. If the processor is good enough, it will become popular.

  13. Peter Gathercole Silver badge

    Round and round we go, where we stop, nobody knows!

    Aren't we at the Itanium/x86_64 point again?

    Surely the problem with all of these APU or GPGPUs is that suddenly we will have processors that are no longer fully compatible, and may run code destined for the other badly, or possibly not at all!

    The only thing that x86 related architectures have really had going for them was the compatibility and commodity status of the architecture. For a long time, things like Power, PA, Alpha, MIPS, Motorola and even ARM processors were better and more capable than their Intel/AMD/Cyrix counterparts of the same generation, but could not run the same software as each other and thus never hit the big time.

    Are we really going to see x86+ diverging until either AMD or Intel blink again?

    1. Giles Jones Gold badge

      Not quite

      Transmeta were ahead of their time.

      In their design the CPU or processing engine could be anything and the instruction set be a sort of software running on it.

      But obviously the future of x86 is a little hazy given ARM based devices may start to dominate the personal computer market.

  14. Dunstan Vavasour


    And that, boys and girls, is why you should use a long password. Modern GPUs can calculate hashes at an extraordinary rate, making brute force attacks on 6, 7 and even 8 character passwords eminently doable.

    1. Ian Yates

      A pinch

      I'd be pretty pissed if my password was being stored hashed without a salt. I'm not naive, I know it happens, but it should be be illegal under DPA for lack of due care.

      Not saying this is an infallible solution, but I'm always amazed when a service warns users that their passwords have been compromised.

  15. Kevin Pollock

    GCN is not a TLA as defined by those idiots at

    Since I couldn't understand most of the rest of the article I dropped into Pedant Mode.

    I know the Reg writers don't have time for things like correct spelling, grammar or semantics; but when you give links to definitions like TLA could you please use a correct (or more correct) definition?

    In this case the Wikipedia definition allows for the fact that in most instances of TLA, the "A" stands for abbreviation, not acronym as PCMag seems to think.

    GCN is a three letter abbreviation, not a three letter acronym.

    1. Anonymous Coward


      FFS! Since when has Wikipedia been the definitive guide to English? Surely Wikipedia is only for lazy students.

    2. Anonymous Coward


      TLA has always stood for "Three Letter Acronym" for as long as I can remember.

      Wikipedia isn't necessarily the best authority on this.

      1. Kevin Pollock

        TLA - don't like Wikipedia? How about a DICTIONARY?

        LOL - you may not like Wikipedia - that's irrelevant.

        My point is that the author of the article, and (and apparently you and AC) do not understand the difference between an abbreviation and an acronym. They are not synonyms.

        Here is a link you can use...

        Like I said - I'm in pedant mode, but TLA is not a TLA by your definition.

        1. Anonymous Coward

          Well, duh

          " not understand the difference between an abbreviation and an acronym. They are not synonyms."

          What gave you that impression?

          An acronym is a pronoucable abbreviation.

          I always assumed that TLA *incorrectly* stood for "Three Letter Acronym".

          The dictionaries may have corrected that oversight but it doesn't make them right in terms of their popular usage, and the usage I have seen and heard for the past 20 years contradicts your assertion.

        2. Ian Yates

          acronym / abbreviation

          "TLA is not a TLA by your definition"

          And why not? Are you saying that "TLA" is not an acronym?

          Using your dictionary link, "acronym" means: "a word formed from the initial letters or groups of letters of words in a set phrase or series of words"

          So regardless of what "A" stands for, "TLA" is an acronym; and a three-letter one at that. Granted, acronyms are also abbreviations (but not vice versa).

          I really don't understand what you're being pedantic about.

    3. Tom Maddox Silver badge

      . . . and other misconceptions

      Are you sure there's not an "i" instead of an "o" in your last name?

  16. Anonymous Coward
    Anonymous Coward


    Already some available, and we are due many more, Arm systems (CPU+GPU on die) using similar if not the same techniques. Available from NVidia now I believe.

    However, these are a bit pants, but by 2013 there will be quad CPU core Arms with built in GPU (with vector cores and associated HW to help with encode decode 2D/3D etc), running at tiny wattage that will blow Intel and AMD chips with their high power consumption (and in some cases LESS performance overall) away.

    And these will be mobile devices with the performance we currently see on the desktop.

    Just use the Khronos API's to code for them (OpenGL, OpenCL, OpenVG etc), and it doesn't matter whether you are on Arm or x86.

    Should be fun!

  17. Neil 7
    Thumb Up

    This new CU design

    sounds vaguely CELL-ish...

  18. ByeLaw101

    Kevin... oh dear, Kevin..

    I've always known TLA stood for "Three Letter Acronym", if you stop being a pedant for a minute it acutally makes sense to be called "Three Letter Acronym" instead of "Three Letter Abbreviation"!

    TLA is actually a Three Letter Acronym itself, and in no sense an abbreviation!

    1. Kevin Pollock

      Sorry skelband...I'm confused.

      It doesn't matter what the expansion of TLA is - although I have to question the logic of your remark...

      "The dictionaries may have corrected that oversight but it doesn't make them right in terms of their popular usage, and the usage I have seen and heard for the past 20 years contradicts your assertion."

      ...that's just silly talk. But if you're doing modern English GCSEs I can imagine how you came to that conclusion.

      My original point, which appears to have completely passed you by, is that the author of the aritcle ( Rik Myslewski) indicated that GCN is a TLA (see the bottom the first page of the article).

      But since the link that Rik used points to a definition of TLA as "Three Letter Acronym" - then GCN is not an acronym, it's an abbreviation.

      @ByeLaw101 - what on Earth are you on about? How can you "pronounce" GCN? Are you Polish? They're the only people I know that seem to be able to pronounce words without vowels.

    2. Nigel 11
      Thumb Down


      TLA stands for Three Letter Abbreviation, nothing to do with acronym. I'll cite FLA, ETLA and YAFFLA in support.

  19. MD Rackham

    Wow! The Program Counter is...

    ...a general purpose register so they can do direct computation on it.

    Let's see, 1973, DEC PDP-11. Maybe the patents finally expired?

  20. John Savard

    Wonderful News

    In the past, ATI GPUs, although they could be used for computing, didn't seem to me to have the kind of support for that application that Nvidia GPUs had.

    I'm really happy to hear news like this; being able to put multiple cores on a chip meant that it was possible to design chips with significantly improved floating-point capability, and so I've been waiting for that to happen.

  21. markusgarvey
    Thumb Up

    what's an Intel?..

    If you ask me, all Intel does is blow hot air...I have been building computers for 15 years and in the real world, AMD has always worked better for me,seemed faster and more reliable...not to mention a hell of a lot cheaper...cpu+gpu is gonna be a would be a good time to buy some stock...

  22. mhenriday
    Thumb Up

    Thanks, Rik,

    for an excellent article which does in fact deal with the subject at hand and manages to keep the obligatory Reg sarcasms to a reasonable minimum !...


  23. Zog The Undeniable

    AMD have been the innovative one for years

    The original Athlon scared Intel and Opteron/x86-64 really made Itanium look stupid. Intel, though, have always caught up quickly and crushed them with brand strength (like no-one ever got fired for buying IBM or Microshaft). I suspect Intel's fabrication plants also have far better economies of scale. AMD frequently makes losses whereas Intel keeps on coining it.

    1. Pperson

      Re: AMD innovative

      Uh sorry to be a denier Zog, but while it is true that AMD really had a very good chip with the Athlon compared to Intel's Pentium IV, they then did exactly what Intel had done and sat on their laurels to maximise profit. Because facilitating good research is obviously a waste of money if your current chip is better than your rival's. Hey, can't blame 'em: Intel does the same. Funnily enough, Intel only got back in it by accident with Core (from the mobile guys) and the tables were turned. And we haven't moved since (4Ghz anyone?) and are now in hype mode trying to flog parallel processing.

      1. Nigel 11


        Silicon CPUs won't go much above 4Ghz for good physical reasons. Assuming otherwise was the big mistake Intel made with the P4 design, which let AMD get ahead for a while.

        Parallel processing of one sort or another is the only way we're ever going to get chips 10x or 100x the power of a current Intel or AMD CPU.

  24. Anonymous Coward
    Anonymous Coward

    Doesn't help that Intel compilers disable optimization of non-intel cpus.

    Doesn't help that Intel compilers disable optimization of non-intel cpus. Thus many of the benchmarks are invalid if they are using Intel compilers for building the executables.

    People need to remember, too, that intel bribed and threatened PC makers with high chip pricing if they did any business with AMD. It did a lot of damage to AMD, but now after losing in court and after AMD has released their APU's, bully Intel is beginning to pee in their pants..

    Intel is, like always, spending millions in dollars in misleading advertisings on things that they haven't release yet. Similar to the misleading and failed efforts: Larrabee, Itanium, etc.

    The 3D transistors that they so much advertized as the first ones to have it, is all hot air, because they still don't have it, and TSMS will have it right before or close to when Intel will have it.

  25. This post has been deleted by its author

  26. Anonymous Coward
    Anonymous Coward

    I doubt AMD has plans to topple Intel

    I believe AMD's philosophy is to provide the products that consumers want and let the market decide what makes them happy. Intel is fully capable of inflicting plenty of damage on themselves without AMD's help.

  27. Cyberius
    Thumb Up

    WOW AMD!

    Real geeks buy AMD. Remember that.

  28. Will Godfrey Silver badge

    TLA or not TLA

    Any decent dictionary is regularly updated to match a word's current popular usage. This is perhaps the only time when public opinion is never wrong! Wikipedia is pissing in the sea if it seriously thinks it can tell everyone to change their understanding of an expression's meaning.

    Indeed, for me (and just about everyone I know) it was' Three Letter Acronym' several decades before there WAS a Wikipedia. Whether we should call it that is irrelevant - the fact is we do!

  29. Denarius Silver badge
    Paris Hilton

    anyone remember Itanic internals ?

    with VLIW. Is this HPs secret master plan for VMS, HPUX et al ?

    Buy AMD to get some differentiation and migrate/merge with itanic chippery and save some IP from the chipwreck (sic) ?

  30. ByeLaw101


    "@ByeLaw101 - what on Earth are you on about? How can you "pronounce" GCN? Are you Polish? They're the only people I know that seem to be able to pronounce words without vowels"

    Ah... I see your point, my bad. Although I suppose you could pronounce TLA as "Ta...LAAAAAA!" ;)

    Also, you should note that everyone really loves a pedant ;)

  31. Kevin Pollock

    ByeLaw101 - you are a gentleman!

    Unless you are a lady, in which case you are a...lady. Have a pint!

    But I see that Will Godfrey is now completely missing the point. I knew it was a mistake to quote Wikipedia!

    Will...please put aside the actual meaning of TLA (as you point out - you consider semantics to be a matter for popular vote). Forget that people don't give a crap about the documented meaning of words these days. Blimey - I hope you don't write mission critical code for a living!

    In the article the author pointed to a reference that said TLA stands for "Three Letter Acronym". He was referring to GCN, which is not an acronym, it is an abbreviation.

    Ironically if he had pointed to the Wikipeia entry instead of the PCMag entry he would have been fine because Wikipedia allows both expansions of the abbreviation TLA (which is, by the way consistent with the OED definition of TLA).

    Flippin' heck! It was only supposed to be a mildly pedantic comment in the first place! Now it's turning into Vogon Poetry!

  32. Will Godfrey Silver badge

    @Kevin - such a friendly sounding name

    Actually, the code I write is so obfuscated that even I don't understand it while I'm writing it! However, it always seems to work... although I'm never totally certain what it does

This topic is closed for new posts.

Other stories you might like