back to article The 'third era' of app development will be fast, simple, and compact

At the annual Hot Chips symposium on high-performance chippery on Sunday, the assembled chipheads were led through a four-hour deep dive into the latest developments on marrying the power of CPUs, GPUs, DSPs, DMA engines, codecs, and other accelerators through the development of an open source programming model. The tutorial …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    "Having multiple different but equivalent ways of doing the same thing really doesn't benefit anyone."

    That's only true if 'equivalent' == 'optimal'. In almost all real world cases the outcome of a committee led design is sub-optimal because it needs to address competing drivers from different members of the alliance.

    There's room for different interpretations and implementations.

    1. Destroy All Monsters Silver badge
      Trollface

      How is Ada doing?

      1. Anonymous Coward
        Anonymous Coward

        Re: "How is Ada doing?"

        It's doing OK in its intended market, as far as I can see. It has cornered the market as an intermediate language in safety critical applications.

        Whether that is a good thing or a bad thing needs further discussion, probably on a separate occasion.

        But since we're here...

        Ada in this picture is often an intermediate language because complex system design is typically done in a connect-the-boxes GUI tool (several are available), which duly produces what the GUI tool vendor describes as Ada. It is a matter of faith that this complex and largely unprovable transformation process is obviously entirely trustworthy in every respect.

        The "Ada" is then compiled and linked to produce an executable binary. Again, it is a matter of faith that this complex and unprovable transformation process is entirely trustworthy in every respect.

        If there's time and budget, the resulting trustworthy executable may be subject to some testing, though in any realistic system the testing can never be exhaustive. So a great deal is taken on trust: if the design meets the spec (the design tool tested that), the executable must be OK too, right?

        If we're talking recent avionics (e.g. Dreamliner), it appears the resulting system is then subject to a regulatory approval process involving ticking a few process-related boxes (no subject matter knowledge necessary).

        How do you reckon Ada's doing?

        1. Destroy All Monsters Silver badge

          Re: "How is Ada doing?"

          It exists but it's nowhere where it was originally INTENDED to be - a do-all language for military (and of course other high-reliability) applications, and the more-or-less single language to use in that case. That idea was doomed from the start.

          Yes, it should be used more often. Instead of C++ hackarounds.

  2. Gene Cash Silver badge
    Facepalm

    Babble & doublespeak

    "pass a pointer to another processor, and execute on that data"

    Sounds like a hardware guy that doesn't know how code really works. Or a sales mouth.

    "we move the compute rather than the data"

    Yup, you just confirmed that. You have no idea what you're talking about

    1. Destroy All Monsters Silver badge
    2. Steve Knox
      Facepalm

      Re: Babble & doublespeak

      "pass a pointer to another processor, and execute on that data"

      Sounds like a hardware guy that doesn't know how code really works. Or a sales mouth.

      Sounds like a high-level code monkey who doesn't understand how hardware and the intermediate compilers/virtual machines/os/firmware can make very complicated changes to exactly how their "code" actually runs.

      "we move the compute rather than the data"

      Yup, you just confirmed that. You have no idea what you're talking about

      Confirmed.

      1. Harry Kiri
        FAIL

        Re: Babble & doublespeak

        No - read the article.

        "Unified memory addressing across all processor types, for example, is a key feature of HSA. "It's fundamental that we can allocate memory on one processor," Rogers said, "pass a pointer to another processor, and execute on that data – we move the compute rather than the data.""

        So a comment that you're slagging off had a perfectly reasonable rider on it, which was omitted.

        CUDA works in a not too faraway way utilising shared memory.

        Have you ever worked on a multi-processor system? Unless you bind your process/program/app (whatever its called this week) it will happily run where-ever. If you're running virtual memory, which is pretty likely, a pointer will not equate to a physical address and whatever processor ends up being scheduled in the next time-slice will use the same pages, pointers and data - moving the compute, not the data. Yes internally the CPU will need to recache and sort out pipelines but thats internal to the compute unit.

        There's more to a brand new advanced idea than von neumann.

  3. Anonymous Coward
    Anonymous Coward

    You are, sir, a moron.

    Ah, so easy to criticize, so hard to understand.

  4. Philius
    Facepalm

    Metaphorical abstraction, anyone?

    It sounds like they're aiming at a system that allows declarative programming, as each system has its own procedural approaches.

    I thought about how I could use OpenGL to rotate the bits in an array of bytes recently (swap rows and columns) to speed up some code - it made my head hurt.

  5. Destroy All Monsters Silver badge
    Holmes

    Java sure is COBOL naow

    I'm sure there is a lot of effort and excellent engineering going into forcing the square peg of Java into every hole (body orifices excepted, I hope), but -- did they even mention languages more appropriate to this "third era" than one that has been designed squarely in the "first era" and is being controlled by Godfather Larry? How will the JVM be applied to this heterogeneous computing? Won't other VMs be more appropriate? Where is Erlang for example? What if I want to run ancient PARLOG code from the 5th generation project?

  6. Pete 2 Silver badge

    Add another one to the pile

    I'm sure I've heard this rhetoric, wrapped around different "developments" every few year for the last few decades.

    Whether it was said about high-level languages, application generators, 3G and 4G languages, Object-Orientated, Agile, Multi-threading or whatever fad is getting taught in tertiary education these days.

    The point is, that adding more layers just hides the underlying stuff. It's still there: in all it's flawed implementations, assumptions and incompatibilities.

    Programming is hard. Mainly because people who do it don't have complete familiarity with the tools they are using, place showing off their intelligence above writing clear code, only have a vague idea of what the final goal should look like, haven't been given the time to do a proper job of keep getting new and conflicting changes added to whatever specifications they started with. None of these problems is helped by having a new programming fashion, toolset or language foisted on them every few year (or university generation: the time between a new intake and them graduating).

    Simply coming up with a new set of buzzwords and more hoopla won't fix the underlying difficulties. It certainly won't help inexperienced and time-pressured programmers make better applications.

    1. Matt Bryant Silver badge
      Thumb Up

      Re: Pete 2 Re: Add another one to the pile

      Agreed. If anything, all it will do is make things worse by giving PHBs the unrealistic expectation that this will make all phases of a project run faster, and the KEY phase - proper requirements analysis and design - will be under even more pressure to be rushed. Makes no difference if you have an old screwdriver or a electric powerdriver if what the job actually requires is a wrench.

  7. John Smith 19 Gold badge
    Unhappy

    Pass pointers to data rather than data itself...

    Erlang?

    Interestingly now that world + dog has standardised on IEEE floating point standard and given most of these systems are back end (minimal user interaction ) it should be fairly easy to write portable code.

    Efficient portable code is a different question......

  8. Stephen Channell
    Meh

    The 'third era' of app development will be: immutable, functional, fast, simple, and compact

    The trick to parallel optimisation is to treat data structures as immutable and then pack them into blocks that vector instruction can use, and multiple cores can work on without cache contention, that’s not easy with Java, because it has no struct type. To avoid multiple threads using the same indexers it is best to take a functional approach and not have “variables” either.

    Sure, you can get Java to work on a GPGPU/MIC (just like Fortran or COBOL), but will the amount of changes justify the effort.. attempts at Java BLAS (Basic Linear Algebra) point to Java being an EPIC FAIL or evolving into a very different language like X10.

  9. BornToWin

    AMD and HSA are the clear choice

    That however won't stop Nvidia nor InHell from doing their own thing and suffering the consequences as both try to find a viable business plan now that their empires are crumbling.

This topic is closed for new posts.

Other stories you might like