back to article Mythic bet big on analog AI but has run out of cash

A startup that bet on the old concept of analog chips to provide energy-efficient AI computing has run out of capital. Texas-based Mythic "ran out of runway with the investors before we could get to revenue," said Ty Garibay, the startup's vice president of engineering, in a Tuesday LinkedIn post. Garibay and several other …

  1. Zola

    10 years old?

    When is a startup no longer a startup?

    Is IBM still a startup? Or HP?

    But 10 years is a lot of runway... maybe they were trying to make something nobody wanted to buy.

    1. jonfr

      Re: 10 years old?

      Because technology is all about what is fashionable today. Actual stuff, that actually works is not as this company just proved. For development like this, I don't think that ten years is that long. But they clearly needed some other revenue source then just investment into their development.

      Developing new technologies is also really hard and expensive. I am sure this basic for androids (think star trek) is going to get used sometimes in the future.

    2. andrewj

      Re: 10 years old?

      Because some things are hard and take more than 10 years to get to work? Just a thought.

      1. Zola

        Re: 10 years old?

        They did make it, and apparently it worked.

        The problem is there was nobody that wanted to buy it, or not in big enough numbers to keep them afloat.

  2. Anonymous Coward
    Anonymous Coward

    Now that they are bust, presumably any IP they did have can be gobbled up at bargain basement prices.

    Or, more likely, they are a vapourware outfit and don't actually have anything to show.

    1. Michael Wojcik Silver badge

      RTFA.

      AI Hardware Summit in September, where the company showed off analog chips that could run the YOLOv5 object detection algorithm on high-resolution video at 60 frames per second while only consuming 3.5 watts

      So they have, in fact, shown something.

      Power consumption is gradually becoming a bigger and bigger concern for large-model ML. But hardware choices move a lot slower than software in the field, and the big consumers mostly already have a lot of still-usable hardware in data centers, so they wouldn't be in the market for something like this yet. And rewriting ML software to work with a different architecture could be expensive.

      Really low-power ML solutions are more tempting for embedded and "edge" (ugh) applications, so there is a potential market for something like this in applications such as self-driving vehicles. (Whether that's desirable at all is another question.) But again the cost of switching to a new architecture is a big barrier.

      1. Anonymous Coward
        Anonymous Coward

        Tech demo is not product.

        A company running for ten years without selling anything has forgotten the number one priority.

  3. John Smith 19 Gold badge
    Unhappy

    *if* that demo was what it appeared

    Not the first time a HW supplier has played that game is it?

    But their right though. IIRC the actual estimate for one human brain is 1 Petaflop using 400W of poweer. That's about 0.25 x 10^-12W per operation.

    Time will tell if their IP was worth anything or not.

    If they were trying to develop an analog mfg process then 10 years is not that long, but if it was adapting a high density CMOS process to do this then yes, that was a lot of time.

  4. Kevin McMurtrie Silver badge

    Maybe not critical mass

    The barrier to adoption would be that it's different. Assuming they're using flash cells, the precision is going to be low and it can drift over time. It may have more than enough precision for AI, but it's not going to match digital models and it's not going to have exactly reproducible results. It might need a feedback loop to keep it trained.

    On the flip side, an analog calculator is exactly what you'd want to take the 'A' out of 'AI' in the future.

  5. Anonymous Coward
    Anonymous Coward

    Other related research

    From New Scientist - ‘Artificial synapse’ could make neural networks work more like brains. Networks of nanoscale resistors that work in a similar way to nerve cells in the body could offer advantages over digital machine learning

    ... have created a nanoscale resistor that transmits protons from one terminal to another. This functions a bit like a synapse, a connection between two neurons, where ions flow in one direction to transmit information. But these “artificial synapses” are 1000 times smaller and 10,000 times faster than their biological counterparts.We are doing somewhat similar things [to biology], like ion transport, but we are now doing it so fast, whereas biology couldn’t,” says Onen, whose device is a million times faster than previous proton-transporting devices.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like