back to article Nvidia offers Intel-thrashing netbook GPU tech

Nvidia is clearly in with Intel: it's announced a netbook integrated-graphics chipset designed to work with the Atom CPU. Enter the GeForce 9400 - codenamed 'Ion' - which essentially knocks the socks off the Intel integrated graphics Atom-based netbooks have thus far shipped with. For instance, Nvidia touted its part's …


This topic is closed for new posts.
  1. Anonymous Coward

    Now where is Jobs

    This could be a real neat Mac Mini...

  2. Brian Bush
    Jobs Halo

    Apple on the Brain

    Smells like a Atom/Geforce AppleTV is in the cards.

  3. Another Anonymous Coward


    I guess the high definition output isn't intended for the netbook itself, but for hooking it up to a bigger TV. Could be a handy way to bring movies over to someone else's place without lugging a stack of dvds.

    Good small+silent media centre potential for this board too, at the moment choosing between the the expense (and size) of seperate graphics cards or the utterly mediocre performance of integrated graphics is a huge barrier to the thing that everyone seems to want - a dinky little box that isn't noisy and can handle HD content.

    Looks pretty good, now to wait and see if it cuts the mustard and does what nvidia is claiming, without any playback jitters as soon as the movie has an explosion/ motorbike chase/ anything fast.

  4. Anonymous Coward
    Paris Hilton


    Now Netbooks will be as powerful as Notebooks, just smaller and more energy efficient and .... who am I kidding? Although I revel at the fact that Netbooks will be able to handle HD video, I can't help but think HW vedors aren't exactly cheering to adopt such a chipset, since it clearly eats into existing product lines and profits, no wonder no one was onboard at announcement. Given the current economic climate, innovation which requires additional investment in R&D to integrate (then build and advertise), while at the same time negatively impacting existing product lines = just plain bad timing for a good innovation.

    Paris, because I feel like it.

  5. Anonymous Coward

    "minimal affect (sic) on battery life"

    In other words it will be just as inefficient as the current chipset, which lets down the Atom so badly? Give us longer run-times before mostly pointless graphics acceleration.

  6. Charlie van Becelaere
    Gates Horns

    "minimal affect on battery life"

    That's affect as in Seasonal Affect, one assumes. I didn't realise that batteries were susceptible, but perhaps there's a Second Battery Life. Perhaps this chipset is optimised for viewing flying penises then?

  7. MYOFB
    Dead Vulture

    SCC with HD (Graphics) . . . Bollocks!!

    Try getting your SCC reviews up to date with the latest and greatest BEFORE expousing this crap!!

    Or have El Reg fallen out with Samsung for some inexplicable reason?

    Look, I know times are hard at the moment but do you not think it a reasonable thing to do, every now and then, to focus on the here and now, instead of this 'Pie in the sky' cod piece of fluffy technology??

    Let's face it, if the SCC community doesn't take off further than it has done, then DreamWare (tm) such as this will never see the light of day. Why?

    Because the manufacturers haven't recouped their Manufacturing/Marketing costs, let alone the R & D costs!!

    So I put this to you El Reg: It's nice to hear and read about the innovations "Just around the corner" but sometimes, just sometimes, we have to settle for what's on the plate laid before us, eat it all up and trust that there will be better fayre on the menu in times to come.

    If we don't, then all we can expect is more of the past and nothing of the future.

    \THAT Icon? . . . Self explantory if you think about!!

  8. Anonymous Coward


    is the key here. Not just mobility. Although useable in mobile units this platform form an excellent base for HTPC's and other multimedia devices. This along with themeable desktops can form a seamless environment that integrate all these types of devices regardless of resolutions and size. From a software perspective this is already evident through GUI-initiatives on mobiles like Moblin and Android which are making their way onto media devices, and with elements frome these even being adopted in general-purpose desktops like KDE and GNOME.

  9. E_Nigma


    "In other words it will be just as inefficient as the current chipset, which lets down the Atom so badly? Give us longer run-times before mostly pointless graphics acceleration."

    No, not really, according to some websites, the Nvidia part can suck up to 18W, while Intel's current solution for Atom (945 + the accompanying southbridge) uses mere 7.5W in the worst case scenario. So it won't be just as unefficient, it'll be a whole lot more inefficient. :D On a mre serious side, while it won't do for a netbook with a 4-cell battery, it may be quite OK for the 6-cell ones and it will certainly make for interesting, small, cheap and quiet media-center PCs.

  10. Anonymous Coward

    Oh, it does eat batteries...

    according to the nVidia's web site, the 9400M "eats" 18W, compared to 945 which "eats" 6W.

    AC: the companies *will* sell such netbooks since it's a competitive market. They didn't want to sell netbooks but then Asus came with the EEE which sold millions, and then everyone started creating and selling netbooks like crazy, so they know that if they don't sell a model with 9400M, their competitor will and that competitor will make money. This happens all the time in this market.

  11. E


    I was pretty disgusted by the way netbook makers bought into Atom: "ooooh, Intel has done all the engineering for us, we just have to wrap Intel's work in slightly different plastic and go to market!"

    We can buy about 50 different netbook products - and about 48 of them have essentially identical specs. Such a success for competition and innovation! Expletive expletive expletive. Even the HP 2133 - a really superior screen, superior keyboard and very good case - killed off because HP could sell the same Atom box as everyone else. F*ck.

    You know what Intel's action is called? It is called monopoly power excluding competition from the market. Stick that in your pipes and smoke it all you Reg and Inq hacks and all you uncritical review-site Atom fanboies. You heard it here first.

    ARM, AMD (ha! maybe in a past age when the company could execute), Via all have tech that could make netbooks that beat Atom on some or many metrics, but no computer company had the balls to make something different.

    Nvidia might change this state of statism. If so then good for Nvidia.

    There really needs to be a truly vituperative icon to adorn a post - for those cases where the poster wants to make absolutely clear his level of sincere anger/disgust/contempt for the status quo.

  12. Anonymous Coward


    Untill Nvidia confess about just what chips are and are not affected by their current BGA "quality" problems, I'll avoid all of them (just in case).

  13. Patrick

    Its or gone worng lol

    I was just reading this and thinking, I have a Dell D400 and its the pentium M cpu. its just about the right size for me for using as a second pc as I use it for looking at the internet etc while watching tv. the battery is only 3800mAH but this still gives 2 hours and 41 minutes in desktop mode, although this could easily be boosted to over three hours by adjusting the screen brightness and power management settings. but I read some netbooks do that.

    I am able to load webpages at a decent turn of speed ( faster than the atom I know i have had three netbooks) and there is wifi and bluetooth built in and I have been able to play games like midtown madness 2 and san andreas and the simpsons hit and run. just in case i need too lol. most at 640*480 res but at least as its a small screen and it does a very good job on resizing the image it still more than playable up 25FPS on gta san andreas which is not to bad ( honest )

    I understand that most netbooks are around the 1kg but this is 1.7kg. I am sure it would be possible today to lower this.

    The point is there is older stuff you can get that works loads better than the very slow atom and if they used the Centrino/pentium m setup and used the 4800/6600 batterys they could get the over 4 hours battery life people they want.

    but at least you could do something on it. and they could use the lastest die size for the cpu and its sorted...

    Or to cut a long story short. there is already the tech out there to do what they need....

This topic is closed for new posts.

Other stories you might like

  • Lenovo reveals small but mighty desktop workstation
    ThinkStation P360 Ultra packs latest Intel Core processor, Nvidia RTX A5000 GPU, support for eight monitors

    Lenovo has unveiled a small desktop workstation in a new physical format that's smaller than previous compact designs, but which it claims still has the type of performance professional users require.

    Available from the end of this month, the ThinkStation P360 Ultra comes in a chassis that is less than 4 liters in total volume, but packs in 12th Gen Intel Core processors – that's the latest Alder Lake generation with up to 16 cores, but not the Xeon chips that we would expect to see in a workstation – and an Nvidia RTX A5000 GPU.

    Other specifications include up to 128GB of DDR5 memory, two PCIe 4.0 slots, up to 8TB of storage using plug-in M.2 cards, plus dual Ethernet and Thunderbolt 4 ports, and support for up to eight displays, the latter of which will please many professional users. Pricing is expected to start at $1,299 in the US.

    Continue reading
  • Will optics ever replace copper interconnects? We asked this silicon photonics startup
    Star Trek's glowing circuit boards may not be so crazy

    Science fiction is littered with fantastic visions of computing. One of the more pervasive is the idea that one day computers will run on light. After all, what’s faster than the speed of light?

    But it turns out Star Trek’s glowing circuit boards might be closer to reality than you think, Ayar Labs CTO Mark Wade tells The Register. While fiber optic communications have been around for half a century, we’ve only recently started applying the technology at the board level. Despite this, Wade expects, within the next decade, optical waveguides will begin supplanting the copper traces on PCBs as shipments of optical I/O products take off.

    Driving this transition are a number of factors and emerging technologies that demand ever-higher bandwidths across longer distances without sacrificing on latency or power.

    Continue reading
  • Intel is running rings around AMD and Arm at the edge
    What will it take to loosen the x86 giant's edge stranglehold?

    Analysis Supermicro launched a wave of edge appliances using Intel's newly refreshed Xeon-D processors last week. The launch itself was nothing to write home about, but a thought occurred: with all the hype surrounding the outer reaches of computing that we call the edge, you'd think there would be more competition from chipmakers in this arena.

    So where are all the AMD and Arm-based edge appliances?

    A glance through the catalogs of the major OEMs – Dell, HPE, Lenovo, Inspur, Supermicro – returned plenty of results for AMD servers, but few, if any, validated for edge deployments. In fact, Supermicro was the only one of the five vendors that even offered an AMD-based edge appliance – which used an ageing Epyc processor. Hardly a great showing from AMD. Meanwhile, just one appliance from Inspur used an Arm-based chip from Nvidia.

    Continue reading
  • Nvidia wants to lure you to the Arm side with fresh server bait
    GPU giant promises big advancements with Arm-based Grace CPU, says the software is ready

    Interview 2023 is shaping up to become a big year for Arm-based server chips, and a significant part of this drive will come from Nvidia, which appears steadfast in its belief in the future of Arm, even if it can't own the company.

    Several system vendors are expected to push out servers next year that will use Nvidia's new Arm-based chips. These consist of the Grace Superchip, which combines two of Nvidia's Grace CPUs, and the Grace-Hopper Superchip, which brings together one Grace CPU with one Hopper GPU.

    The vendors lining up servers include American companies like Dell Technologies, HPE and Supermicro, as well Lenovo in Hong Kong, Inspur in China, plus ASUS, Foxconn, Gigabyte, and Wiwynn in Taiwan are also on board. The servers will target application areas where high performance is key: AI training and inference, high-performance computing, digital twins, and cloud gaming and graphics.

    Continue reading
  • TSMC may surpass Intel in quarterly revenue for first time
    Fab frenemies: x86 giant set to give Taiwanese chipmaker more money as it revitalizes foundry business

    In yet another sign of how fortunes have changed in the semiconductor industry, Taiwanese foundry giant TSMC is expected to surpass Intel in quarterly revenue for the first time.

    Wall Street analysts estimate TSMC will grow second-quarter revenue 43 percent quarter-over-quarter to $18.1 billion. Intel, on the other hand, is expected to see sales decline 2 percent sequentially to $17.98 billion in the same period, according to estimates collected by Yahoo Finance.

    The potential for TSMC to surpass Intel in quarterly revenue is indicative of how demand has grown for contract chip manufacturing, fueled by companies like Qualcomm, Nvidia, AMD, and Apple who design their own chips and outsource manufacturing to foundries like TSMC.

    Continue reading
  • Nvidia taps Intel’s Sapphire Rapids CPU for Hopper-powered DGX H100
    A win against AMD as a much bigger war over AI compute plays out

    Nvidia has chosen Intel's next-generation Xeon Scalable processor, known as Sapphire Rapids, to go inside its upcoming DGX H100 AI system to showcase its flagship H100 GPU.

    Jensen Huang, co-founder and CEO of Nvidia, confirmed the CPU choice during a fireside chat Tuesday at the BofA Securities 2022 Global Technology Conference. Nvidia positions the DGX family as the premier vehicle for its datacenter GPUs, pre-loading the machines with its software and optimizing them to provide the fastest AI performance as individual systems or in large supercomputer clusters.

    Huang's confirmation answers a question we and other observers have had about which next-generation x86 server CPU the new DGX system would use since it was announced in March.

    Continue reading
  • Intel withholds Ohio fab ceremony over US chip subsidies inaction
    $20b factory construction start date unchanged – but the x86 giant is not happy

    Intel has found a new way to voice its displeasure over Congress' inability to pass $52 billion in subsidies to expand US semiconductor manufacturing: withholding a planned groundbreaking ceremony for its $20 billion fab mega-site in Ohio that stands to benefit from the federal funding.

    The Wall Street Journal reported that Intel was tentatively scheduled to hold a groundbreaking ceremony for the Ohio manufacturing site with state and federal bigwigs on July 22. But, in an email seen by the newspaper, the x86 giant told officials Wednesday it was indefinitely delaying the festivities "due in part to uncertainty around" the stalled Creating Helpful Incentives to Produce Semiconductors (CHIPS) for America Act.

    That proposed law authorizes the aforementioned subsidies for Intel and others, and so its delay is holding back funding for the chipmakers.

    Continue reading
  • Intel demands $625m in interest from Europe on overturned antitrust fine
    Chip giant still salty

    Having successfully appealed Europe's €1.06bn ($1.2bn) antitrust fine, Intel now wants €593m ($623.5m) in interest charges.

    In January, after years of contesting the fine, the x86 chip giant finally overturned the penalty, and was told it didn't have to pay up after all. The US tech titan isn't stopping there, however, and now says it is effectively seeking damages for being screwed around by Brussels.

    According to official documents [PDF] published on Monday, Intel has gone to the EU General Court for “payment of compensation and consequential interest for the damage sustained because of the European Commissions refusal to pay Intel default interest."

    Continue reading
  • Intel demos multi-wavelength laser array integrated on silicon wafer
    Next stop – on-chip optical interconnects?

    Intel is claiming a significant advancement in its photonics research with an eight-wavelength laser array that is integrated on a silicon wafer, marking another step on the road to on-chip optical interconnects.

    This development from Intel Labs will enable the production of an optical source with the required performance for future high-volume applications, the chip giant claimed. These include co-packaged optics, where the optical components are combined in the same chip package as other components such as network switch silicon, and optical interconnects between processors.

    According to Intel Labs, its demonstration laser array was built using the company's "300-millimetre silicon photonics manufacturing process," which is already used to make optical transceivers, paving the way for high-volume manufacturing in future. The eight-wavelength array uses distributed feedback (DFB) laser diodes, which apparently refers to the use of a periodically structured element or diffraction grating inside the laser to generate a single frequency output.

    Continue reading
  • Intel to sell Massachusetts R&D site, once home to its only New England fab
    End of another era as former DEC facility faces demolition

    As Intel gets ready to build fabs in Arizona and Ohio, the x86 giant is planning to offload a 149-acre historic research and development site in Massachusetts that was once home to the company's only chip manufacturing plant in New England.

    An Intel spokesperson confirmed on Wednesday to The Register it plans to sell the property. The company expects to transfer the site to a new owner, a real-estate developer, next summer, whereupon it'll be torn down completely.

    The site is located at 75 Reed Rd in Hudson, Massachusetts, between Boston and Worcester. It has been home to more than 800 R&D employees, according to Intel. The spokesperson told us the US giant will move its Hudson employees to a facility it's leasing in Harvard, Massachusetts, about 13 miles away.

    Continue reading

Biting the hand that feeds IT © 1998–2022