back to article Intel forms graphics lab to make games look more real

With Intel starting to get serious about the discrete GPU market, the chipmaker has put together a research group that is pledging to improve the "entire field" of graphics, and that includes making games look even more realistic. Intel announced Wednesday it has formed the appropriately named Graphics Research Organization …

  1. Lucy in the Sky (with Diamonds)

    Tetris comes to mind...

    In Tetris, the biggest thing that I was missing were photo realistic moulds on the bricks that grew in real time...

    In Zork I have always felt that the text could have been smoother, with more rounded edges...

    Then again, in Crysis I just enjoyed shooting trees in half...

    1. vekkq

      Re: Tetris comes to mind...

      Sorry, but splitting Crysis trees is about physics, not graphics. We don't do that here. Photorealism in this lab only - spell pho-to.

  2. vekkq

    Thanks, but graphics are already as realistic as wanted

    Game CGI is on purpose shinier-than-real-life. Making it more realistic would be a downgrade.

    1. Pascal Monett Silver badge

      Re: Thanks, but graphics are already as realistic as wanted

      I would tend to agree. It's a game. It doesn't need to be 100% photorealistic.

      It does, however, need to be optimised to not create lag simply because there's too much to render.

      When I'm playing Minecraft, or Diablo III, I'm not interested in realistic. When I'm playing 7 Days to Die, I find the world quite realistic enough.

      We're good on realism, I think. Let's get this thing optimized even more to up the framerates. There's never enough of those.

      1. Version 1.0 Silver badge
        Happy

        Re: Thanks, but graphics are already as realistic as wanted

        Adventure, (aka "Colossal Cave Adventure") was the first game I ever played on a computer, I just loved reading and interacting with the game on my VT100 connected to the PDP-11, and writing this comment I have so many memories of the glorious images that I interacted with while creating a map of the cave on a pad of paper. I wonder if the new imaging system will compete with my memory?

  3. Plest Silver badge
    Headmaster

    Hyperrealism is not good for us

    I was watching a vid the other day and the guy was saying that modern games are not fun anymore for one reason, the graphics are too good. He said our brains aren't capable of taking in so much graphic info at once and they struggle in tense situations which games create and we're constantly get distracted by amazing details. The reasons older games and styles appeal is the lack of detail allows you concentrate on having fun and enjoying the experience of a game. Modern games cause us to have to slow right down, which not only ruins the fun but also allows game studios to fudge the game play and stretch it out, which is one reason why those who can ignore details often find ways to complete modern games in an hour or two while the rest of us wil take aroudn 30 hours to complete most modern game titles. He added that the modern gaming industry is obsessed with graphic detail over anything else, indy game studios are scoring big as they don't have the budgets but they make up for it with better experiences that are not graphics obsessed.

  4. Stuart Castle Silver badge

    I like good graphics. I like a good story. I like interesting and varied gameplay. Not a big fan of the battle royale games, or anything multiplayer.

    While I would love to see more realistic graphics (particularly human faces which most games don't get quite right yet) , there needs to be some balance. I've played games where the main characters are absolutely beautifully animated, but some of the NPCs are not. I know why it is. Even with a today's computer hardware and game sizes nudging half a terabyte, they reduce animations on minor characters to save space. They may also do it for budgetary reasons, to reduce the time spent creating the animations (and possibly the number of animators).

    It's a bit simplistic to say modern games aren't fun. They are like any games. Some are, some aren't. I've enjoyed hundreds of hours of gameplay in some games, an have completed the storyline of some of my games many times. I also enjoy the odd arcade style game I can just pick up and play without devoting hours to completing a mission.

    As for speedrunning games. Never seen the point of that. When I buy a game, I want more than a few hours entertainment. But then I've never been interested in notoriety or fame, and don't have (or want) a youtube/twitch channel I am trying to sell, so the fame resulting from speedrunning any game does not interest me.

    1. Filippo Silver badge

      > It's a bit simplistic to say modern games aren't fun. They are like any games. Some are, some aren't.

      Yup. And, just like for anything else, we tend to remember the best ones and forget the dross.

      I think it is true, however, that top-quality graphics is overrated, as a factor in a game's success.

  5. BGatez

    lipstick on pigs

    Better looking crap games with endless $ buy-ins... yay?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like

  • Lenovo reveals small but mighty desktop workstation
    ThinkStation P360 Ultra packs latest Intel Core processor, Nvidia RTX A5000 GPU, support for eight monitors

    Lenovo has unveiled a small desktop workstation in a new physical format that's smaller than previous compact designs, but which it claims still has the type of performance professional users require.

    Available from the end of this month, the ThinkStation P360 Ultra comes in a chassis that is less than 4 liters in total volume, but packs in 12th Gen Intel Core processors – that's the latest Alder Lake generation with up to 16 cores, but not the Xeon chips that we would expect to see in a workstation – and an Nvidia RTX A5000 GPU.

    Other specifications include up to 128GB of DDR5 memory, two PCIe 4.0 slots, up to 8TB of storage using plug-in M.2 cards, plus dual Ethernet and Thunderbolt 4 ports, and support for up to eight displays, the latter of which will please many professional users. Pricing is expected to start at $1,299 in the US.

    Continue reading
  • Nvidia wants to lure you to the Arm side with fresh server bait
    GPU giant promises big advancements with Arm-based Grace CPU, says the software is ready

    Interview 2023 is shaping up to become a big year for Arm-based server chips, and a significant part of this drive will come from Nvidia, which appears steadfast in its belief in the future of Arm, even if it can't own the company.

    Several system vendors are expected to push out servers next year that will use Nvidia's new Arm-based chips. These consist of the Grace Superchip, which combines two of Nvidia's Grace CPUs, and the Grace-Hopper Superchip, which brings together one Grace CPU with one Hopper GPU.

    The vendors lining up servers include American companies like Dell Technologies, HPE and Supermicro, as well Lenovo in Hong Kong, Inspur in China, plus ASUS, Foxconn, Gigabyte, and Wiwynn in Taiwan are also on board. The servers will target application areas where high performance is key: AI training and inference, high-performance computing, digital twins, and cloud gaming and graphics.

    Continue reading
  • Intel delivers first discrete Arc desktop GPUs ... in China
    Why not just ship it in Narnia and call it a win?

    Updated Intel has said its first discrete Arc desktop GPUs will, as planned, go on sale this month. But only in China.

    The x86 giant's foray into discrete graphics processors has been difficult. Intel has baked 2D and 3D acceleration into its chipsets for years but watched as AMD and Nvidia swept the market with more powerful discrete GPU cards.

    Intel announced it would offer discrete GPUs of its own in 2018 and promised shipments would start in 2020. But it was not until 2021 that Intel launched the Arc brand for its GPU efforts and promised discrete graphics silicon for desktops and laptops would appear in Q1 2022.

    Continue reading
  • Intel is running rings around AMD and Arm at the edge
    What will it take to loosen the x86 giant's edge stranglehold?

    Analysis Supermicro launched a wave of edge appliances using Intel's newly refreshed Xeon-D processors last week. The launch itself was nothing to write home about, but a thought occurred: with all the hype surrounding the outer reaches of computing that we call the edge, you'd think there would be more competition from chipmakers in this arena.

    So where are all the AMD and Arm-based edge appliances?

    A glance through the catalogs of the major OEMs – Dell, HPE, Lenovo, Inspur, Supermicro – returned plenty of results for AMD servers, but few, if any, validated for edge deployments. In fact, Supermicro was the only one of the five vendors that even offered an AMD-based edge appliance – which used an ageing Epyc processor. Hardly a great showing from AMD. Meanwhile, just one appliance from Inspur used an Arm-based chip from Nvidia.

    Continue reading
  • TSMC may surpass Intel in quarterly revenue for first time
    Fab frenemies: x86 giant set to give Taiwanese chipmaker more money as it revitalizes foundry business

    In yet another sign of how fortunes have changed in the semiconductor industry, Taiwanese foundry giant TSMC is expected to surpass Intel in quarterly revenue for the first time.

    Wall Street analysts estimate TSMC will grow second-quarter revenue 43 percent quarter-over-quarter to $18.1 billion. Intel, on the other hand, is expected to see sales decline 2 percent sequentially to $17.98 billion in the same period, according to estimates collected by Yahoo Finance.

    The potential for TSMC to surpass Intel in quarterly revenue is indicative of how demand has grown for contract chip manufacturing, fueled by companies like Qualcomm, Nvidia, AMD, and Apple who design their own chips and outsource manufacturing to foundries like TSMC.

    Continue reading
  • Nvidia taps Intel’s Sapphire Rapids CPU for Hopper-powered DGX H100
    A win against AMD as a much bigger war over AI compute plays out

    Nvidia has chosen Intel's next-generation Xeon Scalable processor, known as Sapphire Rapids, to go inside its upcoming DGX H100 AI system to showcase its flagship H100 GPU.

    Jensen Huang, co-founder and CEO of Nvidia, confirmed the CPU choice during a fireside chat Tuesday at the BofA Securities 2022 Global Technology Conference. Nvidia positions the DGX family as the premier vehicle for its datacenter GPUs, pre-loading the machines with its software and optimizing them to provide the fastest AI performance as individual systems or in large supercomputer clusters.

    Huang's confirmation answers a question we and other observers have had about which next-generation x86 server CPU the new DGX system would use since it was announced in March.

    Continue reading
  • Will optics ever replace copper interconnects? We asked this silicon photonics startup
    Star Trek's glowing circuit boards may not be so crazy

    Science fiction is littered with fantastic visions of computing. One of the more pervasive is the idea that one day computers will run on light. After all, what’s faster than the speed of light?

    But it turns out Star Trek’s glowing circuit boards might be closer to reality than you think, Ayar Labs CTO Mark Wade tells The Register. While fiber optic communications have been around for half a century, we’ve only recently started applying the technology at the board level. Despite this, Wade expects, within the next decade, optical waveguides will begin supplanting the copper traces on PCBs as shipments of optical I/O products take off.

    Driving this transition are a number of factors and emerging technologies that demand ever-higher bandwidths across longer distances without sacrificing on latency or power.

    Continue reading
  • Arm jumps on ray tracing bandwagon with beefy GPU design
    British chip designer’s reveal comes months after mobile RT moves by AMD, Imagination

    Arm is beefing up its role in the rapidly-evolving (yet long-standing) hardware-based real-time ray tracing arena.

    The company revealed on Tuesday that it will introduce the feature in its new flagship Immortalis-G715 GPU design for smartphones, promising to deliver graphics in mobile games that realistically recreate the way light interacts with objects.

    Arm is promoting the Immortalis-G715 as its best mobile GPU design yet, claiming that it will provide 15 percent faster performance and 15 percent better energy efficiency compared to the currently available Mali-G710.

    Continue reading
  • Intel withholds Ohio fab ceremony over US chip subsidies inaction
    $20b factory construction start date unchanged – but the x86 giant is not happy

    Intel has found a new way to voice its displeasure over Congress' inability to pass $52 billion in subsidies to expand US semiconductor manufacturing: withholding a planned groundbreaking ceremony for its $20 billion fab mega-site in Ohio that stands to benefit from the federal funding.

    The Wall Street Journal reported that Intel was tentatively scheduled to hold a groundbreaking ceremony for the Ohio manufacturing site with state and federal bigwigs on July 22. But, in an email seen by the newspaper, the x86 giant told officials Wednesday it was indefinitely delaying the festivities "due in part to uncertainty around" the stalled Creating Helpful Incentives to Produce Semiconductors (CHIPS) for America Act.

    That proposed law authorizes the aforementioned subsidies for Intel and others, and so its delay is holding back funding for the chipmakers.

    Continue reading
  • Intel demos multi-wavelength laser array integrated on silicon wafer
    Next stop – on-chip optical interconnects?

    Intel is claiming a significant advancement in its photonics research with an eight-wavelength laser array that is integrated on a silicon wafer, marking another step on the road to on-chip optical interconnects.

    This development from Intel Labs will enable the production of an optical source with the required performance for future high-volume applications, the chip giant claimed. These include co-packaged optics, where the optical components are combined in the same chip package as other components such as network switch silicon, and optical interconnects between processors.

    According to Intel Labs, its demonstration laser array was built using the company's "300-millimetre silicon photonics manufacturing process," which is already used to make optical transceivers, paving the way for high-volume manufacturing in future. The eight-wavelength array uses distributed feedback (DFB) laser diodes, which apparently refers to the use of a periodically structured element or diffraction grating inside the laser to generate a single frequency output.

    Continue reading

Biting the hand that feeds IT © 1998–2022