In a recent interview, Raja Koduri, the head of Intel’s Core and Visual Computing Group, explains what convinced him to take the plunge and move away from AMD’s Radeon Technologies Group to new pastures at Intel. And no, it’s not just for the money. Although Intel does have a lot of that, too.
Raja Koduri has a star-studded CV. The one time Apple graphics director turned AMD RTG boss took a sabbatical from the red team’s service shortly after the AMD Vega architecture launched back in 2017. And it wasn’t long afterwards that it was announced Koduri would be joining Intel to head up the its efforts in the discrete graphics card game.
“When I took a break from AMD my thinking was what am I going to do for the next 10 years?” Koduri says to Barron’s. “Where I landed was we’re in the middle of a data explosion. The amount of data that is being generated in the world is way more than our ability to analyze, understand, and process. What are the technologies that are required for us to be able to keep up with the data and actually do some beautiful, amazing things with this data?
“As I was thinking through that and the elements that are required, the core pieces of technology required, and which company has these assets, these people, these resources, the only company that checked my list was Intel.”
The competition: These are the best graphics cards
But Intel doesn’t have a totally clean slate when it comes to graphics tech. And Koduri is keen to fend off naysayers harkening back to the days of Larrabee: Intel’s first venture into discrete graphics that ended up in the scrapheap before ever seeing the light of day.
“…one of the things that most people externally don’t realise is that Intel has been doing graphics for a long, long time,” Koduri continues. “We have 4,500 people. As you know, graphics is as much about software as it is about hardware.
“I want to set the record straight that Intel has a world-class design team sitting here. What I’m doing is helping them figure out how to build products that scale up from the low power, mobile domain up to petaflops—the big data center GPUs.”
Don’t just take any job. Join a movement. https://t.co/JH0TqASGtE #JobsatIntel #IntelCareer pic.twitter.com/y8hBPdRF88
— Intel Graphics (@IntelGraphics) January 29, 2019
Intel confirmed its upcoming Intel Xe graphics will be built upon the building blocks laid out by its integrated Gen11 graphics – set to arrive for the first time within 10nm Ice Lake CPUs. Koduri’s lofty task is to somehow squeeze that silicon into one, power-efficient graphics product and modernise Intel’s graphics driver efforts to unleash that power effectively on client’s machines.
Koduri’s confidence resonates with similar spiel from Intel’s head of gaming, Frank Soqui.
“We’re leaders in intergrated graphics, so we’re not new to graphics,” Soqui says to PCGamesN at IEM Katowice. “It’s not like we woke up and said ‘oh my gosh, if only we could learn graphics.’ So how do you apply what you do with integrated graphics and discrete graphics? I don’t think that’s going to be the challenge.”
But will Intel be able to stack up against AMD and Nvidia in its debut heavyweight bout? Chipzilla is as hopeful as ever, and the industry similarly hopes for a positive outcome to energise the market, but benchmark results speak louder than words and Intel still has a lot to prove.