Intel is going to start selling discrete GPUs in 2020

  • Intel is going to start selling discrete GPUs in 2020

Intel is going to start selling discrete GPUs in 2020

With Intel creating its own line of GPUs, gamers should have more choice when it comes to buying the best graphics card for their rigs.

Intel also followed up with a tweet featuring Raja's lovely, beaming face and a reassertion that its first discrete GPU was coming in 2020. Back in November, the company announced the hiring of Raja Koduri - formerly of AMD - and stated that he would "expand Intel's leading position in integrated graphics for the PC market with high-end discrete graphics solutions for a broad range of computing segments".

After Intel hired AMD's top Radeon architect, Raja Koduri, past year the chip maker flagged plans to deliver its own high-end discrete graphics cards.

Koduri now serves as Intel's chief architect and SVP of the newly created Core and Visual Computing Group. The information was revealed by Intel CEO Brian Krzanich during an analyst event last week. Intel may still just about hold sway in the CPU market, but they're now a long way behind in the graphics game. Shenoy acknowledged that the company will be introducing GPU products for both the data center and the client markets.

Shrout notes that Intel's 2020 plans are ambitious given the three-year development cycle for complex design and the need to build it from scratch.

That nearly means Intel can't really take a particularly radical approach to its new GPU because they'll need to be as familiar as possible to encourage developers to code for them.

Though 2020 is a long way off, the above news should put AMD and Nvidia on notice.

Intel confirmed that its first discrete GPU will be arriving in 2020 via Twitter.

However, the 350nm GPU, co-designed with Lockheed Martin spinoff Real3D, was ultimately a failure because of poor performance and soon disappeared from sight.

Intel's new technology promises to slash the amount of power that laptop displays consume.

The chip, which is smaller than a pencil eraser, could lend itself to dramatic scalability for future computing scenarios. We often hear Nvidia and AMD talk about similar use cases for GPUs.