Shouldn't concern AMD on the tech side of things, financial and planning it may be a real pain. Especially if it has a knock-on effect with the likes of mobo manufacturers. Theory being there, if the tale is true enough and there isn't that great a market for AMD cpus, we could see mobo manufacturers not seeing it viable to produce them either.
I wouldn't hope for it to be true, but the sales of Intel i5/7 cpus can't be great, and although decent enough cpus, I can't imagine AMD's figures being compareable. Problem here is the lack of advancement in tech.
nVidia, with this cpugpu theory going on, is exactly what I hoped AMD were going to do when they acquired ATI. They had/have the idea inhouse relationship in place of the whole backbone of a gaming rig (with maybe a little help from Realtek and any mobo manufacturer). They can either break away from standards or develope technical links, that an AMD cpu and AMD gpu can do that the competition have absolutely no-chance of competing with, merely based on architecture. I questioned the chances of them doing this at the moment they aquired ATI but they seriously don't appear to want to go down that road.
With the likes of gaming developers having nVidia travel to their company to assist is the dev of gaming's future, which costs nVidia millions each year - apparently their only department that is designed specifically not to make money - AMD are being invited by the games devs but declining. I can't see that being good for them as these things will create ongoing relationships. (to google this, u need to look for nVidia using stiff-arm, corporate, tactics to BLOCK AMD from any gaming dev ! - which ofc turned out not to be the case when the game devs were interviewed, but that's par for the course)
Not good for the industry as a whole, or the enthusiast, as it could
lead to the 2 camps building bigger and bigger bridges. A situation where AMD/nVidia(Intel) pcs would be like XBox/Sony. AMD's bread and butter being cpus, would they even keep ATI as an interest ? Very speculative and very forward thinking.
nVidia could well be at a stage now where a 'game' could be dev'd that only needs the mobo's cpu to handle launching of the OS and i/o to components. Further on, I can envisage this being less and less required. Spend big bucks on ur nVidia gpu, and the cpu doesn't really matter cos it doesn't impact the game at-all. I can see now, in geek bedrooms all over the world, people trying to run nix on the likes of a gt300 in the future. B4 the Amiga ideal died it's popular death, this was very close to what happened to it's tech. U launched the Amiga with it's 68k20 cpu, then the addon PPC603/604 card kicked in and took over. (this is ofc decades ago, but even with 32 bit back then it was do-able) Myself I was able to run Redhat and the Mac flavor of the time, 8/9.x I think.
Think the biggest problem for the industry came around 2006 - the lack of advancements. It's not leading to the mass of upgrades we used to experience. There no ~real~ need for upgrades other than ePeen. 20 more fps in ur already 80fps game ? why ? and for hundreds of ££££ ?
The upgrade for the experienced enthusiast now I think comes from curiosity. That's not going to produce massive sales tho.