AMD Raven Ridge Ryzen 3 2200G and Ryzen 5 2400G Review

What is the test bench setup?

HDMI 2.0? I got a HDMI only 4K TV. The 2200G will be the cheapest solution by far, that supports hdmi 2.0. The only native alternative is 1030/rx550, which is almost alone the cost of the 2200G.
 
no video Tom?

The Video is now in the review.

What is the test bench setup?

HDMI 2.0? I got a HDMI only 4K TV. The 2200G will be the cheapest solution by far, that supports hdmi 2.0. The only native alternative is 1030/rx550, which is almost alone the cost of the 2200G.

Test bench setup is on page 2 at the bottom.

HDMI 2.0 will be down to what your motherboard supports, as some AM4 motherboards only have HDMI 1.4 as far as I am aware.
 
The Video is now in the review.



Test bench setup is on page 2 at the bottom.

HDMI 2.0 will be down to what your motherboard supports, as some AM4 motherboards only have HDMI 1.4 as far as I am aware.

Thanks. The test bench probably has a typo, that confused me on the MB:

"AMD Ryzen 3 2200G & Ryzen 5 2400G
ASUS Prime Z370-PLUS
G.Skill Flare X 3200MHz memory
Corsair RM1000i
Corsair MP500 512GB
Corsair H110i GT
Windows 10"

That probably should be x370-PLUS, Z370-PLUS? Anyway, i missed it down there, so thanks for answering.

As for what motherboards supports, the specs are a mess:

X370 XPOWER GAMING TITANIUM

"-1 x HDMI™ 2.0 port, supports a maximum resolution of 4096x2160@60Hz (1)
-1 x DisplayPort, supports a maximum resolution of 4096x2160@60Hz (2)

(1) Only support when using a 7th Gen A-series/ Athlon™ processors
(2) Maximum shared memory of 2048 MB"


What does that mean?

GA-AX370-Gaming K7

" Integrated Graphics Processor:
1. 1 x HDMI port, supporting a maximum resolution of 4096x2160@24 Hz
* Support for HDMI 1.4 version.
2. Maximum shared memory of 2 GB

* Actual support may vary by CPU."


These specs can actually mean a lot of different things. Like being written for the Athlon CPUs available at MB launch or like for the K7, that they maybe but not clearly only support HDMI 1.4. It is clearly CPU dependent, and Athlons only support HDMI 1.4, so what is the story with the MSI board? The HDMI on the MSI is after all only available using an last gen Athlon?

No body seems to know, but a lot of people are opinionated. I would really appreciate it, if you could try it out.

As for HDMI 1.4, max refresh rate for 2D is 24Hz.
 
Thanks. The test bench probably has a typo, that confused me on the MB:

"AMD Ryzen 3 2200G & Ryzen 5 2400G
ASUS Prime Z370-PLUS
G.Skill Flare X 3200MHz memory
Corsair RM1000i
Corsair MP500 512GB
Corsair H110i GT
Windows 10"

That probably should be x370-PLUS, Z370-PLUS? Anyway, i missed it down there, so thanks for answering.

As for what motherboards supports, the specs are a mess:

X370 XPOWER GAMING TITANIUM

"-1 x HDMI™ 2.0 port, supports a maximum resolution of 4096x2160@60Hz (1)
-1 x DisplayPort, supports a maximum resolution of 4096x2160@60Hz (2)

(1) Only support when using a 7th Gen A-series/ Athlon™ processors
(2) Maximum shared memory of 2048 MB"


What does that mean?

GA-AX370-Gaming K7

" Integrated Graphics Processor:
1. 1 x HDMI port, supporting a maximum resolution of 4096x2160@24 Hz
* Support for HDMI 1.4 version.
2. Maximum shared memory of 2 GB

* Actual support may vary by CPU."


These specs can actually mean a lot of different things. Like being written for the Athlon CPUs available at MB launch or like for the K7, that they maybe but not clearly only support HDMI 1.4. It is clearly CPU dependent, and Athlons only support HDMI 1.4, so what is the story with the MSI board? The HDMI on the MSI is after all only available using an last gen Athlon?

No body seems to know, but a lot of people are opinionated. I would really appreciate it, if you could try it out.

As for HDMI 1.4, max refresh rate for 2D is 24Hz.

Sorry, typo we used the ASUS Prime B350 Plus.

I will get Tom to send a few questions AMD's way. See if we can get some answers for the HDMI port stuff alongside a few other things we want to be answered.
 
Very impressive. I don't see much reason to recommend the 1300x or 1500x anymore. You basically pay the same amount and get a free igpu but better memory support.

For 720p gaming this is perfect.
For 1080p Esport games it will shred them. Still not AAA capable really for most games but still it would be around console levels which says a lot about how far we have come.

AMD and their Ryzen architecture is really turning the company around.
 
Great CPU, not so great APU. I know even *I* said not to expect too much but I gotta admit I was expecting more than this. Yes the GPU on it kills Intel but IMO not by enough. Some one pointed out today that you can get a Pentium and 1030 for around the same price that does the same thing. Now sure, Pentium won't be as good going into the future.

IDK man, AMD have all of that technology and we get this.. It's barely an APU.
 
Great CPU, not so great APU. I know even *I* said not to expect too much but I gotta admit I was expecting more than this. Yes the GPU on it kills Intel but IMO not by enough. Some one pointed out today that you can get a Pentium and 1030 for around the same price that does the same thing. Now sure, Pentium won't be as good going into the future.

IDK man, AMD have all of that technology and we get this.. It's barely an APU.

Think you should do more research. Because it definitely is better than anything else at this price.
 
Think you should do more research. Because it definitely is better than anything else at this price.

I didn't say it wasn't, dude. I just said with the technology they have it could have been miles better. Is that not a fair assumption? they have Vega all the way up to Vega 64 and we get about 1/10. I guess my point is that they really should be pushing on, now. IE - making viable gaming APUs that don't suffer in any shape or form. They have the power on both techs to make a very good APU but instead we get these budget bin things.

Is the price great? yes, definitely. However, they barely manage 1080p at the lowest settings and some people have even mentioned 720p which is utter blasphemy in this day and age. I would rather the thing cost £300 but does the job of a £500 CPU/GPU combo. What they released today is a mere taster of what they could be doing.
 
they look sweet. Overwatch being playable is CRAZY for intergrated graphics... the laptop version of the 2200G is going to be a really good cheep for a low tier "gaming" laptop. By the way, if you check on ebay there are some tower coolers that use the latch mechanism AM3 and AM4 uses, so they should be compatible, they aren't too expensive, performance isn't going to be epyc but I do believe they are going to fare batter than the stock cooler
 
I didn't say it wasn't, dude. I just said with the technology they have it could have been miles better. Is that not a fair assumption? they have Vega all the way up to Vega 64 and we get about 1/10. I guess my point is that they really should be pushing on, now. IE - making viable gaming APUs that don't suffer in any shape or form. They have the power on both techs to make a very good APU but instead we get these budget bin things.

Is the price great? yes, definitely. However, they barely manage 1080p at the lowest settings and some people have even mentioned 720p which is utter blasphemy in this day and age. I would rather the thing cost £300 but does the job of a £500 CPU/GPU combo. What they released today is a mere taster of what they could be doing.

So you're expecting a 65watt TDP die to perform at basically 60fps at better than low settings for only $180?

It is miles ahead of everything else. That's what you are overlooking. Instead of looking at what it is you are looking at an unrealistic expectation to be done now instead of the future which is where they will be.. in the future.

The fact that it makes a GT1030 give a run for it's money for cheaper than a 1500x, and is better than a 1500x alongside better memory speed and support is massive.
If you don't think so then really you're just being critical for the sake of being so. It's more than twice as fast as the APU it replaced, probably close to 500%. Not sure how much more it needs to be to impress you.
 
So you're expecting a 65watt TDP die to perform at basically 60fps at better than low settings for only $180?

It is miles ahead of everything else. That's what you are overlooking. Instead of looking at what it is you are looking at an unrealistic expectation to be done now instead of the future which is where they will be.. in the future.

The fact that it makes a GT1030 give a run for it's money for cheaper than a 1500x, and is better than a 1500x alongside better memory speed and support is massive.
If you don't think so then really you're just being critical for the sake of being so. It's more than twice as fast as the APU it replaced, probably close to 500%. Not sure how much more it needs to be to impress you.

Nope I am saying crank the TDP to 120w and give us something useful.

We all know that TDP are BS. As soon as you move the multiplier up even one notch (and why wouldn't you? they are unlocked) you basically tear up that spec and throw it from the window.

I7 950 ate 200w at full pelt @ 4ghz. Modern Kaby X = the same. No matter what the TDP of an overclockable CPU you are going to spunk it up the wall as soon as you rag on it, which of course you will, because you paid for it.

Maybe it is time AMD took a risk and changed the format to something in between ITX and MATX, used laptop memory and gave us an APU that kicks butt?

They have the technology, they know how to use it (IF) so why are they making lame things like this? surely they can see the market of an APU that is good enough to make you not even want to bother with a GPU.
 
Doesn't even seem like you did any research on reviews for it.
If you also think TDP is useless then that's your issue too. We are talking about TDP for AMDs architecture. Not comparing to Intel. And in AMDs terms even looking online with other reviews, these consume little even OC'd.
There's really no point in talking about it because you are so clearly set on it being lame.
 
Doesn't even seem like you did any research on reviews for it.
If you also think TDP is useless then that's your issue too. We are talking about TDP for AMDs architecture. Not comparing to Intel. And in AMDs terms even looking online with other reviews, these consume little even OC'd.
There's really no point in talking about it because you are so clearly set on it being lame.

RE - the last sentence no, no not at all man. I do think it's kinda lame (note kinda) but I expected it to be that way any way, just not this much lame.

However ! after spending about two hours last night watching videos it's rather clear that the whole thing is a bit of a dog's mess ATM and could actually be a lot better than I think it is because of the fact it's so buggy. The Jay video is like a horror show. BLSODs, hard locking, freezing, massive input lag etc.

So it may get better over the coming weeks. I'm only disappointed because I want one. I'm going to build something soon. I'm not quite sure yet but it's a toss up between a Sega AM Dev unit that runs all of the AM3 games (like Daytona 2, Star Wars and etc) on a TV or "The G.E.C.K" (it's from Fallout 3, a part to sort the filters in a vault, but also known as the garden of eden creation kit for making mods) which will be a portable computer with a flip up screen (not a laptop, for the obs reasons). I've had this lovely little 7" touch screen sitting here for what feels like forever, would be amazing to make a portable computer I can play FO4 on :D
 
Remember guys that the Raven Ridge die is almost as large as Summit Ridge, 8-core Ryzen.

While AMD could theoretically go bigger, it would cause costs to go up significantly and make the use of DRAM as VRAM problematic. A larger APU would need HBM on the CPU, which then will probably make the package too big for AM4 nevermind the problems with a required interposer.

The only way to work around DRAM limitations without HBM is to go the console route and use an insanely wide bus, with the Xbox One using what is equivalent to quad-channel DDR3 memory. The Xbox One X ups this even further. This would require a new socket, which is again a problem.

While AMD could make a bigger APU, that doesn't mean that they should.
 
AMD's plan at one point was to get Fusion (or APUs) to a point where you would not need a GPU. At all. That was their future, and a good one IMO. Think about all of that material on a GPU other than just the core that you pay for each time you buy one.

And talking of consoles? same again. AMD's aim was to get into all of the consoles and then make it so that their GPUs and eventually APUs would be enough to run any game at the highest setting, 'cause if the console can do it then why can't an APU twice as powerful as that?

I still think it is the future. Maybe at some point Nvidia will invest into a CPU architecture and might do it themselves. They've had plenty of practice with Tegra.
 
Back
Top