AlienALX
Well-known member
https://www.youtube.com/watch?v=AavuWT17X48&t=850s
Well worth a watch. One of his best videos to date.
Well worth a watch. One of his best videos to date.
It doesn't really matter, does it? all that will happen is the pig miners will just buy them all cheaper. It actually makes no sense, because now if some one thinks "Hey I can afford Vega and it's worth that much" they can't buy one any way.
Seriously, Vega is like Cartmanland..
"So much fun at Cartmanland, but you can't come !"
I love you for this. I properly laughed.
haha one of my all time faves that ep. So many Cartman classics..
You bet your fat clown ass it does !Oh wait no, the fat clown ass was from ladder to heaven.
A where were yow? when dey built that ladder to heaven? did you feel like cryin'? or did you think it was kinda gay?
hahahahahaha !!!
Good video from Gamers Nexus talking about HBM vs GDDR5.
https://youtu.be/p9ih5vmcDEk
Apparently GDDR5 would be about $120 cheaper per card for same amount of RAM and it looks like AMD was kinda forced to sticking with HBM for Vega because of power draw. HBM draws significantly less power than GDDR5 and Vega would suck down way too much juice without it.
I do find it a bit nauseating watching the reviewers get all high and mighty about pricing when they don't personally pay for any of the nice kit they play with. Anyone with multiple 1080s lying around on shelves is taking the P massively tbh (thinking J2C for example)
The main thing that GamerNexus misses out here is while they are looking for capacity parity with HBM2 what they should be looking for it bandwidth parity. IE, they need enough bandwidth to allow their GPU to run well.
Just costing the capacity is not good enough when it is the Bandwidth that is the limiting factor. There is a reason why Nvidia used GDDR5X over GDDR5. Then also consider the recent reports of memory overclocking on RX Vega performance, IE, AMD likes good memory bandwidth.
Long story short is that pricing needs to be done using a configuration that matches bandwidth, not capacity.
Tom, you've hit the nail on the head with this one, I couldn't agree more.
This year was an upgrade year for me, needing to replace my i5-2500K I pre-ordered a Ryzen 7 1800X which meant I suffered for a few months but it was nice seeing AMD stick it to Intel and lead to a complete shift in the market.
I decided to stick to my Crossfire Sapphire R9 290 Tri-X OC until AMD went ahead with Vega and possibly upgrade to single card since multi-card setups aren't getting love any longer. Unfortunately I had to take another road and since Vega didn't deliver on any front when compared to the competition, I went ahead and ordered myself a GTX 1080 Ti. I wasn't even interested in waiting for aftermarket cards. At the prices AMD are charging for reference the aftermarket pricing will be atrocious at best giving Nvidia a clear edge.
Sorry AMD but I don't think you've managed this one correctly. As a neutral buyer on all fronts, I'm passing on this one.
This perfectly encapsulates the general consensus of the 'average' tech-fiend and gamer right now who are neutral in the fight between Nvidia and AMD. Good comment.
After watching Gamers Nexus' video on the cost of Vega, it's clear that the decisions made years ago were detrimental to AMD, both in the past, right now, and for the foreseeable future. They need a huge change or else Fiji, Vega, they are the kind of GPU's we'll continue to get from AMD. Hot, power hungry, expensive to design and produce, slow to the market, only scraping by in raw FPS. I have high hopes for Navi only because of Ryzen's Infinity Fabric. And even then I still think Volta will be more powerful and more efficient because Intel are only behind by a few months (according to Coffee Lake leaks), and likely had the hardware to beat Ryzen but choose to milk consumers for another generation or two.
Indeed Intel got too greedy, I was in the market to upgrade for a year, around when Broadwell-E was released. I need the threads for content creation but the prices Intel wanted were just too steep for me. For the time being I will be sticking to AM4, it's working quite well for me.
In my eyes, Intel got lazy and figured they had this under control. I think there were even some official statements by Intel stating they weren't worried about the situation. Once Ryzen was released it seems like all hell broke loose at Intel and they're still in a continuous scramble and increased release rate to recover. This despite the release issues with Ryzen, it took me a good couple of months to bring the system into stability.
Nvidia on the other hand kept delivering with progress year after year instead of waiting for the competition to deliver something.
Time will tell what the Navi will bring but for the time being. The only fair argument I hear in favour of AMD is "Freesync is cheaper", which it is, but other than that, why bother?
The Freesync vs Gsync value debate is no longer as valid in my opinion. Gsync monitors may be €200 more expensive than their Freesync counterparts, but Nvidia's GPU's (in the high-end sector) are significantly better. Not only that but they come out quicker and with fewer teething problems; they come out exactly when the consumers want them. That makes Gsync the safer bet. If you have the money, buy Gsync/Nvidia. Don't bank on Radeon/Freesync. That's my personal opinion. I wish I had bought a 980/Gsync setup and then upgraded to a GTX 1080 instead of a Fury/Freesync setup. At the time the Fury/Freesync setup was slightly better value for money and offered more performance, but the 980Ti offered more performance for the money than the Fury X, and the 980Ti has been replaced for months now while the Fury X has been a good card, but not a good purchase. That's a lot of potential time enjoying a graphics card gone to waste. Sadly I couldn't afford the 980Ti/Gsync setup, but if I could then I would have.
From the majority of people I've spoken to, Gsync and Freesync are pretty much the same for the average user. Gsync is obviously better in every way besides cost, connectivity (Freesync works off of HDMI), and the fact that Intel supports it (not much of a selling point for us), but for the most part they're a very similar experience. If you have a high-end graphics card (1070/980Ti/Fury X/Vega 56 or above) and a 1440p/144Hz adaptive refresh rate monitor, you'd be hard pressed to tell the two apart.