Personal View site logo
AMD RX Vega Lineup
  • AMD is expected to unveil its upcoming Vega based graphics cards at this year’s Computex on 31st May.

    First card will be called the RX Vega Core which will start at $399. It will deliver performance on par with or better than Nvidia’s GeForce GTX 1070.

    RX Vega Eclipse which will be priced at $499 and will compete head to head with the GTX 1080.

    RX Vega Nova will be the Big Vega that will retail at $599 and rival the GTX 1080 Ti.

    http://digiworthy.com/2017/05/14/amd-rx-vega-lineup-fastest-vega-nova/

  • 32 Replies sorted by
  • Some more details

    image

    image

    sample199.jpg
    724 x 479 - 61K
    sample198.jpg
    740 x 410 - 35K
  • It's quite a jump to go from the $200 RX580 to the $400 'Core'. I think they should target the $325-350 price point if they want to shift more mainstream users to the enthusiast segment. If they can deliver GTX 1080 or better performance for $399, that would also be interesting.

  • @Tron

    Well, RX 580 is $260-280 for 8G versions. And AMD has issues with supply due to lot of idiots producing heat and mining various electronic currencies.

  • According to their investor slide they generally see Polaris serving the $200 MSRP market and Vega seems to be primed to serve $200+ (it should probably be $300+).

    I'm gonna take a guess we'll see 'Core' at $299-$329, Eclipse at $399, Nova at $499-$529 and Duo (Binary?) at $999.

    Sorry, I'm not sure how to shrink this slide but here it is -

    Margin Potential

  • More on Vega

    image

  • Apple today revealed that Vega will have 2x performance in half precision mode compared to 32bit single precision mode.

    It can affect how video editors are designed, as up to this time it has pointless to use anything below 32bits for GPU accelerated effects.

  • According to the Mac Pro option list, there will be a choice of either Radeon Pro Vega 56 @ 11 TF (8 cut down or disabled CUs) or the full Radeon Pro Vega 64 @ 13 TF.

    If the 56 CU GPU turns out to be the 'budget' enthusiast card that comes to the retail market, it could have a lot of bang for the buck.

  • image

    For many years GPUs have been optimized to process 32-bit floating point and integer data types, since these were best suited for standard 3D graphics rendering tasks. However as GPU workloads and rendering techniques have become more diverse, this one-size-fits-all approach is no longer always the best one. The processing units at the core of the “Vega” GPU architecture have been updated to address this new reality.

    Next-Gen Compute Units (NCUs) provide super-charged pathways for doubling processing throughput when using 16-bit data types.1 In cases where a full 32 bits of precision is not necessary to obtain the desired result, they can pack twice as much data into each register and use it to execute two parallel operations. This is ideal for a wide range of computationally intensive applications including image/video processing, ray tracing, artificial intelligence, and game rendering.

    New intelligent power management technologies adjust to your workload, providing thermal headroom to optimize peak performance and system acoustics. Advanced power agility features enable optimized performance during bursty workloads that are common in professional applications.

    sample453.jpg
    750 x 750 - 101K
  • this looks wise

  • image

    sample661.jpg
    650 x 437 - 48K
  • I pay the 1k i dont care for the noise even 2.

  • Sources say that heat produced can overcome famous 290X.

    Gaming cards will consume in stock 300-370W and small overclock makes it above 400W.

  • I hope their reference gaming cards don't have the crappy blower style fan... will have to wait for the customs. I'll stay nice and warm in the winter, either way.

  • @Tron

    Who knows.

    One thing that I know is that guys will need to add another 20 nodes of extreme blur and such to Davinci benchmarks :-) I mean to simulate "real" projects.

  • @Vitaliy_Kiselev Ha... that does seem to be the new standard in Resolve benchmarks - Give us more blur FPS or that card is total shit. More blur and more grain!

  • No surprises

    image

    sample739.jpg
    800 x 140 - 36K
  • I watched the PC Perspective Vega benchmark livestream and it was interesting to see how closely FE scales with last gen Fury X... practically clock for clock. There doesn't seem to be any noticeable architecture gains associated with the gaming benchmarks - unless AMD are holding back some amazing set of gaming drivers just for the RX Vega reveal.

    The clocks hang at around 1440 Mhz due to throttling on an open test bench in an air conditioned room. Power draw is just under 300 watts. By comparison, the GTX 1080 has near identical FPS numbers while only using 180W of power.

    Vega also doesn't appear to be particularly good at mining... barely faster than Polaris 10/12 but with exactly twice the power requirement.

    A "4K card for $400" sales pitch seems like the best way to salvage this thing from a marketing standpoint. Otherwise folks without freesync displays may opt to pay the $499 for a GTX 1080 knowing they'll make up the difference in power savings in the long run.

    AMD is lucky from the standpoint that many gamers are now in a wedge because essentially no capable new or used cards are available for less than $400. They will essentially be forced to buy an inefficient GPU if it is all that is available on the market. You couldn't engineer a better set of circumstances to help bail them out of this blunder.

  • @Tron

    If you ask me, looking for GPU sales charts it is quite clear that both companies had been very interested in all craze.

    And I am sure they provided clear instructions to sponsored sites, channels (around 80% of all big ones) to cover crypto topic in specific sense.

  • Yeah, it's definitely gaining notice from both managers and investors alike. I'm a little worried we could end up with a sharp divergence between gaming and mining cards in the future that results in the best silicon going toward mining.

    Here's the PC PER review for the Frontier Edition w/benchmarks. https://www.pcper.com/reviews/Graphics-Cards/Radeon-Vega-Frontier-Edition-16GB-Air-Cooled-Review

  • image

    image

    Seems exactly like scaled up 580, performance per ACE is same.

    sample761.jpg
    600 x 378 - 34K
    sample762.jpg
    601 x 340 - 22K
  • Power consumption

    image

    Games

    image

    sample782.jpg
    735 x 373 - 52K
    sample783.jpg
    800 x 408 - 61K
  • More news before announcement

    image

    AMD will be forced to drop clock rates and main cards will be not top XTX.

    sample875.jpg
    745 x 134 - 41K
  • I assume the custom cards will be 2.5+ slots thick and have longer length coolers in order to push higher clocks or at least to maintain the boost clocks on the air-cooled. Kind of hard to cheat physics with those TDPs. Maybe it's worth just going for the AIO model.