8
lxmcf
6y

Holy fuck AMD really have made a comeback... NVIDIA is admitting they lost (how I interpret this, no flame) and then we have Intel literally running their already fragmented CPU range into a ditch of mismatched specs and just brute forcing higher cord counts on an almost out of date manufacturing process...

And now AMD is also releasing new mobile chipsets with integrated Vega graphics (on the zen based chips) that are already outperforming the closest Intel competitor on thermal and raw performance...

What the fuck!!!

AMD might actually become Intel and NVIDIA rolled into one the way they are going in my eyes...

Comments
  • 0
    The main reason I was considering AMD instead of nVidia was freesync, now it's a tougher choice...
  • 3
    NVidia is still light years ahead for neural net GPU stuff.
  • 1
    was looking to build a pc using the athlon 200ge
  • 1
    I don't get it. IIRC FreeSync was AMDs attempt, three years or so ago, to compete with Nvidia G-Sync, right?

    And G-Sync enabled displays came out in 2014 or so?

    Ahh! Just looked it up on Wikipedia. Now I get it!
    FreeSync is royalty free, and there are already a lot more displays supporting it, than G-Sync ever had!

    Now the excitement makes sense! 😊
  • 0
    @irene only if you consider machine learning a niche.
  • 1
    @irene The hype is touting AI and shit, but what actually works is pattern recognition. That's where NNs have been used for a long time.

    And there are a lot of actual applications in combination with big data, just think of predictive maintenance. It's quite expensive when an aircraft gets grounded because something doesn't work, or when production stops unexpectedly. But a human cannot extract sensor data from hundreds or thousands of sensors to check whether some damage is ahead.

    So the second ingredience for success is also there: valid business objectives.
  • 1
    @Fast-Nop pattern recognition is *the* thing, yes.
    I almost worked for a company producing reference systems for automatic driving and camera fed driving assistance systems.
    They currently get data of several TB from each test drive. And with new 3D laser sensors, they expect over 2 PB instead.
    You need a lot of hardware to make use of that much in any feasible time...
  • 1
    @Fast-Nop FPGA for similar cost are 10x faster than nvidia Tesla gpu. Gpu neural nets are nitch. More and more FPGA like Intel neural compute stick are coming to market. Supporting caffe, tensorflow, and others.
  • 0
    The more gpus you buy, the more money you saved! You never need a G-sync monitor for high refresh rate. Any 144hz will work on nvidia’s card.
  • 0
    @Noobish I think you're just unlucky, currently ryzen and Vega are literally plug and play, they play together fantastically and won't cost your soul and half your children's kidney
  • 0
    @irene crypto mining is no longer profitable, nvidia has to market its cards for machine learning to attract interests
  • 1
    I see that as an Nvidia powerplay, they now remove a selling point for AMD while also showing FreeSync has a wide range of different qualities, and that only the top 5% of FreeSync monitors compete with G-Sync visually, showing G-Sync is generally superior to the majority of FreeSync monitors anyway. I don't really see how this is good news for AMD considering FreeSync is a royalty Free open standard, AMD won't make any money off the additional sales of FreeSync displays while more people will pair those displays with Nvidia GPUs.
Add Comment