3
wicho
4y

3000 series or 6000 series?

Comments
  • 6
    Won't know until after post launch benchmarks.
  • 0
    @SortOfTested Ehat if they have the same perf/price?!
  • 1
    @wicho
    They won't. It'll be a question of how far off and at what price.
  • 0
    @SortOfTested Solo dime! La roja o la verde?!?!
  • 3
    @wicho
    Nope, not interesting in the fanboy battles. Have fun 😋
  • 0
    @SortOfTested I'll let you know my decision 😉
  • 7
    Neither.
    I don’t care about keeping up with bleeding wallet tech.
  • 1
    Good question. I've had some issues with NVIDIA drivers that are causing me to seriously consider AMD for my next upgrade even though I've only had NVIDIA cards for the past decade.
  • 2
    @EmberQuill they both have driver issues 🤷‍♂️ new product always has problems.
  • 2
    I've ordered 1660super yesterday. I won't need anything for next one or two years at least. By that time it will be known which one is more interesting and there will be less problems with newest drivers

    I hope AMD will cause prices to fall. And I hope miners will burn in hell where they belong.
  • 2
    @wicho for me that decision totally depends on what they can offer as a plus. Rtx has already rtx voice and rtx broadcast is on the horizon, on the other hand amd has... Nothing that I consider a plus for my use case. I work from home(that's my use case) so that's a huge selling point for me to be on the green side. If amd throws something like that on the future, I would gladly switch to amd, because their prices are soo sweet
  • 2
    Intel hd 3000
  • 6
    I'm an AMD fangirl so I choose 6000 series!
  • 1
    I'm definitely getting one of the new cards because Cyberpunk 2077 is going to melt my GTX 1070. Still undecided on which one, but it will almost certainly be one of the new cards unless they're lying about the performance advantage over current-gen cards.
  • 1
    @EmberQuill most probably Cyberpunk won't melt your card at all and it will play decently even on older generation. Just don't be dumb and set everything to useless ultra.
  • 1
    @iiii but... but... it's so pretty on ultra... 😢

    Actually, the real problem is my 4k monitor. The 1070 is just a little too weak to run modern games at 4k even with graphics turned down a bit. Even just upgrading to the 1600 series cards might be enough to cross that gap, but I might as well wait a couple months, get a brand-new card, and keep it for 4-5 years again.
  • 1
    @EmberQuill high settings rarely differ at all from ultra, but cost much less. Reasonable medium (a bit lowered particles, antialiasing) looks as good as high as well.

    Heck, Horizon works well on rx580. On a damn rx580.
  • 1
    @iiii when I upgraded from a 1440p monitor to 4k, I discovered that my 1070 is just not good for 4k gaming. Games that ran smoothly with ultra quality on my old monitor barely run on medium settings at 4k
  • 1
    @EmberQuill oh 4k... nothing works good enough with 4k. that's just reality
  • 1
    I exclusively use MacBook and don’t even play games so good luck
  • 1
    It's hard to say without seeing any benchmarks and see if drivers have been updated.
  • 3
    1. Consider what games you play
    2. When available, check benchmarks
    3. Pick a card which can render your favorite games, at your monitor's resolution, at whatever "acceptable framerate" means to you.

    Also, consider that "price of card" is basically added to your "cost of access to game".

    If you buy 2 games during the lifespan of a card which really need that card, and the card costs $800, that means the cost of being able to play those games is $400+60=$460 per game. Still worth it?

    Not saying it isn't, if you plan to play Assassin's Creed Valhalla for a thousand hours the price per hour will be low even with an expensive card — but just consider the proposition.

    I think in general, people who panick that their 2080 is "trash" because 3080 will be faster, are just victims of a very warped thought process.

    I will be upgrading from a GTX 960 to whatever team produces the best card in the $200-$300 range.

    That might very well be AMD, considering they are mass producing a "budget ray tracing SOC" for both major console brands this year as well.

    Also, forget about the channels and blogs who declare the "winning team" based on which company manages to create the least affordable nuclear powered triple slot fuse-tripping PCB.

    Check Steam hardware stats: the top 3 consists of 1050 and 1060 cards.
    https://store.steampowered.com/hwsu...

    The winning company is the one delivering the best value. Last few rounds that was Team Green, but not because of the amount of TFLOPS delivered by a 2080 Ti, it was because they consistently delivered reasonable performance-per-dollar on their budget cards.

    I think November/December will be a good time to upgrade if you want to get the full picture.
  • 2
    @bittersweet i like your way. JayZ also made a video about panic selling 2080s with the same idea as yours.
  • 2
    @iiii Yeah he tends to be sensible.

    Apart from upgrade fever, too many people also have preorder fever.

    Cards will sell out fast on release. Don't elbow your way to the front of the line. If you get a bad/dead card, what do you think customer service is going to be like?

    Better to wait 2 months after release. You get more settled prices, better reviews, and a better choice of AIB variants.
  • 3
    @bittersweet That's very good thought process.
    But I'm always financially broke, for me acceptance frame rate is always 30 fps 😂
  • 2
    @bittersweet i'll probably wait for another year at least. last AMD release was hopeful but driver issues and local price gauging was too bad (5700xt costs where i live the same as rtx 2070 super which is way too much). I would have used my rx580 for a year or two more, but my card is a piece of crap so I'm replacing it with a sensible gtx 1660 super for now, as it seems like the best bang for buck available right now.
  • 3
    @scorpionk get 1660 and you'll get a nice 1080p 60fps experience for a pretty low cost. or even an rx580 (not an MSI ARMOR one. it's a damn trash with no cooling capacity). or even a used 1080ti is a valid option
  • 2
    @scorpionk Well, even the 3090 is only 150x a $10 truckstop blowjob. You can provide at least 15 blowjobs per day, so that's 5 weekends.
  • 1
    @bittersweet or even faster if it's more than a blowjob 🤔

    OnlyFans anyone???
  • 2
    @iiii Currently I'm having gtx 1050 ti zotac oc edition.
    Will wait for upgrade till fps drop is noticable :P
  • 2
    @bittersweet I'm good with my gtx 1050ti
  • 2
    @scorpionk oh, in such case those upgrades I proposed aren't that substantial. You're right

    well, except used 1080ti. that one may be great
  • 3
    6000 > 3000

    AMD wins by default, jk

    One this is for sure though, 6000 series gonna be cheaper than 3000 series
  • 2
    @cabbybaby I think rumored console performance/pricing bodes pretty well for AMD for the whole "medium performance" segment.

    I suspect they'll be like: "Who cares about competing against 3080/3090? We'll have a raytracing enabled chip which beats the 3060 by 30%, for $50 less"
  • 1
    I'm eager to get a 3080 at release because prices look tempting (and my 1080 won't be totally devaluated), but at the same time I want to wait for the 3080 20 GiB which is said to be coming after Radeon 6000. 3090 seems too far off for me.
  • 2
    i have an rx460 and i dont plan on upgrading anytime soon lol
  • 2
    Assuming the 6000-series from AMD won't be turds I'm saying the AMD one...

    Mainly because Nvidia has too much proprietary bullshit going on...

    Been saying this for years, and I'll keep repeating it:

    I'm afraid that in the near future, it'll start becoming less about making the better GPU and more about creating vendor lock-ins using proprietary "features" (like RTX and DLSS).
  • 2
    @FinlayDaG33k Fair point, although "saying it for years" and "afraid of near future" are kind of a paradox: It never happens.

    This generation AMD cards will support some form of ray tracing at "acceptable" resolution & framerate -- otherwise they couldn't have showcased something like Ratchet & Clank on PS5 for example. On XBOX series X, it seems like games are using DXR + DirectML to achieve a "DLSS-like" feature, and I assume GNMX & PSSL for PS5 have added machine learning features which can fully utilize whatever AI-accelerator cores are stuffed in RDNA2 as well.

    And it has always been like this. I remember being "scared for the future" back in 2002, with DirectX 7.0 features being so far ahead of OpenGL 1.3, and GeForce 3 & 4 getting a monopoly. Then the GeForce 6 got slammed by ATI X800. In 2010, people overhyped Nvidia's CSAA, until ATI release the same thing, as EQAA.

    Every single invention is eventually adopted by the other team -- it might just take a generation.
  • 2
    I mean.. it looks good on paper but fuck Nvidia tho
  • 1
    @bittersweet Well... it's not a paradox since it's already starting to happen...

    I know plenty of people that won't buy AMD simply because they "require" (read: "Just don't wanna swap away from since they are used to it") G-Sync for their monitor and CUDA for their software (often stuff like TensorFlow or Adobe After Effects).

    Yet, it's pretty easy to move from AMD to Nvidia because AMD chooses to open up their goodies (which can then be adapted by Nvidia to make it more attractive to run their card), an example of this would be FreeSync...

    You can run both G-Sync and FreeSync mons on Nvidia, but only FreeSync mons work on AMD...

    Also, ever single invention will get adopted by the other team at some point, which only proves my point: Whom will get the better support for it?

    Either side can build a great new feature, but it'd be pointless without proper support.
  • 2
  • 1
    @FinlayDaG33k But that has been true for almost two decades: Nvidia focuses on closed source, protected IP delivering specialty shiny candy features, and ATI/AMD on raw performance-per-dollar until they can release similar features under their own brand name.

    Granted, AMD somewhat failed even on that premise a few times, but they always claw their way back with some great value card as well.

    The fact that AMD will power 100-150 million consoles, and their pile of Ryzen pocket money means they have plenty of room to faceslap Nvidia a few times.

    Yes, of course from the position of underdog, but that's OK.
  • 2
    @bittersweet And that's exactly where I'm pointing at, those closed features.

    Like I've said, if this keeps going up, AMD could potentially bring a really nice new card, that will be made nearly irrelevant due to the candy on the Nvidia cards.

    Intel has tried the same with stuff like RealSense, but (mostly) failed at it because RealSense was too much of a niche...

    RTX-based things like RTX-voice, Raytracing and DLSS are a lot less of a niche sadly.

    I hope that the influx of console royalties and Ryzen pocket money will make a card that can bitchslap Nvidia nice and hard like they did with Intel before it is too late D:
  • 1
    @bittersweet underdog or not, but RED GOES FASTA!!!
  • 2
    @FinlayDaG33k

    But Ray tracing and most likely some "DLSS-alike" feature are pretty much confirmed for Radeon 6000 — by virtue of presence in the announced consoles.

    There's also a whole bunch of features which are "neat", but not must haves for most (casual) gamers. Stuff like Shadowplay and RTX voice are amazing for streamers — but the vast majority of gamers aren't streamers.

    Then there's things like CUDA which is interesting for professional users, but again, not required for most "mid tier" gamers.

    I think if AMD positions an attractive $200-$300 card to compete with the upcoming 3050/3060, they could reconquer quite a bit of market share.
  • 2
    @bittersweet They are confirmed yea, but will they get much support *outside* the consoles? that is the question atm.

    ShadowPlay is used by quite a lot of non-streamers as well, heck, they are ideal for non-streamers since streamers would most likely have OBS (or similar) running in the background.

    RTX voice is mainly appealing for streamers, I agree.

    CUDA indeed is mainly for professionals (or amateurs that are getting into stuff like TF or AAE), but that shouldn't be a major argument either "it's not for gamers so it doesn't matter much since these are "gaming" cards".

    I think that $200-$300 cards to go against the RTX3050 and RTX3060 later would need to be really sweet or nobody is gonna bother (of course, implying AMD doesn't bork their drivers again).

    Also it would mean that AMD still doesn't solve another issue they have where they can't compete in the high-end market, yet try to market their cards as "high-end" :\
  • 2
    @FinlayDaG33k

    I do think AMD missed a chance with RTX 20xx — Nvidia had marginal performance gains, AMD should have slam-dunked that shit with a relatively cheap, barebones, raw power, non-RTX 2080-killing card, leaning hard into the whole ray tracing scepticism sentiment at the time.

    Now they are kind of up for a difficult challenge where 6000-series has to prove itself as a more appetizing deal on both fronts, both on raster and ray trace performance.
  • 1
    @bittersweet Agreed, back when they announced the 5700XT after all the hype surrounding Navi, I was stoked, thinking this would be the Ryzen of the GPU world...

    You can probably understand how much disappointment I felt...

    And yes, they now need to really make work with the 6000-series...

    The only way they can really punch back at Nvidia:

    - have something that competes on performance

    - make it not suck 300W+ of power

    - make it for a decent price

    I am a little excited for AMD 6000 but I don't expect much of it sadly...
  • 0
    I think I'll be able to buy 1050ti next year, heard it's price/performance
  • 0
    @melezorus34 not really a great price to performance. Though depends on what you want to achieve in the end.
  • 0
    @iiii vscode on 1080p ultra
  • 0
    @melezorus34 wait, what? 😄 You don't need any fancy card for vscode
  • 1
    @melezorus34 @iiii No joke, at one point VSCode had some performance issues (I think stemming from Electron) and it couldn't run at 60 fps. It was noticeable to me when scrolling or sometimes even when just typing.
  • 0
    @kamen dafuq? Intel built in graphics work well with vscode...
  • 1
    @iiii I don't know, it wasn't like insufficient performance (a GTX 1080 should be powerful enough to run a text editor, eh?), it was more like something in the acceleration pipeline that was causing choppiness and FPS dips (you can debug this using VSCode's built-in DevTools). I can't remember anymore, it might've been VSCode itself, it might've been Windows (I've been on Insider Fast ring for years)...
  • 0
    @kamen yes, 1080 is way more than enough.
Add Comment