one of them is not like the others

  1. So what you're telling me is, Nvidia big brained us with the 4080 12gb name, into thinking its a rebranded xx70 series, to draw the attention away from the fact its supposed to be a xx60 series card. That's some scummy move right there.

  2. it gets worse, the other 4080 (16GB) is basically a xx70 class card, itself. there was a graph showing the cuda core percentage breakdown between the 3 that were announced, and the performance to cuda count is bananas:

  3. 4080 12gb its real. The reason is most likely that they would get a lot of shit for selling a mid range card (basically the xx60 of the new generation) at the price of an xx80 card. On top of that the new msrp for all the cards are basically the scalper prices from last year.

  4. Nvidia is straight up scamming their customers at this point. The naming conventions have been subject to inflation for some time (Ti, Super, x50 jumping to x60 at one point iirc.), but this is Nvidia testing the waters on taking it to the next level and straight up selling a x60 or x70 card under the x80 name. Watch them make some 'Titan Ti Super Duper'-bullshit to mark up the top card when they run out of numbers.

  5. They put all their cards classed as 80 this time and sell them under their VRAM, so the 6GB is basically a 4060, the 12GB a 4070, and the 16GB a 4080

  6. There's two 4080 cards. The 4080 16GB and 4080 12GB, the latter is a 70 card, but rebranded as a "4080 12GB" because people are likely to buy it more if it has 4080 in it.

  7. Wonder if AMD will take the opportunity to release reasonable cards that would easily compete, or if they'll edge up against this nonsense too.

  8. Nvidia are launching the 40xx series with two different 4080 models which they want us to believe don't vary much besides RAM size of 12GB or 16GB, but the smaller 12GB one actually features a variety of reduced specifications and compares to the 16GB one similar to how xx60 cards have compared to xx80 cards in previous generations.

  9. ELI5: "Bits" = how many lanes on the road. The more lanes, the more traffic can move on it at the same time. Nvidia are selling a 192 lane road in the middle of LA, as if it's a 256-384 lane interstate in Nevada.

  10. They rebranded an actual 4060 class card to 4080 12GB and have the audacity to charge more money than last years 3080

  11. memory bus width (measured in bits) dictates how much information you can move at once to and out of VRAM, with higher resolutions a wider bus is preferable

  12. Imagine Ford is still selling the Focus. It’s been a solid low end sedan for a decade then one year Ford slaps Mustang on it, raises the prices to mustang price, but at best it’s still a fucking Focus SES.

  13. I'm really temped to jump to team RED next time.. I feel like I've been fucked in the ass one too many times without consent..

  14. They pretty much already have it right if you don't mind reduced RT performance. Just got myself a 6700XT on sale for about 430€ with all taxes included and I'm super happy with the performance at 1440p

  15. As I predicted, the 3000 series would be the last somewhat consumer friendly cards. And the 4000 series is just an overclocked, overpriced hot mess.

  16. And I thought the £700 I paid for my 3080 was a lot of money, it's insane the 4080 (the actual one) is literally double that, they are insane. I really hope they do not sell well. Unlikely I know, but...

  17. Msrp wise yes, but it’s kinda ironic that in reality the 30 series cards ends up being the most expensive gpu lineup ever.

  18. To make things worse this chart only shows have the story... they switched the die to the 104 size die that the xx70 cards usually use aswell

  19. It's not really what is bothering me. With an upgraded cache like AMD did it wont have much of a performance impact. What really bothers me is just how vast of a difference in core design between both versions are. On average there's a 25% difference between the specs of both cards which is the same difference between a 80 and 80 TI or between a 70 or an 80 card.

  20. Each GPU has its own memory or VRAM. The width of this bus and the speed of the memory determine the memory bandwidth of the GPU. GPUs constantly update the frame buffer which is the image you see on the screen. And all the textures and game assets also sit in this memory.

  21. Nvidia are greedy fucking swine. Don't buy any 30series and don't buy any40 series, they're doing this shit on purpose. Let it rot on the shelves.

  22. I’d be willing to bet DLSS 3 is exclusive to 40 series solely as a crutch and had it been backwards compatible, would’ve made the new gen obsolete. Some shady ass bullshit they’ve pulled with 40xx.

  23. Haha me too..I thought even my 1080ti has nearly double that so surely it's wrong or they're doing some weird magic (yes I know bus width isn't really all that, but it's still a substantial part)

  24. You know how the automotive industry is dropping small cars and flooding the market with pointless crossovers and fwd suvs that actually are bloated hatchbacks made from that same small car structure, and selling them as aspirational products for much higher prices than the small cars they stopped making?

  25. So Nvidia releasing a subpar product disguised as a high end GPU... and selling it for $1.2k-ish... give or take. Nope, I'm done. No more buying Nvidia cards from here on out. Nvidia has become too cocky and greedy. Team Red from now on.

  26. Only upgrade is the process node. 3080 12 GB is on Samsung 8 nm while 4000-series is on TSMC N4 (enhanced 5 nm). This is a significant uplift in performance, but the specs are still really deceiving...

  27. Too many non-tech/gaming investors. Demand is a HUGE question mark, so “we’re increasing prices” is usually a “win” for the stock price, and NVDA needs a lot of help—the nuance of EVGA customer service and Sapphire alternative are simply not widely known by too many shareholders.

  28. So grab my 3090 now then? Was going to shoot for a 4080, but not at these prices. Or with this kind of asshattery going on...

  29. For the first time my life, I am seriously thinking of going AMD for my next GPU. This is unacceptable from Nvidia.

  30. 3090 Early adopter here. It's Funny how some people try to justify the 4080 (Fake Edition) 12 Gb, like it's okay to pay almost 1K for a 70 class card. Nvidia has done wonders winning the mindshare of many people, but this is the results of it...

  31. Much of prices is all down to what people are used to or willing to pay. The pandemic proved to Nvidia that the market would bear higher prices and they'll pretend it's because TSMC is just so mean to them making them pay 40% more.

  32. nividia's latest naming convention reminds me of BMW slapping the M batch onto all their low tier cars with bigger wheels and 2 more hp than base model... the M batch used to be exclusive for their top of the range, finely tuned performance line.

  33. The craziest thing to me is 80% of the improvements has nothing to do with the cards but the machine learning tech they are implementing. Things like DLSS 3

  34. Even my old GTX 760 had 256-bit memory bus, but it was the 2 GB model. There are 1.5 and 3 GB VRAM models that were 192-bit, but I dunno how common those were

  35. I think it's more about the naming than the performance. There are 2 completely different products with the same name. And in the past, they always had different names.

  36. AMD uses a lower bus because infinity cache makes up for the rest, they did consider a variant with a wider bus for rdna2 but abandoned it because the wider bus didn't give the expected extra performance while using the infinity cache

  37. I completely anticipate the first run of these to sell, there are always going to be people with more money than sense. But nobody should buy this outside of that, if we do then it normalizes these prices and that’s what you will pay forever.

  38. 80 series has gone from flagship to the step down of the step down of the step down, yet they charge for it like a flagship

  39. Okay, this really puts the comments I saw about the 4080 only having a 192-bit bus in a different perspective.

  40. This is going to bite Nvidia in the ass. I get that the cost to produce the cards have increased for them. . But instead of cutting into their profits they are passing on that cost to their customers and then some. It’s time for AMD to lower launch their cards at a loss and get a metric shit ton of new customers

  41. So, I'm guessing they're legit gonna make a $599 4070 that probably only has 10gb and is only as fast as a 3080. Basically a rebrand of the 3080 itself lol wtf.

  42. Does anyone mind taking the time to explain this to someone that has zero idea what this is about? The only things I know is these are graphics cards, I have a GTX 1060, and the bigger the number the shinier my game looks.

  43. 192-bit is how wide the lane which connects the GPU chip to the VRAM, the wider that lane is, the more data can go in and out of the memory at once

  44. I’m dead ass gonna think that all they are gonna release is a 4090 and 4080 and then if you want a cheaper card just buy the 3000 series, I feel like if they do any 4070 or 4060 might just be a rebrand of a 3080 or 3070 lol

  45. Just don't buy the 4000 series at all. Get a good deal on a 3000 series card or switch to AMD. We should never support these types of business practices.

  46. Wider bus = more memory bandwith. It's like adding more lanes to a highway in order to reduce traffic. Memory bandwitdth = memory speed x bus width. The memory speed hasn't changed from past generation, but the bus width has been reduced by a significant amount compared to last gen, meaning the overall bandwidth is gonna be lower

  47. Either wait until November for the RDNA3 announcement, or if you have deep pockets then get the 4090. These rebranded 4060 and 4070ti are just nvidias marketing strategy to shift all the 3000 series stock they're holding on to

  48. Everyone thinks they don't know what they are doing but they don't do this stuff lightly. They know people will pay and they know they will rake it in.

  49. It's because they're using 2gb modules instead of 1gb. Each module is 32bit. I agree though it's like buying a Walmart pc. Fancy name, eye-catching specs, and then every corner possible has been cut underneath.

  50. Relatively new to the GPU field. I don’t play a lot of games, many I do are 5+ years old. Is anything beyond a 1000 series going to make a difference for me? I’ve been running Integrated Graphics on an Intel 12th Gen i3 and it runs smoothly on medium graphics for a few games, I highly doubt I need a 3080 let alone a 4080. I honestly wonder if even an 800 or 900 would do. I’m literally playing older games at 1080p

  51. I am of the opinion that we shouldn’t jump to conclusions too much before seeing performance. Most of us aren’t computer scientists, electrical engineers, who knows what might be behind this. If performance is what we wanted and the price is good (i know it’s nvidia so good luck with that) then who cares what memory bus it has?

  52. should be noted that bus width is not everything, AMD cards used to regularly have 512bit wide buses but NVidia would have faster memory resulting in roughly the same effective maximum throughput

  53. Side note, love my GTX 1060. Got it for $200 a few years ago and saw it being sold for $500 during the treason.

  54. The funniest thing about this is when the cards release you’ll see countless flex posts showing off their new cards and benchmark results. We can bitch all we want but when this shit comes out people here are gonna buy them still

  55. They're powerful yes but their prices are increased at the same amount as their performance increased. I was thinking of changing team but nope I'm good with red. Nvidia is just going like well you want it you pay it and people will pay so do not expect them to get their shit together.

  56. So they're banking on people not doing their homework and buy the shittier card and they make major profits? Fuck you forever nivida. Building SFF with AMD and moving over to Mac for film/photo work.

  57. I love how Nvidia keeps calling the 60 a "Mid-range" card despite it being no where near the mid. How do you put your 7 GPUs above your "Mid-range" cards and 1 beneath it while still getting away with calling it a mid ranged card? A buddy of mine said it best, the 3080 and 3070 are essentially the mid ranged cards now. If they bring back the Titan then the 3080 will be the mid ranged GPU. All the while they keep charging more and more while effectivly changing the mid ranged GPU's name by adding more GPU numbers above it.  

Leave a Reply

Your email address will not be published. Required fields are marked *

Author: admin