"Sure our latest two generations of our main product have irreparable issues, but at least we got to hire another two dozen fat black dykes and monkeypox infested trannies."
I switched to my first AMD in like 12 years in 2022. Feels good.
Yup, I did as well, roughly around the same time. It's so much cheaper (outside of certain use cases where Intel/Nvidia may be useful) for what you can get. I don't know why, aside from habit and good marketing, normies keep getting Intel/Nvidia. AMD at present provides everything most people need, at much lower costs. For my video card especially, I would have had to step multiple tiers down if I'd wanted Nvidia.
The normies just hear "Muh raytracing" then see the nvidia cards outperforming amd. so they will then go buy an nvidia card instead.
The 7800xt card amd recently released is a beast for its price tbh. and if you dont really care about rtx, then its one of the best buys you can do. unless you really need that rtx 4090 ti. But if you do, theres no helping you anyway.
The 7800xt card amd recently released is a beast for its price tbh.
To be fair, that's true for most of their modern cards.
unless you really need that rtx 4090 ti. But if you do, theres no helping you anyway.
I hope those people are either industry professionals of some sort, or really filthy rich. Any normie gamer who gets a 4090 because it's the top of the line is retarded. You can get even a 7900 XTX for around half the price. It will do almost anything a 4090 can do, almost as well, in real world gaming applications.
Yeah I had a GTX 970 as well, for years. I upgraded it last winter for a used Radeon XT 5600 - still an old card but it does about 50% better in benchmarks and it cost me almost nothing. I'm so glad I switched, too - I run lunix and nVidia drivers are unbelievably shit.
It's something to do with certain conditions in their boost code causing the CPU to request extra voltage it doesn't need.
Having seen how much chaos a 0.1 increase in voltage can do while trying to figure out overclocking, I can definitely believe that requesting X% too much voltage constantly would fuck up the CPU quickly.
Estimated 50% of the chips will either die or have significant performance problems, but up to 100% can't meet the advertized performance without being unstable.
Intel says their manufacturing process had some contamination problems causing corrosion and thus degrading the chips.
But their chips were also pushed way too much to try and score higher than the competition, and they have an upcommming ''micro-code update'' that will ''fix the issues''... and significantly lower performances ( in other words, even if your Intel CPU dosen't die, it won't perform as advertised ).
It is unclear which CPUs among the 13th and 14th gens have these problems, and Intel is not being transparent at all, which is unsurprising but still frustrating.
No matter what, Intel will have to compensate costumers. Either willingly, or after a class-action lawsuit they are guaranteed to lose.
They just failed to meet the new NOx emission standarts and set-up a fraudulent scheme to fool testing.
Intel's 13th and 14th gen CPUs don't ''drive'' as fast as advertised ( best case scenario ), the ''car'' shuts-down randomly and dosen't roll straight or steady ( 50%-100% pre code update), or it just fucking dies after a short while. ( 25% - 50% ).
Will it last weeks? Months? Two years? Nobody knows with the Intel Corrosion Roulette!
InB4 Intel saying it's normal for a CPU to die after 2 years. I never had a CPU die on any of my computers since I got one. 10 years +, if it hasen't been sent for recycling yet, it still works but I don't use it anymore because it's too outdated.
It'd also be a good way to boost their sales of Arrow Lake. Oh hey there corporate customerino's we'll offer you this 10% discount to trade in your old 13-14th gen processors for our brand spanking new 15th gen CPUs that don't have this problem!
Not sure what to do. I have a 9900k and its showing its age. I use my computer for work, 3d modeling/animation/video editing. I plan to get a new cpu soon.. but with intel 13th and 14th gen being a ticking time bomb. I game too, but that is now secondary. Trying to get a new cpu because i want to bake simulations faster and also faster rendering between frames.
Im planning on waiting for 15th gen/arrowlake from intel which will be allegedly released this fall. But im gonna assume its paper launch and mobo will be very over priced for first few months or half a year. I need something in the next few months. I looked up amd and they are releasing new cpus in a few weeks. Seems x3d is very good for certain games, but lower clocks. And non x3d variants are alright, but have higher idle wattage than intel cpus and less stable memory controllers.
Im not sure its useful or not.. but intel has quicksync. Not even sure if its used. It apparently use the igpu to help with video editing. But currently i have a rtx 4080. Does amd have an equivalent for their cpus?
If AMD is an option, they plan to release their new high-end CPUs mid-August.
But what's already out is good. Just avoid Intel 13th and 14th gens, and I wouldn't trust the next one before it's been out and got some mileage either.
Got an 5800X3D not long agoand feels like it punches above it's price in performance. It doesn't make sense for you as a new build since I was just trying to eke another use out of my am4 socket mobo, but I can at least attest one of the X3D offerings is a good performer, noticeable improvements in games like X4 and total war and like triple the performance on cinebench compared to my old Ryzen.
This is an issue for both AMD and Intel. To summarize it, there used to be a lot of headroom left in processors for safety and efficiency. Why gain an extra 5% of performance if it costs 30% more power (and heat) and doubles the failure rate? But in today's world, both AMD and Intel need every last drop of performance to show continued improvement and out-sell each other. On top of that, motherboard manufacturers are incentivized to do the same thing, so they push voltage (power) even further, right out of the box. Combined, this causes modern processors to balance on a knife-edge between maximum performance and failure.
If you buy current hardware, the safe bet is to adjust the voltage and other settings yourself. I have an Intel 13900k myself (paired with an RTX 4090 too), but I have both undervolted from their factory settings. Because we live in a backwards world, reducing power from the factory settings actually results in better performance than the average for my hardware.
I was considering getting something like a 13/14700T whenever I did a GPU upgrade, with those being low power models I assume it's still the same silicon anyway so the same problem. Oh well, they haven't said a word about RX 8000 series in a while and I'm not sure I need an upgrade anyway. Kick the can down the road.
Enshitification of everything
But at least Intel had a diverse enough team of engineers to get more DEI investment dollars from Blackrock.
Have a hard time imagining theres DIE designing silicon, maybe the manufacturing?
Nah, they're in the 'assault workers with baseball bats department'.
Reminder of how they got here:
Step 1: https://i.imgur.com/FAcd6Rw.png
Step 2: CPU design takes 4 years so around 2019 they start getting their shit kicked in by AMD.
Step 3: To win back some benchmarks they crank the clock speeds and voltages to the moon.
Step 4: Chips explode.
DEI really seem like it is meant to crash the global economy so we are all begging for 2030 agenda.
DEI ruins everything!
"Sure our latest two generations of our main product have irreparable issues, but at least we got to hire another two dozen fat black dykes and monkeypox infested trannies."
Looks like I picked a good time to switch, last build. Dodged a bullet.
I switched to my first AMD in like 12 years in 2022. Feels good.
Yup, I did as well, roughly around the same time. It's so much cheaper (outside of certain use cases where Intel/Nvidia may be useful) for what you can get. I don't know why, aside from habit and good marketing, normies keep getting Intel/Nvidia. AMD at present provides everything most people need, at much lower costs. For my video card especially, I would have had to step multiple tiers down if I'd wanted Nvidia.
The normies just hear "Muh raytracing" then see the nvidia cards outperforming amd. so they will then go buy an nvidia card instead.
The 7800xt card amd recently released is a beast for its price tbh. and if you dont really care about rtx, then its one of the best buys you can do. unless you really need that rtx 4090 ti. But if you do, theres no helping you anyway.
To be fair, that's true for most of their modern cards.
I hope those people are either industry professionals of some sort, or really filthy rich. Any normie gamer who gets a 4090 because it's the top of the line is retarded. You can get even a 7900 XTX for around half the price. It will do almost anything a 4090 can do, almost as well, in real world gaming applications.
I still use an i5-6600K that I got nearly a decade ago, but when I finally upgrade I'm definitely going AMD.
I used, I think it was an i7-6700k, and a Nvidia GTX 970, for like twelve years.
Finally upgraded to a much more beastly and modern machine, a Ryzen 7 7800X3D and a 7900 XTX. Hopefully that also lasts me a decade plus.
Yeah I had a GTX 970 as well, for years. I upgraded it last winter for a used Radeon XT 5600 - still an old card but it does about 50% better in benchmarks and it cost me almost nothing. I'm so glad I switched, too - I run lunix and nVidia drivers are unbelievably shit.
just checked, yeah I'm on 12. Good
Are they just running too hot?
What else would cause a functioning CPU to destroy itself?
It's something to do with certain conditions in their boost code causing the CPU to request extra voltage it doesn't need.
Having seen how much chaos a 0.1 increase in voltage can do while trying to figure out overclocking, I can definitely believe that requesting X% too much voltage constantly would fuck up the CPU quickly.
Apparently the CPU's internal software was requesting too high a voltage, which is what's doing the damage.
Estimated 50% of the chips will either die or have significant performance problems, but up to 100% can't meet the advertized performance without being unstable.
Intel says their manufacturing process had some contamination problems causing corrosion and thus degrading the chips.
But their chips were also pushed way too much to try and score higher than the competition, and they have an upcommming ''micro-code update'' that will ''fix the issues''... and significantly lower performances ( in other words, even if your Intel CPU dosen't die, it won't perform as advertised ).
It is unclear which CPUs among the 13th and 14th gens have these problems, and Intel is not being transparent at all, which is unsurprising but still frustrating.
No matter what, Intel will have to compensate costumers. Either willingly, or after a class-action lawsuit they are guaranteed to lose.
Sounds like “dieselgate” but for chips
https://en.m.wikipedia.org/wiki/Volkswagen_emissions_scandal
Similar, but Volkswagen's car still worked fine.
They just failed to meet the new NOx emission standarts and set-up a fraudulent scheme to fool testing.
Intel's 13th and 14th gen CPUs don't ''drive'' as fast as advertised ( best case scenario ), the ''car'' shuts-down randomly and dosen't roll straight or steady ( 50%-100% pre code update), or it just fucking dies after a short while. ( 25% - 50% ).
Will it last weeks? Months? Two years? Nobody knows with the Intel Corrosion Roulette!
InB4 Intel saying it's normal for a CPU to die after 2 years. I never had a CPU die on any of my computers since I got one. 10 years +, if it hasen't been sent for recycling yet, it still works but I don't use it anymore because it's too outdated.
30+ years here. Never once.
entirely fake scandal. nobody cares about muh emisshuns, but they do care about their expensive computer cooking itself to death.
My 4th gen intel cpu is still chugging along nicely, lol.
Knocks wood vigorously
A sacred artifact from the age of the Gods. Take good care of it.
Between Intel paying vendors not to stock AMD and Intel's frequent leftist crap I haven't touched them in decades.
Awesome. I just built my new PC with a 13700kf processor. Good to know it's going to shit the bed at random.
F
See I knew fucking intel couldn't be trusted despite the hype that they'd changed and were producing better products
It'd also be a good way to boost their sales of Arrow Lake. Oh hey there corporate customerino's we'll offer you this 10% discount to trade in your old 13-14th gen processors for our brand spanking new 15th gen CPUs that don't have this problem!
''Trust us bro''
Mmmmmmno.
Not sure what to do. I have a 9900k and its showing its age. I use my computer for work, 3d modeling/animation/video editing. I plan to get a new cpu soon.. but with intel 13th and 14th gen being a ticking time bomb. I game too, but that is now secondary. Trying to get a new cpu because i want to bake simulations faster and also faster rendering between frames.
Im planning on waiting for 15th gen/arrowlake from intel which will be allegedly released this fall. But im gonna assume its paper launch and mobo will be very over priced for first few months or half a year. I need something in the next few months. I looked up amd and they are releasing new cpus in a few weeks. Seems x3d is very good for certain games, but lower clocks. And non x3d variants are alright, but have higher idle wattage than intel cpus and less stable memory controllers.
Im not sure its useful or not.. but intel has quicksync. Not even sure if its used. It apparently use the igpu to help with video editing. But currently i have a rtx 4080. Does amd have an equivalent for their cpus?
You can keep using your RTX 4080 on an AMD motherboard. It won't support Resizable BAR but other than that it should run fine.
If AMD is an option, they plan to release their new high-end CPUs mid-August.
But what's already out is good. Just avoid Intel 13th and 14th gens, and I wouldn't trust the next one before it's been out and got some mileage either.
Got an 5800X3D not long agoand feels like it punches above it's price in performance. It doesn't make sense for you as a new build since I was just trying to eke another use out of my am4 socket mobo, but I can at least attest one of the X3D offerings is a good performer, noticeable improvements in games like X4 and total war and like triple the performance on cinebench compared to my old Ryzen.
Nothing to do with voltage, just jewish design. God please smite all jews off the face of earth. Amen.
Probably changed the particle spec on their ion implants to up their yield, since I've heard they run dirty.
Dang, if true, it makes me feel better about going AMD for my last build.
https://www.youtube.com/watch?v=OVdmK1UGzGs
https://www.youtube.com/watch?v=DznKg1IjVs0
https://www.youtube.com/watch?v=_zTX26Qjzs8
This is an issue for both AMD and Intel. To summarize it, there used to be a lot of headroom left in processors for safety and efficiency. Why gain an extra 5% of performance if it costs 30% more power (and heat) and doubles the failure rate? But in today's world, both AMD and Intel need every last drop of performance to show continued improvement and out-sell each other. On top of that, motherboard manufacturers are incentivized to do the same thing, so they push voltage (power) even further, right out of the box. Combined, this causes modern processors to balance on a knife-edge between maximum performance and failure.
In AMD's case, processors were overvolting so severely that they damaged the socket. For Intel, they also overvolted their processors, but to a slightly lesser degree that causes permanent damage (erosion) ultimately resulting in failure.
If you buy current hardware, the safe bet is to adjust the voltage and other settings yourself. I have an Intel 13900k myself (paired with an RTX 4090 too), but I have both undervolted from their factory settings. Because we live in a backwards world, reducing power from the factory settings actually results in better performance than the average for my hardware.
Refurbished server prices are going to go through the roof. Might start looking at AMD for servers as we're looking to do refresh soon.
Last time I looked they seemed to be pretty lacking in the mid range reasonably priced range. It was all $$,$$$ "threadripper".
Well then, I've got 12th gen at least.
I was considering getting something like a 13/14700T whenever I did a GPU upgrade, with those being low power models I assume it's still the same silicon anyway so the same problem. Oh well, they haven't said a word about RX 8000 series in a while and I'm not sure I need an upgrade anyway. Kick the can down the road.
It's good to be AMD these days.
Bad time to invest.