SpyderTracks
We love you Ukraine
I appreciate the embedded link 😂
I appreciate the embedded link 😂
The dead video scenario, whilst annoying, was the perfect opportunity to troll the forum. I’m not even sorry 🤪I appreciate the embedded link 😂
Just got my new 3090 build a couple of days ago and now i'm reading about this; its not good for my heart!
Seriously though, what stuff can I do to try and make a bricking as unlikely as possible?
I mean, i'm not touching new world with a bargepole but Anno 1800 is my bag.
Is putting Vsync on in games enough? Should I go into geforce control panel and find the max frames setting instead? Do both?
Can someone calm an old man down?
to prevent the bricking, make sure you have FPS cap on, either globally via RivaTuner or using ingame cap or turning on VSYNCJust got my new 3090 build a couple of days ago and now i'm reading about this; its not good for my heart!
Seriously though, what stuff can I do to try and make a bricking as unlikely as possible?
I mean, i'm not touching new world with a bargepole but Anno 1800 is my bag.
Is putting Vsync on in games enough? Should I go into geforce control panel and find the max frames setting instead? Do both?
Can someone calm an old man down?
to prevent the bricking, make sure you have FPS cap on, either globally via RivaTuner or using ingame cap or turning on VSYNC
Its exstremely strange especially a mmo !!!FWIW I've passed this on to PCS via the moderator's backchannel. It seems utterly bonkers to me that hardware can be damaged by commercial software.
Just got my new 3090 build a couple of days ago and now i'm reading about this; its not good for my heart!
Seriously though, what stuff can I do to try and make a bricking as unlikely as possible?
there was super limited few amount of EVGA cards month or so, like less than 20One thing II will say though is that its appeared to be EVGA cards that were mostly effected (not sure if PCS ever even supplied any of those anyway - afaik they don't noramlly), and EVGA have been organising RMA's for their broken cards very quickly, and it also looks like Amazon have made a fix as well, so do not panic
We would follow the standard returns procedure if any card sold to a customer or in a build was affected or needing to be returned faulty. I have passed this onto our technical manager in case he was not aware already as it is something to hold onto for future reference - as I understand EVGA in retail are working directly with customers to replace, hopefully they are doing advanced replacements.
We have had a larger allocation of 3070Ti cards from EVGA (XC3) recently but my understanding is that we haven’t had any other SKUs from EVGA.
not good, but explained brilliantly as ever, and no punches pulledSo it appears that even after full release, this game is still causing major issues on certain GPU's.
One surprise was his summarisation of how dreadful Gigabyte RMA's were, I'm not sure if that's a geographical issue related to the States in particular, but I've preferenced Gigabyte GPU's and motherboards for some time because of how good their RMA service was. Maybe things have changed, must admit, I've not had to RMA anything for a long time, and moved to Asus boards on the last build.not good, but explained brilliantly as ever, and no punches pulled
That's what he's saying though, Furmark was so well known for causing gpu failures that running it would void your warranty on the gpu for a long time. So that's what furmark became famous for was highlighting poor gpu cards.Silly question sorry but isn’t stress testing and even benchmarking software written to push processors to the extreme to see what the best performance you can get is?
As such, how come a game “breaks” it but not a benchmark or stress test program?
Sent from my iPhone using Tapatalk
Who is to say what 'real world' use is? Whilst it's unreasonable to use a graphics card as a hammer and you should expect it to break if you do, any graphics card should be able to execute any combination of it's own instruction set and for any length of time. You simply can't say that instructions X, Y, and Z cannot be executed sequentially (for example) or the card will break. If that scenario is thought unlikely to ever happen but catastrophic if it does, then the graphics card should not permit the execution of instructions X, Y and Z internally. It should fail safe.Certain programs push an unreasonable load, that you'd never find in any real world use.
In blaming the allegedly inefficient code (the 1+1+1 to get 50 example) he's saying that if the code had been more efficient the 'flaw' would not have been revealed. My argument is that if any product can be used in some legitimate way that will break it then the product should protect itself from that legitimate use internally. It should not just break - whether the legitimate use is 'real world' or not.He's not saying its not a hardware issue, what he's saying is that it's highlighting flaws in various gpus
I don't think anyone is arguing that, but in the real world, bugs are found on a daily basis. It's about how those bugs are addressed and it takes cooperation from all parties.In blaming the allegedly inefficient code (the 1+1+1 to get 50 example) he's saying that if the code had been more efficient the 'flaw' would not have been revealed. My argument is that if any product can be used in some legitimate way that will break it then the product should protect itself from that legitimate use internally. It should not just break - whether the legitimate use is 'real world' or not.
A software bug is a purely software issue. It should simply not be possible to execute any code that breaks hardware. Developers debugging software should never have to worry about whether the bug might damage hardware.I don't think anyone is arguing that, but in the real world, bugs are found on a daily basis. It's about how those bugs are addressed and it takes cooperation from all parties.
Again, that's not being argued.A software bug is a purely software issue. It should simply not be possible to execute any code that breaks hardware. Developers debugging software should never have to worry about whether the bug might damage hardware.
There is a big distinction between software and hardware that the video above seems to deliberately try to muddle - perhaps because those involved are trying to muddle it.
If you build a processor (CPU/GPU or whatever) that has a published set of instructions then I should be able to execute any and all of those instructions in my software without being in the slightest bit concerned about how they might affect the hardware.