Of the R7 1700, R7 1700x and 1800x, that would be the R7 1700 giving the best bang per buck by a huge margin.The best performance product should be best value per performance in my eyes, otherwise theres no real incentive to buy it other than bragging rights, followed by crying because you realised what you just did to your bank account.
I wasn't able to afford one, but I did get a 5tb USB HDD for £110 and keep the movies on that. I just plug it in and have the PC run a plex server on the night when I want to use the content there. I do have a NAS but my smart TV insists on dropping the connection to it at random for no reason when I try streaming from it.For anyone with only one drive slot get a NAS. I have all my movies and TV series on one of mine and can even sling Plex over t'interwebz.
Yep, that's the point we're making. Fantastic progress in IT (not transport technology) for decades, but then a slowing of the RATE of change that started (depending on which area of technology) about 2010-ish.
Take HDDs... in 2007, there was the 1TB, in 2009, the 2TB, in 2011, the 4TB... so far so good. You then expect the 8TB to come in 2013. But that came out 2014. And there's no 16TB despite it being 2017 now. The 12TB isn't even out for consumers yet and when it does later in the year it'll be ludicrously expensive again. The 10TB has been out for years, and it's still around £400 depending on the type.
It doesn't make much deviation from a doubling to notice a dramatic difference. Had the 'doubling every 2 year rate continued', then 2013=8TB, 2015=16TB and we'd now we waiting for the 32TB. No need for a NAS!
And have you noticed it's an eternity now between when a company announces a new piece of kit and when it actually is available to consumers? The process goes: rumours... press release.... CES mock-up or prototype.... enterprise roll-out.... consumers.... firmware upgrade to iron out the bugs.... the whole process (Samsung is a bad offender at this) can take 2-3 years.
Frankly I'm amazed that we've been able to keep the rate of improvement for as long as we have, there has to be a limit to the chip technology we're using now. What will we all do when that's been reached I wonder?
Yeah, Quantum Computing. Bring it on. We've all been reading articles about that for many years, as well as room temperature super-conductors, fusion power, supersonic passenger travel, trips to asteroids, landing on Mars, personalised genomic medicine, robot servants, smart toilets that give you medical diagnoses, fridges that order your groceries, maglev trains, AI that can pass the Turing Test, and so on.
Some of these are bound to come sooner or later, but... you know... others are "always 20 years away"...
No, try reading the chart in such a way that you're not just looking to pick a few numbers to support your thesishttp://hexus.net/tech/news/graphics...price-history-high-end-nvidia-gpus-tabulated/
Interesting graph - thanks for the link. So from a low point in 2009, the price is steadily zig-zagging up.
IBM have a working Quantum PC in new york or somewhere which is online and used by professors around the world for research and testing. It went live middle of last year I think and is stable. They've got around the cooling problem now.
No, try reading the chart in such a way that you're not just looking to pick a few numbers to support your thesis
The common accusation levelled against Nvidia is that they are putting their prices up every generation (I'm sure they would if they could). However, this table on the far right has prices adjusted for inflation shows that supposed trend isn't true and that the prices are highest when they dominate the high end. prices are quite high, then low, then even higher, then lower, then higher. But a top end GPU now costs as much in USD as a top end GPU did in 2000 or 2008. That's what the table shows.
Picking a "low point of 2009" is a bit ridiculous. Look at the price the year before. And the price in 2013, which is even higher than now.
D-Wave have the world's first commercially available quantum computer, just announced a couple of months ago. There was a bit of controversy over whether it was actually using quantum computing (and not some fancy algorithm) and whether it's a universal computer and not just for certain specific tasks, but it seems that's been resolved now and it actually is a genuine quantum computer. That is good news. https://www.scientificamerican.com/...-rsquo-s-most-controversial-quantum-computer/
OK, I'll be the first to ask then....
When will PCS stock a quantum computer and will I be able to cool it with a little hand fan?
:tt2:
I must see Weird Science again