Samsung SSD 850 Evo 4TB

Tony1044

Prolific Poster
For anyone with only one drive slot get a NAS. I have all my movies and TV series on one of mine and can even sling Plex over t'interwebz.

Price change - weakened pound. Most peeps are blaming brexit.
 

Oussebon

Multiverse Poster
The best performance product should be best value per performance in my eyes, otherwise theres no real incentive to buy it other than bragging rights, followed by crying because you realised what you just did to your bank account.
Of the R7 1700, R7 1700x and 1800x, that would be the R7 1700 giving the best bang per buck by a huge margin.

For anyone with only one drive slot get a NAS. I have all my movies and TV series on one of mine and can even sling Plex over t'interwebz.
I wasn't able to afford one, but I did get a 5tb USB HDD for £110 and keep the movies on that. I just plug it in and have the PC run a plex server on the night when I want to use the content there. I do have a NAS but my smart TV insists on dropping the connection to it at random for no reason when I try streaming from it.
 

Parramatta

Silver Level Poster
With the high end SSDs, the cost seems to getting away from HDDs. A widening gap. £1350 for a 4TB! Crazy. SSDs have been around for 10 years now. I remember reading articles back in 2006/7 saying that they'll reach price-parity in 10 years. Now it's another 10 years. We'll see. I hope so.

With the NVMe tech you mentioned, they are faster (although real world experience not as fast as the MB/s throughput that the stats suggest) but heat is a downside. Sustained transfers need good cooling for the storage to prevent thermal-throttling. I remember buying systems in the early 90s that had just one case fan. Then the CPU needed a fan. Then the GPU needed a fan. On some motherboards, even the chipset needed a fan. Then the powersupply needed its own fan. Then more case fans. Then the GPU needed two or more fans or third-party cooling solutions. Now some people need to cool their bloody NVMe storage!

The other problem with the NVMe PCIe is smaller capacities and higher prices. So yes, you have a speed improvement, but there's a downside (heat, capacity, price). With high-end HDDs, many have slower RPMs.

Once again, in the 1990s, you'd buy a new HDD after 18 months and it'd be twice the capacity, for less money and faster speeds (due to more cache, etc). There'd be no downside. Today's new tech often has a downside coupled with the improvement.

The most stark example of stagnation is vertical pixels. In 1992 I bought a 1280x1024 monitor. 25 years later, most monitors have 1080 vertical lines. Virtually no increase in a quarter century! Only now with 4K are we improving on that. Hope it doesn't take another 25 years to go beyond that.

I don't know if you were buying systems in the mid-90s, but the pace of change was extraordinary back then. You'd buy a custom high-end gaming rig one year, and in two years, some old grandma would buy an entry-level off-the-shelf from Currys that was better and cheaper in many respects. That can't happen now. High end CPUs from 5 years ago still cut it today.

To be clear, I'm not saying things have stopped, merely that compared to the 1990s, the rate of change appears to be slowing.
 

ubuysa

The BSOD Doctor
I fully realise that this thread is about current prices, but for a bit of comparison I Googled the historic prices of HDDs and RAM per GB. (see http://www.statisticbrain.com/average-cost-of-hard-drive-storage/ and http://www.statisticbrain.com/average-historic-price-of-ram/).

In 1980, when I was working in large IBM mainframe system software support, HDD price was $437,500 per GB (it's $0.019 today) and RAM price was $6,328,125 per GB (it's $4.37 today) so I don't think we're doing too badly... :)
 

Parramatta

Silver Level Poster
Yep, that's the point we're making. Fantastic progress in IT (not transport technology) for decades, but then a slowing of the RATE of change that started (depending on which area of technology) about 2010-ish.

Take HDDs... in 2007, there was the 1TB, in 2009, the 2TB, in 2011, the 4TB... so far so good. You then expect the 8TB to come in 2013. But that came out 2014. And there's no 16TB despite it being 2017 now. The 12TB isn't even out for consumers yet and when it does later in the year it'll be ludicrously expensive again. The 10TB has been out for years, and it's still around £400 depending on the type.

It doesn't make much deviation from a doubling to notice a dramatic difference. Had the 'doubling every 2 year rate continued', then 2013=8TB, 2015=16TB and we'd now we waiting for the 32TB. No need for a NAS!

And have you noticed it's an eternity now between when a company announces a new piece of kit and when it actually is available to consumers? The process goes: rumours... press release.... CES mock-up or prototype.... enterprise roll-out.... consumers.... firmware upgrade to iron out the bugs.... the whole process (Samsung is a bad offender at this) can take 2-3 years.
 
Last edited:

ubuysa

The BSOD Doctor
Yep, that's the point we're making. Fantastic progress in IT (not transport technology) for decades, but then a slowing of the RATE of change that started (depending on which area of technology) about 2010-ish.

Take HDDs... in 2007, there was the 1TB, in 2009, the 2TB, in 2011, the 4TB... so far so good. You then expect the 8TB to come in 2013. But that came out 2014. And there's no 16TB despite it being 2017 now. The 12TB isn't even out for consumers yet and when it does later in the year it'll be ludicrously expensive again. The 10TB has been out for years, and it's still around £400 depending on the type.

It doesn't make much deviation from a doubling to notice a dramatic difference. Had the 'doubling every 2 year rate continued', then 2013=8TB, 2015=16TB and we'd now we waiting for the 32TB. No need for a NAS!

And have you noticed it's an eternity now between when a company announces a new piece of kit and when it actually is available to consumers? The process goes: rumours... press release.... CES mock-up or prototype.... enterprise roll-out.... consumers.... firmware upgrade to iron out the bugs.... the whole process (Samsung is a bad offender at this) can take 2-3 years.

Frankly I'm amazed that we've been able to keep the rate of improvement for as long as we have, there has to be a limit to the chip technology we're using now. What will we all do when that's been reached I wonder?
 

SpyderTracks

We love you Ukraine
Frankly I'm amazed that we've been able to keep the rate of improvement for as long as we have, there has to be a limit to the chip technology we're using now. What will we all do when that's been reached I wonder?

IBM are working on a CPU with "Learning nodes" that mimics the human brain pathways and can basically store most used processes for immediate retrieval. https://www.extremetech.com/extreme/93060-ibm-creates-learning-brain-like-synaptic-cpu

And then there's Quantum computing which has come on leaps and bounds in the last 2 years or so, again lead really by IBM in the mainstream market.
 

Parramatta

Silver Level Poster
Yeah, Quantum Computing. Bring it on. We've all been reading articles about that for many years, as well as room temperature super-conductors, fusion power, supersonic passenger travel, trips to asteroids, landing on Mars, personalised genomic medicine, robot servants, smart toilets that give you medical diagnoses, fridges that order your groceries, maglev trains, AI that can pass the Turing Test, and so on.

Some of these are bound to come sooner or later, but... you know... others are "always 20 years away"... :)
 

SpyderTracks

We love you Ukraine
Yeah, Quantum Computing. Bring it on. We've all been reading articles about that for many years, as well as room temperature super-conductors, fusion power, supersonic passenger travel, trips to asteroids, landing on Mars, personalised genomic medicine, robot servants, smart toilets that give you medical diagnoses, fridges that order your groceries, maglev trains, AI that can pass the Turing Test, and so on.

Some of these are bound to come sooner or later, but... you know... others are "always 20 years away"... :)

IBM have a working Quantum PC in new york or somewhere which is online and used by professors around the world for research and testing. It went live middle of last year I think and is stable. They've got around the cooling problem now.
 

Oussebon

Multiverse Poster
http://hexus.net/tech/news/graphics...price-history-high-end-nvidia-gpus-tabulated/

Interesting graph - thanks for the link. So from a low point in 2009, the price is steadily zig-zagging up.
No, try reading the chart in such a way that you're not just looking to pick a few numbers to support your thesis :)

The common accusation levelled against Nvidia is that they are putting their prices up every generation (I'm sure they would if they could). However, this table on the far right has prices adjusted for inflation shows that supposed trend isn't true and that the prices are highest when they dominate the high end. prices are quite high, then low, then even higher, then lower, then higher. But a top end GPU now costs as much in USD as a top end GPU did in 2000 or 2008. That's what the table shows.

Picking a "low point of 2009" is a bit ridiculous. Look at the price the year before. And the price in 2013, which is even higher than now.
 
Last edited:

Parramatta

Silver Level Poster
IBM have a working Quantum PC in new york or somewhere which is online and used by professors around the world for research and testing. It went live middle of last year I think and is stable. They've got around the cooling problem now.

D-Wave have the world's first commercially available quantum computer, just announced a couple of months ago. There was a bit of controversy over whether it was actually using quantum computing (and not some fancy algorithm) and whether it's a universal computer and not just for certain specific tasks, but it seems that's been resolved now and it actually is a genuine quantum computer. That is good news. https://www.scientificamerican.com/...-rsquo-s-most-controversial-quantum-computer/
 

Parramatta

Silver Level Poster
No, try reading the chart in such a way that you're not just looking to pick a few numbers to support your thesis :)

The common accusation levelled against Nvidia is that they are putting their prices up every generation (I'm sure they would if they could). However, this table on the far right has prices adjusted for inflation shows that supposed trend isn't true and that the prices are highest when they dominate the high end. prices are quite high, then low, then even higher, then lower, then higher. But a top end GPU now costs as much in USD as a top end GPU did in 2000 or 2008. That's what the table shows.

Picking a "low point of 2009" is a bit ridiculous. Look at the price the year before. And the price in 2013, which is even higher than now.

Well, let's hope there's some more low price-points in the future again, and that the memory shortage, thermal issues, and increasing difficulty in dropping down to another die-size level doesn't mean the generally upwards zig-zagging trend will continue. I like your optimism! And it's fantastic that AMD are joining the game again - that should help things along. In a year or two, hopefully even low to mid-range GPUs should be able to run 4K gaming smoothly without exotic cooling required.
 

Oussebon

Multiverse Poster
With Nvidia GPUs, the GTX 780 ti ($699 release, Nov 2013) gets rendered obsolete by the GTX 970 ($329, Sept 2014), offering basically the same or better performance.
The 980 ti ($649, June 2015) got more or less replaced by the GTX 1070 (June 2016, $379), offering better performance albeit at a somewhat inflated cost. AMD has only the Fury X to try to match the 1070 and nothing to match the 1080 at that stage.

So I wouldn't be shocked if we saw the GTX 1170/2070 come in at 1080 ti performance, delivering single card 4k gaming somewhere around $350 (assuming Vega challenges Nvidia enough to keep the prices honest).

If that turns out to be the case, we'd have gone from the ~70 level card offering 1080p gaming to 4k gaming in the space of 3-4 years. Not too bad for progress.

As for thermal issues, a GTX 285 (2009) would run at about 80 degrees with the stock cooler. A 480 (2010) would run at over 90. A GTX 980 would run at around 80, under full load with the stock cooler. High end Pascal GPUs target 83/84. It doesn't seem like anyone's struggling to cool the new cards yet even with the stock coolers. Unlike with CPUs, where Intel going from Haswell-E (22nm) to Broadwell-E (14nm) where temps got a good deal more toasty.

I'm not overly worried about GPU beefiness slowing down just yet :)
 
Last edited:

ubuysa

The BSOD Doctor
D-Wave have the world's first commercially available quantum computer, just announced a couple of months ago. There was a bit of controversy over whether it was actually using quantum computing (and not some fancy algorithm) and whether it's a universal computer and not just for certain specific tasks, but it seems that's been resolved now and it actually is a genuine quantum computer. That is good news. https://www.scientificamerican.com/...-rsquo-s-most-controversial-quantum-computer/

OK, I'll be the first to ask then....

When will PCS stock a quantum computer and will I be able to cool it with a little hand fan?

:tt2:
 
Top