New SSD

A couple of Fridays ago, a solid state drive — the 256GB Crucial M4 — on which I’d been keeping my eye became available again at a reasonable price, so within a few minutes I placed an order. It arrived on Monday, swaddled in bubblewrap (a useful material to keep around for future eventualities — which inevitably turn out to be the fun of popping bubbles).

It came in a small box, sealed with a sticker but with no accompanying material beyond the packing foam. This seemed to indicate either that the drive was so simple and idiot-proof that no instructions were needed, or that it would be so complicated that prior experience was necessary and a mere few pages of instructions just weren’t going to suffice. It transpires that the drive was not quite idiot-proof after all, as demonstrated in the following account, which I will punctuate with some of fundamental lessons I’ve learned in the last two weeks.

Although I’d been watching it for months and anticipating me upgrade to it, I realised I hadn’t properly planned what I was going to do with it. I only had the basic idea that I would use it as my system drive, and that I’d relocate some things previously kept on C: to my secondary, non-SSD, drive (which is a plain 2TB Western Digital HDD).

First I installed Windows on it. On booting Windows into an ugly low-res mode and being unable to change the graphics mode or connect to a network (and having had all my driver disks and manuals thrown out with the boxes last year :<), I had a sinking feeling of having to painstakingly find and install everything to get it working again.

Lesson 1. Have a plan for how you want to set up the new drive, and take into account that reinstalling Windows means reinstalling drivers and reconfiguring everything.

Instead, I optimistically chose to transfer my system from the old drive to the new one. The first step was to clean up the partition, and move items that won’t reside on the SSD onto other storage. The next issue is that the old partition was 300GB (yes, an odd and unaesthetic size for a disk partition), but the new disk is only 256GB. So, shrink the partition in Windows, until it’s smaller than the target drive (it can later be expanded to fill the target). Shrinking can only remove unused data at the end of the drive, so deleting as much as possible and defragmenting will help. This took many hours.

Then make a system image of the drive onto removable media. Surprisingly it took only a few hours to transfer 200GB onto my external HDD over USB. It was much quicker when using the USB3 port, even though the drive itself doesn’t support USB3. That was a bit strange, but I wasn’t complaining.  (Though, when a data operation takes a suspiciously long or a suspiciously short amount of time to complete, it’s always a good idea to check that it went according to plan — witness such examples as a back up to /dev/null, or an inadvertent comment in DELETE FROM accounts -- WHERE id = 123).

Boot using the Windows setup disc, choose recovery, and restore system image. Select the target and begin restoring it. After a few seconds, this error appears:

No disk that can be used for recovering the system disk can be found.

It turned out that the image was still just slightly too large, taking into account a secret 100MB partition used for Windowsy stuff. I tried restoring without that, using the recovery command line. This worked. But the resulting drive did not boot (which was not surprising; leaving out the secret partition had been a long shot). It could not be repaired, so I was back to square 1.

I booted back into the old system and deleted some more stuff. The drive shrunk a bit more. Drives can only be shrunk when there is unused space at the end of the drive; so deleting and defragmenting can help provide that. Unfortunately some things, such as the MFT, cannot be defragmented or even reduced once they become bloated. Eventually, shrinking reached its limit at around 200GB.

Make another system image. This took way longer than last time for some reason, using the same drive and USB port, even though there was less data to backup. I stayed up till it was done and went to bed.

The next evening, I tried restoring it again. This time there was a different and clearer (but misleading) error:

The disk that is set as active in the Bios is too small to recover the original system disk.

I gave up the idea of reusing the existing Windows installation. Speculation on the internet has it that the first cryptic error is caused by the image being too large for the target; the second error is caused by the original complete drive (2TB) being too large, even though only a relatively small partition at the beginning of it is being copied.

Lesson 2. Don’t bother trying to reuse a Windows image unless the source and destination are the same size.

So, having installed Windows, deleted it, installed a complete disk image, and abandoned it (how many write-cycles did I burn up doing that?), I installed Windows again. I copied the basic drivers from the old installation (a slightly dodgy approach, but one which seems to have worked). I installed all available updates.

Then I benchmarked it. It was slightly slower than the HDD it was replacing.

It turns out I’d set it up in IDE mode instead of AHCI (I don’t know what the difference is). There is an option in the BIOS to set a SATA port to AHCI, which should be enabled before installing the operating system. Oops. Fortunately there was a registry hack that could fix it without requiring a reinstall.

Lesson 3. Set the BIOS to AHCI mode before installing the operating system on your new SSD drive.

I changed the registry and the BIOS, and rebenchmarked. It was much improved.

The Crucial M4’s performance weakness is in small random reads. I don’t recall noticing this when I read all the reviews and benchmarks before buying it, but it seems obvious now. This is somewhat frustrating, because the main performance advantage of an SSD is (or should be) in random access times, meaning defragmentation and prefetching are no longer required. Additionally, defragmentation is harmful to SSDs; having to go without would ideally be compensated for by making it unnecessary.

Random writes are less of a problem. Although writing is a generally more expensive operation than reading, it’s not that surprising that the SSD handles them more efficiently. The writes are sent to the drive, and the drive will accept them immediately. But it will retain them in its own memory and coordinate them into bulk updates, rather than sprinking them in random places on the drive. (So the writes are to random places in the file system, but are implemented as a single block write in the hardware.) This excellent article at ArsTechnica, which I read as part of my research, provides fascinating insight into how SSDs really store data.

Although SSDs present approximately the same interface for data storage as a HDD, that interface is a leaky abstraction which will sometimes reveal vast differences in how they behave internally. One such is that blocks are not merely overwritten; they are marked as “no longer used” by a sufficiently sophisticated operating system, and the SSD will internally reuse them for writes to other pieces of data. So some basic operations, like defragmenting or formatting a disk, have completely different ramifications for SSDs. Defragmenting is almost certainly dangerous, and formatting should be more a matter of notifying the SSD that blocks aren’t in use, than it is about zeroing them. (My concern about with the frequent reinstallations is that the intervening formats and reimages may not have fully returned the blocks to the SSD, and therefore left it with less room to maneouver when rearranging data internally. But I guess there is no way to tell.)

Lesson 4. Configure the operating system for efficient and safe SSD use, by disabling scheduled defragmentation for that drive, and ensuring that DisableDeleteNotify = 0 for that drive (using fsutil behaviour ...; this option relates to the TRIM command and similar that SSDs support). Optionally set DisableLastAccess if it’s not already set, and move often-churned but performance insensitive data to a HDD (I moved the temporary directories to the secondary drive, since for all intents and purposes that data will be in the cache, but it still requires file system churn).

Still being vaguely dissatisfied with it, I’ve since reinstalled with AHCI mode, and put fresh drivers on. Annoyingly it has crashed the system a few times, which seems to be a moderately endemic problem with a variety of SSDs. Probably a driver or firmware needs to be upgraded somewhere, but that will have to wait until I’ve recovered from the initial installation.

On the positive side, the drive is noticeably faster when reading large files.

I have not yet noticed whether it boots more or less quickly. That will be harder to fairly assess, since any new Windows install seems a lot snappier than an old crufty one. From the first Windows boot screen to being logged in takes 19 seconds.

There is some good news for bragging rights, in that the Disk component of the Windows Experience Index has gone from 5.6 to 7.9. So Windows noticed a difference, at least.

More specific benchmarks will have to wait till I’m properly set up on it again, and in the mood for testing.

Update. Upon formatting the old HDD and restoring all my files to the new partitions, I found that my backup of C:\Users had failed to copy anything in C:\Users\Edmund.  So let me add:

Lesson 5. When you back up a drive before reformatting it, check that your backup actually contains the stuff you want to restore.

 

This entry was posted in Hardware, Rants and tagged , , , . Bookmark the permalink.

Leave a comment