Jump to content

  • Log in with Twitter Log in with Windows Live Log In with Google      Sign In   
  • Create Account

Photo

BF: Wear Leveling Count / Total LBAs Written


  • Please log in to reply
19 replies to this topic

#1 Umfriend

Umfriend

    Advanced Member

  • Members
  • PipPipPip
  • 372 posts

Posted 24 February 2015 - 12:04 PM

Hi,

 

So I started to play around with Bitflock on my lappy. This one has a Plextor M5-PRO 512GB SSD. Two statistics I am worried about:

1. Wear Leveling Count: 81702

2. Total LBAs Written: 210,728,448 B (201 MB)

 

After running an I/O intensive SQL script, these changed to:

1. Wear Leveling Count: 81898 (+196)

2. Total LBAs Written: 211,411,968 B (202 MB)

 

Could it be that the interpretation rules are not fully correct ("Using Plextor / Marvell SSD specific rules.")? Should I be worried? Not sure the SSD should last for 80K P/E cycles and the idea that only 1MB was written during this operations, let alone that 202 MB was written over 2 years is absurd.

 

Int a grey-ish font it says with all statistics "Value: 100 (Was 100)"...

 

 

.



#2 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,196 posts
  • LocationSan Diego, CA, USA

Posted 25 February 2015 - 01:09 AM

It sounds like an interpretation issue (in regards to LBA written). A lot of SSDs use some non-standard formatting for the SMART values, and that can throw our information off.

 

If you could, let us know what your BitFlock ID (nest) is. PM me the info, or post it here.


Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.


#3 Umfriend

Umfriend

    Advanced Member

  • Members
  • PipPipPip
  • 372 posts

Posted 25 February 2015 - 06:42 AM

BitFlock ID is...suprise...Umfriend!

 

And what about the Wear Levelling Count? Or maybe I simply misunderstood this statistic entirely?



#4 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,196 posts
  • LocationSan Diego, CA, USA

Posted 25 February 2015 - 10:12 PM

Sorry, that looks like it's probably off as well. Though, it's harder to tell.

 

Out of curiosity, what does the manufacturer tool say (if it lists the values)?

 

 

But for the most part, you should be fine here.

Though, I've flagged the issues for Alex, and we'll take a look into correcting this on our end, when we can.

https://stablebit.co...eAnalysis/13502


Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.


#5 Umfriend

Umfriend

    Advanced Member

  • Members
  • PipPipPip
  • 372 posts

Posted 26 February 2015 - 09:21 AM

It only got Plextool delivered (completely different from the legendary Plextools). It does not say much other than SSD Health 098%. It has doen that since two years ago or thereabouts and that can't be right. For which SSDs do Scanner/BitFlock do have good interpretation rules? I'm not buying Plextor again for a long time after the karfuffle with the two M6S I had and not to keen on the Samsung EVOs either given the read-speed issue that has recurred. I am lucky the store I got the SSD for my Server from only had the 850 Pro in stock.

 

I do not understand, why enable SMART on a device and keep statistics when you do not document what the statistics mean? (Yeah, this is aimed at you, Plextor!)



#6 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,196 posts
  • LocationSan Diego, CA, USA

Posted 26 February 2015 - 09:57 PM

Ah, ouch. Unfortunately, some tools expose ... well, nothing.

 

And it's not just Plextor that does stuff like this. A majority of SSD manufacturers use non-standard values. But that's part of why we have BitFlock. 

 

As for good rules, since we have to add the rules ourselves, newer drives may be lacking...

But most popular/well known brands should have good rules. 

 

If you're asking for a specific recommendation, I can't really give you want. There are a lot of drives out there, and we do have a LOT of rules for the disks.


Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.


#7 Umfriend

Umfriend

    Advanced Member

  • Members
  • PipPipPip
  • 372 posts

Posted 10 March 2015 - 03:03 PM

So how do you get new rules to add? Is that something we users can help with?



#8 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,196 posts
  • LocationSan Diego, CA, USA

Posted 12 March 2015 - 04:29 AM

Alex and I do it manually.

We go through the "undefined" devices periodically, and update the SMART data and the interpretation rules.

 

As for helping, submitting the data to bitflock helps. :)

 

 

And Scanner updates these rules every 7 days automatically. But running the "update" option forces this.


Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.


#9 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,196 posts
  • LocationSan Diego, CA, USA

Posted 04 April 2015 - 09:08 PM

So how do you get new rules to add? Is that something we users can help with?

Rules updated.

It should appear in the next seven days, automatically. Or you can open up the SMART Details and manually update the interpretation.

 

 

Also, Alex has posted information about this disk and what you're seeing publicly, so you can take a look, if you want:

https://stablebit.co...alysis?Id=13502


Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.


#10 Umfriend

Umfriend

    Advanced Member

  • Members
  • PipPipPip
  • 372 posts

Posted 07 April 2015 - 06:55 AM

Thanks for this, nice try but not sure about the cigar! ;-)

 

It now reports 12.9 TB written after 192 days Power-On time. I develop SQL Server DBs on this machine and basically rebuild them over and over to test T-SQL scripts. That's GBs at a time, being inserted and updated multiple times in one script. Put another way, my Server OS SSD has 3.33TB Written after about 120 days and it runs nothing except Rosetta@Home.

 

What I could do to see if I am wrong is do a little script that woud copy and delete a 1GB file 1,024 times and see if it rises to 13.9TB. It should, right?



#11 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,196 posts
  • LocationSan Diego, CA, USA

Posted 07 April 2015 - 07:21 PM

Yeah, you could definitely do that. :)

 

Also, it may not bie exact. Remember that it's not just the raw data that's being written. It's the file system entries as well. 

And VSS snapshots, etc.

 

But regardless, please do, and see how far off it is. That way, we have a good idea of where the math is wrong at.


Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.


#12 Umfriend

Umfriend

    Advanced Member

  • Members
  • PipPipPip
  • 372 posts

Posted 15 April 2015 - 10:56 AM

Not sure what to make of this. I ran bitflock first and is stated Total LBSs Written 14,287,846,244,352 B (13.0 TB).

 

Then I ran a small VBA program:

Option Base 0
Option Explicit
Sub TestCopy()
    Dim sSrc As String, STgt1 As String, STgt As String, i As Long, cnt As Long
    cnt = 500
    sSrc = [secret file name]
    FileCopy sSrc, "D:\Copy - 000.bak"
    sSrc = "D:\Copy - 000.bak"
    For i = 1 To cnt
        Application.StatusBar = "Copying file " & i & " of " & cnt & "."
        STgt = "D:\Copy - " & Format(i, "00#") & ".bak"
        FileCopy sSrc, STgt
        If STgt1 <> "" Then
            Kill STgt1
        End If
        STgt1 = STgt
    Next
    Application.StatusBar = False
End Sub

 

The source file is 1,990,475,776 B and was copied 501 times to the SSD (of which 500 from the SSD itself). Assuming 2^10 (1024) as the base, that should add, I think, 0.91 TB

 

I ran bitflock afterwards and it came back with 15,005,709,762,560 B (13.6 TB) written. Amazingly, Total LBA's Read has hardly changed (about +3.5GB).

With 2^10 as base, the increase by 717,863,518,208 equates to 0.65 TB accoridng to Bitflock.

 

What I had expected was a multiple of 2 or 4 (if e.g. the SSD reported values' interpretation was dependent on SSD size) but I did not find that either.

 

One other value changed between the two runs of Bitflock: Wear Leveling Count increased from 86,013 to 89,869...

 

Any ideas what's up with this or what I could do to test/investigate further?



#13 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,196 posts
  • LocationSan Diego, CA, USA

Posted 15 April 2015 - 07:13 PM

I've let Alex know, and we'll see what he says.

 

However, I wouldn't be surprised if the SSD is doing something at the firmware level to "optimize" this.

 

A better way would be to create files with random data. The above uses the same data each time, and it is absolutely possible that the SSD firmware is "cheating" and doing some sort of optimization. Using random data would mean that it would be impossible (or very hard) for it to do something like that. 


Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.


#14 Umfriend

Umfriend

    Advanced Member

  • Members
  • PipPipPip
  • 372 posts

Posted 15 April 2015 - 08:04 PM

Well, I did consider that but honestly, aside from potentially compressing (which these SSDs do not do) I do not see how the SSD firmware would be able to understand that it is the same data over and over again. I assume that to the SSD, each filecopy instruction is seen as independent. Also, the source was a SQL Server compressed backup file, not sure a whole lot left to be gained by further compression.

 

But sure, I can create a random file. Just need to think of how to cleanly write 2GB of random data through VBA. Let me think on how to do that in a simple, efficient and eloquent way in VBA. For now, I do not think it will change anything.



#15 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,196 posts
  • LocationSan Diego, CA, USA

Posted 15 April 2015 - 10:23 PM

Why not try PowerShell? It's much more powerful. :)

[Byte[]]$out=@(); 0..2047 | %{$out += Get-Random -Minimum 0 -Maximum 255};
[System.IO.File]::WriteAllBytes("myrandomfiletest",$out)

Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.


#16 Umfriend

Umfriend

    Advanced Member

  • Members
  • PipPipPip
  • 372 posts

Posted 16 April 2015 - 05:10 AM

Yeah, I've been looking to look into PS for years now. That piece of code, it writes a 2,048 B file? Can the first parameter of WriteAllBytes be a string containing a filename?

 

I could do with a quick tutorial on syntax, assignment, declaration, conditional statements, loops and I/O. Any you can recommend?

 

And of course, PS is not more powerfull than VBA. It is just more suited for the task at hand. :D

 

Edit: OK, that does not seem too hard and is a good way to get started with PS, thanks for that idea. But boy, is PS slow. Takes ages to get 2 GB of random bytes... Going to see if it gets quicker using randow bigints or something.



#17 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,196 posts
  • LocationSan Diego, CA, USA

Posted 16 April 2015 - 07:58 AM

That was the first thing I could find, actually. :)

 

But yeah, it's pretty powerful. :)


Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.


#18 Umfriend

Umfriend

    Advanced Member

  • Members
  • PipPipPip
  • 372 posts

Posted 16 April 2015 - 09:41 AM

OK, reverted to VBA, much easier for me and not that slow either. I plan to:

1. Run bitflock

2. Run VBA

3. Run bitflock

 

The VBA should write 128 x 1GB files which are randomly filled to such an extent that I can't see any SSD optimising saving on actual writes. Sure, there is repitition here but it would require the SSD firmware to recognise randomly placed fixed patterns of 4KB and then do some sort of deduplication within files. I don't think so.

 

1TB seems overly ambitious, would take about 13.5 hrs, mostly as random number generation is slow (PS seems no better on that BTW).

 

I'll report shortly after running this:


Sub tst()
    Dim x() As String
    Dim i As Long, j As Long, f As Long, cnt As Long, c As Long
    Dim sPath As String, sFile As String, sFile_previous As String
    sPath = "D:\"
    For f = 1 To 128
        Sheets("Log").Range("A1").Offset(f, 0) = f
        Sheets("Log").Range("A1").Offset(f, 1) = Now()
        ReDim x(32)
        sFile = sPath & "RndFile " & Format(f, "0000#") & ".fil"
        cnt = 2 ^ 12
        For j = 1 To 32 'Fill 32 4K blocks with random bytes
            For i = 1 To cnt
                c = Int(255 * Rnd)
                x(j) = x(j) & Chr©
            Next i
        Next j
        Sheets("Log").Range("A1").Offset(f, 2) = Now()
        Open sFile For Output As #1
        cnt = 2 ^ 18 ' 262K blocks of 4K
        For i = 1 To cnt
            c = Int(32 * Rnd) + 1 'Select 1 of 32 4K blocks randomly
            Write #1, x©
            If i Mod 10000 = 0 Then
                Application.StatusBar = "File: " & f & ". Did " & Format(i, "#,###") & ", " & Format(4096 * i, "#,###") & " Bytes."
                DoEvents
            End If
        Next
        Close #1
        
        If sFile_previous <> "" And f <> 2 Then
            Kill sFile_previous
        End If
        sFile_previous = sFile
        Sheets("Log").Range("A1").Offset(f, 3) = Now()
    Next f
    Application.StatusBar = False
End Sub



#19 Umfriend

Umfriend

    Advanced Member

  • Members
  • PipPipPip
  • 372 posts

Posted 16 April 2015 - 11:54 AM

Well what do you know....

 

Running this resulted in Total LBA Written to increase by 131.2 GB (base 2), slightly more than the 128 GB I tried to write but, of course, other things were working as well so I'll except that. Based on this I can only conclude that:

1. Copying the same file over and over does something weird, perhaps the SSD firmware indeed does something to save on writes (which I can hardly imagine and oh how I hate not knowing); and,

2. I see no reason anymore to doubt the Bitflock rules & interpretation for this drive. Many thanks for that.

 

So, with 13.8 TB written, I should still be good for at least 10 times the activity sofar. Amazing and very nice.

 

Chris, Alex, many thanks.



#20 Christopher (Drashna)

Christopher (Drashna)

    Customer and Technical Support

  • Administrators
  • 8,196 posts
  • LocationSan Diego, CA, USA

Posted 16 April 2015 - 11:43 PM

Well, a lot of modern drives (especially SandForce based controllers) do some level of compression/trickery with the data. 

I actually just had a nice discussion about this at HSS forums, in regards to BitLocker. 

 

So, at the very least, keep both scripts. :)

You can use it to test to see if the drive does any compression like this! :)

 

 

As for the more than 128GBs ... that's not unexpected. Just remember that each time you create a file, it creates an NTFS file system entry. These are 4k in size (depending on how the disk is formatted) (ntfs cluster size, aka allocation unit size).  So, that you were using small files, and a lot of them, this is ... well perfectly normal.

 

Also, I'm not surprised by this at all. Considering that SSDs do have a limited number of writes, anything that can be done to minimize those writes will extend the life of the drive. 
 


Christopher Courtney

aka "Drashna"

Microsoft MVP for Windows Home Server 2009-2012

Lead Moderator for We Got Served

Moderator for Home Server Show

 

This is my server

 

Lots of "Other" data on your pool? Read about what it is here.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users