+1 for R-Studio

therealcrazy8

Active Member
Reaction score
57
Location
Minnesota
So I currently have 2 hard drives (my brothers and a friends) that I have the privilege of playing with some data recovery software on. I ran R-Studio, and 2 other programs that I sadly don't recall the names of ATM. Anyway, my brother has 1000+ pictures of him and his son that he REALLY doesn't want to lose. His drive took a dump, which may or may not have been from a download that came from questionable sources. R-Studio was the only one that was able to find those pictures out of all 3 programs I used. I will be purchasing R-Studio this weekend. I can say that R-studio was one I was hoping to test anyway, based on all of the reviews I have seen in here. Now I feel very pleased and satisfied to give them my money. :)

A huge +1 for R-Studio

I look forward to future cases of using this software and seeing what it can recover, once I am officially open for business.
 
So I currently have 2 hard drives (my brothers and a friends) that I have the privilege of playing with some data recovery software on. I ran R-Studio, and 2 other programs that I sadly don't recall the names of ATM. Anyway, my brother has 1000+ pictures of him and his son that he REALLY doesn't want to lose. His drive took a dump, which may or may not have been from a download that came from questionable sources. R-Studio was the only one that was able to find those pictures out of all 3 programs I used. I will be purchasing R-Studio this weekend. I can say that R-studio was one I was hoping to test anyway, based on all of the reviews I have seen in here. Now I feel very pleased and satisfied to give them my money. :)

A huge +1 for R-Studio

I look forward to future cases of using this software and seeing what it can recover, once I am officially open for business.
That is great! However, always be sure to get a full sector-by-sector clone of the patient drive, run R-Studio against the clone and copy the recovered files to a separate known healthy drive.
 
That is great! However, always be sure to get a full sector-by-sector clone of the patient drive, run R-Studio against the clone and copy the recovered files to a separate known healthy drive.

I will be sure to do that with future cases. I did just come across that procedure yesterday, after doing everything I did already of course. I have one of those nice dual slot StarTechs. I suppose it would work pretty slick to have a "clean" drive in slot 2 of that and then the problematic drive in slot 1. That way I can do a clone with the StarTech, and then once completed, run R-Studio against the clone. I have to say, when I say the estimated time was 19 hours, in R-Studio, I was assuming and hoping it was going to be a really deep scan. I'll have to verify if the StarTech does sector-by-sector clone. I thought it did.
 
I will be sure to do that with future cases. I did just come across that procedure yesterday, after doing everything I did already of course. I have one of those nice dual slot StarTechs. I suppose it would work pretty slick to have a "clean" drive in slot 2 of that and then the problematic drive in slot 1. That way I can do a clone with the StarTech, and then once completed, run R-Studio against the clone. I have to say, when I say the estimated time was 19 hours, in R-Studio, I was assuming and hoping it was going to be a really deep scan. I'll have to verify if the StarTech does sector-by-sector clone. I thought it did.
It likely does do sector-by-sector copying, but they don't do well with bad sectors....which may not be a bad thing. This way, if the drive has issues, it will just stop and you will know that it might be wiser to outsource than to take any unnecessary risks.

If you want to be serious with the data recovery side of things, you should work toward getting a RapidSpar.
 
It likely does do sector-by-sector copying, but they don't do well with bad sectors....which may not be a bad thing. This way, if the drive has issues, it will just stop and you will know that it might be wiser to outsource than to take any unnecessary risks.

If you want to be serious with the data recovery side of things, you should work toward getting a RapidSpar.
I did find out now that it does sector-by-sector. As far as bad sectors go, I thought I saw in the forum somewhere that R-Studio (maybe it was some other program) skips bad sectors and will continue running and [try to] recover data. Is that correct? RapidSpar would be cool. Not sure if it will get that serious though.

One thing I am curious about, I did a bit of reading yesterday about data recovery and some of the things that can cause hard drive issues and the different approaches to recovering the data. My take on what I got from that is if its an internal issue (motor, head, etc), doesnt have an 8 pin ROM on the PCB (WD drives and some others) send it to the pros. If its a diode or bad sectors, then it can be done by a tech. I may have left something out, but what I am wondering is, when it comes time to send to the pros, do techs "partner up" with one of the pros in the area and get some perk for bringing them a customer, or is it usually a "hey call these guys, it'll cost $x,xxx.xx+ but they will be able to do it" type of ordeal?
 
I did find out now that it does sector-by-sector. As far as bad sectors go, I thought I saw in the forum somewhere that R-Studio (maybe it was some other program) skips bad sectors and will continue running and [try to] recover data. Is that correct? RapidSpar would be cool. Not sure if it will get that serious though.

It scans sector by sector (as does most all data recovery software) and it can handle skipping a few bad sectors. However what Luke is talking about getting a sector-by sector clone is different. When R-Studio is scanning, it's not creating a backup of the data anywhere, just finding the start/end locations of the files and recording the file system structure. So if the drive dies half way through scanning even with R-Studio you lost 100% of the data. Whereas is you image first using a program such as ddrescue in linux, then you'll have at least a backup of what was imaged. And, FYI it handles bad sectors 10x better than any Windows based tools ever could.
 
It scans sector by sector (as does most all data recovery software) and it can handle skipping a few bad sectors. However what Luke is talking about getting a sector-by sector clone is different. When R-Studio is scanning, it's not creating a backup of the data anywhere, just finding the start/end locations of the files and recording the file system structure. So if the drive dies half way through scanning even with R-Studio you lost 100% of the data. Whereas is you image first using a program such as ddrescue in linux, then you'll have at least a backup of what was imaged. And, FYI it handles bad sectors 10x better than any Windows based tools ever could.

I do plan on doing ddrescue also. I'm debating on using an old (5 years or so) system that's sitting around doing nothing (perfect for linux) or just making a linux VM on my current machine. I am no stranger to the linux command line, but I am thinking of using a GUI. Aren't gparted and parted magic essentially ddrescue with a GUI, or is that not accurate?
 
A VM is very unlikely to get the desired result because it won't have direct access to the drive. It'll be tunneled through a Windows driver and thus face all the limitations that Windows has. Why not just use a Linux live USB/CD/DVD? Will be much easier than having to install it on a machine. Just follow the guide I linked to above to set it up.

And no, gparted and parted magic have nothing to do with ddrescue. Those are partition programs and only edit your partition tables, while ddrescue is a drive cloning tool. There really is no useful GUI for ddrescue, but you only have to type in one command and then just let it go to work. It's all in the guide.
 
A VM is very unlikely to get the desired result because it won't have direct access to the drive. It'll be tunneled through a Windows driver and thus face all the limitations that Windows has. Why not just use a Linux live USB/CD/DVD? Will be much easier than having to install it on a machine. Just follow the guide I linked to above to set it up.

And no, gparted and parted magic have nothing to do with ddrescue. Those are partition programs and only edit your partition tables, while ddrescue is a drive cloning tool. There really is no useful GUI for ddrescue, but you only have to type in one command and then just let it go to work. It's all in the guide.
Awesome. Yeah, the VM issues make perfect sense. I do like the idea of the live CD/USB/DVD. Ill be sure to look at the guide. Thanks a lot for the info, I greatly appreciate it.
 
When R-Studio is scanning, it's not creating a backup of the data anywhere, just finding the start/end locations of the files and recording the file system structure. So if the drive dies half way through scanning even with R-Studio you lost 100% of the data. Whereas is you image first using a program such as ddrescue in linux, then you'll have at least a backup of what was imaged.
I'm in complete agreement with you there and I always clone drives before attempting any data recovery, but your comparison raises a point that always concerns me: If a drive is going to die while scanning for files (read only) wouldn't you say it's just as likely, if not more likely, that it's going to die during cloning? I suppose the head isn't working as hard during cloning, due to sequential reading, but I would imagine that's still a significant strain to put on a potentially failing drive. I wonder if we should be condoning anything but professional data recovery services for the recovery of important data ...

But rather than reducing strain, I think the main reason I would always clone a drive before attempting data recovery would be to insure against problems with the recovery process and any unintentional writing to the drive that may occur. The amount of work the drive has to do seems unavoidable either way.

So what does a professional recovery lab like yours use to recover data in a way that doesn't put any strain on a potentially failing drive? (just curious -- tell me to mind my own business if that's a trade secret ;) ).
 
Yes, it's absolutely possible that it could die while cloning. However think about this:

Same dying HDD in two different parallel universes

Universe 1: Tech decides to just directly scan the drive in R-Studio and it dies completely at the 80% mark.
Amount of data actually recovered: 0% the scan only found out where the files were, it didn't save them anywhere

Universe 2: Tech decides to clone in ddrescue first. Drive fails at the 80% mark same as in Universe 1
Amount of data recovered: 80% or more (depending on how full the drive was) because the sectors were being copied as they were read

The principle is simple, if you scan and recover entirely from the original drive you have to read all the sectors twice. Once to scan, and once to recover. If you clone first you only read them once and the subsequent reading/scanning is all done on the new healthy drive. And, yes all the jumping around reading files is much more likely to damage a drive than the sequential reading from cloning.

As to what we use to recover data without killing the drives, it's all hardware imaging tools. Mostly PC-3000 and DeepSpar Disk Imager 4. The advantage of hardware imaging is that we can control the read timeouts to minimize the strain on drives from trying over and over to read the same bad sector. The first pass we'll use a really low timeout where the drive will skip any sector that takes longer than say 300ms to read (varies by brand what we actually use). The next pass we'll use 500ms, and the final pass we'll try at 850ms. That way we have the bulk of the sectors read already before we try reading the damaged areas more intensively.

Ddrescue does something similar, though it can't control read timeouts. When it hits a bad sector, it'll jump ahead a certain block of sectors then try reading backwards until it hits another bad sector. Then it leaves that section of "questionable sectors" alone and continues reading ahead (preventing head damage). Only after it finishes the whole drive this way does it go back and then try reading sectors in between the bad ones by splitting the blocks in half and reading both forward and backward between the known bad ones. It's a similar concept to what the hardware imagers do, minus the timeout control.
 
Last edited:
gparted and parted magic have nothing to do with ddrescue. Those are partition programs

Quick note: Parted Magic is a Linux distribution focused on drive management and data recovery, which includes both parted and gparted, along with ddrescue and a variety of other things.
 
Yes, it's absolutely possible that it could die while cloning. However think about this:

Same dying HDD in two different parallel universes

Universe 1: Tech decides to just directly scan the drive in R-Studio and it dies completely at the 80% mark.
Amount of data actually recovered: 0% the scan only found out where the files were, it didn't save them anywhere

Universe 2: Tech decides to clone in ddrescue first. Drive fails at the 80% mark same as in Universe 1
Amount of data recovered: 80% or more (depending on how full the drive was) because the sectors were being copied as they were read

The principle is simple, if you scan and recover entirely from the original drive you have to read all the sectors twice. Once to scan, and once to recover. If you clone first you only read them once and the subsequent reading/scanning is all done on the new healthy drive. And, yes all the jumping around reading files is much more likely to damage a drive than the sequential reading from cloning.

As to what we use to recover data without killing the drives, it's all hardware imaging tools. Mostly PC-3000 and DeepSpar Disk Imager 4. The advantage of hardware imaging is that we can control the read timeouts to minimize the strain on drives from trying over and over to read the same bad sector. The first pass we'll use a really low timeout where the drive will skip any sector that takes longer than say 300ms to read (varies by brand what we actually use). The next pass we'll use 500ms, and the final pass we'll try at 850ms. That way we have the bulk of the sectors read already before we try reading the damaged areas more intensively.

Ddrescue does something similar, though it can't control read timeouts. When it hits a bad sector, it'll jump ahead a certain block of sectors then try reading backwards until it hits another bad sector. Then it leaves that section of "questionable sectors" alone and continues reading ahead (preventing head damage). Only after it finishes the whole drive this way does it go back and then try reading sectors in between the bad ones by splitting the blocks in half and reading both forward and backward between the known bad ones. It's a similar concept to what the hardware imagers do, minus the timeout control.

Very interesting. Thanks for that. :)

I suppose I was assuming software like R-Studio would recover the files as it reads them, rather than requiring a second pass. Perhaps that should be an option. But yeah, given that using recovery software increases drive activity, even by a small amount, that could of course potentially reduce the chances of recovering the data.

I can understand why professional hardware imaging tools are vastly superior to software recovery tools, and probably much more successful at recovering data, especially if they allow full control of the read process, but how do you know if it's even safe to spin-up the disk in the first place?

Imagine this hypothetical (but somewhat typical) situation: Customer has an external drive containing thousands of files that are so important that losing even one of them would result in the end of the world as we know it. He has ensured that his files are very safe by writing the words "do not delete" in capital letters (with a few exclamation marks, just to be sure) on a pink Post-It note attached to the drive. No backups of course, but hey, he knows if the worst happens he can rely on his IT guys to work their witchcraft.

So the customer asks me to collect a new computer and the external drive containing his priceless data, which he wants me to transfer to his new computer. As I arrive at the customer's premises, he walks towards me carrying both the computer and the drive, and I witness him drop the external drive and watch helplessly as it bounces down a small flight of concrete stairs.

Now at this point, wanting to avert global annihilation, I suggest to him that we immediately send the drive off to a professional data recovery lab since there's a good chance that even spinning the drive up to check it could result in data loss due to mechanical damage. Let's assume the platters survived the fall but the head's arm is damaged/dislocated in such a way that it is likely to scrape across the surface of the platter the moment power is applied. What procedures would a professional lab use to prevent data loss in this scenario? Also, would that procedure always be used to avoid data loss whenever the cause of failure is unknown (just in case)?
 
He has ensured that his files are very safe by writing the words "do not delete" in capital letters (with a few exclamation marks, just to be sure) on a pink Post-It note attached to the drive. No backups of course, but hey, he knows if the worst happens he can rely on his IT guys to work their witchcraft.
Most such users are hopeless and deserve to lose their "priceless" data.

I'm not a data recovery pro (just a novice) but when I learn that a drive has been dropped, I always open it and remove and inspect the heads and check spindle free-play without powering it up. The customer has probably been trying and failed to access the data already or they wouldn't be consulting you in the first place. So, if they say it hasn't been dropped, powering it on to assess it is unlikely to do more harm than has already been caused by the failure or the customer trying to read/recover the data. Opening the drive to inspect the heads/spindle will add $100 to the recovery if they take it elsewhere after your efforts, so I avoid doing that unless I'm pretty sure it's necessary.
 
I'm not a data recovery pro (just a novice) but when I learn that a drive has been dropped, I always open it and remove and inspect the heads and check spindle free-play without powering it up.
Wouldn't that increase the risk of further damage and/or data loss, unless you have a clean-room and the right calibration/alignment equipment?
 
Thanks! Hadn't seen that.

I keep forgetting there's a 'resources' area here.

So, as I understand it, you would always disassemble the drive and assess whether it's safe to spin it up before doing so?

Not that I'm planning to get into data recovery myself. Quite the opposite in fact; I'm looking for reasons to justify professional data recovery to clients. If a drive always requires inspection in a clean-room, prior to attempting any sort of data recovery, that's reason enough for me to recommend sending the drive to somewhere that has the necessary equipment.
 
Wouldn't that increase the risk of further damage and/or data loss, unless you have a clean-room and the right calibration/alignment equipment?
I was also wondering the same thing. Every time I read something about opening the hard drive, "clean room" is usually mentioned.
 
Back
Top