HomeTechnicaldata recovery using: r-studio, getdataback, nucleus kernel, stellar phoenix


data recovery using: r-studio, getdataback, nucleus kernel, stellar phoenix — 8 Comments

  1. Whenever this happens,I don’t even try booting
    to run chkdsk, safe mode etc.

    Plug the drive straight into another pc and
    get an image.The less work the suspect hard drive
    has to do the better.

    I use a custom PE disc with a few imaging tools
    and also a few different Linux bootable forensics discs which can grind out an image ignoring bad sectors,faulty partition tables etc.As you say,XP is not real happy booting with a faulty disc in the system.

    The whole key is getting am image before the
    suspect drive collapses.

    And of course with hard drives increasing in
    size constantly, there’s a lot more data to


  2. Forgot to mention getting an image successfully
    allows you the luxury of getting all the data at
    your leisure because you are not waiting for the
    drive to die.

    In the Windows world, Easy Recovery Pro is
    the best data recovery software I have.
    Lots of options.

    Expensive though.

  3. I’ve had some success with chkdsk in the past, but it obviously depends on the kind of failure.

    Now that I know there are better alternatives available, I’ll take a look around (at the time, I didn’t have time to do much searching around).

    I actually don’t have much time at the moment either, but I’ll eventually evaluate a few more recovery tools (like ERP).

    I totaly agree with imaging the drive as quickly as possible, and then working on the image.

  4. Another reason to run imaging software,(depending
    on the work you undertake).

    I once had a hard drive die unexpectedly while
    working on a customer’s pc.

    Threatened to sue me and caused a few stressful

    Run images and charge the customer.Beats getting
    caught with a disaster.

    It only happened once,but once is enough!

  5. hmmm, I’ve wondered what I would do if a customers PC had a catastrophic failure of a component while I was working on it.

    Yep, If it involves the risk of data loss, then tell the customer the backups will cost extra (I’d say some will take the risk and save some money, others won’t).

    Thanks for the tip mate.

  6. Quite a few shops have signs up and/or on the
    invoices write:

    “We will not be responsible for data loss and all backups are the customer’s responsibility”.

    That’s going to get messy if the PC is newish
    and wasn’t brought in with hard drive problems.

    Especially if the clause is not specifically
    signed by the customer.

    Any machine used in business,or for important data,I won’t do a thing until I have a
    known good image.

    Last year I had a 30 day old 80GB totally collapsed. It was replaced under warranty,
    but all the data was gone.

    Luckily I was only called to diagnose it.

    But it could have happened while I was working
    on it.

    The one disaster has made me a bit paranoid,
    but better safe than sorry.

  7. I just tried phoenix, r-studio and getdataback, after a massive problem (1 new hard drive, motherboard and processor) that lost me a 250GB drive. My backup strategy failed. I am going to buy better backup software, and I shall do the reinstalls as a test, to make sure it all works.

    GetDataBack was best.

    They all worked, but not all worked fully.

    Phoenix recovered far too many empty files (eg old version that had been deleted) – could be my fault for not getting settings right. Very simple software, but too simple if you want any power.

    R-Studio ok, better than Phoenix… you can get it to do more.

    But both had problems above 128GB mark. I had set up a 192 partition and a 64GB partition above it. Only GetDataBack could find the high partition (the 64GB). It missed it at first, but I set up a partial scan starting at the 190GB mark, and it found the lot. All my source code (I am a programmer) – it saved me going back 3 weeks to my monthly CD safety net.

    Phoenix simply couldn’t go above 128GB.

    R-Studio blue-screened and the machine switched off, once it got above 128GB.

    I found your comments here very useful, so I am trying to return the favour, by commenting back. (Even if you did just take us apart in the Ashes.)

  8. Hi Andrew,

    Thanks for the comment.

    I have since found SelfImage great for making an image of a drive, and then using getdataback to work on the image generated

    It sounds like your setup corrupted the 250GB drive… which is a bit different to recovering a drive that was once fine, but has started to break down.

    Anyway, glad I could help.

    I don’t know anything about cricket. I’m more a soccer guy, but only during the world cup (Viva Italia!!!)