Tuesday, October 18, 2011

Hopping between hard disks with Clonezilla

Hard disks are often the first thing to die in a computer, so it is important to be able to quickly and easily migrate your data from one disk to another without having to reinstall your operating system, OS updates, and all of your applications, restore your files, etc.

Between hard drives that are about to fail or just out of space, I've been reaching for my Clonezilla CD an awful lot lately.  It is a bootable CD that gives you the ability to image a drive or partition or clone a drive/partition directly to another (similarly to Norton Ghost).  It also has the ability to add a clonezilla bootloader to whatever you clone your disk to, so the clone itself is bootable and ready install your disk image to any number of drives.  This is super handy if you want to make a custom restore DVD for, say, a new laptop. 

It uses an ncurses interface, so you will have to tab through the fields and use the space bar to press buttons and select options, but it does a good job of walking you through the process and is actually quite intuitive.  It can and will save you a ton of time if your hardware is failing but the your software setup is just fine.

I had a somewhat unorthodox use of Clonezilla this evening which prompted me to finally blog about it. I run Windows XP in a virtual machine using VirtualBox, and when I initially set it up (a few years ago) I only allocated 10 gigs to the virtual disk.  This evening I went to install some development tools on the XP VM and ran out of space.  VirtualBox didn't have the interface for expanding the disk (that I could find; someone please set me straight on this if I am wrong), so I was in a bit of a pickle until I remembered Clonezilla.

I created a brand-new 100 gig VirtualBox virtual hard disk and attached it to the XP VM.  Then I mounted the Clonezilla ISO file (which I happened to have lying around) as the CD rom drive and booted the VM into Clonezilla.  Clonezilla saw the old 10 gig hard drive and the new 100 gig hard drive, and I simply directed it to clone the entire old disk to the new disk.  It took about 15 minutes and when I was done I simply removed the Clonezilla ISO and the old drive from the VM, fired up the VM and I was back up and running (I had to resize the partition after I got back into Windows XP but EaseUS Partition Master will take care of that for you without even rebooting and is free). 

Thursday, May 5, 2011

Creating the home video, revisited

For Mother's Day this year my wife and I decided to take all of the best, cutest videos of our daughter and compile them into a DVD for our mothers and grandmothers.  So we gathered together all of our favorite moments, all of which were a few minutes each and were taken on our cell phones (she has a Blackberry and I have an Android phone).

I used a USB cable to transfer the files to a folder called "raw_footage" on my desktop machine running Linux Mint 10.  That's when I noticed that every single video had an extension of 3GP.  I was able to play them in Totem, so no problems so far. 

I fired up PiTiVi, which has come a long way since the last time I blogged about it.  After adding files to the timeline at the bottom, I noticed that the video and audio tracks each have a horizontal red line going through them (the video has the line going across the top, the audio has the line going through the middle).  With this bar you can double-click to set a fixed point, then drag the line up or down to fade the audio or video in or out.  I used this feature to create fade in and fade out effects for each shot, and to turn up the volume on one of the clips that was too quiet to hear.  Sweet! 

Everything was coming along swimmingly until I went to render the project.  PiTiVi just kept crashing!  It would say it was rendering and just hang there, go unresponsive to input or it would start to render and the progress bar would never appear, all the while the estimated time remaining increased indefinitely.  Super frustrating!  I tried changing output formats but nothing worked.  Finally I removed all of the videos from the timeline and went through each video file individually to render it to Ogg Theora.  After that PiTiVi would render to whatever format I wanted.  Whew!

At this point I'm ready to be done with this project, but I was about to become the next Beowulf, just having defeated Grendel only to have to descend to the underworld to face Grendel's mother.

This next challenge came when I tried to add subtitles and create the DVD.  I used Jubler to create the file containing the subtitles, and I used DeVeDe to create the DVD image file itself (along with menus etc).  Both Jubler and DeVeDe use MPlayer as a back-end for multimedia support, which makes a lot of sense since MPlayer has been around forever and has a reputation for supporting every video and audio format known to man.  So I should have been safe since all of my files are in Ogg Theora now, right?  I mean, there's no way MPlayer is going to have difficulty with the most popular free and open source video format, right?

Wrong!

For reasons totally beyond my comprehension, the Gnome MPlayer default isntallation/configuration Ubuntu 10.10 and Linux Mint 10 is totally incapable of playing Ogg Theora out of the box. It failed with a generic error ("MPlayer interrupted by signal 11 in module decode video"), which I Googled for several hours and turned up empty-handed.  To this day I have no idea how to get MPlayer to play Ogg Theora files on two of the most popular Linux disros.  Totem was able to play them, which tells me that this has nothing to do with availability of codecs.  I think it's totally insane that there is nothing I can do to get one of the most popular media players, which serves as a back-end for a lot of A/V editing tools, to handle Ogg Theora.  It's Ogg Theora, people!  I thought Windows Media Player was the only utility that still couldn't play it!



For all that, the workaround I found wasn't all that bad:  I simply used PiTiVi to re-render my videos into MP4 (with the x264 video codec and Lame audio codec), which MPlayer was more than happy to play.  Once that was done it was pretty much smooth sailing: I fired up Jubler and created the .ass files that told DeVeDe where to put each subtitle and how long to display it (I used this post as a guide), then DeVeDe created the ISO without a hitch.

Saturday, April 23, 2011

Backing up with rdiff-backup

I recently tried out rdiff-backup for my backups and really like it a lot. It is a command-line utility written in Python that can operate locally or remotely via SSH. When it is first run, it copies over all of your files to the backup directory. On subsequent backups, it only copies whatever has changed since the most recent backup, updates the mirror, and stores the changes it made to the mirror. The end result is that you always have a fully up-to-date mirror of your files, but at any point you can restore from previous backups. The backup directory consumes minimal disk space, and the backup process is very fast since it is only copying the changes to your computer.

 The syntax is similar to the "cp" command: the command itself, followed by the source directory, then the destination (backup) directory, like so:

     rdiff-backup /home/jizldrangs /usr/backups

When using SSH, use the server name, followed by double colons and the absolute path, like this:

     rdiff-backup /home/jizldrangs fileServerName::/home/jizldrangs/backups

As with many command-line tools, there are a lot of options, most importantly the option to include or exclude certain paths or files. See the examples and man page for details on how to fine-tune your backup.

I put this on my netbook and desktop, which are both running Ubuntu Maverick, but my wife's machine had only the sporadic backups I had made to our USB hard drive, and I wanted a more consistent plan. Fortunately, rdiff-backup, being a Python application, also has a Windows version. If you backing up to a local directory or mounted network drive, you are good to go.

If you want to back up over SSH, it gets a little sticky but it can be done. Using the instructions on this post I downloaded plink.exe (the command-line version of Putty), and created a batch file with the following:

"C:\Program Files\rdiff-backup.exe" -v5 --no-hard-links --exclude-symbolic-links --remote-schema "plink.exe -i rsa.ppk %%s rdiff-backup --server" "C:\\Documents and Settings\Mrs. Jizldrangs\My Documents" mrsjizldrangs@myfileserver::/home/mrsjizldrangs/backups-my-docs

This batch resides in the same directory as plink.exe, which is why the full path isn't specified. Here is a breakdown of the arguments:
  • no-hard-links and exclude-symbolic-links: these are necessary for windows machines per the blog post above
  • remote-schema: The method of contacting a remote server (in our case, an SSH server over plink.exe)
  • The last two arguments are the source directory and destination (i.e. backup) directory
Plink has some arguments as well, here is the breakdown:
  • i: the name of the ssh key to use for authentication. I created an SSH keypair using PuttyGen, which generates 2 files, a public key and a private key. I added the contents of the public key to the authorized_keys file on the server, and the argument specified above is the private key, which is also located in the same directory as plink.exe and the batch file
  • The %%s tells rdiff-backup to run what follows on the remote server
  • rdiff-backup --server: This is run on the remote machine and all it does is start rdiff-backup in server mode
All right! All of the machines are backed up and everything is peaches and cream. In fact, my newfound confidence gives me a case of Linux Distro Wanderlust, and I got it in my head that I wanted to switch my desktop from Ubuntu Maverick to Linux Mint 10. There's no reason not to, especially since I can just do an automated restore and all my files will come rushing back. So I wiped the disk, installed Mint 10, and began the restore. After a long time I received an error that permission was denied on ~/.gvfs. Fortunately, rdiff-backup allows you to include or exclude as many folders as you want with the --exclude argument. I excluded it, and tried again. I got that same error on ~/.local and ~/.subversion, so I ended up excluding those directories as well, with the final command looking like this:

     rdiff-backup -v5 --force -r now --exclude **/.subversion/** --exclude **/.gvfs/** --exclude **/.local/** myFileServer::/home/jizldrangs/vengeance-backup ~

Here's a breakdown of the arguments:
  • v5: Verbosity level 5. The available levels are 1 being the lowest through 9 which outputs so much info that it is impossible to read. 5 is a nice happy medium as it lists the files it is working on.
  • force: this is necessary to add when doing a restore to a directory that already has some version of the files you are trying to restore. In my case, the default Home directory created for me by Linux Mint already had some default folders, so I had to force rdiff-backup to overwrite them with the version from my backup.
  • r: specifies a restore
  • now: tells rdiff-backup when to restore as of (see the man page for alternative options if you want to go to a past backup)
  • exclude: tells rdiff-backup that these folders exist in the backup but not to restore them
  • the last two arguments specify where to restore from (i.e. the backup directory) and where to restore to (in my case the Home directory, you can change this to restore somewhere else and have access to multiple versions of your files)
After several days of trial and error in figuring out which directories are going to cause problems when copying over, I was able to do a restore of my files, which is a relief because you never know whether your backups are any good until you've done a restore. Happy backing up!

Saturday, January 15, 2011

Getting an HP LaserJet 1018 to work under Ubuntu Server

I just rebuilt my old Slackware 11 machine into a shiny, new Ubuntu Server 10.10 (maverick) machine.  Wireless printing was the most important function of my old server, so I had to get it working on the new server. 

The LaserJet 1018 uses the foo2zjs driver, which works great when installed from the repos (I believe it is part of the default installation so there is no new package to install), however the 1018 requires the computer to provide the firmware.  Here is how to do that:

First, get the firmware on your machine.  Run:

   wget http://foo2zjs.rkkda.com/firmware/sihp1018.tar.gz

Then unpack it, and run a utility to convert the image to a firmware file:

   tar zxf sihp1018.tar.gz
   arm2hpdl sihp1018.img  > sihp1018.dl

Copy it to your /usr directory for safe keeping:
  
   sudo cp sihp1018.dl /usr/share/foo2zjs/firmware/

Run the following to move the firmware to the printer:

   cat sihp1018.dl > /dev/usb/lp0

That's it, you should be able to print!  Of course, your printer will only hold on to the firmware as long as the printer is powered up; when the printer loses power, it will lose the firmware as well.  You don't want to have to run this command every time; instead you should be pushing the task off onto udev by creating a new rule!  Cd into /etc/udev/rules.d, create a new file called 80-printing.rules, and put the following inside:

   ACTION=="add", ATTRS{idVendor}=="03f0", ATTRS{idProduct}=="4117", RUN+="cat /usr/share/foo2zjs/firmware/sihp1018.dl > /dev/usb/lp0"

Save and close the file, then restart udev by running:

   sudo service udev restart

And you should be good to go!  A big shout-out goes out to the folks on this Ubuntu Forums thread, who filled in the gaps of my knowledge.  :)