[OPLINTECH] PC Image Broadcast Techniques

Ed Liddle eliddle at marysvillelib.org
Fri Dec 4 16:21:05 EST 2009


Depending how you want to broadcast image computers, I have used udpcast
to broadcast a hard drive of a computer out to several others during
closed hours. 

One thing to check out for imaging is http://fogproject.org 
The fog server allows computers to pxe boot from it. It can create an
inventory of your machines, you can group machines together and tie them
all to the same image file, you can copy an image file out one by one or
multicast. It has a client app that can run on the computer to check for
tasks you assign to a machine. The client app can also rename computers
after the imaging is done. It keeps track of computers by MAC addresses
and associates them with the computer names. It also has a feature to
deploy printers but I have not used it. 

I have been using fog here and there and also clonezilla for imaging as
well. 

I find imaging computers helps me a lot when deploying new ones. I can
unpack 1 machine set up it the way I want it that is the same for all
the computers, and create an image of it. I store the image to either a
network share or a portable USB hard drive using Clonezilla. 

I then start setting up new machines one at a time. I unpack the new
machine, and put it in the place of the old machine I am replacing. I
then boot off the clonezilla CD and copy my image file onto the machine.
When its done, I go through all the computer specific settings, computer
name, install what ever is specific to that machine, join  it to the
domain, etc. When I am done that machine is ready to go and on to the
next one I go. 

This in my opinion saves me time from finding a place to unpack all the
new machines, connect them all up, and prep them for deployment, then
unhook them all and deploy them.

Below is the handout I made up for the presentation I did at the last
Techconnections. I hope it helps! I don't think there is a reason to run
newsid on them.   read this for more info about it
http://blogs.technet.com/markrussinovich/archive/2009/11/03/3291024.aspx 

-----------------------------------------------------------------------
Imaging Computers using free tools 
Submitted by Ed Liddle on Tue, 02/24/2009 - 23:43 
Imaging computers can save a lot of time when deploying and reinstalling
windows. There are many tools that can be used to image or clone
computers. I am going to focus on using simple freely available ones. 

There are several open source tools that can be used to image computers.
Some rely on creating a bit for bit copy of the hard drive, compressing
the file and storing it. Others keep track of the partition table and
the amount of used space on the drive and just store the data in the
used space. Some use a network, some do not require a network. The one I
have been using lately is the clonezilla live cd. I use it with a
portable USB hard drive to store images to along with parted magic to
resize and create partitions. I have used G4U (ghost 4 unix), G4L (Ghost
for linux), udpcast, and couple of other free imaging tools.

Personal experiences:

I started out using G4U and a sftp server on the network. This worked
well since it did a bit for bit copy of the hard drive. I was able to
deploy the images on many different types of hard drives ranging in
manufacturers and size ( smaller to bigger). The downside is the time it
takes to create and deploy an image in this way. It would take a couple
of hours to create an image and about 30 minutes to an hour to deploy
it. Not bad if you don't have a lot of machines to do or a lot of
different images to keep track of. I had about 2 or 3 main images I
would use, one for circ machines, one for public machines, and one for
staff machines. For multiple machines I found udp cast to be very fast
for copying one machine's hard drive to many others. I have used this
mainly on our laptops we use for classes and the public computers when I
set new ones up. 

More recently I have been using clonezilla with a usb hard drive for
creating images and deploying images in combination with storing a
machine specific image on a linux partition on each machine. At my
library I often find myself setting up one machine at a time. In order
to image many staff machines at one time I would have to set them all up
in a room, network them together with a separate switch, image them with
a basic image, then manually set the computer names, generate new SIDS,
and join them to the domain. After all that is done then I have to
unhook each one, label them as to where they will go, and swap them out
with existing computers. Once the new machines are deployed occasionally
I will have to reinstall or re-image one to get rid of some of the
goodness that many windows nasties will give a user. If I re-image one,
I would use my basic image and afterwards go through the computer
specific settings. This way is definitely faster than doing a fresh
install the tried and true manual way. After using Clonezilla to create
images and deploy them from a usb drive it proved to be much faster than
sending a bit for bit image to or from a sftp server on the network.

My Current Method:

Desktop hard drives have a lot of wasted space on them in the library I
work at. Computers are coming with 80 gig hard drives. A typical windows
installation takes up 15 to 20 gigs which leaves 50 or 60 gigs of unused
space. Staff users save all their files to a file server. How can some
of this unused space be used on desktop hard drives? I decided to use
some of this space to store computer specific images on a linux
partition.

      * Step 1:
        On the machine I use to create my master image I use a linux
        live cd called parted magic. Parted magic is a light weight
        linux distro that has some useful tools. Among the tools is one
        called gparted. Gparted can be used to manipulate existing
        partitions and create new ones. I start out by shrinking down
        the partition windows is installed on to make room for a 20 gig
        or so linux partition. I create the ext3 linux partition and
        format it using gparted. After this is done I proceed in booting
        windows up, installing software, and updates to it along with
        any non computer specific settings that need to be tweaked. I
        also copy newsid and any other utilities I will need when I
        perform the computer specific settings later in a folder on the
        C: drive.
      * Step 2:
        I then use Clonezilla and create an image of the entire hard
        drive on my USB hard drive. This takes about 15 to 40 minutes
        depending on hard drive size and used space.
      * Step3:
        I unpack a new computer and set it up at the location it is
        going to be in. I plug in my USB hard drive and boot the
        clonezilla cd up to copy the hard drive image onto the new
        machine. This takes about 15 minutes or so.
      * Step4:
        I boot into windows run newsid and tell it to rename the PC.
        When its done, I install additional software if needed. I then
        join the PC to the domain and make sure everything is working
        correctly.
      * Step5:
        I boot up the clonezilla cd and image just the windows partition
        to the linux partition on the computer's internal hard drive.
        This provides a secure way to store the image file. When windows
        is booted up it can not read or write to the linux partition.
        This ensures the image file will not get deleted or infected by
        any virus that runs in windows.

Its been my experience that I will have to reinstall windows on a
computer one or more times before the hard drive in the machine will
fail. Having a computer specific image stored locally on the computer
will allow me to quickly reinstall everything and perform windows
updates afterwards in a very timely manor. I estimate the time to
re-image the computer to take about 15-20 minutes add to that the time
it takes to pull down and install the updates it will need. A
guestimation for this would be about a 1/2 hour or so. Total time to get
the computer back in operation would be about 45 minutes. I have no
images taking up space on a file server waiting to be used. Re-imaging
does not effect the local network at all since the images are on locally
connected devices. I can take my external hard drive to our branch
library and deploy the same images there. Worst case scenerio if my
portable USB hard drive dies, I have several computers I can create a
new image from and tweak it accordingly to reuse.

Clonezilla also has an optional server that can be setup to use with a
network. Clonezilla can make use of a windows file server, ftp server,
and nfs server to store images. It can also be used to create a restore
DVD/cd. 

Another network based imaging solution is fog project
http://www.fogproject.org/ Fog is a Linux-based, free and open source
computer imaging solution for Windows XP and Vista that ties together a
few open-source tools with a php-based web interface. Fog doesn't use
any boot disks, or CDs; everything is done via TFTP and PXE. Also with
fog many drivers are built into the kernel, so you don't really need to
worry about drivers (unless there isn't a linux kernel module for it).
Fog also supports putting an image that came from a computer with a 80GB
partition onto a machine with a 40GB hard drive as long as the data is
less than 40GB.

Cost:
$0 - NewSID
http://technet.microsoft.com/en-us/sysinternals/bb897418.aspx
$0 - Clonezilla http://clonezilla.org
$0 - Parted Magic http://partedmagic.com
$120 - 500 gig USB hard drive
Priceless - Short period of time it takes to get a PC back in service. 









I hope this helps.

-Ed Liddle 

On Fri, 2009-12-04 at 15:43 -0500, Eric Maynard wrote:
> Jim,
> 
> 
> As always, I appreciate your insights and advice.  I will check into
> the items you suggested especially the Altiris Deployment Solution.
> 
> 
> Not sure if this is just a demo, but our new public PCs appear to have
> a disc for Altiris Clent Manager version 6.  
> 
> 
> It doesn't say anything on it about being a demo, but it doesn't have
> any licensing info either, so I'm betting it is.
> 
> 
> thanks
> 
> Eric Maynard
> Head of Information Technology,
> Holmes County District Public Library
> Millersburg, OH  44654
> Email [emaynard at holmeslib.org]
> Phone [330.674.5972 x.224]
> Fax   [330.674.1938] 
> 
> "Failure is only the opportunity to begin again more intelligently"
> 
> 
> 
> 
> On Fri, Dec 4, 2009 at 1:19 PM, JKENZIG <JKENZIG at cuyahogalibrary.org>
> wrote:
>         Hi Eric,
>         
>         I have looked into a lot of different possible solutions. You
>         hit the caveat when you ask about printer/workstation
>         management software.  It depends on what you are using but,
>         most of the products out there use the workstation name to
>         configure themselves. So you need a product that on image it
>         is going to be able to give you the ability to assign a
>         specific machine name to the imaged computer and not some
>         random one.  
>         
>          
>         
>         We have always used Altiris Deployment Solution and it has
>         served us well. 
>         
>          
>         
>         The problem of course you have with these types of broadcast
>         imaging software to multiple devices is bandwidth.  You
>         usually can only do a few at a time and it is a long process. 
>         
>          
>         
>         I’ve spoke of it before but I will suggest you look at it
>         again and that is Citrix Provisioning Services for Desktops.
>          This product uses a single virtual disk VHD image that all
>         computers Pxe boot from it. 
>         
>         It has an admin console that lets you assign a computer a name
>         and ip address if necessary by the use of its MAC address.
>          This will resolve the printer workstation management
>         dilemma. 
>         
>          
>         
>         You pick the image from the console that you want the computer
>         to have and on boot it will load it.  You could easily switch
>         a computer from XP to Vista to Windows 7 to Linux just by
>         changing to a different VHD virtual disk image at the console
>         and sending a reboot.  You can have multiple images on the
>         server and switch at will. 
>         
>          
>         
>         The image stays on the server and it can be set up so no
>         changes are made to it. Eliminating the need for products like
>         Deepfreeze, antivirus etc even.  A reboot reloads the image. 
>         
>         http://www.citrix.com/english/ps2/products/product.asp?contentID=1297541
>         
>          
>         
>         I seriously believe that this is the future of delivering
>         desktops in organizations. The product once owned by Ardence
>         and purchased by Citrix has matured quite well over the years.
>         
>         Search Ardence or Citrix Provisioning on Youtube and you will
>         see some amazing videos.
>         
>         This one being my favorite.
>         
>         http://www.youtube.com/watch?v=moIuHqIc-PQ
>         
>          
>         
>          
>         
>         Regards, 
>         
>         Jim Kenzig
>         Network Manager
>         Cuyahoga County Public Library 
>         
>          
>         
>         From:oplintech-bounces at oplin.org
>         [mailto:oplintech-bounces at oplin.org] On Behalf Of Eric Maynard
>         Sent: Friday, December 04, 2009 12:20 PM
>         To: oplintech at oplin.org
>         Subject: [OPLINTECH] PC Image Broadcast Techniques
>         
>         
>         
>          
>         
>         Looking for input from the list on any imaging
>         products/techniques that might be in use out there for
>         supporting public (and staff) computers.  More specifically, I
>         am interested in  the potential for using "broadcast imaging"
>         or some other semi-auto means of updating whole PC images.
>         
>          
>         
>         
>         I personally have tried a variety of imaging tools using
>         everything from open source utils to Ghost, but I have settled
>         on Acronis True Image for last few years.  I would like to
>         consider taking this a step further and be able to just blast
>         a periodic update to all of our public PCs.  Acronis' Snap
>         Deploy sounds like the perfect product for this, but I was
>         hoping others might have some input to offer on this or
>         similar products.
>         
>         
>          
>         
>         
>         Questions I am interested in exploring:
>         
>         
>          
>         
>         
>         How often to broadcast updates?  
>         
>         
>          
>         
>         
>         Is it practical to do this once a month for OS updates and AV?
>         
>         
>          
>         
>         
>         What impact does configuration play into the frequency?  
>         
>         
>          
>         
>         
>         What impact does re-creating a PC have on other management
>         software such as time or print management?
>         
>         
>          
>         
>         What advantages/disadvantages does this have over central
>         update control via a thin client solution? 
>         
>         
>          
>         
>         
>         Thanks in advance for any insight or comments you might have
>         to offer on the subject.
>         
>         
>         
>         Eric Maynard
>         Head of Information Technology,
>         Holmes County District Public Library
>         Millersburg, OH  44654
>         Email [emaynard at holmeslib.org]
>         Phone [330.674.5972 x.224]
>         Fax   [330.674.1938] 
>         
>         "Failure is only the opportunity to begin again more
>         intelligently"
>         
>         
>         
>         
> 
> 

-- 
-Ed Liddle

Technology Assistant 
Marysville Public Library 
231 S Plum Street, Marysville, Ohio 43040
937-642-1876 ext. 45



More information about the OPLINTECH mailing list