Skip to content

Bandwidth Concerns Boost Source Dedupe’s Popularity

by on March 29, 2010

Bandwidth As organizations look for alternatives to tape for backups, one that’s attracting their attention is source deduplication.

Unlike targeted dedupe, where conventional backups are sent to a central location to be deduped, source dedupe is done dynamically. Software is installed on client machines. The clients communicate their data activity to a backup server running deduplication software. As data arrives at the server, it’s deduped.

Source dedupe offers the same storage benefits as target dedupe, but it consumes less bandwidth, as large backup files don’t have to travel on the network to get to the dedupe location. On the other hand, with targeted dedupe, legacy backup software can be used in the process, while source dedupe requires new software purchases.

Source dedupe also has the advantage of working close to the source of data creation. Although still rare, it can be implemented on iron as small as notebook computers. “The closer you are to the source of the data, the more efficient you’re going to be in moving data around,” Lauren Whitehouse, senior analyst at Milford, Mass.-based Enterprise Strategy Group told Data Backup News.

“Deduplicating information closer to the source will decrease network traffic and reduce the storage footprint, helping to make backups more successful,” maintained Symantec, which is known for its malware-fighting products but is also in the network backup space.

“Client deduplication removes redundant data at the source, which leads to lower CPU, I/O, and memory utilization compared to a traditional backup, thereby freeing up more client resources for production services in both physical and virtual environments,” it added.

Organizations operating at diverse locations have been especially attracted to source dedupe because of lower bandwidth requirements.

“Deduplication has the power to transform information management: it is great for backup; it is great for archiving; and it even makes server virtualization manageable,” Symantec asserted. “Deduplication should live in every part of the information architecture.”

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: