Archive for Data Storage

The Costs of Data Loss Extend beyond Financial

By Tom FedroCosts of data loss by Tom Fedro

Companies today have become more and more dependent on the efficiency and security of their data.   It’s amazing how often, though, I come across people with absolutely no data protection strategy in place.  Any data loss hurts business. A large data loss interrupts the flow of work, and cause a loss of both profit and productivity.  I tell people all the time that protection of data should be a top priority.

Of course, they counter by pointing out that I’m a data storage expert and focus my work on data.  Maybe that does make me a more adamant advocate, but look at ways data loss can hurt you:

  1. Productivity.  You use your data in the course of your day.  How much work would you or your employees get done if you didn’t have access to it?
  2. Customer Loyalty.  How many times will losing your customers’ information occur before they stop being your customer?  Have you ever had that sinking feeling when a company you’ve used for a long time can’t locate your information?
  3. Intellectual Property.  What if your data is your product?  Do you want to lose code, lose ad copy, images, or publications?  How much loss it there when an intellectual asset isn’t protected intelligently?

These are just three quick examples of data loss costs.  There are a great many more that are possible and even probable.  Your data is a tool and an asset, and you’re not serving your organization well if you’re failing to plan correctly for protection.

Of course, in the end, it might be impossible to figure the complete amount of financial loss that is possible when data protection fails.  Consequences can vary from mere annoyance, to devastating, and there is no way to cope with the risk without figuring out a contingency plan. What has your company done in this area?  How safe is your data?  How are you managing your data storage?  Action on these questions now can save a lot more than a few dollars in the future.

Data Access on Android Tablets

By Tom Fedro

Thomas Fedro discusses Paragon Technology in Android-based TabletsThe prevailing standard in hard drive format is Windows NTFS. This has presented some challenges with technology with other formats (like Linux) in that data and media stored in the NTFS format is incompatible for viewing and use. The Paragon Universal File System Driver (UFSD) technology is here to bridge the gap.

Paragon Software’s unique UFSD Technology provides full access including read, write, format, delete, copy. It works within nearly all of the major kernels including Linux, VxWorks, eCos, Mac, Windows, and DOS. It offers access to NTFS, HFS+, ext 2/3, FAT16, and FAT32 where the systems would not be supported otherwise. It’s small size and fast delivery makes it easy to manage and operate. There’s no need for other libraries and it supports all NTFS features.

Acer recently chose UFSD for its Iconia Tab A500 and Iconia TabA501 lines, allowing the Android product based on a Linux kernel to access data and media stored in the NTFS format. Ultimately, end users of the product will be able to have full access to file systems which otherwise would not be supported. Paragon’s UFSD-based NTFS for Android driver will enhance the capability of the tablets, since most hard drives are formatted for NTFS. Without the driver, a user would have to reformat to the older FAT file system in order to gain the access.

With the driver, a user can attach a hard drive formatted for Windows via the tablet USB port, making it a value-added feature for Acer. Users can download and stream multimedia products to their hard drive but only transfer immediately desired files via USB storage. UFSD Technology has been designed to provide the highest possible read/write performance as well, which means transferring data is fast and effortless. Importantly, the read/write performance also ensures that media playback is top-notch, which is critical for quality output of Hi Definition and Blu-Ray media.

We’re excited Acer chose this product to push their tablets a step above the competition, and we’re happy to see our efforts working in the marketplace.

Disaster Recovery Software Decision-Making Criteria

By Tom Fedro

Tom Fedro discusses decision making criteria for disaster recoverySales of disaster recovery software have shown dramatic growth over the last several years as just about every company has come to rely on systems and data management for continued operation. Although the first mention of this kind of disaster recovery occurred in the 1970s, it wasn’t for decades that the importance was fully realized. Back then, technology really wasn’t intertwined with a company’s operations the way it is now. Now, most companies would find it strange to think of technology and business as independent, the way we find it strange when we watch a TV show from the 1980s and don’t see cell phones.

Although technology is still advancing at breakneck speed, the data protection industry is essentially mature, and a number of companies vie for market share. When a company decides to determine which solution is correct, there are some important and critical considerations that need to go into the decision-making process. First and foremost, what is the need?

Too often, this step is skipped. Companies tend to examine what’s available and make choices based on the four or five alternatives they come across. That’s the wrong way to do business. The smartest people in the world make mistakes like this one, but they shouldn’t! There are a couple of cardinal rules about shopping at the grocery store that come to mind. First, never shop hungry. You end up overbuying and typically unhealthily. In the same manner, don’t wait for a crisis to buy your software. You’ll end up buying more than you need in most cases and the pain of the urgency will get the better of you.

The second rule? Shop with a list. Without it, you end up buying food you don’t need and you forget food you do need. In the world of technology, your list is called a needs assessment. Sit down with your tech department and your operations and figure out what you need. Here are some conversation starters.

    1. How much data can we afford to lose in a given period of time? One week? One day? One hour? This answer will tell you how regular your backups will need to be, and thus how important the ease of backup and the interruption the procedures cause will become to your decision making.
    1. How reliant on the systems is each department? It’s possible your inside sales department could handle a few hours of downtime. On the other hand, it might cripple your accounting department. When you’ve got all the information, you not only have criteria to determine purchasing based on restore times but also a blueprint for which departments should receive first attention from your IT department in the event of catastrophic failure.
    1. Which particular elements of the system or the data are most critical? If your employees have a dramatic need for email but not other documents, you’ll want software that can provide tools for partial and immediate restoration of that critical information (commonly called granular restore) while the rest of the system comes on line.

Don’t fall into the “Ready, Fire, Aim!” trap. Make your technology decisions like you make other business decisions. Identify the correct solution first. Then go out and get it.

 

Why It’s So Hard to Quantify Losses from Data Storage Problems

By Tom Fedro

Thomas Fedro quantifies loss from disaster recoverySome time ago, I read Out of the Crisis, by Dr. W. Edwards Deming, the man generally credited as the Father of the Total Quality Management movement.  This book (which came out in 1984) is just chock full of brilliant business insight, but I’ve carried one phrase with me throughout my career.   He wrote that the effects of some decisions were “unknown and unknowable.”

So much of his book was an exhortation to run companies using statistical process control that the line seemed out of place.  It wasn’t though.  He was illustrating that effects that may not have a visible dollar sign attached to them are still worthy of management consideration.  We tend to push aside concerns that we can’t label easily, and this is especially true in data protection and restoration.

Anyone in the tech sector hears about the nightmare situations where companies lose or compromise data that leads to gigantic expenses by way of lost contracts, fines, or settlements.  For most companies, though, data loss doesn’t represent something that can be negotiated and pinned down.  The truth is, several consequences of data loss have Dr. Deming’s unknown impacts.  Today, I want to focus on two.  There are a whole lot more, but those will come in later posts.

  • Rework.  If you bought motherboards from an overseas provider and they were delivered with flaws, you’d send them back.  The manufacturer would take the defects and put them into “rework.”  This is pretty straightforward.  “Re” is the Latin root that means “again.”  The work is done again.  Everybody gets that this is a bad thing in manufacturing.Somehow, when it’s done with soft work like typing, accounting, and marketing collateral; management tends to forget that it’s still doing work again.  Let me rephrase that—it’s doing work twice and getting the benefit once.  The problem is that a manufacturer will show the rework in the accounting reports.  There’s simply no way to do that effectively in other industries, it’s unknown.
  • Inefficiency.  This is one of my pet peeves.  I like to compare it to writing with a pencil and a piece of paper.  Imagine you had to get a five page document handwritten and you could get about two pages done per hour.  No problem, two and a half hours, right.  Okay, let’s add a curveball.  The pencil lead is going to break.  You’re going to need to wait a few minutes because only one or two people know how to operate the sharpener, and they’re sharpening other pencils right now.Let that happen a time or two and your two and a half hour job has taken an extra hour or two.  Essentially, it’s a hidden and unknown expense.  You can measure how many times the pencil breaks.  You can even measure how long it takes to be sharpened, but you can’t really measure how it impacts an employee’s ability to write.  It would be naïve to believe that the stopping and restarting doesn’t cause delay above the actual downtime.

HDD to SSD System Migration

By Tom Fedro

We’ve spoken with a lot of organizations that are looking at upgrading to the new solid state drive (SSD) storage technology. These SSDs provide much faster read and write speeds than traditional hard disk drives (HDDs), and with no moving parts they are much more rugged than their mechanical HDD cousins – a crucial consideration for mobile users. The big stumbling block, however, has been the migration process.

Because SSDs generally offer smaller capacity than the HDDs they’re replacing, many users understandably hesitate to make the change. Up until now the only solution was to first re-partition the HDD, and then perform the migration using a special utility to separate the system and data – a time consuming process which carried an inherent risk of data loss. Not an appealing prospect!

If migrating to an SSD is an issue in your organization, we’d love to hear from you and learn from your experience. To find out how Paragon’s technology tackles this issue, take a look at our Product Spotlight below.

Product Spotlight: Easy Migration to SSD

Our Product Spotlight this month is Paragon Migrate OS to SSD. With an intuitive one-step wizard, this solution makes it easy to upgrade existing HDDs to a new SSD, even of smaller capacity.  In a single operation, you can now migrate Windows (any version since XP) to a larger or smaller storage device, with ensured safety for your system and data. If your new drive has less capacity than the current one (generally the case with the new high-performance SSDs), you’ll be able to specify which data to exclude from the migration. Transfer a live system with no impact to your work, and have your partitions automatically aligned in the process if needed.

Why Can’t Windows See the Whole 3TB Drive?

By Tom Fedro

Tom Fedro Discusses Windows Problems Recognizing 3TB DrivesIt was not too long ago when a 5 MB file was considered “large”. Fast forward to today and you’ll find HD-video, archived images and music files stored in lossless compression format exceeding 100-700 MB. It’s not unusual for home computer users to have 1 TB of data and small businesses to have several terabytes of data. High capacity backup and storage devices are more important than ever.

Keeping up with demand, the capacity of data storage devices has been increasing exponentially while costs have plummeted. A good quality 3TB drive can now be purchased for under $200. But with over 51% of PC’s worldwide still running Windows XP, actually using those 3TB of space is an issue. XP does not support drives larger than 2.2TB, as far as XP is concerned, that remaining .8TB of space – almost a third of the drive, doesn’t exist.

Paragon’s tackles this problem with new technology – Paragon GPT Loader, which includes a special driver which augments Windows XP by adding additional partition support and a utility that initializes ultra-high capacity drives.

If you have encountered problems accessing space on a large capacity drive under Windows XP, we’d love to hear from you and learn from your experience. To find out more about this type of data storate technology, take a look at our Product Spotlight below.

Product Spotlight: Migrate to 3TB with Paragon GPT Loader

Hardware manufacturers are now producing relatively inexpensive disk drives with 3TB+ disk capacity, broadening their appeal and functionality for a wide spectrum of end users. Windows XP users, however, have not been able to take advantage of this increased storage space, as XP does not support drives larger than 2TB.

Paragon GPT Loader is a specially designed, cost- and resource-effective driver that solves this problem. Paragon GPT loader enables Windows XP to support the GPT partitioning scheme to grant full, native access to all 3TB+ drives as secondary drives in the XP system, either as external data storage devices or inserted inside workstations and home computers. For more information, visit Paragon GPT loader.