Archive for Tom Fedro

Membership in the SD Association Ensures Accessibility to Engage in Standards Development for Embedded Technology Critical in Today’s Smart Phones, Set Top Boxes and TVs

By Tom Fedro

Over the last few years, demand for cross-platform drivers has exploded with the proliferation of Smart Devices like Smart Phones, Smart Set Top Boxes and Smart TVs.  And, more importantly, the trend shows no evidence of slowing down. As such, software developers involved in the technology that is critical to these devices should be an active member of the SD Association.

Another important aspect of membership is to providing validity of the developer’s commitment to meeting industry standards across the SD card industry to its OEM partners. SD standards apply to a wide range of peripheral consumer electronics beyond Smart Devices; they also apply to storage media for mobile phones, digital audio players, car navigation systems and electronic books. Technology like Paragon’s embedded exFAT, NTFS and HFS+ for Android and Linux driver technology enables compatibility for mobile devices across a variety of operating systems i.e., Windows™, Mac, Linux, Android™, etc.

Cross-Platform Drivers Ensure Read/Write Operability

OEMs use of SD card technology continues to expand as the consumer electronics device markets cross boundaries and merge. For example, Smart TVs have SD card readers so that consumers can view videos and photos over their televisions without the need of a cable to connect the device to the TV.

Consumers expect their Smart devices to recognize external media regardless of its operating system and to perform at full speed, allowing recording and playback for full HD and 3D video content. Exhibiting industry leadership by actively participating in the SD Association not only ensures that you are at the top of your game, but also lends assurance to OEMs that your products provides compatibility and integrity of the consumer’s stored data –a critical OEM requirement.

State Governor’s Office Ensures Reliable Backup in Case of Disaster

Paragon Backup Software for Disaster RecoveryBy Tom Fedro

After Hurricane Katrina hit the Gulf State region of the U.S. in 2005, IT departments in states based in the southern part of the U.S. became particularly sensitive to the potential loss of their critical data.

When the director of technical services started his new position in the govenor’s, one of the first tasks was to replace the old tape backup system with a more reliable, and cost-effective, image-based backup solution. After a lengthy and comprehensive evaluation process Paragon Software’s Hard Disk Manager (HDM) Server was selected to ensure that their files were safe in case disaster strikes.

Not long after the selection was made the office had its first test of the new backup system. Their RAID controller and the backplane on one of their servers failed, thankfully Paragon’s HDM solution rose to the challenge and not only ensured there was no data lost, but also had the office back up and running in record time. To read the case study in its entirety along with others, search by product or by market.

To view a video demonstration of our Drive Backup Server software (bundled with HDM for Servers), check us out on YouTube:

 

 

Mutli-Cast Image Deployment Management Software in an Education Environment

deployment management for educationBy Tom Fedro

Over the last few weeks, we have seen an uptick in inquiries from the educational sector for our deployment management software. To help IT professionals evaluate the best software for their needs, we published a situational case study about a typical secondary educational setting and how our software, Deployment Manager, speeds up deployments of new machines and simplifies desktop refreshes so that classroom staff can perform a refresh without the help of the IT department.

Its small IT staff of 15 spent a significant amount of time refreshing desktops in their computer training classrooms; as well as preparing hundreds of machines simultaneously for staff use. One of the requirements was ensuring that deployments could be conducted to bare-metal, dissimilar hardware. The second requirement was to develop an efficient method of processing new deployments quickly and easily. It needed to automate the process of installation/refreshment and wanted expert tools and advice to conduct the deployments quickly and cost-effectively.

Paragon Software’s Deployment Manager re-imaging software was selected in part because of the list of features, including

  • Automated or manual deployment of individual or hundreds of systems
  • Multicast and unicast support
  • Customizable Linux and WinPE-based boot media
  • ConstantCast – cyclic multicast deployment session  add systems to deployment session while it’s running
  • Initiate deployment directly from the shop floor
  • Adaptive Imaging Tools – deploy one image to several dissimilar systems
  • Pre/post deploy configuration options
  • Scripting
To read the case study, visit our Case Study page at www.Paragon-Downloads.com.

Hard Disk Management

Hard Disk Managment by Thomas FedroBy Tom Fedro

I’ve often considered how rare it is to find an IT department that takes management of storage media as seriously as it should.  Ultimately, the media on which data is written and read has seen the same dramatic and revolutionary advancement as has the rest of computer and software technology, but this core item is often overlooked by the industry.  In early computing, floppy drives evolved from  the 8 inch giants to 3.5 inch “micro-discs” and finally to obsolescence as optical storage and flash drives provided better mobile options.

It was, however, the introduction of the hard disk drive that really allowed for the computer revolution.  This little stack of platters on a flywheel spool with head after head reading and writing is really a miraculous bit of technology.  When you stop and consider for a moment that the speed of a hard drive is measured in milliseconds — 1000ths of a second — you can get an idea about how remarkable the device really is.  Today, we take for granted the speed of disk operations, but we should stop for a moment and consider the real advantages the technology has delivered.

Storage capacity is one such advantage.  Throughout the 1980s, hard disk drives grew in capacity by about 25 percent per year.  In the 1990s, capacity grew at about 60 percent per year, and by 1999 capacity grew at a rate of 130 percent per year.  Now, these components double in capacity every nine months.  By contrast, processors double in processing speed about every eighteen months.  Although at some point, the actual limitation of space will slow the continued capacity increase, for now storage technology advances faster than the rest.

What does that mean to individuals and IT departments?  Perhaps most critically, it means that your hard disks are very likely to be more advanced than any of the other components in your desktops and servers.  This can lead to dramatic slowdowns in efficiency without appropriate instructions to the hard drive and the rest of the computer.  (For example, the brilliant partitioning method in advanced format hard drives actually causes all but the most recent Windows systems to perform redundant read/write operations — unfortunately, you get a slowdown instead of speed up! see Paragon’s white paper and solution description regarding this phenomenon)

One of the software products we’ve developed at Paragon Software is “Hard Disk Manager 11″ or HDM 11.  It’s designed to allow for an IT professional to focus on the technology from the perspective of increasing the performance of the machine.  (And of course, protecting data — we’re Paragon after all.)  It has advanced defragmentation techniques, partition management, and several levels of data elimination security.  As technology continues to increase, it’s the hard disk that’s outpacing all else.  It’s about time we focus on the hard drive!

Fixing Partition Problems

By Tom Fedro

Partition Alignment by Tom FedroThe new Advanced Format, high capacity, 4K hard drives can suffer from a misalignment when a user’s operating system is Windows XP or an earlier version. This phenomenon occurs in both physical and virtual environments, effecting both servers and workstations. Essentially, the misalignment makes the computer perform redundant read/write operations.  In other words, the computer does twice as much work for the same task which obviously makes for slow processing and poor performance.

At Paragon, we developed the Partition Alignment Tool (PAT), which eliminates the misalignment problems in advanced drives by identifying any sector misalignments and then fixing them. The software has the ability to automatically determine if a drive is misaligned.  Its automated realignment of all existing partitions, including the boot partition, can substantially increase performance, sometimes by as much as 300 percent.  Versions of PAT are shipping with a number of high end advanced format drives like those from Toshiba America Information Systems, Inc. who selected the Paragon Alignment Tool for use with the Toshiba Advanced Format (AF) hard disk drives. The product is also available from several very large PC manufacturers such as HP and Dell who have licensed the technology from Paragon.  The software not only corrects alignment but ensures, with the data protection elements within the program, that the alignment continues even after an unforeseen power interruption .  The alignment will thus remain in place even if the computer fails to boot.

When you head up a software company like Paragon, it’s always exciting to have the opportunity to see well-established industry giants choose your product.  We’ve been fortunate enough to see our software time and time again in the hands of consumers and businesses that got it pre-installed or shipped along with products from one of the big guys.

If you’re experiencing substantial performance issues and you operate with an older version of Windows, stop by Paragon’s download website for a comprehensive white paper and all the information you need to fix the problem, including a trial version.  Don’t let slow performance continue to hurt your output.  A tiny investment in a simple solution can pay great dividends.

The Costs of Data Loss Extend beyond Financial

By Tom FedroCosts of data loss by Tom Fedro

Companies today have become more and more dependent on the efficiency and security of their data.   It’s amazing how often, though, I come across people with absolutely no data protection strategy in place.  Any data loss hurts business. A large data loss interrupts the flow of work, and cause a loss of both profit and productivity.  I tell people all the time that protection of data should be a top priority.

Of course, they counter by pointing out that I’m a data storage expert and focus my work on data.  Maybe that does make me a more adamant advocate, but look at ways data loss can hurt you:

  1. Productivity.  You use your data in the course of your day.  How much work would you or your employees get done if you didn’t have access to it?
  2. Customer Loyalty.  How many times will losing your customers’ information occur before they stop being your customer?  Have you ever had that sinking feeling when a company you’ve used for a long time can’t locate your information?
  3. Intellectual Property.  What if your data is your product?  Do you want to lose code, lose ad copy, images, or publications?  How much loss it there when an intellectual asset isn’t protected intelligently?

These are just three quick examples of data loss costs.  There are a great many more that are possible and even probable.  Your data is a tool and an asset, and you’re not serving your organization well if you’re failing to plan correctly for protection.

Of course, in the end, it might be impossible to figure the complete amount of financial loss that is possible when data protection fails.  Consequences can vary from mere annoyance, to devastating, and there is no way to cope with the risk without figuring out a contingency plan. What has your company done in this area?  How safe is your data?  How are you managing your data storage?  Action on these questions now can save a lot more than a few dollars in the future.

Data Access on Android Tablets

By Tom Fedro

Thomas Fedro discusses Paragon Technology in Android-based TabletsThe prevailing standard in hard drive format is Windows NTFS. This has presented some challenges with technology with other formats (like Linux) in that data and media stored in the NTFS format is incompatible for viewing and use. The Paragon Universal File System Driver (UFSD) technology is here to bridge the gap.

Paragon Software’s unique UFSD Technology provides full access including read, write, format, delete, copy. It works within nearly all of the major kernels including Linux, VxWorks, eCos, Mac, Windows, and DOS. It offers access to NTFS, HFS+, ext 2/3, FAT16, and FAT32 where the systems would not be supported otherwise. It’s small size and fast delivery makes it easy to manage and operate. There’s no need for other libraries and it supports all NTFS features.

Acer recently chose UFSD for its Iconia Tab A500 and Iconia TabA501 lines, allowing the Android product based on a Linux kernel to access data and media stored in the NTFS format. Ultimately, end users of the product will be able to have full access to file systems which otherwise would not be supported. Paragon’s UFSD-based NTFS for Android driver will enhance the capability of the tablets, since most hard drives are formatted for NTFS. Without the driver, a user would have to reformat to the older FAT file system in order to gain the access.

With the driver, a user can attach a hard drive formatted for Windows via the tablet USB port, making it a value-added feature for Acer. Users can download and stream multimedia products to their hard drive but only transfer immediately desired files via USB storage. UFSD Technology has been designed to provide the highest possible read/write performance as well, which means transferring data is fast and effortless. Importantly, the read/write performance also ensures that media playback is top-notch, which is critical for quality output of Hi Definition and Blu-Ray media.

We’re excited Acer chose this product to push their tablets a step above the competition, and we’re happy to see our efforts working in the marketplace.

The Importance of the Recovery Time

By Tom Fedro

Tom Fedro discusses data backupThe goals of data backup and recovery can be summarized with two metrics. The Recovery Point is the term that describes the point in time at which a system and its data is protected. The metric might be expressed as a time value in days or hours. If a system is backed up nightly, all data is recoverable to the previous night. Data altered between the backup and the crash represents data at risk. Some organizations will attempt to create a recovery point that approaches continuous backup. Therefore, data at risk is minimized.

The Recovery Time is concerned with restoration rather than backup. This metric represents the length of time it takes for data and systems to be made available after an interruption. Unfortunately, this particular element of storage management is often relegated to the back burner. The level of distress in a catastrophic failure is usually great enough that the relief associated with the final return of the data overshadows the interruption in availability. The fact that critical data was recovered becomes more important than the loss of productivity and business operational efficiency prior to its recovery. This kind of thinking, though, is short-sighted and based on reactionary management rather than proactive business management.

Data is important to a company for its use, not just for its existence. When the data is not available for company operations, there are hard costs as well as opportunity costs involved. Real costs are obvious. Employees sitting at a desk unable to work still generate payroll expenses. A building filled with computers not in use still has a lease cost per square foot. In short, company overhead continues but the revenue that overhead should generate is lacking. No business would survive willingly continuing to expend with no expectation of return. When critical data and systems are unavailable to assist in the conversion of company efforts into profits, this is exactly what occurs.

Opportunity costs are also generated during an interruption in system availability. Orders cannot be processed. Sales cannot be made. Customer interactions (and customer relationship management is one of the most critical aspects of ongoing profitability) are hampered, meaning continued monetization of the customer base is impossible. Sadly, these losses are hidden. They’ll never appear on a company’s financials and will likely never be noticed. Still, the losses are real, and companies fail because they ignore the vague by real impacts of opportunity loss.

Hard costs and opportunity costs mandate that companies examine their recovery time objectives with the same attention given to their recovery point objectives. Data is not an amorphous idea that needs protection. It is the very foundation by which most companies operate and continue as going concerns. Availability of that data is as critical to a company’s success as the existence of the data, and until both aspects of the company’s reliance on its systems are addressed, the company has no effective data protection strategy.

 

 

Disaster Recovery Software Decision-Making Criteria

By Tom Fedro

Tom Fedro discusses decision making criteria for disaster recoverySales of disaster recovery software have shown dramatic growth over the last several years as just about every company has come to rely on systems and data management for continued operation. Although the first mention of this kind of disaster recovery occurred in the 1970s, it wasn’t for decades that the importance was fully realized. Back then, technology really wasn’t intertwined with a company’s operations the way it is now. Now, most companies would find it strange to think of technology and business as independent, the way we find it strange when we watch a TV show from the 1980s and don’t see cell phones.

Although technology is still advancing at breakneck speed, the data protection industry is essentially mature, and a number of companies vie for market share. When a company decides to determine which solution is correct, there are some important and critical considerations that need to go into the decision-making process. First and foremost, what is the need?

Too often, this step is skipped. Companies tend to examine what’s available and make choices based on the four or five alternatives they come across. That’s the wrong way to do business. The smartest people in the world make mistakes like this one, but they shouldn’t! There are a couple of cardinal rules about shopping at the grocery store that come to mind. First, never shop hungry. You end up overbuying and typically unhealthily. In the same manner, don’t wait for a crisis to buy your software. You’ll end up buying more than you need in most cases and the pain of the urgency will get the better of you.

The second rule? Shop with a list. Without it, you end up buying food you don’t need and you forget food you do need. In the world of technology, your list is called a needs assessment. Sit down with your tech department and your operations and figure out what you need. Here are some conversation starters.

    1. How much data can we afford to lose in a given period of time? One week? One day? One hour? This answer will tell you how regular your backups will need to be, and thus how important the ease of backup and the interruption the procedures cause will become to your decision making.
    1. How reliant on the systems is each department? It’s possible your inside sales department could handle a few hours of downtime. On the other hand, it might cripple your accounting department. When you’ve got all the information, you not only have criteria to determine purchasing based on restore times but also a blueprint for which departments should receive first attention from your IT department in the event of catastrophic failure.
    1. Which particular elements of the system or the data are most critical? If your employees have a dramatic need for email but not other documents, you’ll want software that can provide tools for partial and immediate restoration of that critical information (commonly called granular restore) while the rest of the system comes on line.

Don’t fall into the “Ready, Fire, Aim!” trap. Make your technology decisions like you make other business decisions. Identify the correct solution first. Then go out and get it.

 

Optimizing the Recovery Point

By Tom Fedro

Tom Fedro discusses optimal recovery timeWhat’s the optimal recovery point in data backup?  Most tech professionals immediately jump to shout out as loudly as possible “continuous!” or “on the fly!”  Believe it or not, that’s just not correct.

Okay.  Take a deep breath.  I know it sounds like I’ve just committed techno-heresy, but I’m speaking from an operational standpoint here.  The reality is this—on the fly continuous backup is disruptive to most businesses and—brace yourself—unnecessary to most businesses.  The disruptive nature is fairly easy to understand.  Constant image-based backup uses resources and stops users from making changes.  System resource use alone would create a dramatic slowdown.

Does that make sense for a business that doesn’t deal with dynamically changing data?  What about businesses that regularly use but don’t regularly alter data?  Excessive data backup will sometimes cause more of a slowdown than minor data loss.

While nearly every business relies on data nowadays, not every business changes the data with enough frequency to justify the expense or the irritation of attempting to reach near-continual backup.  Companies ought to search for the optimal backup solution.  This solution will be based on the amount of data that needs protection and the frequency of modification in that data.  In some cases, a single backup procedure a few times per week is all that’s needed.  Some businesses will need daily backup, and some businesses will need consistent image-based backup with file-based backup at regular intervals.

There’s an optimal choice and its different for different organizations.  Don’t make the mistake of buying and implementing a solution that makes a whole lot of sense—for someone else’s company.  Determine your real risks and real needs.  Then, consider the impacts of the following:

  • The cost of the backup solution.
  • The costs of implementing the backup solution. (I’m talking about tech department payroll, here.)
  • The costs to the business operations of the implementation.

You may come to the conclusion that backup approaching on the fly consistency may indeed be what you need.  Don’t reach that conclusion because it’s the best available, though.  Reach that conclusion because it has the greatest operational impact on the business.  Somewhere between regular backup and constant backup is the right interval for most businesses.  Find out where on that timeline yours belongs and act accordingly.  Don’t fall into the trap of acting first.