By: Kevin Craine on June 8th, 2021
Content Migration - Evaluating Your Options
The amount of data organizations must manage today is truly mind-boggling. Research shows that there are 2.5 quintillion bytes of data created each and every day. During the last two years alone 90% of the data in the world was generated.
It’s no wonder that many organizations struggle to simply keep pace. And moving mountains of data from older legacy systems to modern cloud-based repositories can seem out of reach for most, regardless of the potential advantages of modernization.
But what can you do when you need to migrate? If leaving your data in place is not an option, and moving it makes you lose sleep at night, you can quickly feel overwhelmed by the chaos. Thankfully, there are some new approaches to data migration that may provide an answer.
Which is the best approach for your project? Let's take a look at three common approaches to migration to compare them.
Lift and Shift vs. Extensive Analysis vs. Intelligent Migration
Here's a quick comparison chart to get us started.
Now, let's look into each of these approaches in greater detail.
Lift and Shift
“Lift and shift” is familiar and a straightforward approach to migrating data and applications from an on-premises system to the cloud. It is attractive because it is (potentially) inexpensive and quick to implement. Because re-hosting, as it is also known, involves no change to application architecture, and little or no change to the data, a lift and shift strategy can make sense.
Extensive Analysis
The opposite approach is to work meticulously to evaluate every system, workflow, and data repository. A profound amount of time is spent identifying sensitive data, re-architecting information structure, and determining what data is still relevant to the business. Teams work diligently to perform a complete assessment of on-premises storage space, archives of inactive data, and backup data for redundancy and disaster recovery.
Because of the exhaustive nature of the approach, however, many organizations get caught up in “analysis paralysis.” They often find that they do not have the people-power or budget to get the job done. After all, it’s hard enough to keep up with the ongoing tide of information created each day, much less pause to review hundreds of terabytes of existing data by hand. And the results are often prone to human error that can cause cascading problems that impact the business down the road.
Intelligent Migration
There is a more elegant approach for data migration that provides a number of important advantages. The key is to use a migration engine that has a built-in data classification engine. Fueled by Machine Learning and AI, these types of tools are able to automate the process of dynamically routing sensitive data to secured locations while also transferring other content to appropriate locations based on the data type.
This can even include automatically sending low-value data to archival storage platforms.
Think of it as a modern coin sorter for your content. The system takes in unstructured data then works automatically to identify, sort, classify, and output the right content groupings in nice and neat ways. Organizations cash in with the ability to automatically evaluate their vast stores of data, identify specific content types on the fly, flag sensitive and protected information, and then classify the content using intelligent textual algorithms and analysis.
Tips for ANY Migration Approach
With whichever approach you take for your project, there are some key considerations that will help you with any migration project.
In a lot of ways, migration can be compared to moving. When you leave your apartment and buy a new house, you probably don’t just throw everything you own into a van. It’s more likely that you take the time to see what you have and make some decisions about what to keep. You’ll bubble wrap your most precious possessions, and then trash or donate what’s left.
In the same way, a smarter data migration plan must “look in the boxes,” ideally in an automated way.
- Start with unstructured content.
- What data do you have?
- What are the compliance rules you need to comply with?
- How much content is redundant, obsolete, or trivial?
- Develop your classification policies.
- Classify content types.
- Identify sensitive or risky data.
- Make decisions based on risk.
- Build a plan for migration.
- Apply change management.
About Kevin Craine
Kevin Craine is a professional writer, an internationally respected technology analyst, and an award-winning podcast producer. He was named the #1 Enterprise Content Management Influencer to follow on Twitter and has listeners and readers worldwide. Kevin creates strategic content for the web, marketing, social media, and more. He is the written voice for some of North America's leading brands and his interviews feature today's best thought leaders. His client list includes many well-known global leaders like IBM, Microsoft and Intel, along with a long list of individuals and start-ups from a wide variety of industries. Kevin's podcasts have been heard around the world, including the award-winning weekly business show "Everyday MBA". He is also the host and producer of "Bizcast" on C-Suite Radio and the producer behind podcasts for Epson, Canon, IBM and AIIM International, among others. Prior to starting Craine Communications Group, Kevin was Director of Document Services for Regence BlueCross BlueShield where he managed high volume document processing operations in Seattle, Portland and Salt Lake City. He also spent time at IKON as an Enterprise Content Management consultant working with national and major accounts. He was the founding editor of Document Strategy magazine. Kevin has also been, at one point or another, an adjunct university professor, a black belt martial artist, and a professional guitarist. Kevin holds an MBA in the Management of Science and Technology as well as a BA in Communications and Marketing.