Am I making a big mistake migrating this client to SP/Teams?

thecomputerguy

Well-Known Member
Reaction score
1,326
I have a client that I've been desperately trying to get them off their current 7 year old server running 2012 R2 that has two drives (in RAID10) in pre-failure. We finally decide that we are done with an on-prem server since their main management software is cloud based their file server basically just stores Docs, Spreadsheets, Pics, and PDFs. OneDrive sync through teams seemed to be the best option since they will also have access to all this stuff through the teams app and we could do away with the VPN which has been spotty and being hardware independent is something I'm aiming for here.

Great right?

Total usage right now is about 1.6TB

I'm aware of the M$ tenant storage limitation of 1TB + 10GB per license. This brought us up to a total of about 1.2TB of storage. I sold them on another $200 a month for another 1TB of additional O365 storage (trying to avoid another physical server for basic data) bringing our total storage up to 2.2TB which is enough for now. Massive growth isn't expected because their behavior has changed. They are no longer storing photos locally as their cloud management system allows them to upload their photos there which is the bulk of their data.

So I fire up the good ol SharePoint Migration Tool (SPMT): https://docs.microsoft.com/en-us/sharepointmigration/introducing-the-sharepoint-migration-tool

I begin to upload their first chunk of data from one of their shares which totals 676GB. After uploading about 30GB of data the SPMT fails and stops and says that the temporary local location used as a staging point (C: Drive) doesn't have enough space to mitigate the staging, which it doesn't. The C: Drive only has about 88GB of free space and so apparently that isn't enough. So I delete all of the data that was uploaded, empty the recycle bin on SharePoint, then empty the secondary recycle bin. I drop off a 5TB External drive to use as the staging point and change the "Working Location" in the SPMT to this external and begin the process again.

All seems to be going well but after about 40GB of data uploaded the tool says "There is a scan issue. Please refer to the Scan Summary report for more details." But the upload continues on despite this. I dig into the scan summary log and get this error

"File count '389052' is too large to create index." I'm aware now that M$ has an indexing limitation of 5000 files (or maybe 20,000 now?)

So they have a lot of files.

The upload is continuing despite this error, and I'm not even sure what this means. Indexing the data is not necessarily important (I think? Maybe?) as we will not be utilizing SharePoint for ANYTHING except a file repository to be able to store this data and access it through Teams/Onedrive, and we won't be utilizing anything else SharePoint does.

The next upload will be another large bulk of data totaling about 900GB and I expect to get this same error about indexing.

Is indexing even important for what we are doing here? Am I doing this all wrong? I just need a cloud location to store this stuff that can be easily accessible through a desktop and a mobile app. I know servies like Dropbox and Sharefile are alternatives (Cheaper?) but they are OK with the cost and having the ability to contain everything under their current O365 subscription is also a big plus. Introducing a new service will come with a whole new set of challenges.

I need this to be a simple file repository that they can simply access the data in a normal Windows folder tree environment because these people are your typical computer dummies that freak out when their icons move.

I'm trying to get most if not all of this done over the holiday and I'm just afraid after hours of baby sitting this I'm going to be met with a failure and have to restart or go a different route altogether.

This is the largest SP migration I've done. I've done a few clients who had like less than 20GB of data and I just used OneDrive to push that data into the document library via syncing the library locally through teams. I've never used the SPMT but I felt that using OneDrive to sync 1.6TB is just asking for disappointment, not to mention I'd have to use an external drive to sync the data since the server is nearly maxed out on storage. UGHHHH HELP!

@YeOldeStonecat
 
Last edited:
Lol ... Client had a power outage at 400gb/676gb transfered after 36 hours of transfer... Wonderful ... Back to square one.
 
Client had a power outage at 400gb/676gb transfered after 36 hours of transfer... Wonderful ... Back to square one.

You'd think that, these days, the virtually all software involved in huge data transfers would, at the very least, have checkpointing built in so that when this occurs it could pick up gracefully. It may have to repeat whatever may have gone after the prior checkpoint, but not everything.

Alas, much of it does not do this . . .
 
It does have checkpointing... I'm not sure what's going on here. It does take some time to go through the entire fileset to determine what made it and what needs copied still though.

As for the client, I'd say they have too much crap. Data has a lifecycle, and if they want infinite archive... well that's what storage blobs are for. But those need Azure Online Premium Plan 1 to do permissions. So it's M365 Business Premium or better.
 
As for the client, I'd say they have too much crap. Data has a lifecycle,

Yup. And there are data hoarders just like there are for objects in the physical world.

It's a sickness, and one that's almost impossible to cure. Getting hoarders to admit they hoard, and actually understand what that means, is almost never achieved, either.
 
Even for large migrations I still prefer to use a "middle man" computer (of substantial horsepower)....create the Teams.....sync each Team...and use File Explorer to migrate. I've just not enjoyed the migration tool.

Extra storage, yeah, just buy it. Easy enough.

Key thing....now is the time to work with the client in "re-structuring" that big bucket of files on the old server. Most businesses have had a file server of some sorts for...well, over 30 years now, with some businesses going back to the Novel or NT 4 server days or older. Sadly, that big old "S" drive was just copied from old server to new server as they got new servers every 7 or so years since then. And just grown in bloat.

Now is the time to work with the client, wipe the slate clean, and have that conversation with them "If you had a change to re-organize all this data, put it nice and neatly into new file cabinets.....what would you do?"

Create a Team for each...department, or project, or ...however the client wants to compartmentalize the data, permissions, etc.

You'll often fine each Team will end up having much less data in the file library.

If the migration will take time, and the old file server will be used concurrently...even if less and less each day, what I do anyways is....as I move folders from the old server, into a new Team or spread across various Teams, I rename the source folders...so they're like.... \hr\applications changes to \hr\applicationsMOVEDTOTEAMS

Oh yeah, as you do the file copy, it's very manual, you'll find those "file names" which are too long. Gives you a chance to do in and rename them. Yeah you can run tools to run pre-flights on this to find those names which end up being too much over 255 characters.
 
Even for large migrations I still prefer to use a "middle man" computer (of substantial horsepower)....create the Teams.....sync each Team...and use File Explorer to migrate. I've just not enjoyed the migration tool.

Extra storage, yeah, just buy it. Easy enough.

Key thing....now is the time to work with the client in "re-structuring" that big bucket of files on the old server. Most businesses have had a file server of some sorts for...well, over 30 years now, with some businesses going back to the Novel or NT 4 server days or older. Sadly, that big old "S" drive was just copied from old server to new server as they got new servers every 7 or so years since then. And just grown in bloat.

Now is the time to work with the client, wipe the slate clean, and have that conversation with them "If you had a change to re-organize all this data, put it nice and neatly into new file cabinets.....what would you do?"

Create a Team for each...department, or project, or ...however the client wants to compartmentalize the data, permissions, etc.

You'll often fine each Team will end up having much less data in the file library.

If the migration will take time, and the old file server will be used concurrently...even if less and less each day, what I do anyways is....as I move folders from the old server, into a new Team or spread across various Teams, I rename the source folders...so they're like.... \hr\applications changes to \hr\applicationsMOVEDTOTEAMS

Oh yeah, as you do the file copy, it's very manual, you'll find those "file names" which are too long. Gives you a chance to do in and rename them. Yeah you can run tools to run pre-flights on this to find those names which end up being too much over 255 characters.
Really? you login to onedrive on the middle man computer and let it just sync that way? That seems odd. OneDrive on this server just got stuck at Downloading a 2mb file and just never made it past that.
 
It does have checkpointing... I'm not sure what's going on here. It does take some time to go through the entire fileset to determine what made it and what needs copied still though.

As for the client, I'd say they have too much crap. Data has a lifecycle, and if they want infinite archive... well that's what storage blobs are for. But those need Azure Online Premium Plan 1 to do permissions. So it's M365 Business Premium or better.
Definitely did not save a checkpoint ... when I reopened the tool my only option was to start a new migration and I don't want to risk duplication ... so I suppose I'll just have to delete 400GB of data and start it over.
 
Really? you login to onedrive on the middle man computer and let it just sync that way? That seems odd. OneDrive on this server just got stuck at Downloading a 2mb file and just never made it past that.

And....nowwwww.... you see one of the reasons why I choose a middle man computer.
Leave servers doing their background network services....doing massive file copy via file explorer and plopping into apps....well, you see what happens.

Learned that trick a few years ago from one of the early Teams experts from Europe we hired for some training.
 
The doc library actually shows it at 670GB of data but I just don't trust that it actually finished since I can't confirm
 
I also rely on a machine somewhere and Onedrive to do the sync, the tool is really not for moving stuff into sharepoint from what I've seen, it's more for moving stuff into a blob... which is entirely different.
 
And....nowwwww.... you see one of the reasons why I choose a middle man computer.
Leave servers doing their background network services....doing massive file copy via file explorer and plopping into apps....well, you see what happens.

So what is the actual process, when you do this? Are you using like a robocopy script to get the files to that dedicated computer? That way you can true-up things in prep for the actual cutover by just rerunning that script? I mean, bandwidth is bandwidth, so if you're in the same network, you're not really saving bandwidth this way, you're just assigning the sync task to a computer dedicated to that, right? I feel like a quick set of instructions (dare I say TN Resource?) would be useful.
 
No robocopy, for migrations I hover over the process. Middleman computer with adequate hard drive. OneDrive just handles the "sync on demand".

Remember, Server operating systems aren't designed for "foreground apps", they're designed for "background services". Resource wise, they're inefficient to use as a desktop. I use "healthy, powerful" desktop for the file copy process and OneDrive migration. Desktop operating systems are designed to pour resources into foreground apps.

foregroundapps.JPG

I sync the Teams folder libraries so those folders are pinned to File Explorer. So the migration really just entails...copying folders via file explorer.
 
The desktop doesn't even really need to be powerful, it just needs enough hard disk space to hold the files.

Though you don't exactly want to use an old junket either, because the CPU / RAM limits will constrain how many files Onedrive can process. But, this also depends how fast the Internet connection is onsite too. Faster means need more machine.

Robocopy can be part of the process to sync up changes from the local server into what OneDrive is syncing, but sometimes you can also simply configure OneDrive to sync the share directly. I'm more partial to the former myself, but usually it's not that big of a deal. It takes forever to get the initial push done, once that's uploaded a second sync of the source files to the desktop in question usually uploads within a few minutes. So leaving that to cutover is easily done.
 
Another who lets OneDrive client do the sync. Guess I'd say cutover migration describes my method best. I'll have a bash at explaining it from memory.


Step 1 - Data cleanup stage (probably the most important stage)
We use TreeSize Pro for a lot of this work. Firstly, as you would expect to provide a general data map of the largest folders. Also get reports on things like data type (Images, Videos, Office files etc) and data age.

File count is also important. The hard limit is 30 million files but Microsoft describes "For optimum performance, we recommend storing no more than 300,000 files in a single OneDrive or team site library." so bear that in mind and consider breaking up into multiple sites if necessary.

Often we use TreeSize file operations feature to move anything not modified in the last 5yrs to an archive folder. This either remains an offline archive or will be merged into SharePoint at a later date after the initial migration is complete.

Another TreeSize Pro feature - Bulk Rename. You can use regex to search file names for invalid characters then replace with another valid character. They have a guide here - https://www.jam-software.com/blog/treesize-bulk-rename-tips.shtml

Yet another TreesSize Pro feature - search for file paths over X length. The limit is 400 characters for SharePoint Online so we search for anything over 350 just to be safe.


Step 2 - Background sync and prep
I'll usually install OneDrive on a server or spare workstation with enough storage to hold a copy of the entire file share. Setup your new sites etc in SharePoint and sync them down to this machine.

Then I use robocopy to sync the data from the fileserver to the synced folders
robocopy "source" "destination" /MIR /R:0 /W:0 /LOG:c\logfile.log

Usually done in stages so I'll copy say 100GB and wait for it to sync up. Then I'll do a another 100GB... repeat until everything is copied over and healthy in sync.

At this point end users are still 100% using the file server. I'll make sure they have OneDrive installed and signed in with the correct account but I don't give permission to access the new SharePoint sites yet.

I'll continue running robocopy daily to pick up new changes until the cutover date.


Step 3 - Cutover
- Disable the file share(s) on the file server.
- Run robocopy one more time and check logs to make sure no errors
- Give end users permission to the SharePoint Sites
- Use GPO, Intune or just manual intervention to sync the new sites to end users devices


Step 4 - Begin to despise people
When they call in with the most basic issues of not being able to find the Z: drive any more even though you sent out multiple explainers and tutorials in advance.
 
Step 4 - Begin to despise people
When they call in with the most basic issues of not being able to find the Z: drive any more even though you sent out multiple explainers and tutorials in advance.

I had 1x client who was all old stubborn people that did the "S drive" for like...30+ years, they were used to a shortcut on their desktop going right to the mapped drive, called S-Drive. Planning stages...they really wanted to mimic as best we could what they were used to.

So, made a Team called "S-Drive". Was not a complex setup, and their folder structure wasn't bad to begin with. Not many folders nested within folders nested within folders nested within folders (ya know...those setups that just KILL OneDrive copy process with >255 character names, and kill Teams searching).

Synced that Team library...so it pinned to File Explorer.
Go to File Explorer, right clicked that pinned Team...send to desktop. There's a shortcut named "S-Drive". So..did not have to change their habits a single bit!
 
So..did not have to change their habits a single bit!

Which is, unquestionably, the one, only, and best way to handle things when it is possible.

I honestly don't know how anyone who frequents this site could fail to understand just how set in our own ways we get, and when those ways span years and years, it's way easier, when possible, for the technician to do something that makes continuation of those ways possible rather than trying to change deeply ingrained user behavior.

It's that "trying to change deeply ingrained user behavior" that's the root problem, and if you ever attempt to do it expect a very long, shallow learning curve where various users will pop-up, months after the initial transition, because it's their first time trying to do something they've only done occasionally when needed, but have had no issues with for years. They're never going to remember any notices from heaven knows how long ago by the time they're going for that data again for the first time.

We sometimes create our own problems by insisting that end users conform to our expectations rather than us conforming to theirs when it's reasonably possible to do just that.
 
If you really want to mimic a traditional setup you can mount the local OneDrive folder as a drive.

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\DOS Devices
Create a new string named Z:
Value: \??\C:\Users\Username\OneDriveFolder

After a reboot they will have a local drive Z: which is pointing to C:\Users\Username\OneDriveFolder
Added bonus any shortcuts they already had to Z:\ will continue working.

It's not something I recommend doing unless forced however it works surprisingly well.

EDIT:
Just adding a caveat to this which I forgot to mention. Explorer shell integration does not work when in the Z:\ drive. Windows treats it as any other folder so you don't get the right-click options for sharing, version history, selective sync etc.
 
Last edited:
Back
Top