Offsite Backup---Read as Site to Site

ohio_grad_06

Well-Known Member
Reaction score
581
So at work, we are looking to get a better plan together for Disaster Recovery. That is to say, we want to start doing backups offsite.

We have some older VM's in Hyper V that we may wish to upload into Azure that we have some financial information housed on.

However, the other part of the discussion came up, file backup specifically. We have some data on our servers here that we would like to make sure there are copies created offsite somewhere. Keep in mind we are a Windows shop.

That said, my boss and I were talking, here is what we have.

We are a church organization, so we have a nice headquarters building, but we want to be prepared for something like lets say a tornado hits our building and destroyed our data center. We are in Missouri, so it's possible.

Our organization has a few Bible Colleges, one of which is moving to a new faciilty about 20-30 minutes away from us. The thought my boss and I have is to pick up a couple of NAS devices. I was doing some checking and thinking about something like this for each site.

https://www.amazon.com/Synology-5ba...=1556139520&s=gateway&sr=8-19#customerReviews

The idea is that we would like to set up a permanent site to site VPN between our site and the college. The to place at least 1 NAS device at each site, get some drives to provide a decent amount of storage, then to sync our critical data to the NAS device at their end, and for them to sync their critical data back to the NAS at our site.

With this college, we actually have a support contract with them, their tech guy had left, and so we basically are now their IT Department, except for one part time guy they have there now. I think they are going to have a 500mbps business line from Charter Spectrum. We currently have a 250mbps connection through Windstream via ATT(I think it's switching to Spectrum soon however).

I'm guessing we have 4-5 servers we would need to send data from to their site. Probably under 50TB for now, but that may go up.

I saw a thread from 2016 on here that was planning to use Delta Copy. I've not done this type of project before, so I'm trying to get some advice from more experienced minds.

Thanks!
 
What I never got too happy with was good performance with backup apps on NAS units. As far as finding a good balance of delta/inverse chain for efficiency.....and..well, just never found that. Haven't looked lately. Most backup apps that seemed to work well were more full image type. And...mirroring a full image offsite on a daily basis is bandwidth hungry.

Perhaps a different approach, to offsite this "size" thing...what about moving a lot of the file storage to O365 Sharepoint/Teams? A church would quality for the deep discounts of the non profit editions, like E3 for non-profits, toss in Advanced Threat Protection for non profits too...still come out at a very low cost.
 
I will present these options to my IT Director. He asked me to pursue the idea of the NAS devices prior to this thread.

StoneCat, that suggestion is very good. Maybe we can do a hybrid, as we already do have O365 E3 and we do utilize Sharepoint, and are starting to learn more about Teams.

The catch on that might be that the college was previously set up with GSuite, so all of their email etc goes through that.
 
We use two Synology NASs to keep redundant copies in two places. But, rather than using a nightly backup ours is all done on the fly using Synology's Drive function which is basically like a DropBox the size of your NAS. When a file is created on a workstation, it's instantly backed up to the local NAS and in a minute or two is backed up to the remote site which is a target of the Drive as well.

It's great too for files that need to be shared locally. We don't use any mapped network drives anymore. Everyone has a local copy of the files right on their station so they open instantly and don't have to wait for the network to send the file. Changes get synced between everyone very quickly and backups are instant.
 
As of now, we do have network drives on various stations. We also map users documents folders to a server with a folder for them on it. We've got many of them trained that if they save to their docs, that it's on the server. So I could see us doing the same scenario with a unit like this and then it just sync with the other NAS at night or something.

Another question, I think I've read this, but on this unit, does it allow you to daisy chain 2 or more units together? That is something I think we'd be interested in for the future.

Also, we looked, our sites are about 13 miles apart. Do you think this is far enough away? A tornado theoretically should not hit both sites, but earthquake could be a possibilty I suppose.
 
There is a reason big companies like Amazon have data centers geographically located around the globe.

If 13 miles is all you have then work with it if you can go further go for it.

Sent from my SM-G870W using Tapatalk
 
What is your budget? That drives a lot of the decision, I would expect. You could do an entire server setup at the second site and replicate everything. You could use something like RapidRecovery (used to be AppAssure). I agree with @YeOldeStonecat that you have a ton of data, so I wouldn't expect a NAS to be the target. Depends on the bandwidth at the other end, I suppose, but still.

How much of that data changes every day? You could setup a computer onsite to do sync and monitor the process to find out. 50TB with only a couple of Gigs of delta each day is easy, 50TB with 20TB of delta each day is.....impossible or at the very least very expensive.
 
We use two Synology NASs to keep redundant copies in two places. But, rather than using a nightly backup ours is all done on the fly using Synology's Drive function which is basically like a DropBox the size of your NAS. When a file is created on a workstation, it's instantly backed up to the local NAS and in a minute or two is backed up to the remote site which is a target of the Drive as well.

It's great too for files that need to be shared locally. We don't use any mapped network drives anymore. Everyone has a local copy of the files right on their station so they open instantly and don't have to wait for the network to send the file. Changes get synced between everyone very quickly and backups are instant.

I should set this up to play with it a bit. Can you tell me how it handles simultaneous access attempts to the same file?
 
As of now, we do have network drives on various stations. We also map users documents folders to a server with a folder for them on it. We've got many of them trained that if they save to their docs, that it's on the server. So I could see us doing the same scenario with a unit like this and then it just sync with the other NAS at night or something..


With an update to the OneDrive4Biz sync client last year, they added the default user library folders of Docs/Desktop/Pics as an option under a new 5th tab. You put a check in there and now you have OneDrive4Biz doing the standard "Folder Redirections" for workstations/laptops.

That can relieve a server and backup from all of that space. Users can log into different computers now and have the same experience as old folder redirection..all their stuff is available no matter where they log in.
If you already have O365.....just sayin'.....here's another option to probably remove a good handful of TB's from your backup worries...as O365 does 1TB per user.
 
Ok. My director is out today, but when he's back possibly monday, I'll run this by him. We tried OneDrive a couple of years ago, but what I seem to remember running into was that they had a limit of 20,000 files. We have some with more than that. So in that instance at least, dropbox was a better solution for us. But that may have been 4 years or so ago now.
 
I should set this up to play with it a bit. Can you tell me how it handles simultaneous access attempts to the same file?

If two people simultaneously open the same file it'll create a duplicate file so it keeps both versions, and it'll re-name the duplicate one and add on the user name the user who last saved the file that conflicted. It's probably not the best setup if you're expecting to have a file opened by multiple people at the same time. But, we only have a few office docs that are regularly used by multiple people and we keep those in Onedrive instead.
 
Ok. My director is out today, but when he's back possibly monday, I'll run this by him. We tried OneDrive a couple of years ago, but what I seem to remember running into was that they had a limit of 20,000 files. We have some with more than that. So in that instance at least, dropbox was a better solution for us. But that may have been 4 years or so ago now.

Here's the current FAQ for OD.

https://support.office.com/en-us/ar...8f5-b3d2-eb39e07630fa?ui=en-US&rs=en-US&ad=US
 
I was going to suggest something online, like AWS..... but the more I hear about O365 supporting "folder redirection" support for the basic locations, the more interested I become in that. Sounds like a great way to get the best of both worlds. Users are no longer tied to any particular workstation, folder redirection just works, and you have your offsite backup (not offline though).

Does the O365 one drive support file versioning? Retention configuration options?
 
Resurrecting an old thread here. A little different discussion. One of our divisions is wanting to do offsite backup of existing files. They also have a lot of other data that they would like to migrate from books/dvds/cds/cassettes, etc over to digital format and backup. I think they have stated before they would like to expand to about 100tb of space possibly. Their current server is a Dell Poweredge R540 with I think 2 ssd drives and 6 8tb drives. So they have a total data area of about 30TB. However, more 8tb drives via Dell are close to $1100.00 EACH. Which seems high to me and I think they'd only get to about 64tb if memory serves.

Talked to a Dell rep, and they are recommending a SAN device I think, specifically a Dell EMC ME4012 it appears with 4 480gb ssd drives and 8 12TB drives. But the price is $18,950.

I don't see this division spending that kind of money. Although who knows. Just wanting to give another option, I wonder about something like this.

https://www.amazon.com/Synology-Bay...teway&sprefix=synology+ds,aps,178&sr=8-4&th=1

And then for storage, maybe 12 of these

https://www.amazon.com/Red-10TB-NAS...western+digital+red+10,electronics,213&sr=1-3

That would give them 100TB on that box.

As far as offsite backups are concerned, we are actually meeting with them tomorrow. A consultant is coming in also. It appears the consultant will push the idea of going to Amazon Glacier, seeding the initial backup with what they call the Amazon snowball, which from what I read is effectively a large encrypted storage device that they send, you back up your data to, and mail back. The consultant previously suggested via email that this could be done about 4 times or so per year.

Thoughts on the NAS device or are there better choices? Since I know they want to expand storage and we are already meeting about offsite backups, it seems like a good time to bring up options for expanding their storage as well.
 
You might consider looking at /r/datahoarder.

I was going to suggest looking into the Backblaze-based pods, but the company that was selling the 60-drive versions appears to no longer exist (backuppods.com). https://www.45drives.com/ has prebuilt setups with their older design, and of course the plans are all available to make/have made yourself.

One relevant comment from datahoarder a couple years ago is along the lines of "overpriced": https://www.reddit.com/r/DataHoarder/comments/6747zo/backblaze_storage_pod/ and notes that they were able to get an HGST 60-bay enclosure for only a few hundred, my search for "60 bay hgst" on ebay looks more like $1250 shipped without drives or $2700-3700 with drives installed (age of drives not specified). The $3700ish only has one left, prepopulated with 60xSeagate 4TB SAS drives.

There are a bunch of other 24-60 bay enclosures that turned up as related results on that search as well.
 
Ok. This at least gives an idea. I know we have a meeting tomorrow with a couple of folks and some consultants. So it may be they just decide any data they have for long term just to pop into Glacier. Should be interesting if nothing else.
 
The PowerVault is undoubtedly the better option - comes down to if it's worth the additional cost.

This will be determined mainly by 2 things

Performance
The PowerVault will wipe the floor with the Synology. But do you need it? Dell reps can run DPACK (now called live optics) to assess your current environment and how many IOPS you require. If it's just a big file share... probably not a lot.

Reliability
PowerVault are built for high uptime. Dual-controllers with MPIO means you can have have a controller fail and it keeps running. You can have a switch or NIC fail and it keeps running. You can have a PSU fail and it keeps running.

Dell can also offer extended warranty on the PowerVault up to 7 years with 4 hour NDB. Although that's going to cost you.


PS.
Yes, I typed half an answer and posted it by accident. But nobody seen that, right :cool:
 
Last edited:
Very interesting. Thank you for all of the info. I know we typically get a 3 year warranty on standard Dell systems. The big thing will be the offsite backup. As I think that is the big push at the moment since we are in Missouri, so a tornado wiping out the building is potentially a possibility.
 
I'd start with management providing a wishlist and an wishful thinking budget. Then you can build options with the price and technology tradeoffs.
 
Back
Top