After reading this post I rembered that I tested the Data Protection Manager when it was still beta, I never thought about it much more, I use DFS-R for backup/data collection needs now.
How you ask? Well I guess I was thinking a bit out of the box when I designed this "backup" solution.
Let me start with the production server, I have roughly 800GB of data I have to backup daily and keep for at least 6 month. 5 month in weekly snapshots, the most recent 30 days in 12 hour snapshots.
So I set up shadow copies on the production server, one at 11:30am and one at 04:00am, that gives me my 30 days fine grain "backup".
For obvious reasons I wanted the long term backup be availible at all times, easily restoreable.
This is where DFS-R comes in. I replicate over gigabit ethernet to my backup server, using 128mbit during work hours and the full pipe during off peak hours.
The backup server creates a shadow copy once a week, giving me my > 6 month backup.
Things to consider:
The DFS staging folders on both servers are on their own physical RAID 10 arrays, so are the shadow copies.
Staging quota is set to 300GB, shadow copy storage to 300GB on the production and 500GB (more changes over time) on the backup server.
All in all I can say I really trust this solution. Yes, it's expensive but I know it's there when I need it 🙂
Time to see if the DPM can catch up with that level of convenience.