DR copy - Best way to move a large set of data to DR environment

Last post 11-17-2015, 10:32 AM by ZCameron. 9 replies.
Sort Posts: Previous Next
  • DR copy - Best way to move a large set of data to DR environment
    Posted: 11-13-2015, 4:40 PM

    What i am trying to achieve is this:

    I have a primary copy that take the backup of what we call "production" data.  it does a production sql and the associated windows based files (all documents such as pdf)

    I then have a aux copy i call DR copy. this will use the media agent in our DR site to perform the aux copy and stores it on the DR library.

    the production SQL data is 198Gb and the production files are 2.2TB.  it took about a week and me setting the job to kill and restart every 2 hours to finish (the reason for the 2hr restart of the job was because we do hourly incrementals of these items and i wanted to keep up with the incrementals at the same time, so it did not get back logged and take days to weeks.) it worked for the production sql, it is now up to date. 

    the production documents on the other hand are too much to do this. it will take months i am guessing and i wanted to know if there is a better solution.

     

    next week i plan to make a trip to the DR facility for other maintenance, and i figured this would be a good time to see if there is an option. 

    something like make an aux copy to a USB hdd and restore it to the library, but i am just not sure how to go about something like that.  any suggestions.

     

    i do take week backups of this to tape (USB HDD) already on Sunday's at 4am, so I may already have what I need.

     

    any suggestions?

     

    thanks

    Ian

  • Re: DR copy - Best way to move a large set of data to DR environment
    Posted: 11-15-2015, 11:57 PM

    Hi Ian,

     

    What you're looking for is the manual seeding steps. Please see general steps here: 

    http://documentation.commvault.com/commvault/v10/article?p=features/deduplication/manualseedingprocess.htm

     

    Essentially, you add a USB hdd as a new library, DASH copy your data onto it into new SP copy under same SP, then take this to your DR location.

     

    In the DR location, you want to do the following in order:

    1. add a new primary sharing folder from DR MA to this USB hdd library

    2. set your DR SP copy to use USB hdd copy as source copy

    3. run aux copy setting your source MA as the DR MA (although it should default to this anyway due to LAN-free code)

     

    This will now read the data on the USB hdd and locally copy that into the DR SP copy. Once this aux copy is done (should be much faster since you don't have the WAN link bottleneck), you change the source copy of the DR copy to use the existing production Primary copy. 

     

    New aux copies will now be able to run faster due to DASH copy optimizations in place.

     

    Let me know if this is the solution for you.

     

    Thank you

  • Re: DR copy - Best way to move a large set of data to DR environment
    Posted: 11-16-2015, 7:44 AM

    thank you for this information.  i will look into this and wil probably perform these action on wed\thurs this week because that will be the day i travel to our dr environment.

     

    I will keep you posted as to how this goes.

    Ian

  • Re: DR copy - Best way to move a large set of data to DR environment
    Posted: 11-16-2015, 11:36 AM

     Just be sure to get a fast disk(s).  We tried this method with a 6TB drive over USB3 (http://www.amazon.com/Seagate-Desktop-External-Storage-STDT6000100/dp/B00R1P2WDK) and the throughput was so slow it actually ran slower than our DASH copies to the DR site.  We ended up relocating our DR MA to a site with a faster link to get it caught up and then took it back. :)

  • Re: DR copy - Best way to move a large set of data to DR environment
    Posted: 11-16-2015, 11:37 AM

     Just be sure to get a fast disk(s).  We tried this method with a 6TB drive over USB3 (Seagate 6TB External) and the throughput was so slow it actually ran slower than our DASH copies to the DR site.  We ended up relocating our DR MA to a site with a faster link to get it caught up and then took it back. :)

  • Re: DR copy - Best way to move a large set of data to DR environment
    Posted: 11-16-2015, 3:56 PM

    will i want to make ny USB based copy dedpued or will it not matte for this particualr time. 

     

    i also want ot through out there that for this souce data that i wish to copy, it will perform a daily full/synthetic full (depening on if it is sql data or files) at midnight and hourly incrementals there after.

    i was going to take a manual full/synthetic before making the aux copy to the USB library. i just wanted to know how to handle the incrementals.  for the seed of a few of these items, i would view the jobs of the dr copy and tell it to do not copy any of the incs until i got the full seed over.

     

    any suggestions?  i read in the documentation to stop all backups until you can see to your dr copy, but that is not really an option for me.  it may also be helpful to know that my dr location is about 30-40 minutes away.

     

    i will be making this copy over usb 3.o because that is the only option i have at this time.  the option to bring the box form dr to my primay facility has potential, but that has to be a last resort.

    Ian

  • Re: DR copy - Best way to move a large set of data to DR environment
    Posted: 11-16-2015, 4:07 PM

    Hi Ian,

     

    If your total dataset is not that large and fits within a USB hdd then it may be worthwhile not using dedupe as it would be much faster to read non-dedupe data off disk due to a contiguous large block IO profile vs a more random small block IO profile required for dedupe data reads.

    Thank you

  • Re: DR copy - Best way to move a large set of data to DR environment
    Posted: 11-16-2015, 4:14 PM

    should i make the ddb local to the usb disk, or local to one of the media agents

  • Re: DR copy - Best way to move a large set of data to DR environment
    Posted: 11-16-2015, 4:51 PM

    If you're going to dedupe to the USB, then leaving the DDB on the fast disks on the production MA is best. During the copy from USB to DR, the source DDB will not be utilized, only the DR DDB will be used.

  • Re: DR copy - Best way to move a large set of data to DR environment
    Posted: 11-17-2015, 10:32 AM

    Seeding an Aux copy with CacheDB

    Copies in the storage policy

     

    1.        Primary

    2.       Seed Aux Copy with Dedupllication

    3.       Aux Copy (destination)

    Please follow the steps below

     

    1.        Insert UsecacheDB and UseAuxcopyreadlesssplus into the source media agent on the primary copy

    2.       Create a library for the seed copy using the portable storage device, or the device to be moved to the remote location for seeding.  (could also be cloud)

    3.       Create a Seed Aux copy with deduplication.  This will use the source media agent and seed library created in step 2 for the data path.  This copy can also be set to disk read optimized for faster transfer. 

    Note:  It is the preferred method to create the seed copy with deduplication so that we generate the CL_DB folder on the source Media Agent, which will be used for moving the cache later on.

    4.       Run the seed copy until completion

    5.       Configure the Aux Copy (destination) with Deduplication and to use the destination Media Agent and destination library in the data path.  Also, specify the copy policy to use the seed copy as the source.

    6.       Move the portable storage to the remote site

    7.       Once the storage is ready for access, run the secondary Aux copy until complete

    8.       Go back to the copy policy on the Aux Copy (destination) and uncheck specify source on the copy policy.

    9.       Do not run any new jobs yet

    10.   Go into the Job results directory and copy the aux_copy folder from CV_CLDB back to the source media agent in the same directory (Stop all services before doing this)

    11.   Change the destination copy on the deduplication tab to use network optomized

    12.   Test run the secondary copy again to test performance

    13.   If there is no cached errors, delete the seed copy and deconfigure and remove the seed library (if not needed for future usage)

     

    Considerations

     

    1.        If the seed copy is configured WITHOUT deduplication, the CV_CLDB folder will not be created in the source media agent job results directory. 

    2.       Based on past incidents, if the seed copy does not have deduplication and the destination copy does, then the CV_CLDB folder can be created and utilized manually located in the same directory as the destination and copied to the source directory

The content of the forums, threads and posts reflects the thoughts and opinions of each author, and does not represent the thoughts, opinions, plans or strategies of Commvault Systems, Inc. ("Commvault") and Commvault undertakes no obligation to update, correct or modify any statements made in this forum. Any and all third party links, statements, comments, or feedback posted to, or otherwise provided by this forum, thread or post are not affiliated with, nor endorsed by, Commvault.
Commvault, Commvault and logo, the “CV” logo, Commvault Systems, Solving Forward, SIM, Singular Information Management, Simpana, Commvault Galaxy, Unified Data Management, QiNetix, Quick Recovery, QR, CommNet, GridStor, Vault Tracker, InnerVault, QuickSnap, QSnap, Recovery Director, CommServe, CommCell, SnapProtect, ROMS, and CommValue, are trademarks or registered trademarks of Commvault Systems, Inc. All other third party brands, products, service names, trademarks, or registered service marks are the property of and used to identify the products or services of their respective owners. All specifications are subject to change without notice.
Close
Copyright © 2019 Commvault | All Rights Reserved. | Legal | Privacy Policy