Deduplication databases are too sensitive and difficult to manage

Last post 09-13-2010, 11:01 PM by AaronA. 8 replies.
Sort Posts: Previous Next
  • Deduplication databases are too sensitive and difficult to manage
    Posted: 08-10-2010, 10:49 AM

    Please change the dedupe databases to be more resilient to accidental shutdowns and process crashes.

     

    Also, add the ability to schedule dedupe maintenance jobs (compress and re-index).

  • Re: Deduplication databases are too sensitive and difficult to manage
    Posted: 08-12-2010, 10:42 AM

    Good Morning Bolsen

    Customer Modification Requests have been created to work on the accidental shutdown issues and the ability to compress and re-index the deduplication database.

    in the mean time please disable automatic updates and use the procedure in BOL on how to reboot an MA with De-duplication

    Rebooting a MediaAgent

    You might reboot a MediaAgent for installing updates or maintenance purposes. For MediaAgents controlling the deduplication database, you will have to ensure that all the deduplication transactions in the memory are completed before rebooting. Failure to follow the recommendations might result in sealing of the deduplication store, which will increase the amount of storage space consumed in the primary disk library. See Rebooting a MediaAgent Hosting the Deduplication Store for step-by-step instructions.

    http://documentation.commvault.com/commvault/release_8_0_0/books_online_1/english_us/features/single_instance/single_instance.htm#Rebooting_a_MediaAgent

    AaronA

  • Re: Deduplication databases are too sensitive and difficult to manage
    Posted: 08-18-2010, 9:45 AM

    Hi Aaron,

    Is deduplication of data limited to data located within a single store?  For example,  If I configure my SP to create a new store every 60 days, but retain data for 1 year, does that mean the dedupe data will ONLY be within each store?

    If so, then the above-mentioned accidental "seal" would really hose up storage forecasting in some instances - with any luck the fix will be released soon.

  • Re: Deduplication databases are too sensitive and difficult to manage
    Posted: 08-24-2010, 9:10 AM

    YesJim,

    when a store is opened all backups associated to the sp will use that store to find common blocks. When the store seals it will no long be referenced for backups and only used for data aging. A new store will be created and used for to find common blocks.

     

    AaronA. 

     

  • Re: Deduplication databases are too sensitive and difficult to manage
    Posted: 09-08-2010, 8:08 PM

    bolsen:

    Please change the dedupe databases to be more resilient to accidental shutdowns and process crashes.

     

    Also, add the ability to schedule dedupe maintenance jobs (compress and re-index).

    would love to see these changes in V9 also :)

  • Re: Deduplication databases are too sensitive and difficult to manage
    Posted: 09-09-2010, 10:32 PM

    Bolsen,

    i created a CMR for this a while ago as i feel its a good idea. Unfortunately its a major code change and will slated for a furture release.

    AaronA

  • Re: Deduplication databases are too sensitive and difficult to manage
    Posted: 09-10-2010, 11:10 AM

    When you say "code change" and "slated for a future release" are you saying Simpana 9 or 10?

  • Re: Deduplication databases are too sensitive and difficult to manage
    Posted: 09-13-2010, 10:55 PM

    Hi there,

    Well it must be a major/major release. It does not appear to have changed in Version 9. Although with Global De-Duplication now being used it is potentially less of an issue.

    http://documentation.commvault.com/commvault/release_9_0_0/books_online_1/english_us/prod_info/features.htm?var1=http://documentation.commvault.com/commvault/release_9_0_0/books_online_1/english_us/features/single_instance/single_instance.htm

    Above is the Version 9 De-Duplication documentation, very similar to version 8.

    From waht I understand of Version 9 a sub copy of the de-duplication database is copied out to all of the agents that have agent level de-dupe enabled (cached de-dupe) then, it uses that reference database so shutting down a MA during this process would have less impact, but I guess we will see next month when V9 is released.

    Andreas

  • Re: Deduplication databases are too sensitive and difficult to manage
    Posted: 09-13-2010, 11:01 PM

    Bolsen,

    Im Unable to provide specific details regarding future releases

     

    AaronA

     

The content of the forums, threads and posts reflects the thoughts and opinions of each author, and does not represent the thoughts, opinions, plans or strategies of Commvault Systems, Inc. ("Commvault") and Commvault undertakes no obligation to update, correct or modify any statements made in this forum. Any and all third party links, statements, comments, or feedback posted to, or otherwise provided by this forum, thread or post are not affiliated with, nor endorsed by, Commvault.
Commvault, Commvault and logo, the “CV” logo, Commvault Systems, Solving Forward, SIM, Singular Information Management, Simpana, Commvault Galaxy, Unified Data Management, QiNetix, Quick Recovery, QR, CommNet, GridStor, Vault Tracker, InnerVault, QuickSnap, QSnap, Recovery Director, CommServe, CommCell, SnapProtect, ROMS, and CommValue, are trademarks or registered trademarks of Commvault Systems, Inc. All other third party brands, products, service names, trademarks, or registered service marks are the property of and used to identify the products or services of their respective owners. All specifications are subject to change without notice.
Close
Copyright © 2019 Commvault | All Rights Reserved. | Legal | Privacy Policy