HDFS integration failed

Last post 10-08-2019, 11:38 AM by lokes. 10 replies.
Sort Posts: Previous Next
  • HDFS integration failed
    Posted: 10-07-2019, 10:10 AM

    Hello All,

    I' m trying to integrate HDFS which it is Cloudera distro in the one of our customer. There are 3 data nodes and 1 edge node in the config. We install Hadoop agent on all the client and we configured a pseudo-client for the HDFS. When we try to browse we haven' t see anything. in the log files, we saw some errors but we could not interpret.

    Is it possible for someone who is an HDFS expert to check log files ?

     

    Best Regards.

    Attachment: hdfs.txt
  • Re: HDFS integration failed
    Posted: 10-07-2019, 11:51 AM

    Hello Omer,

    Please send the error from log files, I can check and help.

     

    Thanks,
    Lokes

  • Re: HDFS integration failed
    Posted: 10-07-2019, 1:08 PM
    Hi Lokes, Thank you for your intereset. I have added error log which i saw in the logs from the client. Could you please check ? Best Regards.
  • Re: HDFS integration failed
    Posted: 10-07-2019, 2:48 PM

    Hello Omer,

    From the logs looks like username is empty in the Hadoop instance property of commcell GUI.

    Please check the HDFS User is properly given in the instance property. Also set the HDFS URI back to default, so we can automatically pick it up from the machine configuration files. 

    (Attached screenshot of the instance properties screen for reference)

     

     

     

     


    Attachment: hadoop.JPG
  • Re: HDFS integration failed
    Posted: 10-07-2019, 2:50 PM

    Also, please restart the commvault services on the master node configured after updating the properties.

  • Re: HDFS integration failed
    Posted: 10-07-2019, 3:41 PM

    Hello lokes,

    Actually we tried with different users but we all got the same error. In addition, according to the documentation user section is optional so for this reason we left empty at the last attempt. I will try again with your suggestion and report the result. If we get an error again, I will share a screenshot of a current log and definition.

     

    Best Regards.

  • Re: HDFS integration failed
    Posted: 10-08-2019, 4:45 AM

    Hi lokes,

    We changed the HDFS URI as you mentioned. After that we tried to browse of content but still same result. We couldn' t see nothing. I attached fresh log cut. Could you please check that ?

     

    Best Regards.

     
    Attachment: hdfs.txt
  • Re: HDFS integration failed
    Posted: 10-08-2019, 8:20 AM

    Hi lokes,

     

    I have added the error cuts from the latest logs. Could you please check ?

    Best Regards.

    Attachment: hdfs.txt
  • Re: HDFS integration failed
    Posted: 10-08-2019, 10:59 AM

    Hello Omer,

    Seems the user error is resolved now.  Now we are unable to get the proper key tab file for the hdfs user. Please make sure keytab file is present in that location and we are able to login as hdfs user using that keytab.

     

     

    http://documentation.commvault.com/commvault/v11/article?p=30706.htm

     

     

    Step 4: In Secure Hadoop Environments, Provide the Keytab File Location in the Configuration File on Data Access Nodes

    For Kerberos authentication, a keytab file is used to authenticate to the Key Distribution Center (KDC). Add the keytab file location as a property in the hdfs-site.xml configuration file on all data access nodes, including the master node. The hdfs-site.xml file is located under the hadoop_installation_directory/conf/ directory.

    Example:

    <property>
    <name>hadoop.user.keytab.file</name>
    <value>/etc/krb5.keytab</value>
    </property>

  • Re: HDFS integration failed
    Posted: 10-08-2019, 11:30 AM
    Hi lokes, According to the hadoop admin the keytab file is presented correctly in that location but we are still getting this error. Is there a way to see which location is checking by Commvault agent ?
  • Re: HDFS integration failed
    Posted: 10-08-2019, 11:38 AM

    Please check in below configuration file that the property hadoop.user.keytab.file is present with correct keytab file location.

    /etc/hadoop/conf/hdfs-site.xml
     
     
    Example:
     
      <property>
    <name>hadoop.user.keytab.file</name>
    <value>/etc/krb5.keytab</value>
    <description>hadoop user keytab file</description>
    </property>
     
    After that try kinit command on that keytab file and make sure we are able to login as hdfs user.
     
    Example:
     
    kinit -kt /etc/krb5.keytab hdfs
     
     
    If we are still getting errors, I can troubleshoot to live to resolve this faster.
     
The content of the forums, threads and posts reflects the thoughts and opinions of each author, and does not represent the thoughts, opinions, plans or strategies of Commvault Systems, Inc. ("Commvault") and Commvault undertakes no obligation to update, correct or modify any statements made in this forum. Any and all third party links, statements, comments, or feedback posted to, or otherwise provided by this forum, thread or post are not affiliated with, nor endorsed by, Commvault.
Commvault, Commvault and logo, the “CV” logo, Commvault Systems, Solving Forward, SIM, Singular Information Management, Simpana, Commvault Galaxy, Unified Data Management, QiNetix, Quick Recovery, QR, CommNet, GridStor, Vault Tracker, InnerVault, QuickSnap, QSnap, Recovery Director, CommServe, CommCell, SnapProtect, ROMS, and CommValue, are trademarks or registered trademarks of Commvault Systems, Inc. All other third party brands, products, service names, trademarks, or registered service marks are the property of and used to identify the products or services of their respective owners. All specifications are subject to change without notice.
Close
Copyright © 2019 Commvault | All Rights Reserved. | Legal | Privacy Policy