Case 2: Secure MapR and HDFS Clusters

Describes how to configure access between a secure MapR cluster and a secure HDFS cluster.

About this task

If the MapR and HDFS clusters are both secure, complete the following steps:

Procedure

  1. Verify that the hadoop.rpc.protection configuration parameter is set to the same value on the MapR and HDFS clusters. This parameter takes one of three values: authentication, integrity, or privacy. The default value for a MapR cluster with security features enabled is privacy. Configure this value in the $HADOOP_HOME/conf/core-site.xml. After changing this value, you must restart Hadoop services on the node where you changed the configuration.
  2. On each node in your MapR cluster that runs the JobTracker or JobClient, create the file $HADOOP_HOME /conf/mapred-site.xml with the following properties:
    <property> 
      <name>hadoop.security.authentication</name> 
      <value>kerberos</value> 
    </property> 
    <property> 
      <name>hadoop.security.authorization</name> 
      <value>true</value> 
    </property> 
    <property> 
      <name>dfs.namenode.kerberos.principal</name> 
      <!-- Name Node principal in HDFS cluster --> 
      <value>hdfs/_HOST@<KERBEROS_REALM></value> 
    </property> 
    <property> 
      <name>dfs.web.authentication.kerberos.principal</name> 
      <value>HTTP/_HOST@<KERBEROS_REALM></value> 
    </property>
  3. For each JobTracker node in the MapR cluster:
    1. Create the mapr/<FQDN_OF_JOBTRACKER>@<KERBEROS_REALM> Kerberos principal in the same KDC as the one used by HDFS cluster. Note that the username for the principal must be mapr .
    2. Generate a keytab file for this principal. Copy the keytab to the /opt/mapr/conf/ directory on the Job Tracker node. Set the keytab file's ownership to mapr:mapr and the permission mode bits to 600.
    3. Add the following configuration parameters to the $HADOOP_HOME/conf/hdfs-site.xml file:
      <property>
        <!-- Job Tracker principal in MapR cluster --> 
        <name>mapreduce.jobtracker.kerberos.principal</name>  
        <value>mapr/_HOST@<KERBEROS_REALM></value>
      </property>
      <property>
        <name>mapreduce.jobtracker.keytab.file</name>
        <value>/opt/mapr/hadoop/hadoop-0.20.2/conf/mapr.keytab</value> 
      </property>
    4. Modify the HADOOP_JOBTRACKER_OPTS variable in /opt/mapr/conf/env.sh by replacing ${MAPR_LOGIN_OPTS} with ${HYBRID_LOGIN_OPTS}.
    5. Restart the JobTracker and TaskTracker services.
  4. Launch any commands that need to access the HDFS cluster with the “HADOOP_OPTS=-Dhadoop.login=hybrid” Java property.