μ±κΈλ
Έλμμ νλ‘μ μ΄λ»κ² μ€μΉνλμ§ μμ±νλ €νλ€.
νλ‘μ μ€μΉνκΈ° μ μ κΈ°λ³Έμ μΌλ‘ osμμ μ€μ μ λ°κΏμΌ νλκ² μλ€.
μΌλ¨ κΈ°λ³Έμ μΌλ‘ os λ²μ μ cent os7.x μ΄λ€.
μΈνλΌλ₯Ό μ΄μν λ root κ³μ μ μ¬μ©νμ§ μλλ€.
λ€λ₯Έ κ³μ μ μμ±νμ¬ sudo κΆνμ λΆμ¬λ°κ³ μ΅λν κ·Έ κ³μ μΌλ‘ μ€μΉλ₯Ό μ§ννλ€.
νλ‘μ μ€μΉνκΈ° μ μ?
λ°©νλ²½μ λ΄λ¦¬κ³ selinux disabled κ·Έλ¦¬κ³ μ μ κ³μ μ μΆκ°νμ¬ λλλ‘ rootκ³μ μ μ¬μ©νμ§ μκ³ μ§ννλ €νλ€.
μλ°λ open jdk 1.8λ²μ μ μ€μΉνμ¬ νκ²½λ³μ μΆκ°νλ€.
ν΄λΉ λ΄μ©μ μΆνμ μΆκ°μμ μ΄λ€.
νλ‘ μ€μΉ
μ€μΉ λ²μ : Apache Hadoop 3.1.1
tar.gz λ²μ μΌλ‘ λ€μ΄λ‘λ μ€ν
νκ²½μ€μ μ νκΈ° νμ΄μ§ μ°Έκ³ νμ¬ μ§νν¨
λΆμ° λͺ¨λλ‘ μ€μΉνμ¬ μ§νν λ νκΈ° νμ΄μ§ μ°Έκ³ νμ¬ νκ²½ λ³μ μ€μ
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 | $ sudo yum install openssh* $ wget https://archive.apache.org/dist/hadoop/common/hadoop-3.1.1/hadoop-3.1.1.tar.gz $ sudo tar xvzf hadoop-3.1.1.tar.gz -C /home/hadoop $ cd /home/hadoop $ vi .bash_profile #JAVA export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk ClASSPATH=$JAVA_HOME/lib/*:$CLASSPATH #HADOOP export HADOOP_HOME=/data/platform/hadoop-3.1.1 PATH=$PATH:$HOME/.local/bin:$HOME/bin:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin export PATH CLASSPATH $ source .bash_profile | cs |
νλ‘ μ€μΉ ν νλ‘ config μ€μ νκΈ° [μ±κΈλ Έλ κΈ°μ€// ν΄λ¬μ€ν°μΌ κ²½μ° port, hostname λ±λ± κ³ λ €νμ¬ μ€μ ν΄μΌν¨]
hadoop-env.sh
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | $ vi $HADOOP_HOME/etc/hadoop/hadoop-env.sh export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk # To prevent accidents, shell commands be (superficially) locked # to only allow certain users to execute certain subcommands. # It uses the format of (command)_(subcommand)_USER. # # For example, to limit who can execute the namenode command, # export HDFS_NAMENODE_USER=hdfs # export HADOOP_CLASSPATH= export HDFS_NAMENODE_USER="user_name" export HDFS_DATANODE_USER="user_name" export HDFS_SECONDARYNAMENODE_USER="user_name" | cs |
yarn-env.sh
1 2 3 4 | $ vi $HADOOP_HOME/etc/hadoop/yarn-env.sh #YARN USER SETTING export YARN_RESOURCEMANAGER_USER="user_name" export YARN_NODAMANAGER_USER="user_name" | cs |
core-site.xml
1 2 3 4 5 6 | <configuration> <property> <name>fs.defaultFS</name> <value>hdfs://localhost:9000</value> </property> </configuration> | cs |
hdfs-site.xml
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 | <configuration> <property> <name>dfs.replication</name> <value>1</value> </property> <property> <name>dfs.namenode.rpc-bind-host</name> <value>0.0.0.0</value> </property> <property> <name>dfs.namenode.servicerpc-bind-host</name> <value>0.0.0.0</value> </property> <property> <name>dfs.namenode.http-bind-host</name> <value>0.0.0.0</value> </property> <property> <name>dfs.namenode.https-bind-host</name> <value>0.0.0.0</value> </property> <property> <name>dfs.client.datanode-restart.timeout</name> <value>30</value> </property> </configuration> | cs |
mapred-site.xml
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 | <configuration> <property> <name>mapreduce.framework.name</name> <value>yarn</value> </property> <property> <name>mapreduce.map.memory.mb</name> <value>1024</value> </property> <property> <name>mapreduce.reduce.memory.mb</name> <value>2560</value> </property> <property> <name>mapreduce.application.classpath</name> <value>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*</value> </property> <property> <name>mapreduce.jobhistory.address</name> <value>hostname:10020</value> </property> </configuration> | cs |
yarn-site.xml
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 | <configuration> <property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> </property> <property> <name>yarn.nodemanager.env-whitelist</name> <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value> </property> <property> <name>yarn.resourcemanager.address</name> <value>hostname:8032</value> </property> <property> <name>yarn.resourcemanager.scheduler.address</name> <value>hostname:8030</value> </property> <property> <name>yarn.resourcemanager.resource-tracker.address</name> <value>hostname:8031</value> </property> </configuration> | cs |
μ¬κΈ°μ λ°μ΄ν°λ Έλ, λ€μλ Έλμ ν¬νΈλ²νΈλ μΆν λ€λ₯Έ νλ‘μμ½μμ€ν μ€μΉν λ λ λμΌνκ² μ€μ ν΄μ€μΌνλ λΆλΆμ΄μλ€.
* 1GB = 1024 MB
* 1GiB = 1000MB
'BIGDATA > νλ‘μμ½μμ€ν ' μΉ΄ν κ³ λ¦¬μ λ€λ₯Έ κΈ
[Apache Spark] Apache Spark standalone Install / μνμΉ μ€νν¬ μ€μΉ (0) | 2021.03.05 |
---|---|
[Apache Hbase] HBASE 2.2.0 standalone install / Hbase μ€μΉ μ±κΈμλ² (0) | 2021.03.05 |
[Apache Hive] Apache Hive / νμ΄λΈ standalone Install (0) | 2021.03.05 |
νμ΄λΈλ? (0) | 2021.03.05 |
νλ‘μ΄λ? (0) | 2021.03.05 |