{"id":51737,"date":"2024-10-08T21:08:16","date_gmt":"2024-10-08T13:08:16","guid":{"rendered":"https:\/\/server.hk\/cnblog\/51737\/"},"modified":"2024-11-14T10:42:00","modified_gmt":"2024-11-14T02:42:00","slug":"ubuntu-18-04-%e4%b8%8b%e6%90%ad%e5%bb%ba%e5%96%ae%e6%a9%9f-hadoop-%e5%92%8c-spark-%e9%9b%86%e7%be%a4%e7%92%b0%e5%a2%83","status":"publish","type":"post","link":"https:\/\/server.hk\/cnblog\/51737\/","title":{"rendered":"Ubuntu 18.04 \u4e0b\u642d\u5efa\u55ae\u6a5f Hadoop \u548c Spark \u96c6\u7fa4\u74b0\u5883"},"content":{"rendered":"<h1 id=\"ubuntu-18-04-%e4%b8%8b%e6%90%ad%e5%bb%ba%e5%96%ae%e6%a9%9f-hadoop-%e5%92%8c-spark-%e9%9b%86%e7%be%a4%e7%92%b0%e5%a2%83-hURKvTCEWM\">Ubuntu 18.04 \u4e0b\u642d\u5efa\u55ae\u6a5f Hadoop \u548c Spark \u96c6\u7fa4\u74b0\u5883<\/h1>\n<p>\u5728\u5927\u6578\u64da\u6642\u4ee3\uff0cHadoop \u548c Spark \u6210\u70ba\u4e86\u6578\u64da\u8655\u7406\u548c\u5206\u6790\u7684\u5169\u5927\u4e3b\u6d41\u6280\u8853\u3002\u672c\u6587\u5c07\u4ecb\u7d39\u5982\u4f55\u5728 Ubuntu 18.04 \u7cfb\u7d71\u4e0a\u642d\u5efa\u4e00\u500b\u55ae\u6a5f\u7684 Hadoop \u548c Spark \u96c6\u7fa4\u74b0\u5883\uff0c\u5e6b\u52a9\u7528\u6236\u5feb\u901f\u4e0a\u624b\u9019\u4e9b\u6280\u8853\u3002<\/p>\n<h2 id=\"%e7%92%b0%e5%a2%83%e6%ba%96%e5%82%99-hURKvTCEWM\">\u74b0\u5883\u6e96\u5099<\/h2>\n<p>\u5728\u958b\u59cb\u4e4b\u524d\uff0c\u78ba\u4fdd\u4f60\u7684 Ubuntu 18.04 \u7cfb\u7d71\u5df2\u7d93\u66f4\u65b0\u5230\u6700\u65b0\u7248\u672c\u3002\u53ef\u4ee5\u4f7f\u7528\u4ee5\u4e0b\u547d\u4ee4\u9032\u884c\u66f4\u65b0\uff1a<\/p>\n<pre><code>sudo apt update\nsudo apt upgrade<\/code><\/pre>\n<p>\u63a5\u4e0b\u4f86\uff0c\u5b89\u88dd Java\uff0c\u56e0\u70ba Hadoop \u548c Spark \u90fd\u4f9d\u8cf4\u65bc Java \u74b0\u5883\u3002\u53ef\u4ee5\u4f7f\u7528\u4ee5\u4e0b\u547d\u4ee4\u5b89\u88dd OpenJDK\uff1a<\/p>\n<pre><code>sudo apt install openjdk-8-jdk<\/code><\/pre>\n<p>\u5b89\u88dd\u5b8c\u6210\u5f8c\uff0c\u6aa2\u67e5 Java \u662f\u5426\u5b89\u88dd\u6210\u529f\uff1a<\/p>\n<pre><code>java -version<\/code><\/pre>\n<h2 id=\"%e5%ae%89%e8%a3%9d-hadoop-hURKvTCEWM\">\u5b89\u88dd Hadoop<\/h2>\n<p>\u9996\u5148\uff0c\u4e0b\u8f09 Hadoop \u7684\u6700\u65b0\u7248\u672c\u3002\u53ef\u4ee5\u5f9e Apache Hadoop \u7684\u5b98\u65b9\u7db2\u7ad9\u7372\u53d6\u6700\u65b0\u7684\u7a69\u5b9a\u7248\u672c\u3002\u4ee5\u4e0b\u662f\u4e0b\u8f09\u548c\u89e3\u58d3\u7684\u547d\u4ee4\uff1a<\/p>\n<pre><code>wget https:\/\/downloads.apache.org\/hadoop\/common\/hadoop-3.3.1\/hadoop-3.3.1.tar.gz\ntar -xzvf hadoop-3.3.1.tar.gz\nsudo mv hadoop-3.3.1 \/usr\/local\/hadoop<\/code><\/pre>\n<p>\u63a5\u4e0b\u4f86\uff0c\u8a2d\u7f6e\u74b0\u5883\u8b8a\u91cf\u3002\u5728 ~\/.bashrc \u6587\u4ef6\u4e2d\u6dfb\u52a0\u4ee5\u4e0b\u5167\u5bb9\uff1a<\/p>\n<pre><code>export HADOOP_HOME=\/usr\/local\/hadoop\nexport PATH=$PATH:$HADOOP_HOME\/bin\nexport JAVA_HOME=\/usr\/lib\/jvm\/java-8-openjdk-amd64<\/code><\/pre>\n<p>\u4f7f\u74b0\u5883\u8b8a\u91cf\u751f\u6548\uff1a<\/p>\n<pre><code>source ~\/.bashrc<\/code><\/pre>\n<p>\u63a5\u4e0b\u4f86\uff0c\u914d\u7f6e Hadoop\u3002\u7de8\u8f2f Hadoop \u7684\u914d\u7f6e\u6587\u4ef6\uff0c\u8a2d\u7f6e\u6838\u5fc3\u914d\u7f6e\u548c\u4f3a\u670d\u5668\u914d\u7f6e\uff1a<\/p>\n<pre><code>cd $HADOOP_HOME\/etc\/hadoop\nnano core-site.xml<\/code><\/pre>\n<p>\u5728 core-site.xml \u4e2d\u6dfb\u52a0\u4ee5\u4e0b\u914d\u7f6e\uff1a<\/p>\n<pre><code>&lt;configuration&gt;\n    &lt;property&gt;\n        &lt;name&gt;fs.defaultFS&lt;\/name&gt;\n        &lt;value&gt;hdfs:\/\/localhost:9000&lt;\/value&gt;\n    &lt;\/property&gt;\n&lt;\/configuration&gt;<\/code><\/pre>\n<p>\u7136\u5f8c\u7de8\u8f2f hdfs-site.xml\uff1a<\/p>\n<pre><code>nano hdfs-site.xml<\/code><\/pre>\n<p>\u6dfb\u52a0\u4ee5\u4e0b\u914d\u7f6e\uff1a<\/p>\n<pre><code>&lt;configuration&gt;\n    &lt;property&gt;\n        &lt;name&gt;dfs.replication&lt;\/name&gt;\n        &lt;value&gt;1&lt;\/value&gt;\n    &lt;\/property&gt;\n&lt;\/configuration&gt;<\/code><\/pre>\n<p>\u63a5\u4e0b\u4f86\uff0c\u683c\u5f0f\u5316 HDFS \u6587\u4ef6\u7cfb\u7d71\uff1a<\/p>\n<pre><code>hdfs namenode -format<\/code><\/pre>\n<p>\u555f\u52d5 Hadoop \u670d\u52d9\uff1a<\/p>\n<pre><code>start-dfs.sh<\/code><\/pre>\n<h2 id=\"%e5%ae%89%e8%a3%9d-spark-hURKvTCEWM\">\u5b89\u88dd Spark<\/h2>\n<p>\u4e0b\u8f09 Spark \u7684\u6700\u65b0\u7248\u672c\uff0c\u4e26\u89e3\u58d3\u5230\u6307\u5b9a\u76ee\u9304\uff1a<\/p>\n<pre><code>wget https:\/\/downloads.apache.org\/spark\/spark-3.1.2\/spark-3.1.2-bin-hadoop3.2.tgz\ntar -xzvf spark-3.1.2-bin-hadoop3.2.tgz\nsudo mv spark-3.1.2-bin-hadoop3.2 \/usr\/local\/spark<\/code><\/pre>\n<p>\u540c\u6a23\uff0c\u8a2d\u7f6e Spark \u7684\u74b0\u5883\u8b8a\u91cf\u3002\u5728 ~\/.bashrc \u6587\u4ef6\u4e2d\u6dfb\u52a0\u4ee5\u4e0b\u5167\u5bb9\uff1a<\/p>\n<pre><code>export SPARK_HOME=\/usr\/local\/spark\nexport PATH=$PATH:$SPARK_HOME\/bin<\/code><\/pre>\n<p>\u4f7f\u74b0\u5883\u8b8a\u91cf\u751f\u6548\uff1a<\/p>\n<pre><code>source ~\/.bashrc<\/code><\/pre>\n<p>\u63a5\u4e0b\u4f86\uff0c\u555f\u52d5 Spark \u4f3a\u670d\u5668\uff1a<\/p>\n<pre><code>start-master.sh\nstart-slave.sh spark:\/\/localhost:7077<\/code><\/pre>\n<h2 id=\"%e6%b8%ac%e8%a9%a6%e9%9b%86%e7%be%a4-hURKvTCEWM\">\u6e2c\u8a66\u96c6\u7fa4<\/h2>\n<p>\u5728\u700f\u89bd\u5668\u4e2d\u8a2a\u554f <a href=\"http:\/\/localhost:8080\" rel=\"nofollow noopener\" target=\"_blank\">http:\/\/localhost:8080<\/a>\uff0c\u53ef\u4ee5\u770b\u5230 Spark \u7684\u7ba1\u7406\u754c\u9762\uff0c\u9019\u610f\u5473\u8457 Spark \u5df2\u7d93\u6210\u529f\u555f\u52d5\u3002\u4f60\u53ef\u4ee5\u901a\u904e\u63d0\u4ea4\u7c21\u55ae\u7684 Spark \u61c9\u7528\u4f86\u6e2c\u8a66\u96c6\u7fa4\u7684\u904b\u884c\u72c0\u6cc1\u3002<\/p>\n<h2 id=\"%e7%b8%bd%e7%b5%90-hURKvTCEWM\">\u7e3d\u7d50<\/h2>\n<p>\u672c\u6587\u4ecb\u7d39\u4e86\u5982\u4f55\u5728 Ubuntu 18.04 \u4e0a\u642d\u5efa\u55ae\u6a5f\u7684 Hadoop \u548c Spark \u96c6\u7fa4\u74b0\u5883\u3002\u9019\u4e9b\u6280\u8853\u5728\u5927\u6578\u64da\u8655\u7406\u548c\u5206\u6790\u4e2d\u626e\u6f14\u8457\u91cd\u8981\u89d2\u8272\uff0c\u638c\u63e1\u5b83\u5011\u5c07\u6709\u52a9\u65bc\u63d0\u5347\u6578\u64da\u8655\u7406\u7684\u6548\u7387\u548c\u80fd\u529b\u3002\u5982\u679c\u4f60\u9700\u8981\u7a69\u5b9a\u7684 <a href=\"https:\/\/server.hk\">VPS<\/a> \u4f86\u904b\u884c\u9019\u4e9b\u61c9\u7528\uff0cServer.HK \u63d0\u4f9b\u591a\u7a2e\u9078\u64c7\uff0c\u9069\u5408\u4e0d\u540c\u9700\u6c42\u7684\u7528\u6236\u3002<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Learn how to set up a standalone Hadoop and Spark cluster on Ubuntu 18.04 for efficient data processing and analysis in this comprehensive guide.<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[4924],"tags":[],"class_list":["post-51737","post","type-post","status-publish","format-standard","hentry","category-setup-tutorials"],"_links":{"self":[{"href":"https:\/\/server.hk\/cnblog\/wp-json\/wp\/v2\/posts\/51737","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/server.hk\/cnblog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/server.hk\/cnblog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/server.hk\/cnblog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/server.hk\/cnblog\/wp-json\/wp\/v2\/comments?post=51737"}],"version-history":[{"count":1,"href":"https:\/\/server.hk\/cnblog\/wp-json\/wp\/v2\/posts\/51737\/revisions"}],"predecessor-version":[{"id":51738,"href":"https:\/\/server.hk\/cnblog\/wp-json\/wp\/v2\/posts\/51737\/revisions\/51738"}],"wp:attachment":[{"href":"https:\/\/server.hk\/cnblog\/wp-json\/wp\/v2\/media?parent=51737"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/server.hk\/cnblog\/wp-json\/wp\/v2\/categories?post=51737"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/server.hk\/cnblog\/wp-json\/wp\/v2\/tags?post=51737"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}