官术网_书友最值得收藏!

How to do it...

There are five major steps we will undertake to install Spark from sources (check the highlighted portions of the code):

  1. Download the sources from Spark's website
  2. Unpack the archive
  1. Build
  2. Move to the final destination
  3. Create the necessary environmental variables

The skeleton for our code looks as follows (see the Chapter01/installFromSource.sh file):

#!/bin/bash
# Shell script for installing Spark from sources
#
# PySpark Cookbook
# Author: Tomasz Drabas, Denny Lee
# Version: 0.1
# Date: 12/2/2017
_spark_source="http://mirrors.ocf.berkeley.edu/apache/spark/spark-2.3.1/spark-2.3.1.tgz"
_spark_archive=$( echo "$_spark_source" | awk -F '/' '{print $NF}' )
_spark_dir=$( echo "${_spark_archive%.*}" )
_spark_destination="/opt/spark"
...
checkOS
printHeader
downloadThePackage
unpack
build
moveTheBinaries
setSparkEnvironmentVariables
cleanUp
主站蜘蛛池模板: 固镇县| 普陀区| 财经| 沂南县| 阿城市| 孝感市| 宁阳县| 济阳县| 临安市| 浙江省| 拉萨市| 诏安县| 永年县| 安徽省| 曲沃县| 青铜峡市| 禹州市| 高密市| 德令哈市| 林芝县| 乌什县| 那坡县| 邯郸市| 嘉禾县| 旅游| 秀山| 驻马店市| 武川县| 兴文县| 兖州市| 青岛市| 镇巴县| 鸡东县| 山阳县| 文安县| 民权县| 凌源市| 溆浦县| 株洲市| 泸定县| 建德市|