官术网_书友最值得收藏!

  • PySpark Cookbook
  • Denny Lee Tomasz Drabas
  • 82字
  • 2021-06-18 19:06:29

There's more...

Instead of using the make-distribution.sh script from Spark, you can use Maven directly to compile the sources. For instance, if you wanted to build the default version of Spark, you could simply type (from the _spark_dir folder):

./build/mvn clean package

This would default to Hadoop 2.6. If your version of Hadoop was 2.7.2 and was deployed over YARN, you can do the following:

./build/mvn -Pyarn -Phadoop-2.7 -Dhadoop.version=2.7.2 -DskipTests clean package

You can also use Scala to build Spark:

./build/sbt package
主站蜘蛛池模板: 翼城县| 班戈县| 谷城县| 藁城市| 长乐市| 清水县| 乌恰县| 含山县| 顺平县| 龙游县| 和平区| 上蔡县| 西林县| 南开区| 常山县| 丰城市| 略阳县| 左权县| 丰台区| 宜城市| 宝应县| 恩平市| 泌阳县| 原平市| 罗山县| 汝州市| 公主岭市| 清水河县| 黄石市| 永吉县| 二连浩特市| 炉霍县| 赫章县| 武隆县| 二手房| 西贡区| 安平县| 耒阳市| 司法| 腾冲县| 阜平县|