官术网_书友最值得收藏!

How to do it...

To configure your session, in a Spark version which is lower that version 2.0, you would normally have to create a SparkConf object, set all your options to the right values, and then build the SparkContext ( SqlContext if you wanted to use DataFrames, and HiveContext if you wanted access to Hive tables). Starting from Spark 2.0, you just need to create a SparkSession, just like in the following snippet:

spark = SparkSession.builder \
.master("local[2]") \
.appName("Your-app-name") \
.config("spark.some.config.option", "some-value") \
.getOrCreate()
主站蜘蛛池模板: 瓮安县| 墨竹工卡县| 东源县| 高碑店市| 祁门县| 奉新县| 东乡县| 朝阳县| 藁城市| 宣武区| 云林县| 同心县| 黔西县| 上思县| 岳西县| 星座| 边坝县| 江口县| 抚松县| 乌恰县| 沁阳市| 修水县| 白山市| 云阳县| 南通市| 庆城县| 和顺县| 贺州市| 腾冲县| 大港区| 湘乡市| 竹北市| 新建县| 文成县| 田阳县| 通江县| 山丹县| 牙克石市| 广安市| 沽源县| 固阳县|