- PySpark Cookbook
- Denny Lee Tomasz Drabas
- 90字
- 2021-06-18 19:06:31
How to do it...
To configure your session, in a Spark version which is lower that version 2.0, you would normally have to create a SparkConf object, set all your options to the right values, and then build the SparkContext ( SqlContext if you wanted to use DataFrames, and HiveContext if you wanted access to Hive tables). Starting from Spark 2.0, you just need to create a SparkSession, just like in the following snippet:
spark = SparkSession.builder \
.master("local[2]") \
.appName("Your-app-name") \
.config("spark.some.config.option", "some-value") \
.getOrCreate()
推薦閱讀
- Python程序設計教程(第2版)
- Software Defined Networking with OpenFlow
- 軟件架構設計:大型網站技術架構與業務架構融合之道
- 算法精粹:經典計算機科學問題的Java實現
- Flink SQL與DataStream入門、進階與實戰
- Selenium Design Patterns and Best Practices
- Julia Cookbook
- Advanced Oracle PL/SQL Developer's Guide(Second Edition)
- Spring Boot進階:原理、實戰與面試題分析
- QGIS By Example
- 基于ARM Cortex-M4F內核的MSP432 MCU開發實踐
- Access 2010數據庫應用技術實驗指導與習題選解(第2版)
- React and React Native
- Java程序設計
- Getting Started with Windows Server Security