官术网_书友最值得收藏!

  • Learning Apache Spark 2
  • Muhammad Asif Abbasi
  • 375字
  • 2021-07-09 18:45:58

Passing functions to Spark (Python)

Python provides a simple way to pass functions to Spark. The Spark programming guide available at spark.apache.org suggests there are three recommended ways to do this:

  • Lambda expressions is the ideal way for short functions that can be written inside a single expression
  • Local defs inside the function calling into Spark for longer code
  • Top-level functions in a module

While we have already looked at the lambda functions in some of the previous examples, let's look at local definitions of the functions. We can encapsulate our business logic which is splitting of words, and counting into two separate functions as shown below.

def splitter(lineOfText): words = lineOfText.split(" ") return len(words) def aggregate(numWordsLine1, numWordsLineNext): totalWords = numWordsLine1 + numWordsLineNext return totalWords 

Let's see the working code example:

Figure 2.15: Code example of Python word count (local definition of functions)

Here's another way to implement this by defining the functions as a part of a UtilFunctions class, and referencing them within your map and reduce functions:

Figure 2.16: Code example of Python word count (Utility class)

You may want to be a bit cheeky here and try to add a countWords() to the UtilFunctions, so that it takes an RDD as input, and returns the total number of words. This method has potential performance implications as the whole object will need to be sent to the cluster. Let's see how this can be implemented and the results in the following screenshot:

Figure 2.17: Code example of Python word count (Utility class - 2)

This can be avoided by making a copy of the referenced data field in a local object, rather than accessing it externally.

Now that we have had a look at how to pass functions to Spark, and have already looked at some of the transformations and actions in the previous examples, including map, flatMap, and reduce, let's look at the most common transformations and actions used in Spark. The list is not exhaustive, and you can find more examples in the Apache Spark documentation in the programming guide section (http://bit.ly/SparkProgrammingGuide). If you would like to get a comprehensive list of all the available functions, you might want to check the following API docs:

Table 2.1 - RDD and PairRDD API references

主站蜘蛛池模板: 合江县| 阿拉善左旗| 崇阳县| 房产| 珲春市| 云浮市| 宜宾县| 崇信县| 长兴县| 定南县| 陵川县| 丹凤县| 和田县| 酉阳| 吉木乃县| 军事| 阜新市| 铜梁县| 依兰县| 竹北市| 镇江市| 长沙县| 富裕县| 宁南县| 永康市| 思南县| 江北区| 大名县| 正安县| 绥阳县| 乾安县| 利辛县| 长武县| 新源县| 晋州市| 临邑县| 诸城市| 图们市| 安西县| 大渡口区| 宁津县|