官术网_书友最值得收藏!

NRT – high-level system view

The previous section of this chapter is dedicated to providing you with an understanding of the basic building blocks of an NRT application and its logical overview. The next step is to understand the functional and systems view of the NRT architectural framework. The following figure clearly outlines the various architectural blocks and cross cutting concerns:

So, if I get to describe the system as a horizontal scale from left to right, the process starts with data ingestion and transformation in near real-time using low-latency components. The transformed data is passed on to the next logical unit that actually performs highly optimized and parallel operations on the data; this unit is actually the near real-time processing engine. Once the data has been aggregated and correlated and actionable insights have been derived, it is passed on to the presenting layer, which along with real-time dash boarding and visualization, may have a persistence component that retains the data for long term deep analytics.

The cross cutting concerns that exist across all the components of the NRT framework as depicted in the previous figure are:

  • Security
  • System management
  • Data integrity and management

Next, we are going to get you acquainted with four basic streaming patterns, so you are acquainted with the common flavors that streaming use cases pose and their optimal solutions (in later sections):

  • Stream ingestion: Here, all we are expected to do is to persist the events to the stable storage, such as HDFS, HBase, Solr, and so on. So all we need are low-latency stream collection, transformation, and persistence components.
  • Near real-time (NRT) processing: This application design allows for an external context and addresses complex use cases such as anomaly or fraud detection. It requires filtering, alerting, de-duplication, and transformation of events based on specific sophisticated business logic. All these operations are required to be performed at extremely low latency.
  • NRT event partitioned processing: This is very close to NRT processing, but with a variation that helps it deriving benefits from partitioning the data, to quote a few instances, it is like storing more relevant external information in memory. This pattern also operates at extremely low latencies.
  • NRT and complex models/machine learning: This one mostly requires us to execute very complex models/operations over a sliding window of time over the set of events in the stream. They are highly complex operations, requiring micro batching of data and operate over very low latencies.
主站蜘蛛池模板: 贞丰县| 宁明县| 鄱阳县| 周口市| 莎车县| 惠东县| 福安市| 合阳县| 壶关县| 松潘县| 安泽县| 和静县| 石狮市| 闻喜县| 张北县| 苍溪县| 德江县| 沛县| 天祝| 潜江市| 佛坪县| 四平市| 洪湖市| 林口县| 周口市| 应城市| 武城县| 平度市| 大石桥市| 富蕴县| 太白县| 屏山县| 邵阳县| 浙江省| 桑植县| 林口县| 北安市| 萍乡市| 五台县| 莆田市| 丹凤县|