- Practical Big Data Analytics
- Nataraj Dasgupta
- 141字
- 2021-07-02 19:26:26
The core modules of Hadoop
The core modules of Hadoop consist of:
- Hadoop Common: Libraries and other common helper utilities required by Hadoop
- HDFS: A distributed, highly-available, fault-tolerant filesystem that stores data
- Hadoop MapReduce: A programming paradigm involving distributed computing across commodity servers (or nodes)
- Hadoop YARN: A framework for job scheduling and resource management
Of these core components, YARN was introduced in 2012 to address some of the shortcomings of the first release of Hadoop. The first version of Hadoop (or equivalently, the first model of Hadoop) used HDFS and MapReduce as its main components. As Hadoop gained in popularity, the need to use facilities beyond those provided by MapReduce became more and more important. This, along with some other technical considerations, led to the development of YARN.
Let's now look at the salient characteristics of Hadoop as itemized previously.
推薦閱讀
- Mastering Matplotlib 2.x
- LabVIEW虛擬儀器從入門到測控應用130例
- 數據挖掘實用案例分析
- Creo Parametric 1.0中文版從入門到精通
- 大數據平臺異常檢測分析系統的若干關鍵技術研究
- AWS Administration Cookbook
- CompTIA Linux+ Certification Guide
- Ruby on Rails敏捷開發最佳實踐
- 愛犯錯的智能體
- Grome Terrain Modeling with Ogre3D,UDK,and Unity3D
- INSTANT Munin Plugin Starter
- 傳感器原理與工程應用
- 運動控制系統(第2版)
- Hands-On Generative Adversarial Networks with Keras
- 巧學活用Linux