官术网_书友最值得收藏!

Exporting HDFS data to a local machine

In this recipe, we are going to export/copy data from HDFS to the local machine.

Getting ready

To perform this recipe, you should already have a running Hadoop cluster.

How to do it...

Performing this recipe is as simple as copying data from one folder to the other. There are a couple of ways in which you can export data from HDFS to the local machine.

  • Using the copyToLocal command, you'll get this code:
    hadoop fs -copyToLocal /mydir1/LICENSE.txt /home/ubuntu
    
  • Using the get command, you'll get this code:
    hadoop fs -get/mydir1/LICENSE.txt /home/ubuntu
    

How it works...

When you use HDFS copyToLocal or the get command, the following things occur:

  1. First of all, the client contacts NameNode because it needs a specific file in HDFS.
  2. NameNode then checks whether such a file exists in its FSImage. If the file is not present, the error code is returned to the client.
  3. If the file exists, NameNode checks the metadata for blocks and replica placements in DataNodes.
  4. NameNode then directly points DataNode from where the blocks would be given to the client one by one. The data is directly copied from DataNode to the client machine. and it never goes through NameNode to avoid bottlenecks.
  5. Thus, the file is exported to the local machine from HDFS.
主站蜘蛛池模板: 青岛市| 锡林郭勒盟| 翼城县| 本溪市| 彭州市| 武山县| 桂林市| 博客| 五原县| 延庆县| 土默特左旗| 丹巴县| 涿州市| 罗田县| 吴旗县| 夏津县| 崇文区| 罗田县| 南澳县| 高平市| 慈溪市| 彩票| 高邮市| 新绛县| 连云港市| 班戈县| 巴彦淖尔市| 霍城县| 华容县| 新巴尔虎左旗| 岳西县| 仁布县| 桦川县| 乌苏市| 卢龙县| 天津市| 九江市| 得荣县| 广河县| 永春县| 宽甸|