- Apache Hadoop 3 Quick Start Guide
- Hrishikesh Vijay Karambelkar
- 439字
- 2021-06-10 19:18:43
Working across nodes without passwords (SSH in keyless)
When Apache Hadoop is set up across multiple nodes, it often becomes evident that administrators and developers need to connect to different nodes to diagnose problems, run scripts, install software, and so on. Usually, these scripts are automated and are fired in a bulk manner. Similarly, master nodes often need to connect to slaves to start or stop the Hadoop processes using SSH. To allow the system to connect to a Hadoop node without any password prompt, it is important to make sure that all SSH access is keyless. Usually, this works in one direction, meaning system A can set up direct access to system B using a keyless SSH mechanism. Master nodes often hold data nodes or map-reduce jobs, so the scripts may run on the same machine using the SSH protocol. To achieve this, we first need to generate a passphrase for the SSH client on system A, as follows:
hadoop@base0:/$ ssh-keygen -t rsa
Press Enter when prompted for the passphrase (you do not want any passwords) or file location. This will create two keys: a private (id_rsa) key and a public (id_rsa.pub) key in your .ssh directory inside home (such as /home/hadoop/.ssh). You may choose to use a different protocol. The next step will only be necessary if you are working across two machines—for example, using a master and slave.
Now, copy the id_rsa.pub file of system A to system B. You can use the scp command to copy that, as follows:
hadoop@base0:/$ scp ~/.ssh/id_rsa.pub hadoop@base1:
The preceding command will copy the public key to a target system (for example, base1) under a Hadoop user's home directory. You should now be able to log in to the system to check whether the file has been copied or not.
Keyless entry is allowed by SSH only if the public key entry is part of the authorized_key file in the.ssh folder of the target system. So, to ensure that, we need to input the following command:
hadoop@base0:/$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
The following command can be used for different machines:
hadoop@base0:/$ cat ~/id_rsa.pub >> ~/.ssh/authorized_keys
That's it! Now it's time to test out your SSH keyless entry by logging in using SSH on your target machine. If you face any issues, you should run the SSH daemon in debug mode to see the error messages, as described here. This is usually caused by a permissions issue, so make sure that all authorized keys and id_rsa.pub have ready access for all users, and that the private key is assigned to permission 600 (owner read/write only).
- ABB工業機器人編程全集
- Word 2000、Excel 2000、PowerPoint 2000上機指導與練習
- 現代測控電子技術
- Spark編程基礎(Scala版)
- IoT Penetration Testing Cookbook
- 流處理器研究與設計
- 網絡綜合布線技術
- 21天學通ASP.NET
- Embedded Programming with Modern C++ Cookbook
- INSTANT Autodesk Revit 2013 Customization with .NET How-to
- 網絡化分布式系統預測控制
- 21天學通C語言
- 基于32位ColdFire構建嵌入式系統
- 工業機器人安裝與調試
- HTML5 Canvas Cookbook