Trusting other computers is a two edged sword. Many systems that disallowed trusted hosts did
well against the Internet worm, compared to other sites that did not. You need to specify in your
policy just what kind of access is allowed. Maybe it's the extreme
where everyone trusts everyone else. Maybe it's the extreme that no one trusts anyone. The
middle ground would be to say that the database server trusts no one, although the database server
is trusted by the others. That way if one machine is compromised, the database server is safe.
You need to weigh convenience with security.
When I was able to crack the account
of one system administrator,
he already had an .rhosts file that allowed access to his account
machine from every other machine by both his own account
and root. Therefore, once I had broken into
one machine using his account,
I could break into all of them.
If you are setting up a system for
the first time, you need to define you access policy before you hook up the machine to the rest of
Once on a net work where security
"can" be broken, the new system is no
If you are taking over a system, you need to check it to make sure that it
adheres to both the security
policy and common sense. Check /etc/hosts.equiv to see who is given
access and every .rhosts file on the system. Make sure that they are what you want. Never
allow wildcards of any kind. Make sure that you specifically define who has access and from what
One common mistake is that the .rhosts file is world-readable. No one should be
able to figure out what access another account
gives. Just because someone knows what other machines
can reach this one does not mean that he can access that account.
However, the more information an
intruder has, the more directed the attack and the greater the chances of success.
Fortunately, the remote-command/login functionality does not work on most newer Linux
distributions if the .rhost file is readable by others.