#
# INSTALL Comments - SHADOW Version 1.6
#                    Last Changed 21 Sep 1999
#
Here are the rough steps for assembling a SHADOW system. These steps are
very broad and leave a lot of work as exercises for the reader. Obviously,
the person performing the installation must be intimately familiar with the
workings of his UNIX system and the process of installing "open source" 
software. The steps omitted can be show stoppers; for example, if you don't
have a C compiler on your system, you cannot install tcpdump from the source,
and you will have to find a way around that problem. Another example: the
SHADOW scripts are written in Perl. If your system doesn't natively have Perl,
you have another problem.

A SHADOW system consists of two or more parts, one or more sensors and an 
analyzer. 

For the sensor:

1.  Obtain a machine.
2.  Install the latest version of the OS (must be UNIX or LINUX) with the 
    latest security patches.
3.  Install libpcap and tcpdump from sources pointed to by links in the 
    accessories directory.
4.  Install ssh from from sources pointed to by links in the 
    accessories directory.
5.  Install the SHADOW-1.6/sensor directory.
6.  Set the crontab,as root, to restart the tcpdump process each hour. Using 
    sensor_init.sh as an example, set up the system initialization process to 
    start the sensor_driver.pl script upon boot-up.
7.  Configure the std.ph file to match the locations where you have 
    installed the components.
8.  Strip the system of all internet services except ssh. This machine
    is vulnerable because it will reside outside your firewall. You do not
    want to make it easy to compromise, and you certainly don't want anything
    on this machine to give away secrets about the rest of your network.
9.  Move the sensor to your DMZ, e.g. outside your firewall where it can
    see every packet coming into your site. If you are on a switched network,
    you've got a problem.

For the analyzer:

1.  Obtain a machine.
2.  Install the latest version of the OS (must be UNIX or LINUX) with the
    latest security patches.
3.  Install libpcap, tcpdump, tcpslice, and ssh from sources pointed to by 
    links in the accessories directory.
4.  Install Apache from the source pointed to in the accessories directory. 
    Decide where your web pages are going to reside and configure Apache 
    appropriately.
5.  Create a non-privileged user on the analyzer to run the analyzer scripts
    and own the web pages.
6.  Using SSH on the analyzer, set up a link to the sensor(s) such that the
    non-privileged user process on the analyzer can connect as root to the 
    sensor without a password. This can be accomplished by using "--with-rhosts"
    on the SSH configure command or by putting a blank passphrase on the 
    user's private SSH key. In addition, the public SSH key of the user on 
    the analysis station must be placed in the .ssh/authorized_keys file 
    for root on the sensor.
7.  Configure the analyzer by editing the site-specific variables in
    each of the .ph files contained in the SHADOW-1.6/sites directory. You must
    create one "SITE.ph" file for each sensor from which your analyzer will 
    obtain data. 
8.  Any necessary directories for holding the raw data files and web pages 
    must be created, e.g. /LOG, /LOG/SITE, /home/httpd, /home/httpd/html, etc. 
    Since fetchem.pl runs as an unprivileged user, you may have to become
    root to create some of them. The "*.ph" files edited in step 8 are 
    where the directories are configured. The user created in step 6 
    must be the owner.
9.  Interactively test the fetchem.pl script with the "-debug" flag to see 
    if it connects to the sensor and downloads the raw tcpdump file. 
    Correct any ssh, directory permission, or other problems revealed by the 
    /tmp/fetchem.log file created by "-debug."
10. Using the supplied example tcpdump filters, define your site specific
    filters by substituting your internal network addresses where necessary,
    and by deciding exactly what you want SHADOW to display as "suspicious"
    on your web pages.
11. As you get familiar with your "normal" network traffic, tweak your filters
    to display what you want to see. This may change with time as the bad
    guys learn new ways to attack your site.


Bill Ralph, wralph@nswc.navy.mil
21 September 1999
