Setting up a multi-tiered log infrastructure Part 2 -- System Overview
System Build Overview
The next steps are to build the environment; starting with the elasticsearch (ES) nodes and the log parser/search frontend because they require certain components to be identical. The process assumes the use of a minimal OS install using CENTOS 7 but any major NIX based OS can be used (just remember the commands might be different). Start by building three servers; two will be ES data nodes and one will be used as the ES master node. The ES master node will also be where graylog and mongod are installed.
Node Details
Endpoints (Windows, Linux etc.)
Application Dependencies
- Determined by platform
Inputs
- Logs generated locally by system and applications
Processing
- Different tools can be used depending on the platform. Some processing can be handled on the endpoint prior to shipping logs to the CLR
Outputs
- Output logs to the log aggregator server
Central Log Aggregators (Two Nodes)
Application Dependencies
- rsyslog
Inputs
- Incoming streams from endpoints
Processing
- Check logs from remote source then forward to CLR
- Check logs from local source write to local file
Outputs
- Output logs in raw format to CLR
Central Log Repository (One Node)
Application Dependencies
- rsyslog
Inputs
- Incoming stream from Central Log Aggregators
- (Optional) Incoming alerts/logs from log analysis server
Processing
- Forward to log parser server
- Check logs from remote source then write to local file
- (Optional) Check logs from LAS but do not forward back to LAS
- Check logs from local source write to local file
Outputs
- Output logs to log parser server for indexing
- Output raw logs to local file (%HOSTNAME%-YYYY-MM-DD.log)
- (Optional) Output Logs in raw format to log analysis server
Log Parser and Search Frontend (One Node)
Application Dependencies
- Java – recommended openjdk v1.8 or later
- Elasticsearch – latest from ES repo v2.x
- Mongodb – latest from mongo repo v3.x
- Graylog-Server -- latest from sources v2.x
- (Optional) Apache – latest available in repo used for reverse proxy
Inputs
- Incoming streams from CLR
Processing
- Custom rules can be created to parse logs and create actions and alerts based on them
Outputs
- Output alerts to email server
- Outputs to elasticsearch backend for indexing and storage
Storage Cluster (Two Nodes)
Application Dependencies
- Java – recommended openjdk v1.8 or later
- Elasticsearch – latest from ES repo v2.x
Inputs
- Incoming streams from log parser server
Processing
- No processing used for storage and shard replication
Outputs
- Standard endpoint logging to CLR
Log Analysis Server (Optional)
Application Dependencies
- Apache – latest available in repo
- PHP – latest available in repo
- OSSEC- latest stable from sources v2.8.3 or later
Inputs
- Incoming streams from CLR via rsyslog
Processing
- Perform rule checks based off OSSEC rules
- Send alerts based on anomalies
Outputs
- Output alerts to email server
- Output logs to CLR for storage and routing to log parser server
E-Mail Server (Notifications)
Application Dependencies
- Email Server
Inputs
- Allow relay of message from log parser server
- Allow relay of messages from LAS server
Processing
- Check destination address is local account
Outputs
- Email recipients listed as contact for specific alerts