Configuration of LogStash for Analytics treatment.
Provide set of configuration files for:
- Logstash data collect and treatment
- Filebeat template to transfert your logs to logstash
Best Pratice for real usage
Repartition by Virtual Machine 0/ Source
- Install: filebeat, syslog (UDP), JSON/TCP
- What: Dedicated VM where the data source is/come from
1/ Data collection
- Install: logstash
- What: Collect all source, index and push it to elasticsearch
2/ Storage
- Install: elasticsearch
- What: Store and index all data
3/ Interface
- Install: kibana
- What: Provide web interface for analytics reports
Target path is done for standard package deployment done via package management tool (as yum, apt...). So if your settings are not in '/etc/x' thanks to adapt to your environment.
cp filebeat/* /etc/filebeat/
cp elasticsearch/*yml /etc/elasticsearch/
\cp -fr logstash/* /etc/logstash/
That's preconfigured files to be used like that. No change is ok for run but some configuration adaption are require for production usage.
Global file to manage the elasticsearch indexer according to the logstash and filebeat env to set.
Lot of filters are ready and just you need to adapt your input (connectors) according to your systems.
- 001-syslog-input.conf --> UDP/5000
- 002-beats-input.conf --> TCP/5044
- 004-bot-input.conf --> TCP/5055
Most of them with based on the patterns included in the logstash-patterns-core distribution:
- 001-syslog-input.conf
- 002-beats-input.conf
- 100-syslog-filter.conf
- 101-apache-filter.conf
- 102-audit-filter.conf
- 103-login-filter.conf
- 104-bot-filter.conf
- 105-redis-filter.conf
- 106-aws-filter.conf
- 107-cisco-filter.conf
- 108-mongodb-filter.conf
- 109-nagios-filter.conf
- 110-java-filter.conf
- 111-haproxy-filter.conf
- 112-ruby-filter.conf
- 113-netscreen-filter.conf
- 114-sharewall-filter.conf
- 115-postgresql-filter.conf
- 116-rails-filter.conf
- 117-bro-filter.conf
- 118-cucmcdr-filter.conf
Elasticsearch only at this time... to be continued
- 300-elasticsearch-output.conf
- 399-debug-output.conf
Provide log template to quickly integarte on your host where the data should come.
- syslog
- audit
- redis
- apache
- apache-other-vhost
- apache-error
- dpkg
- cucm-cdr
- cucm-cmr
- bot: based on JSON and dynamic index
Thanks to Damienetwork and its website
I've not integrated the mecanism to get the CDR/CMR from the CUCM, up to you to add your choice.
- Install logstash-filter-translate in Logstash
- Import JSON in elasticsearch and kibana (to find in the Damien's site)
Thanks to:manicas and its article
GeoIP is included in apache and syslog conf...
- To continue in other configuration
- Or can declared as generic field under 'clientip' name
Principle:
- Dearchive the package
- Copy the content in your logstash configuration folder
- Adapt the conf to your elasticseaerch server in the 3xx output
- Download the GeoIP database
- Install the CUCM CDR/CDR features
git clone https://github.com/guillain/LogStash-conf
cp LogStash-conf/logtash/* /etc/logstash/conf.d/
cd /etc/logstash/
curl -O "http://geolite.maxmind.com/download/geoip/database/GeoLiteCity.dat.gz"
gunzip GeoLiteCity.dat.gz
/opt/logstash/bin/plugin install logstash-filter-translate
service logstash restart
yum install filebeat
git clone https://github.com/guillain/LogStash-conf
mv /etc/filebeat/filebeat.yml /etc/filebeat/filebeat.yml.orig
cp LogStash-conf/filebeat/filebeat.yml /etc/filebeat/filebeat.yml
service filebeat restart
- syslog
- filebeat
-
- syslog
-
- auditlog
-
- apache
-
- CUCM CDR/CRM
- logstash
- NodeJS bot
- Python bot
- syslog
- apache
- audit
- login
- bot
- logstash
- elasticsearch
- redis