Splunk Kafka Messaging Modular Input
in order to support huge modular input configuration xml strings when users want to run a very large number of inputs on 1 single instance of this app , we had to change the way that the child java process is invoked. Previously the xml string was passed as a program argument which could break the max argument size in the Linux kernel. Now we changed the logic to pass the xml string to the java process via the STDIN pipe.
The app performs periodic socket pings to the splunkd management port to determine if splunkd is still alive and if splunkd is not responding , usually because it has exited or is not network reachable, then the app self exits it's running java process.The default timeout is now 300 seconds. You can change this timeout value in bin/kafka.py by setting the
upgraded internal logging libraries to Log4j2 v2.17.2
- fixed a bug in getting credentials from storage/passwords
- upgraded internal logging libraries to Log4j2 v2.17.0
- upgraded internal logging libraries to Log4j2 v2.16.0
- upgraded internal logging libraries to Log4j2 v2.15.0
- general updates to meet latest Cloud Vetting requirements
- moved kafka_password out of inputs.conf , browse to the
Setup Credentials menu tab and enter any kafka usernames/passwords you require.
- activation key is now setup globally via a menu tab
- removed the HEC output option, default is now stdout
- upgraded core Kafka libraries. ZooKeeper is no longer required to establish connections.You can now connect directly to a Kafka bootstrap server.
- upgraded logging functionality
- added a setup page to encrypt any credentials you require in your configuration
- enforced python3 for execution of the modular input script.If you require Python2.7 , then download a prior version (such as 1.5).
- Python 2.7 and 3+ compatibility
- added JAXB dependencies for JRE 9+
- fixed Splunk 8 compatibility for manager.xml file
- added trial key functionality
- minor manager xml ui tweak for 7.1
- Added an activation key requirement , visit http://www.baboonbones.com/#activation to obtain a non-expiring key
- Docs updated
- Splunk 7.1 compatible
- Added a new custom handler : com.splunk.modinput.kafka.CSVWithHeaderDecoderHandler
This allows you to roll out CSV files (with or without header) into KV or JSON before indexing.
Example config you could pass to the custom message handler when you declare it
- Better JSON handling for HEC output (hat tip to Tivo)
- Better logging around HEC success/failure
- Can now add custom timestamp into HEC payload
- New custom handler (JSONBodyWithTimeExtraction) for pulling out timestamp from JSON * messages from Kafka and adding this into HEC payload
- Added support to optional output to Splunk via a HEC (HTTP Event Collector) endpoint
- Added support for raw connection string format so that multiple zookeeper hosts
can be provided in a comma delimited manner
- Enabled TLS1.2 support by default.
- Made the core Modular Input Framework compatible with latest Splunk Java SDK
- Please use a Java Runtime version 7+
- If you need to use SSLv3 , you can turn this on in bin/kafka.py
- You can now pass a charset name to the DefaultHandler