Logstash 6.2.1 釋出,開源服務端資料處理流程

類別: IT

Logstash 6.2.1 已釋出,Logstash 是一個開源的服務端資料處理流程,可同時從多個源中獲取資料,將其轉換,然後將其傳送到“收藏”中,目前擁有超過 200 個外掛。它能集中、轉換和藏匿您的資料。

同樣的,暫未發現 Logstash 6.2.1 更新內容,點此保持關注

不妨看看 6.2.0 的更新說明:

  • Added support to protect sensitive settings and configuration in a keystore.

  • Added the jdbc_static filteras a default plugin.

  • Set better defaults to allow for higher throughput under load. (#8707and #8702)

  • Set the default configuration for RPM/DEB/Docker installations to use Multiple pipelines.

  • Added a default max size value (100MB) for log files.

  • Added compression when log files are rolled (for ZIP-based installs).

  • Added the ability to specify --pipeline.idfrom the command line. (#8868)

  • Implemented continued improvements to the next generation of execution. Give it a try with the command line switch --experimental-java-execution.


Jdbc_static Filter

  • Released the initial version the jdbc_static filter, which enriches events with data pre-loaded from a remote database.

Dissect Filter

  • Fixed multiple bugs. See the plugin release notes for 1.1.3.

Grok Filter

  • Fixed a thread leak that occurred when Logstash was reloaded.

Kafka Output

  • Improved error logging for when a producer cannot be created.


Logstash 6.2.1 釋出,開源服務端資料處理流程原文請看這裡