Using Kafka as alternative to Filebeats and Logstash











up vote
2
down vote

favorite
1












I'm new to the ELK stack and I was just wondering whether if it is possible to ship our log files to Elasticsearch using Kafka. But I need the job of Logstash ( parsing logs using filters like grok ) to be done in Kafka as well. Is this entire thing possible? Basically what I'm trying to do is to replace the combination of Filebeats and Logstash with Kafka and I want to know whether if it is possible or not.



Thank you :)



Note: What I am trying to do is to Ship + Parse the logs in Kafka. I know that shipping logs to elasticsearch is possible using the Elasticsearch connector but what I'm asking is that whether Parsing data (Logstash's job) is possible with Kafka.










share|improve this question
























  • You can checkout that question: stackoverflow.com/questions/48561197/…. It seems it is possible using connect api and elasticsearch connector.
    – alpert
    Jul 18 at 13:08










  • Possible duplicate of How to connect Kafka with Elasticsearch?
    – dawsaw
    Jul 18 at 20:49










  • That particular question was asked on connecting Elasticsearch with Kafka after the logs are parsed by Logstash. What I am asking is whether if it is possible to get Logstash's and Filebeat's jobs (Shipping and Parsing data) via Kafka?
    – Dasun Pubudumal
    Jul 19 at 0:01















up vote
2
down vote

favorite
1












I'm new to the ELK stack and I was just wondering whether if it is possible to ship our log files to Elasticsearch using Kafka. But I need the job of Logstash ( parsing logs using filters like grok ) to be done in Kafka as well. Is this entire thing possible? Basically what I'm trying to do is to replace the combination of Filebeats and Logstash with Kafka and I want to know whether if it is possible or not.



Thank you :)



Note: What I am trying to do is to Ship + Parse the logs in Kafka. I know that shipping logs to elasticsearch is possible using the Elasticsearch connector but what I'm asking is that whether Parsing data (Logstash's job) is possible with Kafka.










share|improve this question
























  • You can checkout that question: stackoverflow.com/questions/48561197/…. It seems it is possible using connect api and elasticsearch connector.
    – alpert
    Jul 18 at 13:08










  • Possible duplicate of How to connect Kafka with Elasticsearch?
    – dawsaw
    Jul 18 at 20:49










  • That particular question was asked on connecting Elasticsearch with Kafka after the logs are parsed by Logstash. What I am asking is whether if it is possible to get Logstash's and Filebeat's jobs (Shipping and Parsing data) via Kafka?
    – Dasun Pubudumal
    Jul 19 at 0:01













up vote
2
down vote

favorite
1









up vote
2
down vote

favorite
1






1





I'm new to the ELK stack and I was just wondering whether if it is possible to ship our log files to Elasticsearch using Kafka. But I need the job of Logstash ( parsing logs using filters like grok ) to be done in Kafka as well. Is this entire thing possible? Basically what I'm trying to do is to replace the combination of Filebeats and Logstash with Kafka and I want to know whether if it is possible or not.



Thank you :)



Note: What I am trying to do is to Ship + Parse the logs in Kafka. I know that shipping logs to elasticsearch is possible using the Elasticsearch connector but what I'm asking is that whether Parsing data (Logstash's job) is possible with Kafka.










share|improve this question















I'm new to the ELK stack and I was just wondering whether if it is possible to ship our log files to Elasticsearch using Kafka. But I need the job of Logstash ( parsing logs using filters like grok ) to be done in Kafka as well. Is this entire thing possible? Basically what I'm trying to do is to replace the combination of Filebeats and Logstash with Kafka and I want to know whether if it is possible or not.



Thank you :)



Note: What I am trying to do is to Ship + Parse the logs in Kafka. I know that shipping logs to elasticsearch is possible using the Elasticsearch connector but what I'm asking is that whether Parsing data (Logstash's job) is possible with Kafka.







elasticsearch logging apache-kafka logstash filebeat






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jul 19 at 0:25

























asked Jul 18 at 12:15









Dasun Pubudumal

387




387












  • You can checkout that question: stackoverflow.com/questions/48561197/…. It seems it is possible using connect api and elasticsearch connector.
    – alpert
    Jul 18 at 13:08










  • Possible duplicate of How to connect Kafka with Elasticsearch?
    – dawsaw
    Jul 18 at 20:49










  • That particular question was asked on connecting Elasticsearch with Kafka after the logs are parsed by Logstash. What I am asking is whether if it is possible to get Logstash's and Filebeat's jobs (Shipping and Parsing data) via Kafka?
    – Dasun Pubudumal
    Jul 19 at 0:01


















  • You can checkout that question: stackoverflow.com/questions/48561197/…. It seems it is possible using connect api and elasticsearch connector.
    – alpert
    Jul 18 at 13:08










  • Possible duplicate of How to connect Kafka with Elasticsearch?
    – dawsaw
    Jul 18 at 20:49










  • That particular question was asked on connecting Elasticsearch with Kafka after the logs are parsed by Logstash. What I am asking is whether if it is possible to get Logstash's and Filebeat's jobs (Shipping and Parsing data) via Kafka?
    – Dasun Pubudumal
    Jul 19 at 0:01
















You can checkout that question: stackoverflow.com/questions/48561197/…. It seems it is possible using connect api and elasticsearch connector.
– alpert
Jul 18 at 13:08




You can checkout that question: stackoverflow.com/questions/48561197/…. It seems it is possible using connect api and elasticsearch connector.
– alpert
Jul 18 at 13:08












Possible duplicate of How to connect Kafka with Elasticsearch?
– dawsaw
Jul 18 at 20:49




Possible duplicate of How to connect Kafka with Elasticsearch?
– dawsaw
Jul 18 at 20:49












That particular question was asked on connecting Elasticsearch with Kafka after the logs are parsed by Logstash. What I am asking is whether if it is possible to get Logstash's and Filebeat's jobs (Shipping and Parsing data) via Kafka?
– Dasun Pubudumal
Jul 19 at 0:01




That particular question was asked on connecting Elasticsearch with Kafka after the logs are parsed by Logstash. What I am asking is whether if it is possible to get Logstash's and Filebeat's jobs (Shipping and Parsing data) via Kafka?
– Dasun Pubudumal
Jul 19 at 0:01












1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










I'll break down your question in two:



1. Can events being streamed via kafka can be indexed in ElasticSearch



Yes, if you consider Confluent kafka-connect as part of Kafka. It's not kafka itself that does the indexing but a kafka-connect sink connector that will be configured to consume from your kafka topics and index the events in Elasticsearch.



You can find more information here: https://docs.confluent.io/current/connect/kafka-connect-elasticsearch/index.html



2. Can I achieve the same sort of parsing, transformation and flow control features of logstash directly in Kafka



The only Kafka ecosystem feature I'm aware that can help you do something like that is Kstreams (but you have to know how to develop using Kstreams API) or using another Confluent piece of software called KSQL that allows to do SQL Stream Processing on top of Kafka Topics which is more oriented to Analytics (i.e: Data filtering, transformations, aggregations, joins, windowing and sessionization)



You can find more information on KStreams here: https://kafka.apache.org/documentation/streams/



And you can find more information on KSQL here: https://docs.confluent.io/current/ksql/docs/index.html



Conclusion



In my opinion you wouldn't be able to achieve ALL sort of parsing and transformation capabilities of Logstash / NiFi without having to program with the Kafka Streams API, but you definetely can use kafka-connect to get data into kafka or out of kafka for a wide array of technologies just like Logstash does.



A nice illustration of such setup (taken from Confluent) would be:
enter image description here






share|improve this answer

















  • 1




    Wow! Thank you for the great explanation :) It is always wonderful to have these kinds of explanations in Stackoverflow posts.
    – Dasun Pubudumal
    Nov 21 at 3:27











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f51401795%2fusing-kafka-as-alternative-to-filebeats-and-logstash%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
1
down vote



accepted










I'll break down your question in two:



1. Can events being streamed via kafka can be indexed in ElasticSearch



Yes, if you consider Confluent kafka-connect as part of Kafka. It's not kafka itself that does the indexing but a kafka-connect sink connector that will be configured to consume from your kafka topics and index the events in Elasticsearch.



You can find more information here: https://docs.confluent.io/current/connect/kafka-connect-elasticsearch/index.html



2. Can I achieve the same sort of parsing, transformation and flow control features of logstash directly in Kafka



The only Kafka ecosystem feature I'm aware that can help you do something like that is Kstreams (but you have to know how to develop using Kstreams API) or using another Confluent piece of software called KSQL that allows to do SQL Stream Processing on top of Kafka Topics which is more oriented to Analytics (i.e: Data filtering, transformations, aggregations, joins, windowing and sessionization)



You can find more information on KStreams here: https://kafka.apache.org/documentation/streams/



And you can find more information on KSQL here: https://docs.confluent.io/current/ksql/docs/index.html



Conclusion



In my opinion you wouldn't be able to achieve ALL sort of parsing and transformation capabilities of Logstash / NiFi without having to program with the Kafka Streams API, but you definetely can use kafka-connect to get data into kafka or out of kafka for a wide array of technologies just like Logstash does.



A nice illustration of such setup (taken from Confluent) would be:
enter image description here






share|improve this answer

















  • 1




    Wow! Thank you for the great explanation :) It is always wonderful to have these kinds of explanations in Stackoverflow posts.
    – Dasun Pubudumal
    Nov 21 at 3:27















up vote
1
down vote



accepted










I'll break down your question in two:



1. Can events being streamed via kafka can be indexed in ElasticSearch



Yes, if you consider Confluent kafka-connect as part of Kafka. It's not kafka itself that does the indexing but a kafka-connect sink connector that will be configured to consume from your kafka topics and index the events in Elasticsearch.



You can find more information here: https://docs.confluent.io/current/connect/kafka-connect-elasticsearch/index.html



2. Can I achieve the same sort of parsing, transformation and flow control features of logstash directly in Kafka



The only Kafka ecosystem feature I'm aware that can help you do something like that is Kstreams (but you have to know how to develop using Kstreams API) or using another Confluent piece of software called KSQL that allows to do SQL Stream Processing on top of Kafka Topics which is more oriented to Analytics (i.e: Data filtering, transformations, aggregations, joins, windowing and sessionization)



You can find more information on KStreams here: https://kafka.apache.org/documentation/streams/



And you can find more information on KSQL here: https://docs.confluent.io/current/ksql/docs/index.html



Conclusion



In my opinion you wouldn't be able to achieve ALL sort of parsing and transformation capabilities of Logstash / NiFi without having to program with the Kafka Streams API, but you definetely can use kafka-connect to get data into kafka or out of kafka for a wide array of technologies just like Logstash does.



A nice illustration of such setup (taken from Confluent) would be:
enter image description here






share|improve this answer

















  • 1




    Wow! Thank you for the great explanation :) It is always wonderful to have these kinds of explanations in Stackoverflow posts.
    – Dasun Pubudumal
    Nov 21 at 3:27













up vote
1
down vote



accepted







up vote
1
down vote



accepted






I'll break down your question in two:



1. Can events being streamed via kafka can be indexed in ElasticSearch



Yes, if you consider Confluent kafka-connect as part of Kafka. It's not kafka itself that does the indexing but a kafka-connect sink connector that will be configured to consume from your kafka topics and index the events in Elasticsearch.



You can find more information here: https://docs.confluent.io/current/connect/kafka-connect-elasticsearch/index.html



2. Can I achieve the same sort of parsing, transformation and flow control features of logstash directly in Kafka



The only Kafka ecosystem feature I'm aware that can help you do something like that is Kstreams (but you have to know how to develop using Kstreams API) or using another Confluent piece of software called KSQL that allows to do SQL Stream Processing on top of Kafka Topics which is more oriented to Analytics (i.e: Data filtering, transformations, aggregations, joins, windowing and sessionization)



You can find more information on KStreams here: https://kafka.apache.org/documentation/streams/



And you can find more information on KSQL here: https://docs.confluent.io/current/ksql/docs/index.html



Conclusion



In my opinion you wouldn't be able to achieve ALL sort of parsing and transformation capabilities of Logstash / NiFi without having to program with the Kafka Streams API, but you definetely can use kafka-connect to get data into kafka or out of kafka for a wide array of technologies just like Logstash does.



A nice illustration of such setup (taken from Confluent) would be:
enter image description here






share|improve this answer












I'll break down your question in two:



1. Can events being streamed via kafka can be indexed in ElasticSearch



Yes, if you consider Confluent kafka-connect as part of Kafka. It's not kafka itself that does the indexing but a kafka-connect sink connector that will be configured to consume from your kafka topics and index the events in Elasticsearch.



You can find more information here: https://docs.confluent.io/current/connect/kafka-connect-elasticsearch/index.html



2. Can I achieve the same sort of parsing, transformation and flow control features of logstash directly in Kafka



The only Kafka ecosystem feature I'm aware that can help you do something like that is Kstreams (but you have to know how to develop using Kstreams API) or using another Confluent piece of software called KSQL that allows to do SQL Stream Processing on top of Kafka Topics which is more oriented to Analytics (i.e: Data filtering, transformations, aggregations, joins, windowing and sessionization)



You can find more information on KStreams here: https://kafka.apache.org/documentation/streams/



And you can find more information on KSQL here: https://docs.confluent.io/current/ksql/docs/index.html



Conclusion



In my opinion you wouldn't be able to achieve ALL sort of parsing and transformation capabilities of Logstash / NiFi without having to program with the Kafka Streams API, but you definetely can use kafka-connect to get data into kafka or out of kafka for a wide array of technologies just like Logstash does.



A nice illustration of such setup (taken from Confluent) would be:
enter image description here







share|improve this answer












share|improve this answer



share|improve this answer










answered Nov 15 at 17:56









Alexandre Juma

505212




505212








  • 1




    Wow! Thank you for the great explanation :) It is always wonderful to have these kinds of explanations in Stackoverflow posts.
    – Dasun Pubudumal
    Nov 21 at 3:27














  • 1




    Wow! Thank you for the great explanation :) It is always wonderful to have these kinds of explanations in Stackoverflow posts.
    – Dasun Pubudumal
    Nov 21 at 3:27








1




1




Wow! Thank you for the great explanation :) It is always wonderful to have these kinds of explanations in Stackoverflow posts.
– Dasun Pubudumal
Nov 21 at 3:27




Wow! Thank you for the great explanation :) It is always wonderful to have these kinds of explanations in Stackoverflow posts.
– Dasun Pubudumal
Nov 21 at 3:27


















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f51401795%2fusing-kafka-as-alternative-to-filebeats-and-logstash%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Biblatex bibliography style without URLs when DOI exists (in Overleaf with Zotero bibliography)

ComboBox Display Member on multiple fields

Is it possible to collect Nectar points via Trainline?