apache kafka - Event sourcing in Flink -


i have flink application implemented following event-sourcing paradigm. both events , commands stored in several kafka topics.

the application has 2 startup modes: recovery , production. first, recovery mode used recover application state (a savepoint) events topics. in mode, commands not read @ all. once event topics have been processed, savepoint triggered manually (from command line) , application stopped. then, yarn process started in production mode. in mode application processes both events , commands.

i prefer execute process programatically. it, several questions arise... how can application itself:

  1. detect kafka sources have been readed?
  2. trigger savepoint programmatically?
  3. stop , start programmatically?

thank you!


Comments

Popular posts from this blog

php - How to add and update images or image url in Volusion using Volusion API -

javascript - jQuery UI Splitter/Resizable for unlimited amount of columns -

javascript - IE9 error '$'is not defined -