apache kafka - Event sourcing in Flink -


i have flink application implemented following event-sourcing paradigm. both events , commands stored in several kafka topics.

the application has 2 startup modes: recovery , production. first, recovery mode used recover application state (a savepoint) events topics. in mode, commands not read @ all. once event topics have been processed, savepoint triggered manually (from command line) , application stopped. then, yarn process started in production mode. in mode application processes both events , commands.

i prefer execute process programatically. it, several questions arise... how can application itself:

  1. detect kafka sources have been readed?
  2. trigger savepoint programmatically?
  3. stop , start programmatically?

thank you!


Comments

Popular posts from this blog

c# SetCompatibleTextRenderingDefault must be called before the first -

C#.NET Oracle.ManagedDataAccess ConfigSchema.xsd -

c++ - Fill runtime data at compile time with templates -