Answer the question
In order to leave comments, you need to log in
Storing data that changes over time. For subsequent statistical analysis, what approaches are available?
There is a system for automating a certain business process. Its database (SQL Server) stores only its current state. There is no logging of the state change of any objects or their parts. Naturally, in this state of affairs, it is impossible to answer questions like: "And how many orders did we have in the Nth state on the Dth date?", "How long, on average, the orders are in state S?", "Who fails the most in business process compliance, when was it, and how often?
Hence the conclusion - before proceeding with analytics, you need to set up a data storage system, and so that you can select any point in time and get the state of the OLTP system for it (ideally, as a filter parameter in a query, or a stored procedure that builds dataset for the report).
Can you please tell me how to implement this in practice? Can links to books-articles on the topic give? Is it possible that there are ready-made databases optimized for such tasks? Maybe someone implemented using SQL Server (persistent request of the customer)?
Answer the question
In order to leave comments, you need to log in
EVG , there is little information, because you set the task too abstractly. How to store depends on the object and on the purpose of storage. None of this is known to us, respectively, and advice will all be like "leading a finger in the water."
For experienced developers and analysts, such issues are resolved at the stage of system design. Given the detail of the question from myself, I can only advise using nosql or sql with nosql capabilities, for example, postgres works very well with json
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question