Is procedural/imperative the only data management approach? Isn’t our daily declarative approach better?

We interact with others daily communicating needs and receiving  requests .Every day, in a thousand daily activities, we behave and interact with others and with the context that surrounds us communicating what we need and receiving what is required.

This is because, as a matter of fact, we are not experts on everything and we know (theoretically) that better results are obtained by delegating the realization to those who have the skills and have already done something over and over again.

Who can get on an airplane and independently execute the command sequence for take-off?

Or who at the restaurant would give the waiter the recipe the chef must follow to prepare a dish?

In real life, behavior follows a declarative paradigm: a result is commissioned, it is not specified how to obtain it.

In the IT world instead, particularly in data management, the prevailing approach has so far been the procedural: defining an application’s functioning in detail, even the most technical.

  • In a data intensive implementation solution project is it really necessary to describe the process, break it down into a series of steps and underpasses, for each of which the algorithms and intermediate structures, aggregation and reporting operations must be specified?
  • Is it essential then to describe precisely when these intermediate information structures will have to be created, populated and, if not needed anymore, deleted?
  • And again, is it really necessary to explicitly indicate (perhaps even through a nice graphic editor) in which sequence these elementary operations will have to be performed, foreseeing all possible cases of execution, according to different conditions?

Is it really the only possible option?

The answer to these questions is: no.

Even in data-intensive processes the declarative paradigm can simplify, and much more, the life of the development team and produce results more in line with the client’s expectations.

Let’s start with the articulation of data processing. A declarative model does not require explanation of the various execution steps at the design stage:

  • which functional components to create and their order of execution is determined by the system based on necessities, precedences and implicit dependencies,
  • something that a properly prepared system can do well, without forcing the designer to define everything in advance.

This feature can be found in the Irion EDM platform: It is called Declarative ELT (DELT ™).

Let’s now move on to the management of intermediate information structures, for data transformation and aggregation. In a declarative approach they are created and destroyed automatically according to timing and space. Intermediate or even complex results produced by functional components (as well as input data), are represented as virtual tables, available to other components or explorable through queries and testing: Irion EDM’s Everything As a Table (EAsT ™) takes care of it.

Required data for executions is automatically made available in an isolated and dedicated operating space, IsolData ™. It is a virtually unlimited work space, allocated and freed dynamically from the system, segregated in terms of access permissions and namespace, volatile or persistent. IsolData ™ enables n parallel executions on the same data or on completely different data and/or with different parameters, rules and logics, without any particular structure having to be predefined or managed; all infrastructure management support operations are coordinated and managed by Irion EDM in an optimized and transparent way for the user.

[su_quote cite=”” url=””]With a declarative approach, data quality can be guaranteed at much lower cost than procedural approach.[/su_quote]

For example, in a declarative model all technical controls on data can be generated automatically by the system when the input is declared (flow, uniqueness, mandatory values, format, expected number).

Irion EDM also provides this with the RTG Add Ons, all the “formal” controls are automatically generated based on expected characteristics of data declared through an intuitive interface; execution of these checks is automatically arranged by the system according to the DELT ™ logic.

Let’s not forget about documentation management: when changes are requested during operation of the solution, documentation alignment involves a significant commitment. But in a declarative development environment, the whole solution  operation is driven by metadata; this feature is exploited by Irion EDM platform, and based of such metadata is also able to automatically produce the complete documentation of the solution, always aligned to the real operation status.

And these are just some of the areas of possible solution improvement based on the declarative paradigm.

The result is metadata-driven, faster, higher quality, better performing, well documented and therefore easier to maintain applications. For example, the development time of a data intensive solution done with Irion EDM exploiting the “declarative” features can be reduced by up to 70% compared to traditional procedural developments.

The  of an Enterprise Data Management project management has significant advantages from the application of the Declarative paradigm. With reference to the questions asked we can outline the following:

  • The level of complexity and implementation timing are considerably reduced. It is no longer necessary to specify, before the implementation phase, a whole series of technical details, because a declarative development environment manages them autonomously. Furthermore, technical quality data controls, whose implementation is usually onerous, are automatically generated and the phases of implementation and testing of each individual component can be performed independently and in parallel by different teams. As already mentioned, this makes a more frequent comparison with the customer possible, reducing overall availability time of the final solution,
  • the implementation project and operation of the solution is managed in a single integrated platform that includes all the necessary functionalities. ; the declarative environment reduces, and in many cases removes, the involvement of technical skills (DBA, web developer, UX expert), since a good part of the tasks that would have been taken into account by these figures is managed automatically,
  • changes in progress of the requirements, even significant, involve work only on directly impacted functional components, while the others adapt to any changes in the transit information structures, exploiting the characteristics of the DELT ™ and IsolData ™ technologies; the impact of changes on project times is overall less than a procedural approach:
  • automatic documentation generation facilitates the comparison with non-business users and the verification of functional requirements, accelerates completion times and favors solution maintainability in the event of variations during construction.