Your English writing platform
Free sign upExact(2)
Implement the data transformation logic within the data pipeline.
In this paper three different approaches were explored: 1. Implement the data transformation logic within the data pipeline.
Similar(58)
The third approach is again like the second approach in that no transformation is carried out in the data pipeline, but the transformation logic is implemented in a common library and is available to be imported into any analytics jobs.
If the data pipeline is replaced then the transformation logic would need to be re-implemented.
As expected, the first approach was complex as all the transformation logic was in the custom data pipeline so the transformation logic had to be re-implemented into Apache Flume.
The second approach has two benefits as the transformation logic is moved to a centralised location and untampered raw data are stored as well as the transformed data.
Data transformation 3.
Invokers provide the invocation logic to binding protocols and implementation technologies, while interceptors are a special kind of invoker that provides additional functionality, such as data transformation, security, and transaction control.
Data ingestion with data transformation and without data transformation.
(a) Before data transformation and (b) after log data transformation.
The transformation logic can be implemented in a shared library, which can be imported into any analytics jobs.
Write better and faster with AI suggestions while staying true to your unique style.
Since I tried Ludwig back in 2017, I have been constantly using it in both editing and translation. Ever since, I suggest it to my translators at ProSciEditing.
Justyna Jupowicz-Kozak
CEO of Professional Science Editing for Scientists @ prosciediting.com