This question was originally posted on DCIM Support by Akihiro Yamagami on 2018-09-13
I cloud not find the documents to setup the environment to develop custom transformation.
I took the following steps. Is this the appropriate procedure?
By the way, I'm developing custom transformation, but I still cannot understand the purpose of "Spoon Client Library".
I also found the DCO related steps, such as "DCO Cage Input" in the "design view" of Spoon, but I cloud not find the information how to use them.
Could you kindly explain the purpose of introducing "Spoon Client Library"?
Regards,
Akihiro Yamagami
(CID:134031147)
Solved! Go to Solution.
This answer was originally posted on DCIM Support by Benjamin Bjørn Larsen on 2018-09-13
Hi Akihiro,
I must admit, I am unsure if we have any official documentation on how to interact with these custom steps. However, I can give you a brief explanation.
In a few words:
The "Spoon Client Library" is a library, which exposes Pentaho Data Integration/Spoon/PDI transformation steps specific for Struxuware Data Center Operation (DCO), to the PDI client.
Some deeper insights:
The steps will give you, the designer of ETL integrations, a means to interact with the DCO server directly, utilising some of the functionality and computations, done internally in DCO.
By the nature of the transformation steps, they're "black box" steps, which doesn't give you any insight into their inner workings. They do connect to a counter part on the DCO Server, which they'll be able to connect to. The data they're exposing can be found directly from the output of the standard export job.
When I am designing a ETL export integration, I've rarely used the steps my self (I can't think of when, if ever), I am instead usualy basing them on the output from the standard export job, whenever I need some of the data generated through these steps. I've found it easier to work with in a general context, than using the usage specific steps from the library. We do however provide you with the option to use the steps, if you'd prefer.
I hope this was the information you needed, otherwise I can try to find you some official documentation (if we have any) for you.
Best regards,
Benjamin
(CID:134031629)
This answer was originally posted on DCIM Support by Benjamin Bjørn Larsen on 2018-09-13
Hi Akihiro,
I must admit, I am unsure if we have any official documentation on how to interact with these custom steps. However, I can give you a brief explanation.
In a few words:
The "Spoon Client Library" is a library, which exposes Pentaho Data Integration/Spoon/PDI transformation steps specific for Struxuware Data Center Operation (DCO), to the PDI client.
Some deeper insights:
The steps will give you, the designer of ETL integrations, a means to interact with the DCO server directly, utilising some of the functionality and computations, done internally in DCO.
By the nature of the transformation steps, they're "black box" steps, which doesn't give you any insight into their inner workings. They do connect to a counter part on the DCO Server, which they'll be able to connect to. The data they're exposing can be found directly from the output of the standard export job.
When I am designing a ETL export integration, I've rarely used the steps my self (I can't think of when, if ever), I am instead usualy basing them on the output from the standard export job, whenever I need some of the data generated through these steps. I've found it easier to work with in a general context, than using the usage specific steps from the library. We do however provide you with the option to use the steps, if you'd prefer.
I hope this was the information you needed, otherwise I can try to find you some official documentation (if we have any) for you.
Best regards,
Benjamin
(CID:134031629)
This comment was originally posted on DCIM Support by Akihiro Yamagami on 2018-09-18
Thank you for your detail explanation. It's quite helpful to understand ETL.
I understand that the steps of "Spoon Client Library" are internally used for connecting DCO server and user don't need to care the steps in general.
I could already enough data in export database using standard export job, swoExportJob.
So, I will use this export job.
Best regards,
Akihiro Yamagami
(CID:134032985)
This question is closed for comments. You're welcome to start a new topic if you have further comments on this issue.
Discuss challenges in energy and automation with 30,000+ experts and peers.
Find answers in 10,000+ support articles to help solve your product and business challenges.
Find peer based solutions to your questions. Provide answers for fellow community members!