This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Available scripts and objects in synchronization engine mapping scripts

 I am not finding any documentation that discusses the available scripts, objects, and functions that are able to be referenced from within a synchronization engine mapping script.  I would like to be able to reference the custom schema type (filter), the mapping object name, or other properties of the mapping or workflow that should be available at run time.  The goal in this scenario is to be able to connect to an LDAP system that doesn't natively have groups exposed.  I have created a custom filtered schema class and a mapping that references the new class, but the value for vrtStructuralObjectClass (in browse) shows the parent schema class instead of the one selected during the mapping creation.  I need some way of determining which mapping is currently running so that I can use that information in a property mapping script (OU selection).

Parents
  • Hi Markus,

    I believe the OP's question was generic.

    You are correct to say I could solve this problem by modifying the template, but that's not really the point. There is no established or formally communicated best practice for resolving this sort of issue, and if I were a customer I would be entirely within my rights to insist on a full and formal change control (with design documentation, impact assessment, regression testing and the rest) to modify any script in production in One Identity that affects multiple target systems.

    Which means, there's at least one real world scenario that I can think of (because I've already been there!) where modifying a sync project might take a day to develop, unit test and deploy to production, but making a similar change to the template might take 3 months because of the perceived impact.

    Here's a real world technical example. I have seen a UNS system in a dev lab which uses containers. Each row has a "Container ID" and a "Parent container ID" value. The system does actually have a field for DN. But these DNs are not the same that would be calculated by Identity Manager.

    Should I use the sync project to pass the DN generated by the target system to One Identity Manager, and have the target system owner deal with any data quality issues this picks up on? Change Control risk: does not affect any other system.

    Or should I let the template calculate the DN from the CNs and end up owning any data quality issues? Change Control risk: affects every system with containers in the UNS Name Space, while making me responsible for fixing the data as well as regression-testing the change to the template against all other onboarded UNS systems.

    My preference would be to do this in the sync project (if possible) because

    (a) Data qualities in a custom target system are not my job to fix. I just need MY code to recognize a data quality issue and react appropriately.

    (b) A template change impacts on ALL UNSContainerB.DistinguishedName values for ALL container objects across all target systems to undergo the same sanity check. Really, one-off edge cases unique to one target system shouldn't even touch a "global" script.

    Here's another example of a "standard" VB function you can't use...

    So at the moment I (apparently) don't even have the option of writing my validation code in the sync project because functions that work fine in the template do not work in a read script.

     

    HTH

Reply
  • Hi Markus,

    I believe the OP's question was generic.

    You are correct to say I could solve this problem by modifying the template, but that's not really the point. There is no established or formally communicated best practice for resolving this sort of issue, and if I were a customer I would be entirely within my rights to insist on a full and formal change control (with design documentation, impact assessment, regression testing and the rest) to modify any script in production in One Identity that affects multiple target systems.

    Which means, there's at least one real world scenario that I can think of (because I've already been there!) where modifying a sync project might take a day to develop, unit test and deploy to production, but making a similar change to the template might take 3 months because of the perceived impact.

    Here's a real world technical example. I have seen a UNS system in a dev lab which uses containers. Each row has a "Container ID" and a "Parent container ID" value. The system does actually have a field for DN. But these DNs are not the same that would be calculated by Identity Manager.

    Should I use the sync project to pass the DN generated by the target system to One Identity Manager, and have the target system owner deal with any data quality issues this picks up on? Change Control risk: does not affect any other system.

    Or should I let the template calculate the DN from the CNs and end up owning any data quality issues? Change Control risk: affects every system with containers in the UNS Name Space, while making me responsible for fixing the data as well as regression-testing the change to the template against all other onboarded UNS systems.

    My preference would be to do this in the sync project (if possible) because

    (a) Data qualities in a custom target system are not my job to fix. I just need MY code to recognize a data quality issue and react appropriately.

    (b) A template change impacts on ALL UNSContainerB.DistinguishedName values for ALL container objects across all target systems to undergo the same sanity check. Really, one-off edge cases unique to one target system shouldn't even touch a "global" script.

    Here's another example of a "standard" VB function you can't use...

    So at the moment I (apparently) don't even have the option of writing my validation code in the sync project because functions that work fine in the template do not work in a read script.

     

    HTH

Children
No Data