SQL Plug-in + DynamicTypes = Simple CMDB for vCAC - Part 2

Welcome back! This is the second article of a multi-part series that steps you through the process of mapping a SQL table into vRealize Orchestrator, building out a DynamicTypes plug-in inventory based on that table, then exposing it to vRealize Automation's Advanced Service Designer (ASD). In the first article, we got our database table mapped using the SQL Plug-in and generated some CRUD workflows.

Introduction

Let's build a simple Dynamic Types plug-in around our SQL Table that we created in our previous article.

NOTE: For an overview of the Dynamic Types plug-in, review this great article by Christophe!

This second article will cover the following topics:

Future articles will build upon the work performed here. They will cover:

  • How to add the simple CMDB to vRealize Automation's Advanced Service Designer
  • How to add/remove entries to the CMDB via vRealize Automation Infrastructure as a Service (IaaS) workflow

Import Package

Building out this solution by hand would take hours of testing and digging through code to figure things out. Instead of stepping you through that process, I have uploaded a package to the VMware Communities for use with this article. Download that package and import it to your vRO server now. https://communities.vmware.com/docs/DOC-28573

media_1417816006050.png

  1. Once you have imported the package, run the "1 - Define Plug-in Namespace and DB" workflow.
  2. Provide a Namespace for your plug-in - this will be reflected in your object types once they have been defined. I will use MyCMDB here.
  3. Specify the SQL Plug-in Database you are creating the plug-in for
  4. Click Select to choose the database
  5. Click Submit when done

We need a root element in our Inventory to hold the various objects. This root element is our Namespace. The Namespace is also used in the construction of the Object Type name as follows:
DynamicTypes:MyCMDB.Asset

DynamicTypes = The Plug-in that has the object
MyCMDB.Asset => This is the Object Type (will be creating that soon...)
To break this down further:
MyCMDB = Namespace
Asset = Type

The Database we specified during namespace creation was added as a Custom Property of the Namespace object so that it may be easily referenced in later workflows and actions.

Review vCO Inventory 1/2

review_vco_inventory_12.png

  • Click on the Inventory tab, then expand out Dynamic Types to see that you now have a Namespace listed: MyCMDB
  • The right pane gives details on the selected Namespace

Review vCO Inventory 2/2

review_vco_inventory_22.png

Click on the Custom Properties tab in the right pane
Note that the "CMDB" SQL:Database object has been stored in the "sqlDb" key as a custom propery
This allows for easy retrieval in workflows using the following syntax:

// Get Namespace object by name:
var ns = DynamicTypesManager.getNamespace("MyCMDB");
var sqlDb = System.getModule("com.vmware.library.customProperties").getCustomProperty(ns,"sqlDb");
// at this point, sqlDb is now a SQL:Database object that we can use for further scripting, queries, etc...

Define Type(s)

define_type_s_.png

Now, we must define one or more Types. In my case, I want my type to be Asset so that's what I'll specify when I run the workflow...

Run the "2 - Define a Type and its parent folder" workflow

  • Select your Namespace that was created in the previous step (Once selected, the "Parent Folder name" input will populate a dropdown list with the names of all the tables in the associated DB)
  • Select a Parent folder name (Database table - this will be the name of the root inventory folder that holds the table records. Once a selection has been made here, the table fields will be loaded into dropdown lists for the ID and Name inputs)
  • Specify a Type Name (This will be the name of the object/table record entry you wish to be able to request.)
  • ID Field: Each Dynamic Type object requires an ID, choose one of the table fields to be used as the ID.
  • Name Field: Each Dynamic Type object requies a Name, choose one of the table fields to be used as the Name.
  • Choose an icon (Resource Element image file) to represent your Object Type in the inventory (this is the image to be used as an inventory icon. You may add more images by uploading images to vRO Resource elements)
  • Click Submit

NOTE: The ID and Name fields that get mapped here MUST be mapped to table fields that will never be null.

Check Inventory Custom Properties

check_inventory_custom_properties.png

The workflow that was run in the last step bound the selected name and id field names as custom properties to the Dynamic Type for the "Asset" I specified. These fields are from the DB table we chose - and we can also see that the table name was bound. In my case, this is "Assets". Binding these values as custom properties allow for easy retrieval by the findAll and findById actions that help make the DynamicTypes work!

Check Inventory

check_inventory.png

Now, expand out your Namespace and observe that each record in your DB Table has an inventory object showing under the folder. As you can see in the screenshot above, my table currently has two entries: controlCenter and vco.

Create a Record

create_a_record.png

Remember those CRUD workflows we created in Part 1? It's time to use those now!

  1. Create a new Workflow named something like "Create a Record"
  2. Drag the "Create active record for 'Assets'" CRUD workflow onto the schema
  3. Drag the "getDbObject" and "getParentObject" actions from the "com.vmware.coe.dynamicTypes.sql" Action module onto the schema AFTER the Create workflow - be sure to place them as shown in the screenshot above
  4. Drag the "invalidateObject" action from the "com.vmware.coe.dynamicTypes" action module onto the schema after the other actions (NOTE: This was only necessary with earlier versions of DynamicTypes to ensure that vRO was made immediately aware of the updates. With 7.x this object actually generates an error and workflow failure so do not include this element on vRO 7.x)
  5. Add a Scriptable task to the end of the workflow and name it "Map Output"
  6. Click back on the "Create active record ..." workflow
  7. Now click the Setup button that appears in the top right of your schema

Create a Record - Promote Workflow Input/Output Parameters

create_a_record_-_promote_workflow_inputoutput_parameters.png

The exact steps you take here will be up to you... however, to keep things simple, I have set the "isUnique" Mapping Type to "Value" with a Value of "Yes" - this will create an attribute named isUnique with a value of Yes and bind it to the sub-workflow input for me.

The rest of the Mapping Types were left as Input because I will want these provided by the user/vRealize Automation in a future article.

For the Output Parameter, rename the variable to something other than "result" - I chose "newRecord" and set the Mapping Type as "Local Variable" (Attribute).

When done, click the Promote button.

Note: If you set any of the Mapping Types to "Skip" or "Value" you will need to provide some sort of value by editing the sub-workflow "IN" tab or the Create a Record General tab in case you have set to "Value"

Create a Record - Presentation

create_a_record_-_presentation.png

Now that the inputs have been defined for this workflow, it is important to jump over to the Presentation tab and set your ID and Name fields as Mandatory inputs. Failure to include either of these in a record added to the database will result in a broken Inventory!

Once those are marked as Mandatory, return to the Schema tab.

Create a Record - Review sub-workflow Visual Binding

create_a_record_-_review_sub-workflow_visual_binding.png

At this point, all IN parameters for the sub workflow should be bound to either IN Parameters or In Attributes and the OUT Parameter should be bound to an Out Attribute as shown in the screenshot above.

Create a Record - getDbObject

create_a_record_-_getdbobject.png

This getDbObject action will retrieve the DynamicTypes object based on the ID of the newly created database record. The object is needed for some additional actions, and then so that it can be mapped to an output and returned to the calling system/user.

  1. Click on the getDbObject action (Note: Don't worry about the type showing DynamicTypes in the screenshot, that will get fixed during workflow validation)
  2. Click on the Visual Binding tab
  3. Drag the "objectId" from getDbObject to the field you specified as being the record id field. In my case, this is the "ServerID" Input
  4. Drag the "type" from getDbObject to the "In Attributes" box in the lower left and release - a "Create parameter" window will pop up - Specify name as getDbObjectType with a value of MyCMDB.Asset (please see note below), click OK
  5. Drag the "actionResult" from getDbObject to the "Out Attributes" box in the lower right and release - a "Create parameter" window will pop up - Specify name as newAsset, click OK

Note:
The format of the "type" field for this action is: namespace.type
Based on the choices I made earlier in this article, this means I need to specify "MyCMDB.Asset" as the "type" to pass into this action. If you made different choices, be sure to specify based on YOUR Namespace and YOUR type.

Create a Record - getParentObject

create_a_record_-_getparentobject.png

Once we have the newly created object, we need to get the parent object (folder) so that the DynamicTypes plug-in can be notified that the contents of the folder has changed.

  1. Click on the getParentObject action
  2. Click on the Visual Binding tab
  3. Drag "object" from getParentObject to the "newAsset" In Attribute in the lower left corner. (The small triangle will turn green when your mouse is over newAsset to indicate it is ready for binding)
  4. Drag "actionResult" from getParentObject to the "Out Attributes" box in the lower right and release in an empty area so that the Create Parameter window pops up. Name the parameter "parentFolder"

Create a Record - invalidateObject

create_a_record_-_invalidateobject.png

The invalidateObject action takes a DynamicTypes object as an input and notifies the DynamicTypes plug-in that it is no longer valid and should be refreshed. For our purpose, this will result in the folder refresh discovering it has a new child object.

  1. Click on the invalidateObject action (Reminder: this object should not be in a vRO 7.x workflow)
  2. Click the Visual Binding tab
  3. Drag "object" from InvalidateObject to the "parentFolder" In Attribute in the lower left of the screen. Make sure to release on that attribute when the triangle is green to bind them as shown in the screenshot above

Create a Record - Map Output - IN

create_a_record_-_map_output_-_in.png

Now that we've created and retrieved our object and notified the plug-in that the parent folder needs to be refreshed, we can now map our output.

  1. Click the Map Output Scriptable Task
  2. Click the In tab
  3. Click on the "Bind to workflow parameter/attribute" button to create a new input binding.
  4. Select the existing newAsset attribute that was created earlier
  5. Click Select

Create a Record - Map Output - OUT

create_a_record_-_map_output_-_out.png

  1. Now click on the OUT tab
  2. Click on the "Bind to workflow parameter/attribute" button to create a new output binding.
  3. The Output Parameter does not exist yet, so click the link to "Create parameter/attribute..."
  4. Specify name as assetOut
  5. Use the Filter box to search for the asset object type
  6. Select the DynamicTypes:MyCMDB.Asset Type
  7. Change the Create option to "Create workflow OUTPUT PARAMETER with the same name
  8. Click OK
  9. Next, Click on the Scripting tab and enter the following code
var assetOut = newAsset;

Workflow Validation


Click the Save button in the bottom right of the client.
Make sure your workflow validates, I received no errors or warnings when I validated.
It is a good idea to Validate the workflow now to reset bindings to eliminate warnings and verify that we did not miss any important variable bindings.

Once your workflow has been validated, Increment your Version number on the General tab, give the workflow a meaningful description, then click Save and Close

Delete a Record

delete_a_record.png

Now that we have a workflow capable of creating new records, we should create one that can delete them. It is important that the input be of the correct object type .

  1. Create a new workflow named "Delete a Record"
  2. For the first element, add a Scriptable task and name it "Get ID"
  3. Click the IN tab for that Get ID Scriptable task
  4. Create a new parameter named "asset" of type "DynamicTypes:MyCMDB.Asset" and make sure to set as an IN PARAMETER. Click the OUT tab and Create a new parameter named "assetId" of type (string) and keep the default setting so the bound variable will be an Attribute.
  5. When the input parameter has been setup, click Select
  6. Add the getParentObject from com.vmware.coe.dynamicTypes.sql after the scriptable task
  7. Next, Add your "Delete active record ..." CRUD workflow from the first article
  8. Finally, add the invalidateObject action from the com.vmware.coe.dynamicTypes as the last element before the workflow End (NOTE: This was only necessary with earlier versions of DynamicTypes to ensure that vRO was made immediately aware of the updates. With 7.x this object actually generates an error and workflow failure so do not include this element on vRO 7.x)

Delete a Record - Get ID

delete_a_record_-_get_id.png

Click on the Scripting tab of the Get ID scriptable task and add the following code:
var assetId = asset.id;

Now compare your environment with the screenshot here.

Delete a Record - getParentObject

delete_a_record_-_getparentobject.png

Select the getParentObject action, then go to the Visual Binding tab
Drag "object" from getParentObject to the In Parameters and bind it to the "asset" input that is there
Drag "actionResults" from getParentObject to the Out Attributes and release in the empty area - when prompted to create a new attribute, name it parentFolder

Delete a Record - Delete active record (sub-workflow)

delete_a_record_-_delete_active_record__sub-workflow_.png

The Delete active record CRUD workflow that was auto-generated will have a number of inputs. The most important input there is the one that you mapped as the "ID" when setting up the object type. In the case of this article, this is the "ServerID" input. Setup the inputs as shown in the screenshot above.

isUnique should be bound to an attribute named isUnique (boolean) with a value of "Yes"
ServerID should be bound to the attribute named assetId that we created when setting up the Get ID Scriptable task in this workflow
The remaining inputs may be bound to NULL (Do this by clicking on Not Set, then double-clicking NULL. All Inputs/outputs of workflow elements must be bound to either NULL or a valid value)

On the OUT tab, there is a single "result" output of type number. This identifies how many records were deleted. For simplicity in this article, I bind mine to NULL. However, as a best practice, I would actually recommend binding this to an attribute named something like recordsDeleted and then use a decision box to determine if recordsDeleted == 1. If so, continue on, if not perform some error handling since we expect a record to be deleted here.

Delete a Record - invalidateObject

delete_a_record_-_invalidateobject.png

The invalidateObject action takes a DynamicTypes object as an input and notifies the DynamicTypes plug-in that it is no longer valid and should be refreshed. For our purpose, this will result in the folder refresh discovering it has a new child object.

  1. Click on the invalidateObject action (Reminder: this object should not be in a vRO 7.x workflow)
  2. Click the Visual Binding tab
  3. Drag "object" from InvalidateObject to the "parentFolder" In Attribute in the lower left of the screen. Make sure to release on that attribute when the triangle is green to bind them as shown in the screenshot above

Save, Validate, then Save and Close

Test The Workflows from vRO Client

Using the vRealize Orchestrator client, make sure that your Create a Record and Delete a Record workflows each result in the appopriate results: Adding and Removing records from your database table AND from your Inventory.

Wrapping up...

This concludes the second article in the series. Stay tuned for additional articles that will build upon the workflows we have created here in this article.
Previous Article of series: Part 1
Next Article of series: Part 3

Comments  

0 # burke 2015-01-15 20:20
Quoting Mike R:
First of all, thank you for the very thorough article.
...
I can see other dynamic types that I created before trying this method. Is there something I am missing?

Yeah, that's a little bit of a glitch with the version of the plug-in I think. Try this: in vCAC, if you have your vCO set as internal, set to external - even if you just point it to the same internal vCO server. Save, then set it back to what you originally had it set to. If external, then just reverse this process. I will include that in the next article I'm planning - just need time ;)
0 # Mike R 2015-01-15 20:14
First of all, thank you for the very thorough article.

I followed it pretty closely to setup a couple of database driven dynamic types that I want to use for data driven selectors in VCAC XaaS blueprints. The types are in VCO, populated and I am able to update the tables from vco with the inventory getting updated correctly.

The problem I have is that these Dynamic Types are not showing up in VCAC as selectable either as custom resources, or for binding to in the blueprint form components (trying to set my entity for a search component to this dynamic type).

I can see other dynamic types that I created before trying this method. Is there something I am missing?
0 # SeanKohler 2015-01-08 22:11
We have a Postgresql DB called dbservername_db ldb with a table inside it called vcac_vminfo. In that table are various columns representing CMDB information. We have a record_num column and a hostname column.
When I ran step one, it worked fine. Namespace was setup, and I validated everything was as per your screenshots.
Step 2, after selecting the namespace and then the DB table (for Parent folder name) NONE of the table columns showed up for selection for either the inventory ID or the inventory Name. = Dead end.
I have another MS SQL database in the inventory used for other data. Ran Step 1 to set up namespace. Ran step 2 and when the table (BusinessGroupI nfo) is selected, the one available column (BusinessGroupN ame) is presented in both the inventory ID and the inventory Name list.
So, maybe our current CMDB doesn't work because it is Postgres. Or maybe it is because of the underscores in the DB or Table name. Can you test a Postgres table?
0 # burke 2014-12-20 02:06
NOTE: I was informed that when this article was initially published, I forgot to include the one line of code for the "Map Output" element of the "Create a Record" workflow. I have just added the missing line!

Additionally, my screenshots may show a DynamicType while yours may show "Any" as the object type. That should be fine - run the workflow and confirm you get an Output Object of the correct type: DynamicTypes:My CMDB.Asset :)

Happy Holidays!

NUC Lab Kit

Below are my thoughts for a vSAN nuc lab. Since I already have cables, not including them here. I ordered (and received by Nov 30, 2016)
3 x nuc, 3 x 32GB Crucial mem, 3 x Toshiba NVMe drive, 3 x Startech USB to GB NIC, and 3 x Crucial 1TB SSD. I've also been very happy with my Cisco SG300-10 so I bought one more since my existing one only has one port available. Each of the items listed here are linked below - all were purchased using the provided links below.
single NIC (See this post for details on using the USB -> GB NIC item listed below

I stayed with the i5 for the power consumption and form factor vs. the i7 Skull Canyon ;)

 

Search

Experts Exchange