Tag Archives: datastore

It’s ON with Turbonomic and vRO

There have been a lot of changes for VMTu Turbonomic.  I believe most in the industry are aware of the bold name change.  This name is more representative of what the product does based on the economic model it is known for.  Besides that with the latest version, Turbonomic also released vRealize Automation workflows to integrate with their product.  You have to be a member of the Green Circle, which is free, but you can download them here.  There are instructions on importing the workflows, setting up Operations Manager as a REST host, etc.  I was excited to see this but, unfortunately, my environment only uses vRealize Orchestrator.
Below is the schema for vRA workflow:

vra_vmt

First is a scriptable task to gather inputs for vRA.  The inputs are all vRA specific so I could remove these. At the end it, the workflow is pushing back properties to vRA so I removed “Override vRA Settings” at the end.

Inputs removed from original VMTurbo Main workflow:

parameters

 

My workflow ended up like this, removing vRA dependencies and ending with 2 scriptable tasks to convert the datastore and host to VC:objects instead of strings.  These scripts will be covered in another post.

newworkflow

My inputs end up moving from general attributes and are templateName, clusterName and datacentreName. In the future I will likely add a scriptable task at the beginning of the workflow to determine these as they will come from inputs generated by my Windows or Linux Master Build workflow.

Inputs converted from attributes:

newinput

I also now have outputs for the actual VC: Datastore VC: Host System objects for your clone workflow in vRO. These were created via the scriptable tasks which take the strings returned from Turbonomic and do a lookup to match them to the vCenter objects.

Outputs created:

outputs

 

What’s great about having this functionality from Turbonomic is now the best host and the best datastore will be selected based on analytics from Operations Manager. I originally was picking my datastore based on amount of free space but now using the REST API I can have the least utilized host and datastore supplied to by clone workflow.

Download the modified workflows here.

I’ll be going over these workflows in the upcoming webinar “Overcoming Private Cloud Challenges in Healthcare IT”, September 29th at 2:00PM EST.  Register here