Firstly, let’s explain quickly what Viva Goals is.
Viva Goals is a tool that allows us to define and track the goals of the business. The goals or objectives are measured using KPIs to see if the objective has been met. Viva Goals allows objectives to be set at business, department, team, and individual levels. The idea is that an individual’s objective will help their, team, department and business meet their objectives.
This video gives a good introduction.
Ever since Microsoft announced their acquisition of Ally.io I have been excited to see how it would become part of the Microsoft Viva suite. Microsoft then announced Viva Goals and that it was going into private preview earlier in 2022 and although I tried to get on the programme it was not to be.
On the 1st of August 2022, Viva Goals was released to us all and I was keen to start seeing how we could use it at iThink 365 to drive the business forward.
This blog is the first post in a series of posts where I plan to document our Viva Goals journey in the form of diary posts. These posts will capture the things that we have learned and tried with the aim of helping others and sharing the knowledge of implementing Viva Goals.
This section will contain the list of blog posts to provide an index for the series of posts.
The goal that I am looking to achieve is to align the iThink 365 team so that they are working towards the same goals as the business. Some examples of the goals are to build an organisation that is continuously learning and experimenting and that is a successful and sustainable business.
The plan is to start with using Viva Goals for the business objectives that I am working on. This will allow me to understand how Viva Goals works and get my head around how it can be rolled out to the rest of the team.
The next step is then to take the team on the journey. To be successful, I believe they will need to understand the why, what, and how of Viva Goals.
Once the team understand the what, how and why we can then start implementing and assigning objectives which align them to the overall business strategy.
Anyway, that is it for this post, but I look forward to sharing our Viva Goals journey at iThink 365 in the next one.
If you have any questions, feedback or comments please leave them below.
If you have read it then, that is great and thanks for taking time out of your busy schedule to read this.
As mentioned, there are two core pipelines, the build pipeline, and the release pipeline. This blog post will delve into how they work currently. I fully expect that they will change as I get feedback from the community.
I really look forward to that as I am sure they can be better! I have some ideas on some tweaks that need to be made.
Anyway, let’s get started.
The Build Pipeline
So, the Build Pipeline has a parameter block and a set of variables.
The parameter is used to decide whether the build should create a managed or unmanaged solution.
The variables are used to identify the solution (PowerPlatformSolutionName) that needs to be exported and the name of the environment (PowerPlatformEnvironmentName) that the solution needs to be exported from.
The final variable provides the major, minor, and revision numbers as the version prefix for the solution version. The build number is handled by a counter managed by Azure Dev Ops.
Next, we resolve where the PAC tools are installed on the build agent and set the version number.
We then publish the customisations in the environment. This makes sure that we are using the latest version and if a developer has forgotten to publish something it will get picked up.
Then we set the solution version based on the build version. I like this as you can now tie the build / release process to the solution deployed to Power Platform!
Next, is exporting the solution. We do this twice! Once as an unmanaged solution and again as a managed solution. This means we have a copy of the solution held in source control in case something happens to the source Power Platform environment.
The next step extracts the solution settings, environment variables and connection references.
Next, we unpack the solution so that we can get the environment variables and blank them. This makes the solutions easier to port into the other environments. Also, if we want to apply settings as part of the release, we can do that!
Once we have extracted the environment variables and cleared them out, then we can pack the solution back up.
Finally, we take a copy of that solution settings file and use the Azure Dev Ops Publish Assets task. This associates all the files in the Artifacts Stage directory to this build.
These can be seen here.
Phew, let’s talk about the release pipeline!
The release Pipeline has been covered in a previous post a little but let’s discuss how it works.
These release or deployment pipelines need to deploy something. That something is referred to by the resources tag. In our example we are referring to the build Pipeline assets.
When you run the release Pipeline you can see here that the files are downloaded by the Download Artifacts task.
The release process will deploy to the various stages defined by this YAML script.
However, the main logic to deploy is held in another YAML file, as shown by the template parameter.
Let’s look at that YAML file!
This starts with a set of parameters that are used to deploy the solution.
The release is quite simple and to be honest there are some more things that we can do. For example, apply settings to the solution. However, I wanted to keep this part simple.
So, initial steps show the files that are part of the release (for debugging purposes).
Then we have a task which could be used to replace tokens. This might be used for example to set values in the settings file. (More on that in another post).
Next then we install the tooling using required for the Power Platform deployment.
The final task performs the actual deployment into the Power Platform solution into the environment using the PowerPlatform import solution task. This also applies the settings via the DeploymentSettingsFile parameter.
And to be honest that is it!
So, in this post we have covered how the pipelines work. I hope that it has been useful. I am sure you will have some improvements to suggest. Also, I bet there are some other ideas for what could do done here. If you do, I would love to hear them!
In the next post I will discuss some common issues that we have seen when deploying the solutions with these processes.