Introduction
It's undeniable AI is all over the place and it feels like everybody is jumping on the AI train right now. Every day a new astonishing example of the capabilities of AI is demonstrated. How can I utilize the power of AI? It's a fact AI will change the way we work, so I'd better learn how to benefit from it.
To be honest, I really like what AI can do for me. It allows me to focus on functionality rather than digging into documentation to find out that one parameter or switch I'd need.
Let's get back to what brought me to this topic: generating production ready PowerShell scripts
PowerShell is very powerful but I also have a hate-love relationship with it. I don't write scripts often enough to remember all the nifty details, which can make it a frustrating and time consuming exercise when I want to write one. So I typically resort to ChatGPT to get started, though the results are often mixed. It's clear ChatGPT is not intelligent in a sense that it checks whether what it generated would work, it's a language model zo it generates the next likely piece of script. This leads often to a script I have to review to make it work in the first place. Scripts created that way are far from production ready, where you not only need it to functionally work correctly but you also want it to be documented, maintainable and easy to (re)use.
I must admit I tend to use short phrases as prompts, so on the other hand it might not be a surprise a lot is assumed by ChatGPT when generating the output.
Recently I needed to create a script to output an overview of the performance and scaling settings of all Azure Container Apps in a resource group. Quite straight forward and not to complex. So I started with my typical short prompt to ChatGPT and the script was not working, a mix of PowerShell and CLI and not using any best practices at all.
That inspired me to the challenge to find out whether a more extensive prompt would lead to something that would work, be production ready and using best practices. Obviously when you tell AI in more detail what you'd like, it can generate a better response.
And that's where I think AI will be able to help me. I know what I functionally want to achieve, and instead of writing script (or code) I need to write in natural language what I need. And in a way this is not different from the process that is happening inside my head when I do need to implement some piece of functionality in code. You have to think about the functional and non-functional requirements. How many of us have a rubber ducky of some kind, to function as sparring partner to order your thoughts? You can do the same thing with AI, and the good thing is it will build it for you as well, or will it?
ChatGPT
For this test I'm using ChatGPT 4o with canvas, the model meant to support developers, but it's paid and still in beta. I tried to be way more comprehensive than I'd normally be, so I started with this prompt:
Take the role of PowerShell expert and apply all best practices available to create a production ready script.
I need a script which outputs the scaling details like cpu, memory, number of replicas and custom scaling rules for all Azure Container Apps in a resource group.
The caller of this script will provide the name of the resource group.
I want the caller to first log in to Azure to select the correct subscription.
The script needs to display an overview of the scaling details and output the information as an object.
This prompt generated an already very nicely structured PowerShell script, like can be seen in attempt1.ps1. Couple of problems though, of which one critical: it doesn't work. Also no documentation at the top of the file, so users will have to dive into the script to see how to use it and what to expect.
The reason the generated script doesn't work, is because there is no template
object on root level, there is a properties
object in between.
So by replacing $app.template.
with $app.properties.template.
the attempt2.ps1 worked!
Agreed, I did need to dive myself into the data to see what was wrong, but I didn't have to write a single line of script here.
Amazing!
If I would commit this to Github and create a PR, I would definitely get remarks about this script from my fellow developers. It's not production ready as documentation is missing.
To fix that I added this as instruction:
Can you make sure the PowerShell script has proper documentation?
This prompt not only adds a nice information header, but also adds additional comments in the script itself: attempt3.ps1.
Interestingly enough, the code was also a bit refactored, while I only asked it to update the documentation.
The code also had the same issue again as I experienced in the first attempt: it's missing the properties
object layer.
Maybe it was again due to my short prompt, I probably should have told it to leave the code unchanged. Or I could have been a bit more explicit in the first prompt already.
Take the role of PowerShell expert and apply all best practices available to create a production ready script. This includes documentation best practices.
I need a script which outputs the scaling details like cpu, memory, number of replicas and custom scaling rules for all Azure Container Apps in a resource group.
The caller of this script will provide the name of the resource group.
I want the caller to first log in to Azure to select the correct subscription.
The script needs to display an overview of the scaling details and output the information as an object.
Mistral AI
The daily news might give you the impression all AI developments happens in the USA, but there are interesting things going on in Europe as well. Recently I was pointed to a company called Mistral, a French company which can compete with the OpenAI's of this world. I never played with their tool so I decided to give it a try.
I used the exact same prompt and Mistral generated the following: attempt1-mistral.ps1. I think the result is very impressive, especially because I don't have a paid subscription and it generated this PowerShell script right after I registered for an account. The code is nicely structured, has documentation included and looks production ready, in my view way better than the first attempt of ChatGPT. Next to that, the generated script is fully PowerShell, where ChatGPT generated a mix of PowerShell and CLI. The code however isn't functionally correct, but that was also the case with ChatGPT.
Both have incorrect references to fields for cpu, memory and scaling rules. The difference however is ChatGPT outputs the scaling rules correctly, which is something Mistral doesn't do at all.
To fix the scaling rules missing in the output, I had to prompt again:
I want the scaling rules details in the response as well, can you adjust the script?
After that it adjusted the script to include the scaling rules, but that also was not entirely correct. Basically the same issue where fields from the PowerShell response object were not used correctly. After some manual changes this is the Mistral result: attempt2-mistral.ps1.
Final thoughts
I don't think there is a clear winner in this tiny contest. Although you might argue that a messy but functionally correctly working script is preferred over nicely written script which doesn't fully work. After all, when we need to maintain the code at a later point in time, we have AI to explain it to us again. In that scenario I don't think we even touch the code anymore, we just instruct the AI to adjust the functionality for us. In my view this difference boils down to spending a bit more time on using the right prompt.
I'm extremely excited to see what AI will bring us tomorrow. Having AI at your finger tips can be a real productivity boost and I can really recommend to get your hands dirty to learn how you can benefit from it. It is good to be aware ChatGPT and Mistral are not the only tools out there who can do this. We also have Claude Sonnet, which is said to be very powerful and better than ChatGPT for certain tasks. I haven't been able to try it yet because the login page responds with “Error sending code” when I key in my phone number during registration. And the error cannot be found in the support pages, so my user experience is not that good to be honest.