Now Reading
Dear Microsoft, let’s talk about Copilot for Windows.

Dear Microsoft, let’s talk about Copilot for Windows.

I see we have big plans for Copilot on Windows. That’s exciting news.

To start, what an awesome name we have in Copilot. The word is apt. It neatly captures everything we hope this technology could be for us. A resource that’s close enough, so you don’t feel like you’re going out of your way to reach it. Capable, so that you can trust its abilities confidently. And clear on its role, where you set course, and it supports you. Rather like a Copilot. 

As we converge around this exciting technology that is AI, I don’t know that there could be a better-suited partner to lead the way. That wasn’t always the case. It’s no secret that Microsoft has had to evolve both its process and persona to become the organisation we see today. One that has found and reconnected with its soul. I’m inspired by this. I too am working on myself. From one work in progress to another; well done for going on that journey. Your pioneering spirit and service posture earn you my vote of confidence.

Copilot for Windows is probably the closest to having an AI companion that works alongside us. It’s remarkable how much our sense of what’s possible has evolved in the last 18 months. Something that felt like a utopian dream not so long ago now seems reasonable to expect. In a short time, we’ve gone from asking “How on earth?” to “Why not?” 

This feels like a pivotal moment when we embed AI deeply into Windows, the very platform many of us rely on to deliver our life’s work.

I’m hopeful and nervous at the same time. Hear my thoughts. 

Safety first

Let’s address the elephant in the room, safety.

I take comfort in knowing you’ll spare no effort to safeguard user data. To be sure, I think you’re far more capable in this regard than myself, and the average user. However, it is true that despite our best efforts – yours and mine – bad outcomes are still possible, often presenting in ways we least expect. And it’s this residual risk that calls for special care. To make things more interesting, AI is an especially tricky technology to safeguard. Like the digital marketing sector which has perfected the science of targeted ads, AI works best with full access to user data.

Alas, not everyone wants this level of exposure to AI, especially at this stage. Some users want to dip their toes in to make sure the temperature is right for them. Others are more confident, and happy to start with a deep dive. For good measure, some will gladly venture in and make a splash, but need the assurance that the waters are shallow, and if they have to, they can stand and walk away. You get the gist. Comfort levels differ. How we respond to these individual differences can make all the difference. Pun intended.

Here’s an idea that has resonated with me: letting users tell you their comfort levels. From a user perspective, they would be making a direct tradeoff between their exposure to the AI, and the functionality they get from it. Preferences would range between all and nothing. 

One user may limit Copilot to their local files for example, with no app access. Realistically, the value-add in this context would be more like an AI-enhanced filename search. Adding file content access would allow the AI to surface shared themes across different files. One might ask a question like “What do my files say about ESG?” to discover multiple results related to this broad topic within their files. The benefits are outsized for the user who gives the AI full access. 

Building on the notion that users would want to choose how closely they relate to Copilot, here are some ideas for giving users the agency they need to manage their relationship with the AI. 

Copilot and user-agency

Make it easy

A pet peeve of mine is when important settings are buried in layers of sub-menus. Finding them becomes an endurance test, and most users, myself included, are not ready to go above and beyond. The natural solution is to make it easy. With modern UI/UX design at its current level, this is achievable. 

While there’s no precedent for an AI assistant on an operating system, I see natural parallels (and crossovers) between a web browser and Copilot for Windows. Both plug into the internet as part of their normal use. And in so doing, both create the risk of security breaches. As a result, both require robust security tools and controls to mitigate this risk. 

You’ve had some time to practise with security and privacy settings on the Edge browser. For the most part, I think you’ve nailed it on simplicity and usability. This fills me with hope. I trust you’ll bring the same UI/UX design philosophy into Copilot for Windows, enabling users to easily (and precisely) dial in their preferences. 

In simple terms, you want to make it easy for Copilot users to dip their toes, make a splash, or go all in. 

Default to safety

When users haven’t stated their preferences, you often have to make assumptions about them. In these situations, the assumptions we make about user preferences and needs will impact many, and those assumptions should be informed by the right ideas. 

It’s like planning a party for many people. Do you make it interesting and take liberties with the food menu? Or do you accommodate those who may have special dietary needs? This is not an enviable position to be in, I admit. It requires one to balance the risks to a few, against the presumed wants of many. 

See Also

It might be possible to achieve both. If we put everything on the table and label all of it, we empower users to make informed choices that work for them. 

In this regard, my advice would be, don’t serve, educate. Bringing this idea to Copilot for Windows, default settings should prioritise safety, and not make assumptions about which risks users are willing to take. Put the options in plain sight, and give users clear choices.

I realise this may seem counterintuitive, especially when there is a financial imperative in play. Ultimately, users will stick to their own choices in the long term. Helping them make those choices is a win-win and builds trust. 

Be transparent

One criticism of AI at the moment is that it’s not explainable. There are inputs and outputs, but the process in between does not follow simple logic. This makes it hard to know exactly what’s happening under the hood. 

As a user, I don’t need to know the details, I probably wouldn’t understand them. I do believe, however, that it would be useful to know what data is feeding into the AI. This would enable me to gauge how extensive the data is, and how effective Copilot is at using it.

User data can be presented in several ways: cumulatively, on a query-by-query basis, periodically (e.g. in each billing cycle) etc. What’s important is that it strikes a balance between simplicity and detail.

In the same vein, users should be able to delete their data permanently, if they wish to. Clear out the cache if you will. 

These are not entirely new concepts. The right of access and right to be forgotten are among several data privacy protections based on the EU’s General Data Protection Regulation (GDPR) and similar laws. Data regulators will be pleased to note a healthy alignment with their values. That can only be good for business. 

View Comments (0)

Leave a Reply

Your email address will not be published.

© 2024 The Finance Chapter. All Rights Reserved.

Scroll To Top