Simply Speaking Microsoft Ignorance
Microsoft had a change management team which coordinated changes across many of their services. As late as 2020, they didn’t have a good handle on change management and as a result, they made changes that caused business harm. I didn’t have the same insight to their change practices in 2022 but I did see the results of the same types of behavior. Microsoft pushes changes into their services without understanding the implications. Once something breaks bad enough or causes visible damage, they undo it, apologize, and move on.
When I sat down with the change management leadership in 2018 to discuss technical changes they made which impacted my company with clear financial implications, they were deeply sorry but told me “We are the pilots, and you are the passenger.” How ironic is it today that they are releasing a product called “co-pilot.”
In our case, they made an underlying change to a technology that allowed us to put data into a “hold” status. This impacted us both technologically and legally. It disrupted our patent process, held up work and caused the business to miss important deadlines. The net result was a loss to my company and an apology from MS. Microsoft continued to apologize for constant mistakes and actions which over and again were the same. At some point, they stopped apologizing and started telling us that we agreed to their services so we should accept this as the way business is going. If we don’t like it, we can always go somewhere else.
Where is somewhere else?
I ask you right now, where is somewhere else?
Microsoft is into everything. They are more powerful than any utility company and technically have access to more data than most likely any other single entity. If we consider for a moment that Google has access to data on the outside of most companies while Microsoft has access to data that resides inside of companies, there is a major difference in risk.
I know, they don’t access this data. They tell us they don’t, so they don’t. Except, they do.
Why is this important to all of us?
Microsoft is like a child with access to a weapon that should be safely locked in a cabinet. They are either simply ignorant of what kind of damage they will cause, or they know what will happen but value technological progress over the untold damage they will cause.
Why So Serious?
A few basic scenarios:
Years ago, Wendy and I were working with Exxon and there was a post on their newly released Yammer instance. Yammer was supposed to be released only to a few people in IT but grew uncontrollably and went viral. The technology seemed safe enough and was being highlighted by many as game changing. It was game changing, in fact, people were sharing and discovering data that violated laws and created a host of issues. IT was trying to learn and discover the opportunities; Microsoft didn’t have good controls in place and didn’t understand the implications.
In 200x, the US government had a data breach by a bad actor. When the report came out as to what happened and what the impact would be, one of the comments by a leading security official was, “there is so much there in napkin notes, that the xx people won’t be able to make heads or tails of it.” Well, that was then and with the newest technologies, if they still have that data, they can figure it out now. What data may be useful to them from those data stores today? PII and a bunch of other information. They can use that data to correlate information from the internet. You might say that this technology is already available to them without Microsoft. While I’d agree for the data of the past, I’d argue that today, MS has access to a ton more data that poses much more of a risk and they don’t have their shit straight on how to keep it protected.
The risks are so very high due to the fact that most organizations don’t know what they have in their own data stores. Today, someone could be storing a ton of illegal content and most companies wouldn’t even know it’s there. If they enabled these services to go into archives, the risk is amplified by differences in social norms and lack of security controls, differences in values etc. I remember a time when people would download pornography and illegal music copies, software, and store these on shared folders. In many cases, when shared folders were migrated, they weren’t audited. Pretty confident there is some bad stuff on some SharePoint Online that will be unearthed with GPT4. “Oh, pornography, how did that get there”?
If only Microsoft were accountable for the services, they currently force on people. This is important because whether or not you buy these services, it doesn’t matter, your data is living on a Microsoft product somewhere for sure. Maybe at some point, Microsoft will sell a “Data Implication Service” which will have AI running risk scenarios over all of your data and send your security team internal threat assessments on your internal knowledge stores. This doesn’t seem like their plan today and if they did offer you this service, it will cost you a PLENTY. This is a whole new field for people!
KM experts used to talk about all the data, information and knowledge being archived in the same way as old things found in grandma’s attic. It was a good way for us to keep cool things and forget about them. In some cases, we suffered loss through these practices. For example, NASA lost the recipe for the heat shield and had to essentially recover and rediscover it.
Wonder what we will find in the attic? Will it be good or things we should have left in the box? In our age of social extremes, even if something is terribly damaging, we tend to share it.
On the nightly news you might hear a story that goes like “today we discovered if you push these three buttons on your phone, you’ll steal a million dollars from your neighbor.” Security experts say, “the exact code is 123 / ABC” people are worried and buttons are being pushed. In other words, very few people are concerned about the impact of sharing too much information. Even the nightly news folks feel compelled to share information they shouldn’t.
We live in an age where private conversations over texts are put on billboards. Microsoft Copilot technology is not only a threat to organizations, but also a threat to everyone. People have personal Microsoft accounts and as mentioned before companies, organizations, and the government have data about you and information important to you in Microsoft technologies.
Not Just MS
One could argue that AI is here, and that risk isn’t just with Microsoft. Sure, we could argue this, and it would be true. I’ll ask us to consider which organization if any has access to more information and data globally more than MS.
This type of AI service is dangerous, risky, and reckless. Microsoft and other companies need to understand more before releasing this into the workloads.
While there isn’t much that can be done overall, at the very start of this people that use MS services should make sure these are turned “off” until they can perform analysis, test and reduce business risk. Based on history, MS will just roll this out and we will wake up one day and see it in Office. It will be really cool at first and scary once it starts digging deep into data stores. Hang on.. it will be a fun ride. If I were a lawyer, I’d stand up a whole practice around this service.
What do you think? Disagree? Agree? Are you worried about it? I am ..
One Reply to “Ladies and Gentlemen this is your Captain Speaking PREPARE for a Bumpy Ride with MS Copilot”
I have experienced this vulnerability