AI inaccuracies and misinformation

So, Microsoft is, in effect, self-insuring its AI products. Maybe its time to brush up on copyright law. Microsoft will defend AI users from copyright infringement lawsuits

Now this is a fascinating quote from counsel:

“We are charging our commercial customers for our Copilots, and if their use creates legal issues, we should make this our problem rather than our customers’ problem.”

If I’m a commercial customer of Microsoft (Office 365, OneDrive) and M$'s breach exposes my client data, it’s on them now?

I suspect Microsoft recognizes that it will ultimately be responsible anyway. This really is getting interesting.

Ed Bott, at ZDNet, has been writing lately about Microsoft’s plans mid-long term, including my fear that they have already stuck the stake in the ground to moving all software and services (including Windows) to the cloud. If they do that, and you are really only running off their backend services, then this acknowledgment as to AI does transfer to the whole liability issue for backend failures and breaches…

I just don’t want to be the first lawyer to have to tell the court - “But your honor, I use Windows 13, and it and all my files are in the cloud, and it was down at the deadline…”


If MS pushes this AI too hard and literally pushes everyone to the cloud, I will unequivocally switch to the Apple walled garden and not look back.


There’ll be AI inside Appletraz too.


Yeah but inside Apple I have advanced data protection—meaning I am protected from my data being siphoned.

Anyway, I am hoping that all this stuff is optional on the part of MS.

By the way, I typed a bad word thinking it would be made into the squares, but instead the post showed it and was hidden. :smirk: I edited so I think we are good.

So quasi on topic to all this. We have discussed in multiple threads "computational photography " both it’s obvious benefits but also some recent abuses such as the “fake moon” shots with the Samsung S23.

Along those lines, the thought has come up about reliance on digital photos as being truly representative of a scene/even give the myriad ways that the AI portion manipulates the image.

I had lunch with our legal counsel and he shared that another part of his firm is getting ready to challenge the admissibility of smart phone photos, in this instance of traffic court of all things involving someone not observing a posted stop sign.

In this case the plaintiff is arguing with his own photos that the sign is not properly visible, and just as importantly, the municipalities photo “evidence” of him running the stop sign should not be admissible due to AI related manipulation.

The founding attorney has always been a big advocate of personal civil rights and has been concerned about the potential for malicious manipulation of digital images and thus looking for an opportunity like this in court.

It will of course likely take some time to adjudicate regardless of outcome, but the larger impacts could/should be potentially precedent setting.


I have an issue that is similar but different in a case I am working on. It involves the synthesis of digital aerial images into data and back into an aerial “photograph.” We are challenging the admissibility of the “photograph” as such for much the same reasons.

The legal landscape in this area is nearly limitless.


I am highly confident that the vendors who provide traffic camera enforcement data (who are usually low bidders and make a percentage of the fine revenue as payment) haven’t spent the time or money to harden their system against tampering or manipulation. LIkewise, the municipalities they serve are equally unprepared for a real chain of custody and digital integrity fight.

I’ll be back with :popcorn: and :beers:


For those of us not of the American bent, are we talking about another cornucopia of US litigation?


Somebody had to sing it.

Every time Goldman Sachs has to go out of its way to tell us something is not a bubble, its because it’s a bubble. Tulips anyone? AI is not a bubble (


Elle and her performing partner is a hero in Roswell NM (the spaceship crash site)

PS - yes, the ROSWELL SPACESHIP CRASH SITE. Do you really think the the world’s only atomic bomb air base in 1947 would mistake a weather balloon for a flying saucer? REALLY?


The lawsuits continue to proliferate. John Grisham, George R.R. Martin and other prominent authors sue OpenAI ( Shame my daughter is not in law school yet. I read somewhere that Amazon is now rejecting self published books written using ChatGPT.


Lots of lawyers are going to make lots of money.

But in terms of actually achieving anything… it’s vs. The Internet. ML is just going to go on and on, regardless of the law now.

The cynic in me says that the lawyers are fine with this.

You CAN’T make this stuff up:

“Feedback loop of misinformation” sounds like a political phrase…


OK folks - I’m not a dumb lawyer, but I want someone to explain to me how this disclosure explanation from MS protects my proprietary client data while using Copilot?

@Bronsky, @Bishop, @Dellaster, @Desertlap - care to chime in? Still looks like your data has to be resident in Azure ro the LLM “magic” to work, so I’m relying on M$ to not let it leak?

I didn’t realize the ABA is on the hunt and trying to put onus on developers (beware M$ - ha ha ha Gate’s father was a lawyer):

ABA Creates Artificial Intelligence Task Force | The Federalist Society.

Ironic, I was about to start a separate thread about this very issue with CoPilot.

There are many industries and federal contractors (and others under traditional civil non-disclosure agreements) who, by law and or contract, cannot allow their files to be “scanned” by third parties.

Many may already be in breach by allowing AI on their systems already.

1 Like