AI Facial Recognition Company & Privacy Law, Drone Interference, And DIY Silencers
Your face might already live in a searchable database—and BC’s courts just drew a sharp line around what companies can do with it. We break down a major ruling that upholds the privacy commissioner’s order against Clearview AI, unpack why “publicly available” doesn’t mean “free to scrape,” and explain how a province can regulate a US firm with no brick-and-mortar presence. This is a story about jurisdiction in the age of the internet, biometric data rights, and the limits of consent on social media platforms Canadians use every day.
From there, we pivot to a wildfire zone, where a tiny drone met a big legal problem. When a helicopter pilot fighting the Kelowna blaze was irritated and distracted by a nearby drone, the court found that distraction alone interfered with fire control under the Wildfire Act. We walk through the difference between strict and absolute liability, why due diligence matters, and how “no harm done” isn’t a shield when public safety is at stake.
We close with a sign of the times: 3D printed suppressors that triggered prohibited device charges. Beyond the plastic parts and lab delays, the headline is new criminal exposure for simply accessing or possessing digital files intended to produce firearms or key components. We talk through how Canadian firearms law treats suppressors, why courts imposed a conditional sentence rather than jail in this case, and what makers and hobbyists need to know before downloading a file that could cross a legal line.
If privacy, drones, or maker tech lives anywhere near your world, this episode offers clear, practical takeaways: don’t assume public equals fair use, steer drones far from emergency operations, and think twice before clicking on gun-printing files. Subscribe, share with a friend who needs a reality check on tech and law, and leave a review to tell us where you think the line should be drawn next.
Legally Speaking with Michael Mulligan is live on CFAX 1070 every Thursday at 12:30 p.m. It’s also available on Apple Podcasts or wherever you get your podcasts.
Legally Speaking Feb 19 2026
Adam Stirling [00:00:00] Time for our regular segment. Joined as always by Barrister and Solicitor with Mulligan Defence Lawyers, it’s Michael Mulligan with Legally Speaking. Good afternoon, Michael, how are you?
Michael T. Mulligan [00:00:09] Hey, good afternoon. I’m doing great. Always good to be here.
Adam Stirling [00:00:12] Interesting items on the agenda this week. I am reading Privacy Act order, it says, to stop scraping facial data from social media upheld on appeal. What was happening?
Michael T. Mulligan [00:00:22] Well, you may not have realized you’ve been probably scraped. And so this arises out of an investigation by the Information and Privacy Commissioner in British Columbia and to a couple of other provinces. And it has to do with a company called Clearview AI Inc., which is a US company. And the business model of Clearview AI Inc. Is to scrape information from websites that are accessible to its scraper engine, things like YouTube, Instagram, Facebook, and others, and to then use the images of people along with metadata associated with the original pictures to create a facial data database which it stores indefinitely. Back in 2017 when all of this started, they had some 3 billion individuals who had been identified using that system. They had then by late 2023, they’d managed to scrape and examine some 30 billion images to enhance their database. And currently, if you look at what that company has on their website, I think they’re up to something like 70 billion images. And so what they do is scrape all of this information and try to connect up with metadata to identify the people in the images that they’ve captured. And then they sell searches, to like for criminal investigations or U.S. National security. Interestingly on the website, they will also sell it to public defenders. I’m not sure why you have to be represented by a public defender, but that’s at least how they’re marketing it. And so that’s how they make money. And so the legal issue was, one, with respect to the app, whether this activity, the scraping of this information, collecting up and making this database to identify people. So like the idea is you could, if you had some picture from a security camera, you could upload it. It would search against their database of scraped facial data and identify the person. And so the privacy commissioner and the ones in the other provinces that were looking into this, were assessing whether this process was contravened the protection of privacy legislation, which is provincial legislation in each of those provinces. And their conclusion was that this was not permitted. And really there were a couple of issues that had to be sorted out there that were then the subject of first a judicial review of their decision and now a just-release decision from the British Columbia Court of Appeal. And the first issues that had to be sorted out was the issue about whether the province of British Columbia has constitutional authority to regulate the activity of Clearview AI Inc. And the issue there is whether Clearview has a sufficient connection to British Columbia. Because the province of British Columbia wouldn’t have some authority to go off and order some company based in Louisiana to do something. It had nothing to do with British Columbia. And that has been, over the past few years, as the internet has become so pervasive, a more complicated issue than it once was. At some point, you would have been able to figure out whether a company had a substantial connection to British Columbia or some other province by asking, well, do they have an office here? Do they have any employees here? Do they sell stuff here? But it becomes much more ambiguous when you’ve got companies that have services marketed online on the website. And that was actually litigated in B.C. in the context of Google a few years ago. Google’s argument was, well, no, you can’t regulate us, we’re just a passive service that might be used in British Columbia, we have no control over that, you know who us? And that didn’t work for Google. And in Google, the courts looked at things like. You know the company Google sales advertising marketed you could target people just in British Columbia, for example, and so they looked at how that was used under the modern reality of it is they concluded. Yeah, there is a sufficient connection there and in the case of Clearview, not only are they scraping data from people that live in British Columbia, but they were also marketing their service to Canadians including people in British Columbia. At least when the investigation started. Now, perhaps, detecting the way the wind was blowing, the company then indicated it was stopping its marketing of services to Canadians or people in British Columbia. But again, in the internet context, what does that really mean? Because of course, it took me all of about five seconds to type in the name of the company and come up with their website, how I request a demo and so on. So yeah, they haven’t mailed anything out to me, but clearly there it is. And so, the Privacy Commissioner concluded, yeah, there is authority to do that. And ultimately, that was upheld both on the initial judicial review and this most recent review by the Court of Appeal. And the interesting legal thing to know about those kinds of judicial reviews on issues like jurisdiction, is that the test for the court is whether that decision was correct. Because there’s no special, you know, knowledge possessed by the privacy commissioner about the operation of law or constitutional authority and so on, so the courts do that afresh. And even on an appeal, it’s looked at just fresh by the Court of Appeal, they don’t presume the original judge to have been correct, there’s not deference owed to them. But despite all of that, and despite the ambiguity caused by the fact that Clearview doesn’t have a brick-and-mortar office in B.C. And despite the fact that they said, no, we’re going to stop marketing to people in Canada or entities in Canada, they concluded the fact they’re actively collecting and scraping this kind of data did provide a sufficient connection to allow constitutional authority for BC to regulate them. But then came the other arguments from the company, including the reasonableness of what was concluded there. And whether there was an exception, in that privacy legislation for getting information that’s on publicly available websites. Now, the way that works is that the privacy legislation in British Columbia sets out when a company is permitted to collect and use personal information. And it provides a number of express circumstances where that can be done. And that includes information that appears in, and then it lists a bunch of things, a printed or electronic publication, that is available to the public, including, so it’s not limited to, a magazine, book, or newspaper in printed or electronic form. And so the company’s argument there was, well, look, this is an electronic publication. It appears to be publicly available. We are bought, can just scrape up your information. You put it on Instagram, you open up your settings, we can get it, we’re taking it. That didn’t work, either initially or on the two levels of appeal. Concluding that, that kind of material, like on social networks, is not the same kind of publication that would occur in a newspaper or magazine, even if those things were put online, on the theory that much of that material is sort of created by individuals rather than the purveyor of it. So that didn’t fly. The other argument made by the company was that it was just impossible to comply with the order that was made, and the order was made to use reasonable efforts to stop collecting information about people that were in British Columbia. And part of the problem the company had there is apparently they were involved in litigation in Illinois and the company head agreed to take steps to stop acquiring facial data from Illinois residents and to block searches of information about facial data for people who resided in Illinois and so that was not a compelling factual piece of information when the company was saying, this is just impossible, we can’t do it for people in B.C. The company also argued, well, it’s just too vague to say you have to take reasonable, steps to, to do this or reasonable effort and that got rejected as well. I think that’s sort of a term that’s been used by courts repeatedly and making orders, you know, and saying, look, you know you may not run afoul of it if you do your very best, but something sneaks through. You know, somebody indicates they live in, I don’t know, uh Saskatchewan when in fact they live in British Columbia and you just made a mistake. And so the upshot of all that which is analyzed on a reasonableness basis rather than correctness is that both the original chambers judge and now the court of appeals have concluded that the commissioner was reasonable when they concluded that the legislation did not permit the scraping of information off of social media sites to be used in this kind of a database. And so that’s the order from the order stand so says the Court of Appeal and I guess we’ll have to wait and see what happens now with Clearview AI, but this is the modern reality of what is going to happen to your information. If some company is capable of collecting it up, it is happening. I mean, you’ll see this even for your own use now. Like if somebody has, if you have like an iPhone or something and you take a bunch of pictures if you label who’s in the picture. It’ll dutifully figure out all the other pictures of that person and organize them, and you can search by name and so on so it’s all happening for you and not only is that technology happening for you it’s now happening with billions of pictures. I mean reality, 70 billion images they’ve now collected to do that. The population of the earth is 8.3 billion people and so that would suggest they’ve got many pictures or there maybe there’s a lot of useless cat videos or something out there, but nonetheless, it’s a huge trove of information. And no doubt there’s law enforcement benefits to that or maybe even benefits for public defenders. But all of that has to be weighed up against the privacy considerations of do you really want your image in some database that people are able to identify and track you and so on. So at least the law is in BC that at least what Clearview is doing is not permitted. So hopefully they get on with doing something along the lines of what they’re doing in Illinois and try to fish that data out of their system. And maybe we’ll all have just a little bit more private.
Adam Stirling [00:10:58] Alright, Legally Speaking will continue right after this commercial break.
[00:11:02] COMMERCIAL.
[00:11:02] Legally speaking continues on CFAX 1070, joined as always by Michael Mulligan, Barrister and Solicitor with Mulligan Defence Lawyers. Michael, up next it says, a conviction for flying a drone interfering with fire control. What happened?
Michael T. Mulligan [00:11:16] So this is a prosecution under the Wildfire Act of British Columbia. It’s a good act. and in particular, it dealt with the Kelowna wildfire from back in August of 2023. And the fellow who was charged here was operating, he was in a, had a boat, private boat, and he was operating what’s described as a miniature drone taking footage of the damage caused by the wildfire. And the challenge arose because there was a helicopter in the area which was tasked with picking up buckets of water and flying it over and dumping them on the fire. Now it’s not illegal to operate a drone generally, but the Wildfire Act makes it an offence, provincial offence, for a person to without lawful excuse operate equipment, machinery or a vehicle or a vessel or act in a manner that interferes with fire control that is being carried out under the act. So, the first thing about this that the judge had to analyze of this trial is that there are different levels, there are different categories of offence in terms of regulatory or criminal offences. So for criminal offences, ordinarily it’s going to be a requirement to prove that a person willfully or knowingly did something to commit the act. Criminal law is generally concerned with, you know, willful misconduct, not accidents or unintentional conduct. But this Wildfire Act is regulatory, it’s not criminal, you don’t get a criminal record for it. And for those kinds of regulatory offences, there are two different categories. There’s either what is called strict liability offences or absolute liability offences. And I’ll start with the strict or the absolute liability, when that’s the most strict one, where the Crown only needs to prove that you did the thing. So an example of that would be like speeding. It’s not a defence to speeding if you say I didn’t realize they were speeding or I’d misread the sign that said the speed limit was eighty that i was driving a hundred or something that’s not a defence. If they prove you’re speeding well that’s pretty well that. There’s this middle category of what referred to a strict liability offence, where the Crown just needs to prove the pretty at the act was committed and then you could have a defence if you could establish, or is a reasonable doubt if, you took all reasonable steps to prevent it from occurring. Or you believe in a set of facts, which if true, I mean, you weren’t committing the offence. There’s sort of a, a scope, for there being a defence beyond simply the improving yet while your car went over the speed limit. That’s that then, right? Now, in that regard, first of all, the judge had to focus on, well, what does this thing actually prohibit? And didn’t have any problem finding that the drone constituted equipment or machinery. So that wasn’t really the issue. But one of the issues was, did this interfere with what was going on? And in that regard interestingly at the trial, the helicopter pilot testified. He had pilot experience, 36 years. And Helicopter pilot testified about what he does picks up buckets of water and dumps it on the fire and he said Well It’s not uncommon for people to have boats out there looking what’s going on. He says that doesn’t cause him any problem at all he just flies to a different part of the lake and scoops up water over there. But he, in this case indicated that he observed, he first thought it was a bird but then I noticed the drone and he was irritated by the drone. His evidence didn’t seem to be that the drone was going to somehow knock his helicopter out of the sky being miniature and all, but it irritated him. And so as a result of his irritation, he decided to pick up two, one after another, smaller buckets of water and decided to try to knock the drone out of the air himself. By picking up buckets of water, flying over, and dumping them, trying to hit the drone.
Adam Stirling [00:15:09] Oh No.
Michael T. Mulligan [00:15:09] And he missed, twice. And then he decided he’d go back to his business and flew away from the drone and kept collecting his water and going back over and jumping it on the fire. And so one of the issues there, I suppose, would be, you know, does that really interfere with it? Is it the irritation of the helicopter pilot? And the judge accepted that the potential risk caused by the irritation and removing the focus of the helicopter pilot was sufficient to make out, this that it was being interfered with because of the irritation and distraction the drone was causing. Even though it didn’t appear to be on the evidence any kind of an actual danger to anything that was going on. And it was interesting of course given the pilot’s evidence about boats being around and that he didn’t care about that he would just fly somewhere else and carry on. But that he found to be enough and here found that the fellow had not made out the defence that he had taken all reasonable steps or that didn’t know that you know the fire was going on or something and so as a result of that, even though it’s you know strict not absolute, and even though there was no evidence that the drone actually caused some danger to the helicopter. The fact that it was irritating and distracting was enough to found a conviction and so the man was found guilty of the regulatory offence. And so i guess the takeaway there is bear that in mind if you’re flying a drone around a Forest fire that’s being fought even if your drone is tiny and isn’t actually doing anything you might wind up irritating the helicopter pilot. You could find yourself running afoul of the Wildfire Act and like this man end up with a conviction in probably some kind of a fine. I’m not sure whether he’s ever going to see the drone again, but that’s the latest on the wildfire act and why you shouldn’t fly drones around.
Adam Stirling [00:17:01] All right, just under four minutes left. We’re out of 59. Another high-tech case, a sentencing for 3D printing of two-gun silencers. What happened?
Michael T. Mulligan [00:17:11] Well, this is a sign of the time. So you have this fellow in this case, who the police came in for some unknown reason to arrest him for unrelated reasons. And while doing so, they noticed some 3D printed things. They noticed a 3D-printed spring-loaded knife and a 3D printed bullet, whatever that means, and some other items. And eventually they went and got a warrant and came back and connected a search and seized a whole bunch of stuff that this man had printed out on a 3-D printer. They sent all of these things, including things that were shaped like guns but weren’t, off to a lab to test what they all were to determine whether any of them constituted prohibited weapons. Now, for reasons unknown, the laboratory took just months to do that. The guy wound up waiting on bail for some 23 months, most of which was spent by the laboratory testing these items out. And ultimately the laboratory, oh yeah, has been on bail for 26 months, came back and concluded that two of the items were prohibited, not any of the printed off plastic guns, but two things which amounted to silencers. And in Canada we have made it prohibited devices include a device intended to muffle or stop the sound or report of a firearm, like i think you could screw on the end of a gun to make it quieter. Interestingly in that regard other countries take a very different view of that like the UK for example they encourage people to use silencers or guns to be polite and not disturb people, but in Canada that we’ve decided to make that prohibited Because we want the guns to be very loud.
Adam Stirling [00:18:51] Yes, very British, very polite.
Michael T. Mulligan [00:18:53] You know, so here they are prohibited and these things even though they’re plastic printed objects met that definition in the view of the expert. Now another thing to mention here, and this is important since this case, we actually have a new provision in the Criminal Code is in here under section 102.1. And it actually makes it a crime to possess or access computer data that could be used to do things like 3D print items like a gun. And so now just going to a website and accessing that kind of information, even if you didn’t print it at all, just having it, if you have it, access it or distribute it, you’ve committed the crime. And so be aware of that because there are, again, other places like the United States where that’s not a crime at all. And in fact, it may be constitutionally permitted to allow people to in some places make their own gun. It’s probably very state-by-state, very different regimes. It’s one big internet just like in that Clearview case. So be careful because you can very easily commit a crime by accessing those things. And it’s taken seriously in Canada. This fellow pled guilty after being on bail for 26 months. And ordinarily in cases involving prohibited firearms, jail sentences imposed. And here the judge, well that’s necessary, but the guy’s not of any risk he’s been on bail for 26 months, hasn’t done anything. No reason to think he’s ever going to do anything. And he didn’t print them for some illegal purpose. He was just interested in making them for his own amusement, I guess. He wasn’t doing anything with them. And so the judge imposed a conditional sentence order, which is like house arrest for a period of one hundred and twenty days. So a fairly significant impact for the man. But that’s the latest on 3D printing, how that can quickly turn into an offence if the item you print off happens to meet the Canadian definition of what is a prohibited device, and that includes a silencer. So that’s the latest on 3D printed silencers and what not to do.
Adam Stirling [00:20:57] Michael Mulgan with Legally Speaking during the second half of our second hour every Thursday here on CFAX 1070. Thank you so much. Pleasure as always.
Michael T. Mulligan [00:21:04] Thanks so much, always great to be here.
Automatically Transcribed on February 24, 2026 – MULLIGAN DEFENCE LAWYERS