害羞草研究所

Skip to content

CBSA to use facial recognition app for people facing deportation: documents

The mobile reporting app would use biometrics to confirm a person害羞草研究所檚 identity
web1_20240815170824-e47e8b0ad356fe9a173da404361f5c20ad97f5a714e12312109d40746164391c
A Canada Border Services Agency (CBSA) sign is seen in Calgary, Alta., Thursday, Aug. 1, 2019. The Canadian Border Services Agency plans to implement an app that uses facial recognition technology to keep track of individuals who have been ordered to be deported from the country. THE CANADIAN PRESS/Jeff McIntosh

The Canada Border Services Agency plans to implement an app that uses facial recognition technology to keep track of people who have been ordered to be deported from the country.

The mobile reporting app would use biometrics to confirm a person害羞草研究所檚 identity and record their location data when they use the app to check in. Documents obtained through access-to-information indicate that the CBSA has proposed such an app as far back as 2021.

A spokesperson confirmed that an app called ReportIn will be launched this fall. The CBSA said in a followup comment the app could also be used for permanent residents and foreign nationals who are subject to inadmissibility hearings.

Experts are flagging numerous concerns, questioning the validity of user consent and potential secrecy around how the technology makes its decisions.

Each year, about 2,000 people who have been ordered to leave the country fail to show up, meaning the CBSA 害羞草研究所渕ust spend considerable resources investigating, locating and in some cases detaining these clients,害羞草研究所 says a 2021 document.

The agency pitched a smartphone app as an 害羞草研究所渋deal solution.害羞草研究所

Getting regular updates through the app on a person害羞草研究所檚 害羞草研究所渞esidential address, employment, family status, among other things, will allow the CBSA to have relevant information that can be used to contact and monitor the client for any early indicators of non-compliance,害羞草研究所 it said.

害羞草研究所淎dditionally, given the automation, it is more likely that the client will feel engaged and will recognize the level of visibility the CBSA has on their case.害羞草研究所

Plus, the document noted: 害羞草研究所淚f a client fails to appear for removal, the information gathered through the app will provide good investigative leads for locating the client.害羞草研究所

An algorithmic impact assessment for the project 害羞草研究所 not yet posted on the federal government害羞草研究所檚 website 害羞草研究所 said biometric voice technology the CBSA tried using was being phased out due to 害羞草研究所渇ailing technology,害羞草研究所 and it developed the ReportIn app to replace it.

It said a person害羞草研究所檚 害羞草研究所渇acial biometrics and location, provided by sensors and/or the GPS in the mobile device/smartphone害羞草研究所 are recorded through the ReportIn app and then sent to the CBSA害羞草研究所檚 back-end system.

Once people submit photos, a 害羞草研究所渇acial comparison algorithm害羞草研究所 will generate a similarity score to a reference photo.

If the system doesn害羞草研究所檛 confirm a facial match, it triggers a process for officers to investigate the case.

害羞草研究所淭he individuals害羞草研究所 location is also collected every time they report and if the individual fails to comply with their conditions,害羞草研究所 it said. The document noted individuals will not be 害羞草研究所渃onstantly tracked.害羞草研究所

The app uses technology from Amazon Web Services. That害羞草研究所檚 a choice that grabbed the attention of Brenda McPhail, the director of executive education in McMaster University害羞草研究所檚 public policy in digital society program.

She said while many facial recognition companies submit their algorithms for testing to the U.S. National Institute of Standards and Technology, Amazon has never voluntarily done so.

An Amazon Web Services spokesperson said its Amazon Rekognition technology is 害羞草研究所渢ested extensively 害羞草研究所 including by third parties like Credo AI, a company that specializes in Responsible AI, and iBeta Quality Assurance.害羞草研究所

The spokesperson added that Amazon Rekognition is a 害羞草研究所渓arge-scale cloud-based system and therefore not downloadable as described in the NIST participation guidance.害羞草研究所

害羞草研究所淭hat is why our Rekognition Face Liveness was instead submitted for testing against industry standards to iBeta Lab,害羞草研究所 which is accredited by the institute as an independent test lab, the spokesperson said.

The CBSA document says the algorithm used will be a trade secret. In a situation that could have life-changing consequences, McPhail asked whether it害羞草研究所檚 害羞草研究所渁ppropriate to use a tool that is protected by trade secrets or proprietary secrets and that denies people the right to understand how decisions about them are truly being made.害羞草研究所

Kristen Thomasen, an associate professor and chair in law, robotics and society at the University of Windsor, said the reference to trade secrets is a signal there could be legal impediments blocking information about the system.

There害羞草研究所檚 been concern for years about people who are subject to errors in systems being legally prohibited from getting more information because of intellectual property protections, she explained.

CBSA spokesperson Maria Ladouceur said the agency 害羞草研究所渄eveloped this smartphone app to allow foreign nationals and permanent residents subject to immigration enforcement conditions to report without coming in-person to a CBSA office.害羞草研究所

She said the agency 害羞草研究所渨orked in close consultation害羞草研究所 with the Office of the Privacy Commissioner on the app. 害羞草研究所淓nrolment in ReportIn will be voluntary, and users will need to consent to both using the app, and the use of their likeness to verify their identity.害羞草研究所

Petra Molnar, the associate director of York University害羞草研究所檚 refugee law lab, said there is a power imbalance between the agency implementing the app and the people on the receiving end.

害羞草研究所淐an a person really, truly consent in this situation where there is a vast power differential?害羞草研究所

If an individual doesn害羞草研究所檛 consent to participate, they can report in-person as an alternative, Ladouceur said.

Thomasen also cautioned there is a risk of errors with facial recognition technology, and that risk is higher for racialized individuals and people with darker skin.

Molnar said it害羞草研究所檚 害羞草研究所渧ery troubling that there is basically no discussion of 害羞草研究所 human rights impacts in the documents.害羞草研究所

The CBSA spokesperson said Credo AI reviewed the software for bias against demographic groups, and found a 99.9 per cent facial match rate across six different demographic groups, adding the app 害羞草研究所渨ill be continuously tested after launch to assess accuracy and performance.害羞草研究所

The final decision will be made by a human, with officers overseeing all submissions, but the experts noted humans tend to trust judgements made by technology.

Thomasen said there is a 害羞草研究所渇airly widely recognized 害羞草研究所 psychological tendency for people to defer to the expertise of the computer system,害羞草研究所 where computer systems are perceived to be less biased or more accurate.

Anja Karadeglija, The Canadian Press

Breaking News You Need To Know

Sign up for free account today and start receiving our exclusive newsletters.

Sign Up with google Sign Up with facebook

This site is protected by reCAPTCHA and the Google and apply.

Reset your password

This site is protected by reCAPTCHA and the Google and apply.

A link has been emailed to you - check your inbox.



Don't have an account? Click here to sign up




(or

害羞草研究所

) document.head.appendChild(flippScript); window.flippxp = window.flippxp || {run: []}; window.flippxp.run.push(function() { window.flippxp.registerSlot("#flipp-ux-slot-ssdaw212", "Black Press Media Standard", 1281409, [312035]); }); }