Algorithms
in policing are applied in Canada, yet the pace is slower than in the U.S. Still,
there are questions on algorithms usage in screening. Discrimination cases and criminal
charges are still in question. The human rights people are also coming on the
front to discuss that just because algorithms are helpful does not mean that
these must be allowed, as these lose human rights. However, the algorithms
functions have proven to be good in data collection as scattered data about an
individual or event at different places can be collected instantly. This
information can further be used for predictive measures. The purpose of
algorithmic policing is to ensure the better safety of Canadians.
Introduction
The current reflection critically analyzes the algorithm's system
implication on the Canadian landscape. The implication has already been started,
and artificial intelligence is applied in surveillance and predictive programs.
The paper has concerned journals and reports to make an undetailed analysis of
the algorithmic system. The algorithmic approach has benefitted Canadian society
as mentioned:
"with the rise of algorithmic governance…people
experiencing homelessness can be policed and punished."[1]
Thesis statement: Algorithms systems have become an important policy tool in the Canadian
policing system; however, the human rights charter is not explainable as the
rights to privacy, equality, and liberty are violated.
One Concept Learned From Topic
Algorithmic policing in Canada will help to monitor crimes more closely
by taking a good grip on criminals through tracking Algorithmic data. Social media
communication comments and posts, including facial and personal tracking, may
help reduce crime and increase city safety measures. The algorithm in policing
is shown as a positive side for policing departments; however, there are
threats to human rights as humans are no safer concerning privacy, livery, and
equality.
Significance of the Topic
Algorithmic policing is revolutionizing the informational world as
people are classified at regional, provisional and federal levels. The
information about the people scattered in different private and public sectors
is now available in one place. This is more or less like a surveillance
technology, which has been legalized by now via legislation. Algorithmic
technologies are legalized to collect disparate information as mentioned:
"provide
an alternative understanding and implementation of citizenship, belongings,
rights, ethics, morality, human agency, security and borders."[2]
This is not only about data collection; the
data can also be used to make trends and analyze the future. The police
services are holding some severe deleterious cases, and the instances of races
are resolved by surveillance technology. The unfairness of individuals can be
predicted, and due to this, professionals have raised concerns that
surveillance technologies are working within the ideal limits of the law.
Views from Other Scholars
Different
implications of algorithmic policing systems are observed in other parts of Canada.
Departments like the Toronto Police Service and Saskatoon Police Service
collaborate with ministries to make predictive analytic labs in different
regions, including Vancouver. The purpose is to ensure better safety measures
by using Algorithmic work
and c extension to hub
for community safety. The at-risk individuals can also be identified. The
Calgary police service also uses such services along with the Ontario Ministry
of the attorney general. The pretrial risk assessment tool is also applied, and
technology helps decision-making. The paper summarizes the linkage between
university machine learning projects and risk assessment instrument development
to release the right decisions for young offenders. The majority use
surveillance technologies such as license plate readers, and the Calgary police
service uses them for social media surveillance. Chat room scraping and facial
recognition technology is applied by almost all police station in Ontario[3].
In addition,
Algorithmic policy technology has helped in two ways, i.e. making predictions
or finding trends and increasing surveillance. The human rights implications
draw inferences; these methods can be location-focused or person-focused. The
methods predict the individual likelihood of engaging in any illicit activity.
Alike the authors mentioned above, the report of Kenyon has also said the types
of surveillance technologies, i.e. facial and social media surveillance
technology, along with many others. Canadian law enforcement agencies are using
it to preoptic the policing methods. The program's names are divided into
GeoDASH algorithmic policing systems, and the Toronto police service utilizes
it as Environics Analytics. Person-focused algorithm policing is a new
technology mentioned in detail in this paper, and there is police predictive
analytic land in Saskatchewan police stations. Social media content and
telecommunications information are also used in person-focused Algorithmic policing
technologies. Though it is debated if civil liberties are violated this way, in
reality, the international human rights law completely protects the Canadian
charter of rights and freedom. Algorithmic functions may endanger the right to
privacy and rights to freedom of expression along with rights to livery and
equality; however, Algorithmic technology
is still expanding by the time[4].
Analyzing it
from human rights expert's point of view, it is not wrong to say that human
rights violations must not be justified as explained. This is a fact owing to
development in technology. Canadian police stations have applied this
technology, and A.I. has accounted for 48 law enrolment agonies across Canada.
The correction systems are utilized in deciding bail and parole as well. The
case of Nijeer Parks is famous in which consigned lived in jail for 11 days
before going on a pretrial monitoring program. However, human rights are at
risk due to a lack of transparency. The peer review journal has raised serious
questions about the inaccuracy of the
Algorithmic system and discriminatory-based questions. Kate knows that
Ontario police, due to false facial recognition, arrested a person, and this
was an act of racialized. As mentioned in a peer reviews journal:
Can
police officers serve as the harbingers of human rights in a world that
desperately needs it?[5]
Outstanding Questions
Several
questions can be anticipated, such as whether law enforcement agencies are
transparent in utilizing algorithmic policy or not and also argue about the
working of surveillance technologies made under the jurisdiction of the law. It
is essential to note that the algorithmic policing that defines human rights
character and what explanations can be given in this instance is also an
important factor to consider. Historic police data sets and current Algorithmic policing technologies
are under the control of the police; thus, who will monitor the department of
police and how federal, regional, or provisional governments will limit the
role of the police is also a serious matter of discussion.
Conclusion and Potential Areas for Improvement or Future
Research
Public
information is not shared regarding Algorithmic matters. The policing technologies used by law
enforcement agencies are, however, the other applications of this policy have
not been explored yet. It can also be used in other matters of life; however,
those matters are not considered yet. Moreover, the statistics related to dark
net websites and location-based services have not been discussed yet. The
information must be publicized so that the people may know that life has been
revolutionized by Algorithmic policing
now. No public data is available in this instance[6].
References
Arrigo, Bruce A., Brian G. Sellers,
and Faith Butta. "Introduction: The Ultramodern Age of Criminology,
Control Societies and 'Dividual' Justice Policy." The Pre-Crime Society,
2021, 1–14. https://doi.org/10.1332/policypress/9781529205251.003.0001.
CHRC.
"Annual Report - Algorithms in policing."
(2021). https://2021.chrcreport.ca/algorithms-in-policing.html.
Humphry, Justine. "Policing
Homelessness: Smart Cities and Algorithmic Governance." Homelessness
and Mobile Communication, 2022, 151–81.
"Kate Robertson, Cynthia Khoo,
Yolanda Song, "To Surveil and Predict: A Human Rights Analysis of
Algorithmic Policing in Canada [2020 C4EJ 67]." C4E Journal, October 10,
2020.
https://c4ejournal.net/2020/10/06/kate-robertson-cynthia-khoo-yolanda-song-to-surveil-and-predict-a-human-rights-analysis-of-algorithmic-policing-in-canada-2020-c4ej-67/.
Kenyon, Miles. "Algorithmic Policing in Canada Explained". 2020. Algorithmic
Policing in Canada Explained - The Citizen Lab
Marina, Peter, and Pedro Marina. "Police,
Power, Agency, and Human Rights." Human Rights Policing, 2022,
36–59. https://doi.org/10.4324/9781003220282-4.
Pekic, Alexander. "Toronto the
Good? the Access T.O. Policy - Making Toronto a Sanctuary City," 2021.
https://doi.org/10.32920/ryerson.14660754.v1.
Singh, Shawn. "Algorithmic Policing
Technologies in Canada". Manitoba law journal. 44,6. 2021.
[1] Humphry, Justine.
“Policing Homelessness: Smart Cities and Algorithmic Governance.” Homelessness
and Mobile Communication, 2022, 151–81.
https://doi.org/10.1007/978-981-19-3838-2_6. 1
[2] Pekic, Alexander.
“Toronto the Good? the Access T.O. Policy - Making Toronto a Sanctuary City,”
2021. https://doi.org/10.32920/ryerson.14660754.v1. 1.
[3] Kate Robertson,
Cynthia Khoo, Yolanda Song, “To Surveil and Predict: A Human Rights Analysis of
Algorithmic Policing in Canada [2020 C4EJ 67].” C4E Journal, October 10, 2020.
https://c4ejournal.net/2020/10/06/kate-robertson-cynthia-khoo-yolanda-song-to-surveil-and-predict-a-human-rights-analysis-of-algorithmic-policing-in-canada-2020-c4ej-67/.
[4] Kenyon, Miles. “Algorithmic Policing in Canada Explained”. 2020. Algorithmic Policing in Canada
Explained - The Citizen Lab. 1
[5] Peter Marina and Pedro Marina, “Police, Power, Agency, and Human
Rights,” Human Rights Policing, 2022, pp. 36-59, https://doi.org/10.4324/9781003220282-4.
47.
[6] Singh, Shawn. “Algorithmic Policing Technologies in Canada”.
Manitoba law journal. 44,6. 2021.1.
No comments:
Post a Comment