Algorithmic Colonization of Africa, part 2
Some “Global North” things to avoid in Africa… and in the Global North too!
(this is the second and final part of my own summary of a paper about algorithmic colonization of Africa. Please read the first part to get the full picture!)
Exhibit Four: terrorism instead of terrorismS
During the CyFyAfrica 2019 conference, the Head of Mission, UN Security Council Counter-Terrorism Committee Executive Directorate addressed work that is being developed globally to combat online counterterrorism.
Unfortunately, the Director focused explicitly on Islamic groups, portraying an unrealistic and harmful image of global online terrorism.
Contrary to such portrayal, more that 60 percent of U.S. mass shootings in 2019 were, for instance, carried out by white-nationalist extremists.
Exhibit Five: race-oriented surveillance
Vumacam, an AI powered surveillance company, is fast expanding throughout South Africa, normalizing surveillance and erecting apartheid era segregation and punishment under the guise of “neutral” technology and security.
Vumacam currently provides a privately owned video-management-as-a-service infrastructure, with a centralized repository of video data from CCTV.
During apartheid, passbooks served as a means to segregate the population, inflict mass violence, and incarnate the black communities. Similarly, “[s]mart surveillance solutions like Vumacam are explicitly built for profiling, and threaten to exacerbate these kinds of incidents.”
Although the company claims its technology is neutral and unbiased… What the Vumacam software flags as “unusual behaviour” tends to be dominated by the black demographic and most commonly those that do manual labour such as construction workers.
Exhibit Six: biometric IDs
As Kenya embarks on the project of national biometric IDs for its citizens, it risks excluding racial, ethnic, and religious minorities that have historically been discriminated. [If that system is implemented as proposed] these minority groups are rendered stateless and face challenges registering a business, getting a job, or travelling.
What is missing: attention to minorities, and awareness tha “Data are People”
Given that the most vulnerable are affected by technology disproportionally, it is important that their voices are central in any design and implementation of any technology that is used on or around them.
However, contrary to this, many of the ethical principles applied to AI are firmly utilitarian; the underlying principle is the best outcome for the greatest number of people.
This, by definition, means that solutions that centre minorities are never sought. [Doing so would require much] time, money, effort, and genuine care for the welfare of the marginalized which often goes against most corporates" business models.
Consulting those who are potentially likely to be negatively impacted might (at least as far as the West"s Silicon Valley is concerned) also seem beneath the “all-knowing” engineers who seek to unilaterally provide a “technical fix” for any complex social problem.
[Instead] The African equivalent of Silicon Valley"s tech start-ups [are] headed by technologists and those in finance from both within and outside of the continent who seemingly want to “solve” society"s problems and using data and AI to provide quick “solutions”.
The reduction of complex social problems to a matter that can be “solved” by technology also treats people as passive objects for manipulation.
The discourse around “data mining”, “abundance of data”, and “data rich continent” shows the extent to which the individual behind each data point is disregarded.
This muting of the individual, a person with fears, emotions, dreams, and hopes, is symptomatic of how little attention is given to matters such as people"s well-being and consent, which should be the primary concerns if the goal indeed is to “help” those in need.
Furthermore, this discourse of “mining” people for data is reminiscent of the coloniser attitude that declares humans as raw material free for the taking.
What to learn from the Global North (even IN the Global North!)
Insisting on a single AI integration framework for ethical, social, and economic issues that arise in various contexts and cultures is not only unattainable but also imposes a one-size-fits-all, single worldview.
AI, like Big Data, is a buzzword that gets thrown around carelessly […] This makes it extremely difficult to challenge the deeply engrained attitude that “all Africa is lacking is data and AI”.
The sheer enthusiasm with which data and AI are subscribed to as gateways out of poverty or disease would make one think that any social, economic, educational, and cultural problems are immutable unless Africa imports state-of-the-art technology.
People create, control, and are responsible for any system. For the most part such people consist of a homogeneous group of predominantly white, middle-class males from the Global North [and] “algorithms are opinions embedded in code”.
This widespread misconception further prevents individuals from asking questions and demanding explanations.
[At the end of the day] The question of technologization and digitalisation of the continent is also a question of what kind of society we want to live in.
A way forward for Africans
The continent has plenty of techno-utopians but few that would stop and ask difficult and critical questions. African youth solving their own problems means (among other things!):
- deciding what we want to amplify and showing the rest of the world;
- shifting the tired portrayal of the continent (hunger and disease) by focusing attention on the positive vibrant culture (such as philosophy, art, and music) that the continent has to offer.
- not importing the latest state-of-the-art machine learning systems or some other AI tools without questioning the underlying purpose and contextual relevance, who benefits from it, and who might be disadvantaged by the application of such tools.
- creating programs and databases that serve various local communities and not blindly importing Western AI systems founded upon individualistic and capitalist drives.
Who writes this, why, and how to help
I am Marco Fioretti, tech writer and aspiring polymath doing human-digital research and popularization.
I do it because YOUR civil rights and the quality of YOUR life depend every year more on how software is used AROUND you.
To this end, I have already shared more than a million words on this blog, without any paywall or user tracking, and am sharing the next million through a newsletter, also without any paywall.
The more direct support I get, the more I can continue to inform for free parents, teachers, decision makers, and everybody else who should know more stuff like this. You can support me with paid subscriptions to my newsletter, donations via PayPal (mfioretti@nexaima.net) or LiberaPay, or in any of the other ways listed here.THANKS for your support!