Like many critical concepts around which social relations are dependent, artificial intelligence (AI) has assumed a symbolic significance. That is, AI has become a term that represents more than itself. It has become the way in which people refer to a cluster of meaning, approaches, and premises about technology and its relationship to social relations. AI has become its own idea; as well as the vessel for a host of sometimes incompatible premises about humans, human society, technology, and more broadly consciousness, and the principles and assumptions around which social collectives (and individuals) cede authority over decision making (big and little) to processes, algorithms, analytics, and computational thought processes based on inductive reasoning from unending streams of iterative behaviors. What started out as a desire to produce simulation--the simulacra of humanity to solve problems, has become a linked set of processes and applications that can (in an ironically dialectical way) drive human behavior through the logic of its own programming (considered here).
The idea of AI blends together distinct streams of technological development (eg deep learning, natural language process, robotics, computer vision, problem solving reasoning (expert) systems). The first learns from data (pattern finding and projection); the second connects data learning to solve problems; the third translates and applies language as a sort of applied semiotics; the third embeds machines with capacity to perform tasks automatically; and the last comes closest to attempting to mimic human reasoning and decision making in the context of traditional human individual or institutional tasks (one way of looking at it here). In the aggregate, where various levels of technological development are inserted into an increasing number of human actions and interactions, on can arrive at a point where the individual and the collective are effectively managed or curated at the micro and macro level by and through clusters of tech based programing. At its limit, it can invert the relationship between the humans for whose benefit it is adopted and the ecologies of programs and functions that produce the benefit.
|
Pix Credit clip from the Movie Brazil here
|
That suggests the question of the nature and points of insertion of the human in these ecologies of tech and their management. It is at this point that human societies have done what they always tend to do in contemporary society--to invoke the language and sensibilities of collective authority. These tend to produce variations on three impulses. The first is to suppress the creation of forms of big data tech or its use in whole or in part. The second is to control the biases built into data harvesting and analytics of big data tech to reflect privileged bias. The third is to determine points of mandatory human intervention in the operation, creation, assessment, or quality control oversight of these technologies and their use. Nonetheless these are complicated legalities, and even harder to reduce to human intelligible text. This is especially the case because while tech is IN THE FLOW OF time, its human regulatory mechanisms exist at ONE MOMENT IN TIME (eg
here). It becomes likely that big data tech may become as necessary to its own regulation as it does to the regulation of its application. More importantly, its various forms will likely become as embedded in public compliance regimes and the exercise of state power as it has become in other human social relations (see eg REMARKS
HERE; ACCESS PPT
HERE).
The idea of AI also blends together a host of tasks to which it is directed. Many of them are a function of capacity--data based problem solving. Their differentiation is a function of the complexity of the problem to be solved and the autonomy of decision making. Te later can be infinitely divided around sub-programing or aggregated to produce a "final" or "end product" decision or action. They are based on data, analyzed against premises, assumptions and expectations at least initially programmed into the process by humans. The analytics can themselves become part of the problem solving ecology permitting the technology to modify its analytical functions on the basis of its its encounters with data within the ultimate action parameters that may be inserted. but also cap. But large enough data streams can assume a life of their own--and projected through time can themselves serve as the basis for evolving analytics that then change the parameters of analysis. In one sense it it a very high volume chartist approach but with expanded capabilities and functions (e.g., predicting the cost of transport).
None of this makes much sense without beginning to appreciate the extent to which dependency is already emerging within complex webs of differentiated big data tech. But even getting to data to answer that query, in contemporary society, requires or is made possible only by invoking the technology itself. Thus, for example, a Google" query on "AI applications" produces a suggestion that one use Google Cloud. Even the search itself is possible through the application of big data tech. Even a cursory search suggests the current breadth of use.
Some examples: (1) Sam Daley describes "56 Artificial Intelligence Examples Shaking Up Business Across Industries;" (2) Avijeet Biswal describes "18 Cutting-Edge Artificial Intelligence Applications in 2024" across a variety of sectors; (3) Passionate also notes 18 sectors in which various forms of AI may be encountered (here); (4) Forbes describes "Applications of Artificial Intelligence Across Various Industries" suggesting that, at least for the moment, "AI usage is particularly prominent in finance, digital spaces (like
social media, e-commerce, and e-marketing) and even healthcare"; (5) Siddhesh Shinde for Emeritus explored the ecologies of big data tech in heathcare, e-commerce, robotics, education, finance, marketing, banking, social media, business, and sustainability (here); (6)
These programs focus on the automation of processes and interactions, compliance, and all forms of data based decision making. Its robotics and chatbots can project these functions beyond a computer, that is make them mobile and able to be projected from all sorts of other devices. One begins to see the ecologies, the networks, of automated action and decision making and interaction that together already provide a substantial connection between big data tech and its human users. It is now far more likely that these programs and applications in the aggregate will shape the terrains and interactions between tech and humans (individuals and collectives) from the bottom, than the current crop of top down efforts coming from traditional hierarchically superior human institutions. The flow of time makes less relevant any single instance of its memory recorded, for example, as law. It is therefore likely, as well, that legality must find of way of inserting itself back in time--to become automated in a sense.
That, in turn, may require a fundamental shifting in our understanding of the role of law in the management of human social relations--and the state. To those ends, the future does not come so much from the top but from the bottom--not from the conceptual universe of control but from the realities that shape human collectives within the interlinked platforms of functionally differentiated users and producers. AI moves from fetish to form, and the fundamental ordering principles of consciousness, control, subjectivity and its dialectics (iterative inductive, deductive) become a function of the interplay between the ideal and the imagined ideal represented in the aggregate and moving collective manifestations of an infinite number of interactions that harden into principles that can be applied to (re)shape those behaviors. In the process the normative foundations--human rights, sustainability, constitutionalism, administration, may find themselves in their current forms on the wrong end of history. And the state--a meta-platform of automated decision making, descriptive and predictive analytics in the service of norm maintenance, policy, and the mechanisms of interaction with a subject population.
Mimesis also suggests the morphing of the democratic principle from active consultation and elections, to passive extraction of sentiment by and through analytics (machine based and perhaps automated) of trends in aggregate behaviors with the state as the coordinator and client of a large focus group society in dialectical conversation with its objects. But all of this is far too early to tell, though not too early to begin to see in broad and tentative outline.
|
Pix Credit Clip from the Movie Brazil here
|
So here is a guess: Law and its legalities (however expressed) may have to shift from command to principle and objective; it may have to shift from control to oversight and from prosecution to quality control and accountability. It may have to reconstruct itself from external enforcement to internally produced nudging with positive and negative consequences--substantially automated. That, at least, may work for the everyday world of routine behaviors: a great shift of authority from elected officials and a direct connection between the population and its government, to masses of techno-bureaucrats functionally differentiated to develop, construct, and apply the machinery of big data tech to the project of managing humans. And over it all a new sort of law--constituting but not constitutional, normative but not statutory, providing the ideal against which the machinery of control may be deployed--and a new sort of bureaucracy. Law will become less precise to survive and the locus of law process will coalesce around the management and control of the classes of administrators whose role will be to manage tech and to exercise discretion in iteratively evolving human activity. Abuse of discretion and good faith--already within the palette of legal doctrine, will likely emerge as the great centers of constitutional control. And "street level" law will become a manifestation of automated decision making. At its limit, even the techno-bureaucrats at the bottom will be superseded by tech. And the question will divide into three parts: (1) what one does to or for people "left behind"; (2) what one does to or for the redundant human; and (3) how one manages and controls the enormous industry of norm making and the care and maintenance of these networks of automated control.