Thursday, May 23, 2024

From the Valley of the Heart's Delight to the Valley of the Heart's Appropriation

 

From the Valley of the Heart’s Delight to the Valley of the Heart’s Appropriation

Prior to the late 50s, early 60s what we know as Silicon Valley was a vast swath of fruit orchards, mostly peach, plum, apricot.  Funding from the US Departments of Defense and Energy to Hewlett Packard, Fairchild and other start-ups transformed the luscious orchards to concrete, steel and glass campuses with a category of yield light years from the former flora. Nurtured with students and faculty from Stanford, Santa Clara University, San Jose State University, and UC Berkeley to the north the region has become an international hothouse of digital technologies, with all their promise and profound challenges.

Since stories seed our imagination I use the contrast between the Valley of the Heart’s Delight and Valley of the Heart’s Appropriation to portray the profound differences between the two landscapes. Where the orchards evoked our human connections to the rhythm of seasons and senses of smell, taste, touch and sight, the digital tech industry is prying us away from our connections to those rhythms, thus from an integral part of our humanity.

This narrative is not an attack on metropolitan or suburban culture. It is an attempt to underscore the profound challenges to our experience of being connected to our biological environment and interpersonal relations posed by emerging applications of artificial “intelligence.” The quotation marks are to emphasize the misnomer that has taken residence in our consciousness. Digital tools designed to mimic human brains are mechanisms for correlating millions of bits of information labeled and organized by and embedded with the values and biases of the workers, and or their employers and of the contributors to the databases themselves.  The tools are not complex, organic, nonlinear, unpredictable beings. And our brains are not machines, yet we are besieged with mechanical applications to be more "productive," replace caregivers, keep people from feeling lonely, tell us whether we are pretty, smart, or cool, as well as invasions of our personal privacy, driving us to anger, self-contempt, suicide and political fragmentation. Among the many existential questions of this digital era are “Can you have an I-Thou relationship with AI? And, would you want one in the first place?”  

Ironically, since the explosion of remote work has given workers more time with family and presumably their natural environments it is at least some of these same workers who are being incentivized to create digital tools that do precisely the opposite… transform interpersonal connections from immediate, physical contact to remote contact and mechanical impersonations… robots to minister to the frail, the isolated….Albeit, remote work is also enabling employees to provide support and care to children, frail parents, and to have more autonomy over their work-life balance, yet at what cost to those to whom their wares are pushed? This paradox mirrors the practice of many Silicon Valley parents to not allow their own children to use the wares they promote to be used by the children of other people.

Through our personal relationships we all participate in interdependent networks. In this way we are in continuous learning and adaptative mode. A vital skill is the capacity to identify emerging patterns of behavior and determine where we want to put more energy for what we deem positive outcomes.  Our decisions are influenced by the economic system and stories around who, how, when, where and what is appropriate for generating revenue and financial “success.” The dysfunction of rapacious, surveillance capitalism is becoming increasingly obvious, a variant of "neoliberalism."  
Where “Google helps us find information for free, … in return it seizes every opportunity to keep us from actually digesting that information, pushing us to click on the next personalized ad instead.” Why? Because “they earn money whenever customers click on their advertising…”(The Shallows)."


Hyper objects,”(Otto Scharmer, The Consilience Project) are challenges that currently have no simple, clear solutions over which experts disagree. The primal challenges of AI to our experience of being human are added to the conundrums of climate change, mass migration, water and air pollution, nuclear war, hunger and vast disparities in wealth within and across nations, and demographic change where lower birth rates threaten financial stability.

It is time to train our focus on designing systemic regulation of AI development by understanding the economic model driving the emerging technology. Let us use medical research as a potential model, acknowledging its limitations as well. Prior to general use drug therapies are subject to multiple research trials over time. Positive and negative side effects on specific populations are identified and drugs are not introduced for general use until they are deemed “safe.”  Indeed, there will need to be much conversation about the nature of research trials, by whom they will be conducted and assessed. However, those conversations need to begin now… not after we learn that the harms to our brain capacities and interpersonal relations are beyond repair. The latter outcome would enable the "tech titans" to continue to create augmentations to our cognition that their prior devices had depleted in the first place.


No comments:

Post a Comment