Sign In Forgot Password

Aug. 8, 2023

08/08/2023 02:18:16 PM

Aug8

In recent months in our increasingly computerized world, much has been written about the development of Artificial Intelligence—machine-driven direction of all kinds of activities associated with human life. It is a challenging issue, with many seeing it as a modern-day success akin to Rabbi Judah Loeb ben Bezalel of Prague’s Golem. That fable became the source for many versions of autonomous creatures in literature, art, and even ballet. All were inventions, many inspire fear, and frequently they end in frenzied self-destruction. Often that brings the reader or observer a sense of relief.

The bits and bytes of contemporary digital technology have become integral to our daily activities. Even among my contemporaries, above the worldwide median age, the use of digital devices is key to much we do, read, and how we communicate. Some are discomforted by the takeover of information that arrives like the flood of spring snowmelt daily, even hourly at times, it seems.

Like that snowmelt whose rush downstream collects desirable and undesirable materials, the overload of inboxes includes useful, useless, and, sadly, destructive information. That can make it hard to distinguish between the veracity of ‘information’ from unverified sources. For some, moreover,  misattributions picked up and repeated, resent, forwarded, and incorporated into texts can become the cause of inadvertent plagiarism—a cardinal sin among the academic community.

Computers, I have learned, use ‘Boolean Logic’ to assemble information. This is the system using true and false as the sole possible variables in expressing things. In late elementary school, we learned ‘new math’, studying the varied bases in use: base 10 (decimal, our default), base 12 (ancient, used for clocks and time-telling, dozens, and grosses in large orders), and the odd base 2: 0, 1, 10,11,100,101…at about the time Sputnik entered space and NASA was born.

Over the decades since then, the older mechanical tools, the tubes, valves, and wiring have been miniaturized even as their computational power and speed grew. The Cold War, the Space Race, and a veritable revolution in information technology (IT for the initiates) have ensued and continues. Now it seems, another huge step is underway, in which the IT apparatus can more independently develop novel solutions to problems presented, invented, or—regrettably—manipulated.

There’s the rub: the morality and ethical standards that have been part of human activity are excluded by the preset algorithms that drive the choices a computer search makes. These preset conditions have been much discussed, but simply put, seem to me to be like shopping for a new item of clothing. We preset size, color, season-appropriate fabric, and perhaps allowable patterns. These limitations contrast with the available supply. Et, voila, we can purchase an item. What I omitted from this list is an array of categories like sustainability, or standards of manufacture like working conditions, age of workers, or fairness of their recompense.

With these ideas—not expert, nor out of training as I am a digital immigrant—the weakness of AI begins to peek through. The limits of judgment by the dreaded machine are imposed by the limits of imagination and the scope of ideas that the codewriter set into the algorithm. Behind the brilliance of computer-driven research is the ethical standard applied by its composer.

There is a great advantage imaginable from removing the inconsistencies of human activity from many processes. There remains the limit set by the danger that the code setter—‘big brother watching you’—might deliberately or inadvertently impose. Whose ethics are correct? Whose ethical system will lead the society and the political culture? When these questions are adequately answered, I have no doubt Artificial Intelligence will change life. But will it change humanity?

Tue, June 18 2024 12 Sivan 5784