A few years ago, I was visiting my Aunt Diana and Uncle Robert, and they have a Google Home setup in their house. Every time my aunt would ask Google something, she'd say “thank you” afterwards. You have to know that she's one of the kindest and sweetest people you'll ever meet, so it wasn't out of character to see her doing this, but I still teased her a little bit when I heard her do it. Finally she said, in her ever-patient and compassionate way, "Oh, when the robots eventually take over, I'd like them to know that I'm one of the nice humans, at least." Solid plan, auntie, solid plan.
This morning, I answered a survey about the future of AI from the Imagining the Internet Center. I get to periodically do this kind of thing because I was once considered to have expertise in emerging internet technologies, and though I tried to explain to these nice people that I neither have that, nor do I want to anymore, they still send me questions, which amuses me greatly.
I don't yet fear the robots taking over, or the singularity, in my lifetime, but good lord, answering these questions showed me what I do fear: that capital- and profit-driven AI development will do more individual and societal harm than good. In the same way, back in 2008, that we were talking about homophily and xenophily on social media (hi Ethan!), and needing to actively pursue strategies against those "instincts," we're there already with AI. Who's writing the code, what bias they bring to the table, what materials the code is being fed to learn… none of it bodes well for equity.
Out of nowhere, though, I started thinking about a very early natural language processing program that was easily available to everyday people-- it was called ELIZA. I don't know where my family got a copy of it, but in the late 1980s, we had an MS-DOS based version for our IBM clone (a Leading Edge Model D, for those keeping score at home). I remember when I first started messing with it, I did all the things a 12- or 13-year-old would do: I asked it if it liked farts, and tried to get it to say something dirty. Like typing 55378008 on a calculator, so that when you turned it upside down, it said BOOBLESS. That is 1970s & '80s comedy gold right there.
After getting past the giggles, as some point, I tried out actually using it as it was intended: I told it my problems. I don't remember which problems, but I remember very distinctly it responding with things like, "That sounds very difficult" and "What are your feelings now?" Boilerplate stuff, but there were times where those things hit me right in the chest. I felt seen, and even though intellectually I knew that this wasn't "real," the feeling was.
Later in my first year of college, I got introduced to the Internet via a text-only VAX/VMS setup at our university. One of the first things my friend Matt showed me was Usenet, a wide open system of discussion forums about every topic under the sun. You found your topic by typing in the command to list all the newsgroups on the server you were on, and reading that list until you found something interesting. At the top of the list, I saw one called "alt.angst," which I thought must be weird and hilarious, so I popped in. I found people talking openly and honestly about their very real mental illnesses, about depression and anxiety, about medications and therapy as normal and everyday parts of working through things, about thoughts of self-harm and large emotions in ways that weren't scary. It was my first exposure to people talking about these things in positive and nuanced ways. Usenet helped me start managing my own mental illness and set me on a long, ongoing path to getting real help.
I come back to these things when I think about the future of technology in general, because as scared as I am of humans making everything worse, I can't help but remember that there will always be specks of connectivity in our technologies that may help us feel a little bit more OK with existing on this weird, wet rock hurtling through space. I'm not naive enough (anymore) to think that they'll outweigh the harm, but I just can't forget the feeling of being seen and heard.