The Imbalance of Innovation

Passing thoughts on the lack of safe boundaries in a fast-paced innovative era
Published on 2024/04/30

There are a few things I have been thinking about for a little while. The exponential growth of AI made it even more evident. Some of the most innovative technologies come at a price we are not equipped to manage (or have the budget to pay). Most of the time we don't even think about it until it gets out of hand. For LLMs we thought scraping the internet without any regard for intellectual property was ok, and I noticed some folks thinking about building a web where users are either more equipped to defend themselves or have higher control over their data.

Most of the time, once a resource is in the public domain (e.g. any media type like an image), you completely lose any control over it. It doesn't belong to you anymore. It can be used for training, it can be edited, it can be distributed without your consent. I like to dream of a world where digital ownership can be guaranteed. We jumped on the AI hype train so hard that we didn't think about the consequences or how to mitigate them. While I've historically been a proponent of innovation (if you read a book like "The Master Switch" you realize that it is indeed unavoidable), I like to have this utopian idea of regulated innovation (ugh I don't like the word "regulated" but I can't think of anything better at the moment). I'd like to see innovation accompanied by work and research done on how to prevent it from doing irreversible damage.

Think of social media, we unleashed it without thinking about moderation at scale. Think about information in the digital area, we allow anyone to write anything with unverified sources. What I'm probably explaining poorly here is that I'd love to continue seeing opinions freely, but I'd like to be able to call out an unverified source when they're utilized to reinforce some information. You can link to a "source" in an article, but we never built the tooling to allow source verification at a point in time (e.g. an article that references Wikipedia that has been edited 100 times over after a year, or a link to a dictionary definition that without clicking through gives the illusion of legitimacy to the user).

Thoughts

This is something that's been bothering me for a while that I have a hard time putting into words. The widespread evolution of LLMs made it even worse. Reading "Trust Me I'm Lying: Confessions of a Media Manipulator" just added salt to the wound. We keep introducing innovative tech without accompanying it with a well thought out proposal on how to make it safe. We just wait until the damage is done and then struggle to find a scalable solution. On the other hand, how can we guarantee innovation (which, as I mentioned earlier, is unavoidable) safely without slowing it down too much? Is there a way to guarantee and promote innovative tech more safely?

I'll have to do a ton more research around this to clear my mind. I don't know if "regulation" is what I mean, but maybe safe boundaries within which we can move to guarantee individuals' safety. How can we keep the freedom the Internet era brought to us more safely?

0
← Go Back