6/27/2023 0 Comments Clipy paper thingsTalking about future paper clips might be interesting or thrilling, but in reality, it’s a way of avoiding dealing with our present paper clips. That’s a concrete problem we need to deal with now. What we do have are organizations that are already maximizing their own paper clips, and that aren’t intelligent by any standard. We have more immediate problems to solve. As Andrew Ng has said, we’re being asked to worry about overpopulation on Mars. We don’t have–and may never have–an artificial general intelligence, or even a more limited artificial intelligence that will destroy the world by maximizing paper clips. What frustrates me about Bostrom’s paper clip maximizer is that focusing on problems we might face in some far-off future diverts attention from the problems we’re facing now. Business systems that optimize profit-well, they’re old-fashioned human wetware, collected in conference rooms and communicating via the ad-hoc neural network of economic exchange. Automated trading systems largely predate modern AI, though they have no doubt incorporated it. It’s already happening in our corporations, where short-term profit creates a world that is worse for everyone. As O’Reilly and Stross point out, paper clip maximization is already happening in our economic systems, which have evolved a kind of connectivity that lets them work without oversight. The paper clip maximizer is a provocative tool for thinking about the future of artificial intelligence and machine learning–though not for the reasons Bostrom thinks. Get a free trial today and find answers on the fly, or master something new and useful. Join the O'Reilly online learning platform. That process of optimization is out of control-and may well make our planet uninhabitable long before we know how to build a paper clip-optimizing AI. Like O’Reilly, Stross says the process is already happening: we’re already living in a world of “paper clip maximizers.” Businesses maximize stock prices without regard for cost, whether that cost is human, environmental, or something else. It was told to make paper clips, lots of them, and nothing is going to stop it. If told to maximize the process of making paper clips, it could decide that humans were inessential. Even when our systems are working, they’re maximizing the wrong function.Ĭharlie Stross makes a similar point in “ Dude you broke the future,” arguing that modern corporations are “paper clip maximizers.” He’s referring to Nick Bostrom’s thought experiment about what could go wrong with an artificial general intelligence (AGI). However, the real danger of the Skynet moment isn’t what happens when the software fails, but when it is working properly: when it’s maximizing short-term shareholder value, without considering any other aspects of the world we live in. The “flash crash” of 2010 was an economic event created purely by the software that runs our financial systems going awry. Alluding to The Terminator, he says we’re already in a “Skynet moment,” dominated by artificial intelligence that can no longer be governed by its “former masters.” The systems that control our lives optimize for the wrong things: they’re carefully tuned to maximize short-term economic gain rather than long-term prosperity. In What’s the Future, Tim O’Reilly argues that our world is governed by automated systems that are out of our control.
0 Comments
Leave a Reply. |