Several very smart people are worried about artificial intelligences becoming smarter than people, taking over the world, and not necessarily having our best interests at heart. A specific concern is that some form of paperclip maximizer might optimize the world for something that is harmful to us if the optimization in question is taken too far.
I once tried to explain this to my grandma, who replied "if a computer is taking over the world, why can't you just unplug it?" A very sensible plan, if the computer in question isn't already running a lot of critical functions, without which people would die and society would devolve into chaos (and if it doesn't have batteries). Obviously we'd need to put a stop to something like that before it got started.
Except we can't - it's already too late. We already have a "paperclip maximizer" running the world. It's not optimizing for paperclips, though - it's optimizing for profit: GDP (on the level of nations), shareholder value (on the level or public companies) and personal income (on the level of individuals, especially those with bills to pay). No artificial intelligence is required - the "machine" uses the intelligence of the people that comprise it to do its thinking. But all of those people answer to someone else - employees answer to their managers, CEOs answer to their customers (usually) or investors, most investors are just trying to get the best returns so that they can retire as quickly and comfortably as possible, and consumers are just looking for the best performance/price ratios. Elected representatives answer to voters in their constituencies or lobbyists for various special-interest groups, mostly industries. Voters all too often vote for who the media tells them to, or who seems more likely to ensure that they can find work or keep their jobs. There's nobody in charge of the whole thing, putting the brakes on to ensure that profits don't come before the freedom and welfare of individual humans or humanity as a whole. There's no "plug to be taken out" short of a massive revolution which would also dismantle the systems that keep us fed, clothed, warm, clean, secure, healthy and entertained.
Taken to it's logical conclusion, what does a world fully optimized for profit look like? Well, predictions become a whole lot more difficult once super-intelligent AIs are in the picture, but here's what I think it would be if such things turned out not to be possible. As there are things that humans can do that machines can't, most human effort would go into profit-generating activities (in other words, we'd all be wage-slaves). If someone is rich enough not to have to work then they probably wouldn't, so the generated profits can't go to making a lot of people rich - this implies that almost all the generated wealth would be concentrated in the hands of a very small number of individuals. Most workers would probably have quite a lot of debt since if they could have an unchecked accumulation of capital they could get rich enough not to have to work. Social safety-nets for any but those absolutely unable to do any kind of profit-generating work would be dismantled since the more dire the consequences for not working the more people will work. Education of children would focus on those skills needed in the workplace, and pure research would only be funded to the extent that it could eventually lead to more profit. Retirement would probably not be a thing (or if it is, you would not expect to have a lot of quality post-retirement time).
It's not all bad, though. Unemployment would be very low (since if someone could do useful work, there's no value in letting them go idle). Amongst the employed, extreme poverty would be nonexistant as people who starving and/or overly stressed about lack of money aren't able to work as effectively. The standard of healthcare would be good (since getting sick makes you unavailable to work and dying means money spent training your replacement) but may focus more on expensive and continual management of disease rather than cures (since this can be paid for by the worker as another incentive to work). War, crime, and political instability would be non-existent as overall these things destroy wealth rather than creating it. As most of the work that can't be done by machines is intellectual in nature, workers are likely to be well-educated, working conditions are likely to be very good and workers would get as much vacation and leisure time as needed to keep them from burning out and to maximize their overall productivity. Time spent commuting is wasted, so people will tend to live close to their workplaces. Violent revolution would be bad for business, so the general standard of living would be good enough that people would not expect to be able to improve it by revolting. Entrepreneurship would be encouraged (and funded by the wealthy) as long it is profitable overall.
On getting to that point, the amount of wealth being generated will surely far exceed that needed to meet the basic needs of workers and keep them productive. The excess goes to the super-wealthy, but what would they do with it? They would be the few who have the luxury of being motivated by concerns other than profit, so depending on their interests they might invest in the arts, space travel, curing diseases and other ills of the world that are left unaddressed by profitability concerns, or blue skies research. Or perhaps they will just build gigantic monuments to their own egos that will be enjoyed by subseqeuent generations of tourists.
That actually sounds less awful than I thought it would be when I started writing this essay. It's still all rather unjust, though - ideally the excess profits would be distributed far more equitably so that more people have a say in what the priorities of the human race should be. I hope we can find a way to keep the good parts of this scenario while replacing the bad parts. In a future post I'll look more into how this might be achieved.
Well, it's settled then: We will evolve into the Ferengi. Time to start hoarding bars of gold-pressed latinum!
It's worse than that, I think - at least the Ferengi were able to able to keep much of the profit they made. I seem to recall that Quark was quite rich by the end of DS9.
[…] entities in the world today. But what are their missions (or what should they be)? What are they optimizing for? Democratic governments at least are supposed to represent the interests of the voting public, but […]