The influence of artificial intelligence assistants -Part 4


Si Gyeongmin

Sep 22, 2021

The Coase Theorem, information economics, and behavioral economics help to develop more accurate models to design legal systems based on how the market actually operates. This article shows that the opposite assumption may make more sense in some situations. The market will still be imperfect in some aspects. For example, there are too many competitors, but it is assumed that the market can move closer to a perfectly competitive economic model with minimal conversion costs. Once you recognize this possibility, understand why the high transaction costs still exist, and it will be very beneficial to realize that laws that help automate consumer transactions will bring considerable social benefits and controllable losses.

Regarding whether the law should affect the development trajectory of artificial intelligence assistants, there have actually been results. Some laws inadvertently hinder artificial intelligence assistants, thereby preventing theoretical market volatility and more practical large-scale social welfare. At the same time, some laws designed to help consumers, such as those that mandate consumers' online cancellation rights, may inadvertently benefit artificial intelligence assistants.

A long-term regulatory vision should start with legal reforms that actively support artificial intelligence. If the commercial market becomes more like the stock market, the costs and risks of large-scale monitoring will become very important. Regulatory tools may also converge in the future, and super-conversion may be restricted by the Securities and Exchange Commission-style stock market suspension function (fuse mechanism). In any case, bridging the current erroneous thinking of separating micro and macro issues and separating finance from the real economy will better position the regulatory framework to design a comprehensive automated market rule.


Related Articles

Privacy Policy | Terms of Use

Copyright 2019 - 2023

Contact us at : [email protected]