In the past ten years ago, data-driven technology has changed the world around us. We have seen the potential in collecting information and education artificial intelligence to interpret: computers that learn to do so translating languages, the same identification form Turn on our phones, the same algorithms cancer diagnosis patients. The possibilities are endless.
But the new weapons have posed serious problems. What a machine learns depends on many factors — including data that taught them.
Data that fails to represent the American team can bring in the same agents I do not understand Southern obedience; facial expressions themselves leads to wrongly constituted, in favor of all others; and the same medical algorithms discounting dryness of kidney disease in African America, which prevents people from receiving kidney transplants.
Model-like training machines can overcome prejudice in the past and support modern-day discrimination. The same writing tools learn form Employees of the company can refuse to register those who are no different from those who have been there even if they are eligible – for example, women’s computer programs. Valid retrieval algorithms to determine the appropriateness of credit it can lead to other housing issues related to race and poverty, which perpetuate housing discrimination for many years. AI can recommends medical treatment for groups that receive medical care most often, not for those who need it most. Indirect AI training in online discussions results in “mind analysis ” who see the terms “Black,” “Jew,” and “gay” as irrelevant.
The technology also raises questions about privacy and transparency. When we ask our smart speaker to sing a song, do I record it? what our children are saying? When a student writes an exam online, he or she must cameras monitor and monitor all their movements? Do we have a right to know why we have been hit with mortgages or being asked about jobs?
In addition, there is the problem of AI being intentionally tortured. Other authorities use it as a tool for oppression, division, and government-sponsored discrimination.
In the United States, some AI failures may be unintentional, but they are serious and have a profound effect on people who have been abused. It often comes from those who create AI without using the right sets and not analyzing systems enough, and without having a variety of ideas on the table to anticipate and fix problems. kale chemicals are used (or killing things that can’t be fixed).
In a competitive market, it may seem easy to cut. But it is illegal to make AI machines that can harm many people, just as it is not acceptable to make drugs and other things — whether cars, children’s toys, or medical devices — that could injure many people.
Americans have a right to expect good. Technology must respect the principles of democracy and adhere to the principles required by all to be treated fairly. Establishing these assumptions may help to confirm this.
As soon as we drafted our Constitution, the American people introduced the Bill of Rights to protect the new government we have just created – and to provide guarantees such as freedom of speech and assembly, the right to fair trial and justice, and to be protected from harassment and extortion. Throughout our history we have had to interpret, reaffirm, and constantly increase these rights. In the 21st century, we need a “charter of freedom” to avoid the powerful technologies we have developed.
Our country needs to articulate the rights and freedoms that we expect the technology-driven technology to respect. The real issues will need to be discussed, but here are some of the possibilities: your right to know when and how AI affects decisions that affect your human rights and fundamental freedoms; your right to be subjected to an AI that has not been thoroughly researched to ensure that it is accurate, impartial, and trained in adequate representative systems; your right to self-determination everywhere or in your home, community, and workplace; it is your right to get help if using algorithms hurts you.
Obviously, free counting is the first step. What can we do to protect them? The possibility is that the government refuses to buy software or technology that fails to respect these rights, requiring government contractors to use technologies that comply with this “legal right”, or to comply with new laws and regulations to eliminate opportunities. Countries may choose to adopt the same policy.