Few topics are discussed as controversial as that of artificial intelligence. As is often the case, new technologies bring uncertainty.
The immediate fear that computers would become smarter than we humans, take over our jobs and, in the worst case, push us into insignificance has reached a preliminary summit at least since the humorous robot Lady Sophia. Popular headlines such as the case of The “secret language” of Facebook bots Do the rest to doubt human control.
Even though it may seem like it, artificial intelligence is not a recent development. Researchers have been working for more than 50 years to find out how the human brain works and subsequently to simulate their findings by machine.
The artificial reconstruction of human perception and action is a highly complex undertaking because it is based on the expertise of a wide range of disciplines and sciences.
But does this mean that computers will be smarter than we are and will eventually take over?
... but the teacher is a person.
As far as future developments in AI are concerned, it appears to be changing many areas of life, at least from a current perspective. But not because robots are taking over the world.
Behind every artificial intelligence are people who can “create” it and switch it off again. A supposedly banal statement, but one that is particularly worth mentioning against the background of the headlines mentioned above.
Computer scientists, neurologists, cognitive and language scientists, and psychologists are just a few of the indispensable links in AI research.
Her most central topics include the question: When is a machine intelligent?
As early as the 1950s, British computer scientist Alan Turing tried to answer this with a possible definition. His considerations resulted in the so-called Turing Test. The process, which is still prominent and still controversial today, set the standard for artificial intelligence the distinguishability of man and machine.
The Turing-Testein Questioner “blindly” conducts an interview using a keyboard with two test subjects — a real person and a machine. If he is unable to clearly assign or differentiate between the two conversation processes at the end, the machine is equal to the human being.
Turing's test does not answer the question adequately, but it shows that AI research has been around much longer than is often assumed.
In machine learning, experiences are converted into knowledge. This means actions are not simulated 1:1; much more patterns and laws are derived from them. Using so-called algorithms, what has been learned by the machine is generalized and applied.
With every repetition (= experience), the machine becomes more intelligent.
To explain where artificial intelligence has long been part of our everyday lives, we don't need annual reports from Silicon Valley — just look at our smartphone.
Anyone who owns a younger model, for example, only needs to look at it again, is recognized and gains access. This is not an art, but it is the result of long research and advanced machine learning — because in addition to calculating facial features and proportions for the first time, factors such as different lighting conditions must be balanced out for a functional application. And that's learned.
Of course, in addition to the further development of existing AI phenomena such as chatbots, the simple example of parking assistance or self-driving cars, there are also many more far-reaching efforts and visions.
AI cannot only increase efficiency in the form of machine learning, but also make many processes more reliable. For example, robotics could advance medicine to such an extent that the risk of high-precision procedures or complicated operations is reduced to a minimum or made possible in the first place. And this without necessarily crowding out specialist personnel. Partial steps in production and manufacturing can also be carried out more predictably and almost flawlessly.
The challenge of new technologies is usually accompanied by uncertainties. Especially when it comes to collecting and using data, there is often talk of “monitoring” or...
Read moreWhy visualize data? Our brain not only processes visual elements 60,000 times faster than text, but also stores them longer.
Read more