How does artificial intelligence (AI) work, and why is AI revolutionary and world-changing now? What are the legal conditions for processing large data sets to train an electronic brain? And how about storing user inputs and outputting images and texts? Excerpt from one of my lectures.
Introduction
People who don't understand much about AI often use terms like ChatGPT as a placeholder to sound interesting. Many think that ChatGPT is a search engine. Spoiler: That's complete nonsense. ChatGPT has an ancient dataset by today's standards. That's intentional and deliberate. Because ChatGPT serves as a response machine, not for finding current knowledge.
Those who understood more about AI and had an eye on the stock market bought Nvidia stocks some time ago and have since been able to observe a gigantic price increase. For Nvidia is the manufacturer of graphics cards that are absolute champions when it comes to AI applications.
I predict the downfall of the stock market in its current form because soon anyone can make predictions about stock prices with a probability of over 50%.
My theory along with the assumption that this will soon be accomplished by me.
What's that? Quite simple: A Graphics card like the Nvidia GeForce RTX 3070 has 5888 cores in its GPU. The GPU is the processor of the graphics card. In contrast, the CPU, the classic processor of a computer, stands for it. Good today's Intel processors have 10 or a few more cores.
An Intel core is mathematically something like a Albert Einstein (who could do maths very well as a physicist). A Nvidia GPU core is a moderately gifted mathematician. AI algorithms are based randomly on calculation operations that can be executed particularly well by graphics card processors (GPUs). While the Albert Einstein core performs multiplication with ease and gets bored for half the time, the GPU mathematician is heavily loaded but finishes this trivial calculation almost as quickly.
Unfortunately, 5888 average mathematicians working in parallel would take less time for, say, 100,000 simple multiplications than 10 Einsteins working simultaneously. While the PC equipped with a graphics card has long since finished the K-computation, one thinks that the Intel-only driven PC would have hung up. One can count on a performance increase of a factor of 50 or more with the graphics card. The graphics card is not used for displaying pictures or videos or games, but only for calculating. This is also heard from the loud fan of the card, which can outdo any PC fan.
While in Villariba the CPU is still glowing and only 20 percent of the goods have been unloaded, everything is already shining in Villabaj.
Please excuse the silly comparison with these two fictional villages, which are likely known from advertising, and about which more is known than about current AI algorithms.
The graphics card already makes a quite significant difference for algorithms that are gladly trained 10 days straight for more demanding tasks, or which need 10 seconds on a GPU to generate an image, but take 8 minutes on a – gähn- CPU. You've probably heard of DALL-E or Midjourney and know you don't have to wait 10 minutes for an image.
Functionality of current AI
Systems of artificial intelligence like ChatGPT are based on artificial neural networks. A neural network is also found in the human head or brain. It works roughly as follows:

Shown is the way people process information and how intelligence arises. We understand just as much from looking at the picture why there's even such a thing as intelligence in the first place. I claim that we know nothing about it, but are only amazed that neurons with their connections are able to give rise to something like intelligence. Spoiler: It has nothing to do with God, which I will show soon.
In the above image, on the left we see a series of environmental influences, that is signals. This can be sounds, tones, still images, moving images, smells, air movements etc. Bats are also very familiar with ultrasound. In the middle comes our brain, which receives and processes all these signals. On the right we see the neural network, in which the signals are processed and stored.
A Cell is comparable to a simple processor core. There are connections between neurons, and there are many, very many of them. Billions exist. Whether a neuron fires, that is, is active, is determined by the action potential created by other connected neurons towards a target neuron.
Now we come to the technical realization of today's AI algorithms.

You see the same links in the picture as above with humans.
In the middle you see the electronic brain, above it was the human one.
On the right side of the picture you see the neural network in digital form, which is present biologically and thus more analogically in humans.
So far, so good. But it's going to get even better. The electronic brains transform all signals into number sequences, called vectors, thanks to the Transformer approach (known since 2017). Exactly the same thing does the human brain. At least qualitatively it is the same. That there are fine differences in the general implementation of biology and electronics is nearly irrelevant and only ensures a slight performance boost for biology over electronics. You probably know the Moore's Law: Every 12 to 24 months, the computing power of a processor doubles, often with a simultaneous decrease in price. So the performance winner is the machine, and that's now (around 2023).
My name is Klaus Meffert. I have a doctorate in computer science and have been working professionally and practically with information technology for over 30 years. I also work as an expert in IT & data protection. I achieve my results by looking at technology and law. This seems absolutely essential to me when it comes to digital data protection. My company, IT Logic GmbH, also offers consulting and development of optimized and secure AI solutions.




