What is Generative AI?
A type of AI that can create new content.
Model that create a wide range of outputs such as text, image, music, video, or other multimedia things.
A subset of deep learning where the model is trained to generate output on their own.
Opens exciting possibilities for creative tasks, automation, and new ideas.
How does Generative AI really work?
Learns the underlying patterns in a given data set and uses that knowledge to create new data that shares those patterns.
For example, let's assume that you have to guess a cow — but how does it work?
If you feed the data like its image and text describing how it looks, the model starts to process and find certain patterns — like the cow having a tail, legs, its head shape, and more features. Then, it makes a list of percentages showing which animal is closer to that pattern. That’s how Generative AI gets trained.
Machine Learning
Identifies patterns in pictures and uses those patterns to recognize and classify new examples, such as cats and dogs.
Training a ML model:
By providing images or data and labeling them, we feed that data into the model for training.
Inference:
It is the stage where the ML model has been trained and is now ready to use. If you input some data into it, it can predict the particular data.
This machine learning is also called supervised machine learning.
How is Generative AI different from other AI?
In traditional AI models, first it makes labels and then tries to predict according to the labeling.
So the first clear map is: we have certain types of data, and in the case of traditional AI models, it will first label the data. The ML model then learns the relationships between the label and the data and gives the output based on that labeling.
But in Generative AI, it is kind of different.
We already have a generative model — even if you give unstructured data to it, it will generate a new output as new content, from creating text to images to videos, as the user defines.
Types of Generative AI Models
Models can generate text, code, poems, and more.
They learn from large collections of text data to capture patterns, language structures, and semantic relationships.
0 Comments