Skip to product information
This is generative AI
This is generative AI
Description
Book Introduction
**** Author of "This is Artificial Intelligence" selected as an "Excellent Science Book" by the Ministry of Science and ICT!
**** The third book in the AI ​​field that has been a long-running bestseller chosen by readers for 5 years!

**** This year's economics and management hot topic book selected by companies, institutions, and schools!

Knowing this much about AI is enough!
The CEO of a leading AI company explains it in an easy-to-understand way using vivid examples.
Everything You Need to Know About the AI ​​Ecosystem Transformed by Large-Scale Language Models (LLMs)!

Major trends in future science and technology are not determined from a technological perspective.
How the public understands technology, which technologies they are most excited about, and which technologies attract more market capital determine the direction in which major trends in science and technology will unfold.
Trends in science and technology are determined not by the researchers or engineers who deal with the technology itself, but by the public who strive to understand and utilize the technology.
In other words, technology is always just a means, and the direction in which technology itself will develop in the future is determined by the public.
Like computers and the internet, AI will be a useful tool in a variety of technological fields. The future development of AI technology and the future trends in various scientific fields utilizing AI technology are entirely up to the readers of this book.
  • You can preview some of the book's contents.
    Preview
","
index
Recommendation · 4
Author Interview: 7 Questions and Answers on AI from an AI Company CEO · 8
Prologue · 22

PART 1 Generative AI

01 The Rise of ChatGPT and Generative AI · 31
02 What is Generative AI? · 33
03 The Essence of Generative AI: Large-Scale Language Models · 37

PART 2: AI Trends Between AlphaGo and ChatGPT

01 The Machine Learning Craze That Began with the Emergence of AlphaGo · 43
02 What is Machine Learning? · 45
03 Combining Cloud and AI · 48
04 What are the limitations of machine learning? · 58
05 Super-large AI emerges as an alternative to machine learning · 61
06 What is Super-Large-Scale AI? · 63
07 What are the limitations of super-large AI? · 66
08 Large-Scale Language Models: A Crossroads Between Machine Learning and Super-Big AI · 68
09 Comparing Super-Large AI and Large-Scale Language Models · 73

PART 3 AI Semiconductors: Power Consumption Issues

01 The Relationship Between Super-Large-Scale AI, Large-Scale Language Models, and AI Semiconductors · 79
02 Power Consumption Issues of Super-Large-Scale AI and Large-Scale Language Models · 83
03 The Capitalism of Super-Large-Scale AI and Large-Scale Language Models · 87

PART 4: GETTING STARTED WITH LARGE-SCALE LANGUAGE MODELS

01 Google Transformer Model, the Progenitor of Large-Scale Language Models · 101
02 Transformer Model's Descendants GPT, BERT · 106
03 What is a Large-Scale Language Model? · 108
04 The Misconception That Large-Scale Language Models Are Only Used for Language Problems · 112
05 Large-Scale Language Models Also Useful for Time Series Forecasting · 116

PART 5 Transfer Learning and RAG

01 New Employee Pretending to Know Everything · 121
02 Field Training, Transfer Learning · 124
03 Answer only within the given manual, RAG · 128
04 Comparing Machine Learning and Large-Scale Language Models · 131

PART 6 Implications of Using Large-Scale Language Models

01 The Utility of Using Large-Scale Language Models · 141
02 The Future of Juniors Relying on Large-Scale Language Models · 143
03 Galileo and Large-Scale Language Models · 147

PART 7 How to Use Large-Scale Language Models?

01 When Do We Need Large-Scale Language Models? · 155
02 How to Use Large-Scale Language Models? · 162
03 The Future of Large-Scale Language Models · 179

Epilogue: The Future Shaped by Generative AI—What We Need to Know · 186
References · 190
","
Detailed image
Detailed Image 1
","
Into the book
The world before and after the advent of computers, the Internet, smartphones, messengers like KakaoTalk, video platforms, social media, and OTT services like Netflix is ​​not entirely different.
There are people who don't use these technologies and services right now, but that doesn't mean many things are impossible for them to do.
However, if you are frequently exposed to and use this technology and service, you will naturally become accustomed to it and things will become more convenient.
Rather than thinking of generative AI as a grand and difficult technology that requires a long and arduous learning process, it's better to approach it as just one of the technologies and services we've naturally accepted.

--- p.34

The fundamental difference between ChatGPT and existing AI services is that ChatGPT utilizes AI technology called a large-scale language model to generate answers to questions received from users in the chat window.
The essence of AI technology or services, which we collectively call generative AI, is large-scale language models.

--- p.38

Another area of ​​AI technology that competes with machine learning is expert systems.
Let me explain using the process of learning a language as an example.
An expert system is like enrolling in a language school that focuses on grammar and learning from a teacher who is proficient in that language.
Expert systems make AI smarter by instilling the knowledge system, mindset, and logic of experts with expertise in a language field into AI.
Just as learning a language this way allows you to acquire a basic understanding and ability for that language very quickly, developing AI as an expert system has the advantage of being able to raise AI performance to a basic level from the beginning.
But there is a problem.
Just as we don't easily become proficient in English even after studying it for decades from childhood to adulthood, developing AI as an expert system is highly unlikely to result in high performance over time.

--- p.46

Some people mistakenly believe that storing big data needed for machine learning in the cloud makes it more vulnerable to security vulnerabilities than storing it on your own servers.
It can be emotionally uncomfortable to have big data about yourself stored in another company's cloud.
However, we must not overlook the fact that most security incidents occur due to internal factors such as internal employees, former employees, and employees of partner companies.
Additionally, storing big data required for machine learning on your own servers does not necessarily make security stronger.

--- p.52

In machine learning, we first identify who is trying to solve what problem, and then decide what the AI ​​model will predict or recommend and at what level of performance to solve that problem.
Next, we use machine learning to improve the performance of the AI ​​model so that it can make predictions and recommendations according to the target performance.
If 10 companies each have 10 problems, a total of 10 × 10 = 100 AI models are needed.
When utilizing machine learning, a new AI model must be created each time, depending on the subject trying to solve the problem and the problem to be solved.
An alternative to overcome the inefficiency of this approach is super-large AI.

--- p.64

Even after securing data for AI training, a process of developing an AI model using machine learning is necessary.
Utilizing large-scale language models means selecting and utilizing models that have already been developed, from creating AI training data to developing AI models. This naturally saves time, money, and budget compared to solving problems with machine learning. If the problem to be solved is so specialized that it cannot be solved with large-scale language models, machine learning should be used despite the relative inefficiency. However, in other cases, using large-scale language models is the current AI development trend.

--- p.72

To periodically train and build large-scale language models, AI semiconductors used in AI model training are required. Consider the case of ChatGPT, an AI service equivalent to a super-large AI, created using a large-scale language model called GPT.
To create the ChatGPT AI service, a large-scale language model called GPT must be continuously trained to improve performance in line with the purpose of ChatGPT, and OpenAI used AI semiconductors in this process.

--- p.81

Training AI models requires collecting and processing massive amounts of data, which inevitably raises privacy concerns.
Furthermore, AI models learned through this process can reinforce existing human biases, potentially promoting or reinforcing racism, discrimination against socially vulnerable groups, or hatred. There is already considerable concern and debate surrounding the ethical issues surrounding AI.
These issues should not be viewed solely as right or wrong, but rather as social costs arising from the process of developing and utilizing AI. Those developing and utilizing AI should be encouraged to shoulder these social costs and ultimately work towards minimizing them.

--- p.96

The way large-scale language models produce results is similar to mixing together the content of numerous fairy tales to create a plausible fairy tale.
If a professional baseball broadcaster and commentator were to speak for over three hours, carefully considering every word, they would be exhausted and unable to continue the conversation until the end of the game.
The broadcast caster and commentator must watch the game unfold in real time and naturally bring up related stories from their past experiences to continue the conversation without exhausting their stamina and concentration.

--- p.122

A large-scale language model is an AI model that has been trained by collecting data expressed in a large number of languages, such as books, papers, and web pages on the Internet, so it is like meat that has been grilled.
Because it has been pre-grilled, it is somewhat cooked and can be eaten right away.
However, each person has a different level of enjoyment of cooking meat, and each person has different optional things they can do to add flavor, such as seasoning or sprinkling salt, according to their own preferences.
The process of taking meat that has been grilled like this and grilling it further according to one's taste, or adding seasoning, salt, spices, etc. is transfer learning (Fine Tuning).
--- p.124

Collect photos of well-ripened kimchi and photos of unripened kimchi.
The kimchi photos obtained in this way are labeled (classified) into photos of ripe kimchi and photos of unripe kimchi, and then made into a form that can be useful for training an AI model.
The AI ​​model is repeatedly trained using the AI ​​training data created in this way.
When a new kimchi photo is input into a trained AI model, it produces a result of either ripe kimchi or unripe kimchi.
With this result, we can create an AI service that tells us whether it is okay to put it in a kimchi refrigerator or a regular refrigerator and eat it, or whether it would be better to ripen it a little more at room temperature.

--- p.136

AI developers directly work with and create large-scale language models, and ordinary users use AI services built on top of large-scale language models.
When a user uses a large-scale language model, it precisely means that the user is using an AI service built on a large-scale language model. However, from now on, we will assume that the user's use of an AI service built on a large-scale language model is the same as using a large-scale language model.

--- p.143

In summary, from the perspective of an AI provider, if the AI ​​model required for the AI ​​service they are trying to create is not among the paid AI models that can be used by paying for it, if they do not have enough resources to create a super-large AI themselves, and if the problem to be solved is not special enough to require creating an AI model through machine learning, then a large-scale language model is necessary.

--- p.161

For those using AI services built on large-scale language models, it is important to learn and practice prompt engineering.
In Prompt Engineering, a prompt refers to a question or request that a user continuously makes to an AI service built on a large-scale language model.
The goal of prompt engineering is to generate optimal prompts that allow large-scale language models to more effectively answer questions or requests by considering various factors such as words, sentences, grammar, and context that make up the prompt.
--- p.169

In the future, more and more people will perceive themselves as raising their own large-scale language models, much like raising a pet.
While it's difficult to impart meaning or personality to AI models built with machine learning optimized to solve specific problems, large-scale language models are fundamentally capable of conversing with humans and can experience changes and development through user interaction, making it much easier to impart meaning, much like a pet.
--- p.183
","
Publisher's Review
What is generative AI?
How can we best utilize this technology?
A book that tells the story naturally and plainly

An easy-to-understand guide to generative AI that even the general public can understand!


The world before and after the advent of computers, the Internet, smartphones, messengers like KakaoTalk, video platforms, social media, and OTT services like Netflix is ​​not entirely different.
There are people who don't use these technologies and services right now, but that doesn't mean many things are impossible for them to do.
However, if you are frequently exposed to and use this technology and service, you will naturally become accustomed to it and things will become more convenient.
Rather than thinking of generative AI as a grand and difficult technology that requires a lot of planning and a long learning process, it's better to approach it as just one of the technologies and services we've naturally accepted.
While the Internet has certainly played a role in improving human life and business, there are also lives and businesses in which the Internet cannot play a role.
There are areas where generative AI can play a role, and areas where it cannot.
Just as the ubiquity of internet search transformed our daily lives and the way we work, generative AI will bring about just as much, or perhaps even more, change.
Let's make generative AI a part of our daily lives by talking to ChatGPT or similar generative AI right now.
"]
GOODS SPECIFICS
- Date of issue: February 15, 2025
- Page count, weight, size: 192 pages | 386g | 146*209*20mm
- ISBN13: 9791167852410
- ISBN10: 1167852419

You may also like

카테고리