I am the CEO of a one-person AI startup, DeepNetwork, and over the past six months, we have successfully uncovered the detailed workings of how the LoRA (Low-Rank Adaptation) model transforms pre-trained weight matrices into two low-dimensional matrices for efficient training. I would greatly appreciate your interest in our LoRA model implementation expertise.

 

DeepNetwork CEO / Seokweon Jang  /   sayhi7@daum.net

 

Over the past 1-2 years, I have also spent a great deal of time analyzing the detailed principles behind designing a GPT-3 foundation model. Securing technical expertise in GPT-3 foundation model design involves addressing a crucial component: implementing Korean embeddings. I have spent several months understanding the principles behind implementing Korean embeddings. I firmly believe that mastering the embedding implementation process, enabling AI to understand the ten major world languages, is the core of how generative AI models like ChatGPT function.

 

Since I am Korean, I dedicated significant effort to understanding the know-how of Korean embedding implementation. Additionally, to build features that analyze the contents of academic papers, I also spent months delving into how PDF documents are structured and how they should be parsed. Did I only spend time worrying? Absolutely not! The depth of my efforts has led to tangible results and the acquisition of critical expertise, which is why I’m writing this now.

I started studying LLMs in earnest back in 2020 when the GPT-3 model was first introduced. As GPT-3 was developed by OpenAI, its primary supported language is English. For this reason, I focused on gaining expertise in designing Korean tokenization and embedding processes. And I mean it—I’ve truly mastered this area.

Korean tokenization was particularly challenging because Hangul (Korean characters) is inherently composed of initial consonants, medial vowels, and final consonants. Tokenization must work at the morpheme level to handle Korean effectively. Understanding this system was no easy feat—it took tremendous effort to grasp.

+ Recent posts