From the course: Foundational Math for Generative AI: Understanding LLMs and Transformers through Practical Applications

Unlock this course with a free trial

Join today to access over 25,500 courses taught by industry experts.

Solution: Build a two-layer transformer encoder

Solution: Build a two-layer transformer encoder

(upbeat music) - [Instructor] Amazing. How was the challenge? In this challenge basically you had to build a transformer for yourself so you get acquainted with all the process if you ever need to modify, top up, add heads or whatever process you need to do to a given LLM that is not simple transfer learning. So as before we'll do the pip install and let me start like working out through the specific notebook. So first of all, what we need to do is get the data, the irony data, and we're going to of course get the trained text and the trained labels and we're going to convert them to tensors. That's all good. Beautiful. Then as before, so this is not news, we're going to get all the training text to convert to a gensim Word2Vec model to get our embedding. And in this case our embedding will be 300 dimensional and we're going to save it just in case you want it. But the important thing is that we are going to get the vocabulary from it to construct embedding matrix the same way that we…

Contents