Author: Johnny Lin
-
What are neural networks? (Part I)
As I was looking at job postings for a machine learning engineer, I looked at the job description and one of the requirements is knowledge of deep learning and PyTorch. For the traditional data scientist, deep learning is typically not part of the repertoire of studies, typically a part of the study of computer science…
-
Google AI Links
Google recently had a the Google Cloud Next ’24 Conference and here are some YouTube links of the some relevant links.
-
Some Python interview questions (Part IV)
I recently encountered another technical interview which posed the question in an online coding debug format. The interviewer gave me a URL from SharePad.io and asked the question: “Do you see anything wrong with the code?” The interviewer gave me about 5 to 10 minutes while he watched. The process was intimidating and there was…
-
Some Python interview questions (Part III)
In Part II we talked about decorators and how they can be used to modify the behavior of a function or class without repeating code. Today we will continue to the final interview question: The method iterrows() comes from a popular data science library pandas. The core data structure in pandasis a DataFrame, a two…
-
Transformers, BERT, and GPT (Chapter 4)
I’m reading a book recently called Transformer, BERT, and GPT: Including ChatGPT and Prompt Engineering by Oswald Campesato (2024). The book is divided into 10 Chapters. Here is a summary of the fourth chapter (Transformer Architecture in Greater Depth). Input tokens are converted into word embeddings (a map of words to real-valued vectors) and from…
-
Transformers, BERT, and GPT (Chapter 3)
I’m reading a book recently called Transformer, BERT, and GPT: Including ChatGPT and Prompt Engineering by Oswald Campesato (2024). The book is divided into 10 Chapters. Here is a summary of the third chapter (Transformer Architecture). Sequence-to-sequence models and “encoder-decoder” are related. The Seq2Seq model consists of two multilayer LSTM (long-term short-term memory): one maps…
-
Transformers, BERT, and GPT (Chapter 2)
I’m reading a book recently called Transformer, BERT, and GPT: Including ChatGPT and Prompt Engineering by Oswald Campesato (2024). The book is divided into 10 Chapters. Here is a summary of the second chapter (Tokenization).
-
Transformers, BERT, and GPT (Chapter 1)
I’m reading a book recently called Transformer, BERT, and GPT: Including ChatGPT and Prompt Engineering by Oswald Campesato (2024). The book is divided into 10 Chapters. Here is a summary of the first chapter (Introduction). Generative AI is a subset of artificial intelligence models designed to generate new data samples similar in nature to the…
-
Some Python interview questions (Part II)
In Part I we talked about three technical interview questions I encountered as a data scientist. We answered the first question, now let’s talk about what a decorator is (we can use function and class interchangeably). According to ChatGPT: A decorator allows you to modify the behavior of a function or a class. It is…
-
Some Python interview questions (Part I)
Python is essential for data science, but a lot of Python is not required to be a successful data scientist. However, the unfortunate situation is that most employers test you on a high level of Python. I recommend reviewing an introduction course on Python with some additional review of how pandas integrates with the workflow.…