Dive into Neural Networks, the backbone of modern AI. Understand its mathematics, implement it from scratch, and explore its applications.
You know the drill. You get a dataset, load it into a Jupyter notebook, explore it, preprocess the data, fit a baseline model or two, and then train an initial final model, such as XGBoost. The first…
A large share of datasets contain categorical features. For example, out of 665 datasets on the UC Irvine Machine Learning Repository [1], 42 are fully categorical and 366 are reported as mixed…
Large language models (LLMs) have advanced rapidly in recent years. As new generations of models are developed, researchers and engineers need to stay informed on the latest progress. This article…
Where Do EU Horizon H2020 Fundings Go?. Combining explorative data analytics, geospatial data, and network science in Python to overview 35k+ EU-funded projects..
Friends, I am so excited to share with you a truly amazing invention of Stochastic Process Math— the Signature of Time Series! This is a highly complex issue with a lot of detail to cover, so this…
Good morning, AI enthusiasts! This week, I’m super excited to announce that we are finally releasing our book, ‘Building AI for Production; Enhancing LLM Abilities and Reliability with Fine-Tuning…
Explaining attention using a simple money borrowing scenario.
Recently, I have been researching large language models with the goal of using them to score samples. Initially, I attempted to have the model rate on a scale from 1 to 5, but it consistently gave…
In the context of Large Language Models (LLMs), RAG stands for Retrieval-Augmented Generation. RAG combines the power of retrieval systems with the generative capabilities of neural networks to…