7 Best Machine Learning Workflow and Pipeline Orchestration Tools

Building impactful machine learning projects relies on much more than selecting the best algorithm for the job. Data scientists and machine learning engineers need to collaborate to make sure that…

Prompt Engineering Best Practices: Chain of Thought Reasoning

Prompt engineering, a fundamental concept in AI development, involves crafting tailored instructions or queries to guide AI models in generating desired outputs effectively across diverse tasks and…

Critical Ideas Behind What Powers LLMs

In recent years, the ascent of Large Language Models (LLMs) such as GPT-4 and LLaMA has paved the way for the general public to utilize AI in their daily workflows. It can range from programming and…

Time Series Simulations: Signature Transformation Method in Python. Part 2

Enriching the dataset with a large number of vectors distributed in the same way as the original ones and calculating of sample statists.

The Microsoft Phi-3-Mini is Mighty Impressive

The Phi-3-Mini language model was recently released by Microsoft AI. This model comes in the category of a small language model (SLM) that offers many of the same capabilities offered by LLMs. The…

When Things Go Wrong, What and How to Communicate with the Non-AI Leadership?

Let's call a spade a spade. You are interested in reading this article, which means you are one of the 90% of data scientists who have felt the need to interact with the boss or a leader to let them…

Some Technical Notes About Phi-3: Microsoft’s Marquee Small Language Model

The term small language model(SLM) has been gaining traction in the world of generative AI. The term was originally coined by Microsoft after publishing a paper with a catchy title: “Textbooks is all…

Human and Artificial General Intelligence Arises from Next Token Prediction

What if human intelligence derives from successful next token prediction, and what if next token prediction is a sufficient objective function for emergence of artificial general intelligence? This…

How to Boost the Performance of Python Using the Caching Techniques?

Python cache decorator can improve performance significantly for recursive functions, data science scripts and applications, lru_cache can manage memory as well.

Multimodal citations with Google’s Vertex AI

One of the key challenges for organisations who want to use Generative AI is hallucination — the fact that Large Language Models (LLMs) sometimes make up content that isn’t true. This is where…