In this edition of our Haskell in Production series, we interview Fyodor Soikin from CollegeVine – an online platform that connects high school students with college admissions guidance and mentorship. | Continue reading
Word2vec, short for "word to vector," is a technology used to represent the relationships between different words in the form of a graph. This technology is widely used in machine learning for embedding and text analysis. | Continue reading
In this article of our Haskell in Production series, we interview José Pedro Magalhães from Standard Chartered – a multinational bank that has over 6 million lines of code written in their own dialect of Haskell. | Continue reading
In this blog post, we explore, describe, and dissect the second phase of the collaboration between Runtime Verification and Serokell on optimizing the K semantic framework. | Continue reading
In this article, we give a comprehensive overview of AI project ideas to work on, from beginner-friendly tasks to more advanced challenges. | Continue reading
In this article, we look at the technology behind GPT-3 and GPT-4 – transformers. We’ll talk about what transformers are, how they work, and why they are so important for technology and business. | Continue reading
In this edition of our Haskell in Production series, we interview Simon Marlow, who's currently an engineer at the Code Search and Indexing team at Meta. | Continue reading
In this interview, Jesse Johnson – a leading data science expert and the founder of the Merelogic consulting group – shares his thoughts about the challenges of AI analysis in biological research. | Continue reading
We love Haskell, but we also love learning new languages. In this article, we want to show how to use your Haskell knowledge to write Rust code. | Continue reading
The relationship between bias and variance is similar to overfitting and underfitting in machine learning. Learn how to achieve optimal model performance by keeping in mind the bias-variance tradeoff. | Continue reading
What benefits can functional programming bring to consulting projects? How does Haskell help to deliver projects on time and on budget? Find out the answers in our interview with Rob Harrison from Flowmo.co. | Continue reading
The number of open-source ML libraries is constantly increasing, but which ones should you use in your project? In this blog post, we present fifteen ML libraries to pay attention to in 2023. | Continue reading
In this blog post, we explore, describe, and dissect the first phase of the collaboration between Runtime Verification and Serokell on optimizing the K semantic framework. | Continue reading
Even though Rust and Haskell are quite different languages, they are also surprisingly alike. If you know Rust, you have a head start with Haskell, and vice versa. | Continue reading
Multimodal learning is a machine learning technique that incorporates data from multiple modalities – ways of perceiving the world – to create models that have increased accuracy and better capabilities. | Continue reading
Like “smart money”, which is VC money invested by firms and angels that have great expert knowledge and connections, smart dev involves taking into mind the non-code value a consulting firm can bring. | Continue reading
ChatGPT, a chatbot capable of conducting conversations in a human-like manner, has made headlines both in specialized technology publications and mainstream news. Some experts predict that this marks the beginning of a new era in AI. | Continue reading
Generative AI has recently seen an incredible popularity surge. In this post, we take a closer look at what it is and how it works, as well as outline common use cases and perspectives for the future. | Continue reading
K-means is an algorithm that can separate unlabeled data into a predetermined number of clusters. In this blog post, we look at its underlying principles, use cases, as well as benefits and limitations. | Continue reading
With Rust generics, programmers can write general algorithms that work with arbitrary types, reducing code duplication and providing type safety. In this article, we show when and how to use them. | Continue reading
Feature engineering is the process of designing predictive models based on a carefully selected set of data. Read our step-by-step guide on how to introduce feature engineering into your model. | Continue reading
In this edition of our Haskell in Production series, we feature e-bot7 – a low-code conversational AI platform designed for customer service and support. Read the interview to learn where they use Haskell, why they decided to adopt it, and what was their experience using it. | Continue reading
Which are the trends in machine learning that will be relevant in 2023? Read our post to find out the answer. | Continue reading
In this interview, we speak with Dr. Yuriy Gankin and Maxim Kazanskii from Quantori. This innovative IT company works in AI for life science, biotech, and pharmaceutical companies developing software solutions to accelerate the discovery and development of novel therapies. | Continue reading
In this edition of our Haskell in Production series, we feature FOSSA – a tool for open-source risk management. | Continue reading
Stable Diffusion is a free, open-source neural network for generating photorealistic and artistic images based on text-to-image and image-to-image diffusion models. Read our tips to start generating your own masterpieces in minutes. | Continue reading
The DataKinds language extension might not work the way you think it does. In this article, we look at how it's usually taught and show how the common intuition differs from reality. | Continue reading
In this article, we cover the basics of traits in Rust: what they are, when they are useful, and how to use them. | Continue reading
Transfer learning reuses/leverages existing ML models to solve a new problem. A transfer learning strategy is helpful if you have too little data to train a model from scratch. Read this post to find out more details. | Continue reading
In Rust, enums are composite data types that can have multiple variants. In this article, we show you how to define, instantiate, and use them. We also cover pattern matching and two common enums for error handling: Option and Result. | Continue reading
Support vector machines build a hyperplane that partitions data into two categories. The SVM algorithm is widely used in research and in the investigation of complex problems as a stand-alone method or in combination with neural networks. | Continue reading
In this edition of Haskell in Production series, we interview Syed Jafri, a Senior Software Engineer from Caribou. Read the interview to learn how he successfully pitched and introduced Haskell in his company. | Continue reading
The most popular solutions for storing data today are data warehouses, data lakes, and data lakehouses. This post gives a detailed overview of these storage options and their pros and cons for specific purposes. | Continue reading
In this edition of our Haskell in Production series, we feature NoRedInk – an EdTech product that helps students become better writers through its online, adaptive writing curriculum. | Continue reading
The k-nearest neighbors (kNN) algorithm is a simple non-parametric supervised ML algorithm that can be used to solve classification and regression tasks. Learn how it works by reading this guide with practical example of a k-nearest neighbors implementation. | Continue reading
In this month’s episode of Functional Futures, our guest is David Christiansen – the Executive Director of the Haskell Foundation and the co-author of The Little Typer, a book on dependent types. | Continue reading
Find out how the one-shot learning algorithm works and the practical applications of this novel paradigm in neural networks. | Continue reading
In this edition of our Haskell in Production series, we interview Max Tagher, the co-founder and CTO of Mercury. Read the article to learn where Mercury uses Haskell, why they chose it, and what they like about it. | Continue reading
In this month’s episode of Functional Futures, our guest is Edward Kmett – Head of Software Engineering at Groq and the author of many widely-used Haskell libraries. | Continue reading
What are the best strategies for training neural networks? How to avoid overfitting? Which open-source datasets to choose? Find the answers to these questions in our interview with Dr. Varun Ojha. | Continue reading
Deep learning has been a game changer in the field of computer vision. It’s widely used to teach computers to “see” and analyze the environment similarly to the way humans do. | Continue reading
What's the type of a type? Can a type abstract over polymorphic types? Find the answers to these questions in our article on kinds in Haskell. | Continue reading
What's the Functor typeclass, and how can it be used? Find all the information you need to get started with Functor in our beginner-friendly blog post. | Continue reading
Did you know that both universal and existential quantification are possible in Haskell? In this article, we show you where these quantifications can be useful and how to use them. | Continue reading
In this month’s episode of Functional Futures, our guest is Erik Svedäng – a game designer and the creator of Carp, a statically-typed lisp for real-time applications. | Continue reading
Pattern recognition – finding hidden patterns in data – is one way to effectively solve problems and automate tasks across a variety of industries. This article covers what pattern recognition is, how it's used, and the real-world opportunities it opens up. | Continue reading
In the first part of our Parsing With Haskell series, we introduce you to Alex – a Haskell tool for generating lexers. | Continue reading
In the second part of our Parsing With Haskell series, we cover Happy – a Haskell parser generator. | Continue reading