PyCon Nigeria Annual Conference

Introduction to AI/Large Language Model apps development using Python and OgbujiPT

speaker-foto

Uche Ogbuji

[Uche is an experienced consultant, startup founder & CTO](https://ucheog.carrd.co/). A serial tech pioneer, he was a teenage member of the small computing scene in 80s' Nigeria. In the 90s he became a very early adopter and community member of Python, including standard library contributions and related open source project leadership. He is currently working on cutting-edge AI development, having founded [Oori Data](https://oori.dev/) in Boulder, Colorado. Uche is also a poet, DJ, writer, speaker and mentor.

Description

GenAI is revolutionizing tech, especially with the success of ChatGPT. Large Language Model-based creative, business & chat products have become all the rage. OgbujiPT is an open-source project offering Python & command line interfaces for writing your own apps to use LLMs. Learn how to get started in just a few lines of code, then take a deeper look at the world of language and multi-modal AI.

Abstract

Machine learning and deep learning neural networks were revolutionized by the Transformer architecture for AI models, and Large Language Models (LLMs) soon emerged, with ChatGPT exploding into the consumer marketplace. Increasingly, Generative AI features are essential to any modern product, even though the entire tech stack is still in its early days. In any emerging tech, simple, clean interfaces are essential, and a grounding in sound software engineering helps smooth out the rough patches. The OgbujiPT toolkit arose out of frustration with over-complicated and questionably engineered libraries for working with LLMs. It allows you to get started using language and multi-modal (language+visual) AI models.

OgbujiPT supports self-hosted, local models, even such lightweight/mobile friendly models as Phi-2, as well as full-service offerings such as OpenAI's GPT models. It includes support for sophisticated AI/LLM techniques such as Vector Databases, Retrieval Augmented Generation (RAG, often used for "chat your documents" tools) and having LLMs integrate with live actions and real-time function-calling. You can use it with back ends such as llama.cpp (custom HTTP API), llama-cpp-python (OpenAI HTTP API), actual ChatGPT, text-generation-webui (AKA Oobabooga or Ooba), as well as in-memory hosted Llama-class, etc. models via Python libraries such as ctransformers.

Basic outline:

  • Brief overview of GenAI and LLMs
  • AI language generation in five lines of Python
  • Flexible usage modes for GenAI interfaces
  • Vector databases and retrieval-augmented generation
  • Agent capabilities in GenAI and function calling
Audience level: Intermediate or Advanced