Under the Hood

From Primates to Precision: How Blockchain Taught Me to Train AI

Critics and detractors mock me about my vanity projects, be it my self-published children’s gardening book in memory of my late mother, my personal cryptocurrency JKoin, or the latest, my NFT collection called Quantum Primates. Yes, my dirty little habits are expensive but they keep me fired up and excited about life (sad, I know).

Here’s the thing: when I built Quantum Primates on the Polygon blockchain, I wasn’t just launching an NFT collection—I was encoding identity, emotion and neurodivergence into metadata. Each trait—Empathy Pulse, Hyperfocus Halo, Resilience Core—wasn’t random. It was structured, intentional and deeply symbolic. Please check them out at https://opensea.io/collection/quantum-primates-genesis-edition

To do that, I had to learn how to:

That process—of breaking down human complexity into digital language—was my gateway into AI.

When I began training the menopause LLM, I realised it was the same dance:

Just like Quantum Primates, this wasn’t just about data—it was about meaning. I wasn’t just teaching a machine the facts about menopause. I was teaching it how to listen, how to respond with empathy and how to honour the invisible crown of women.

And just like minting NFTs on Polygon, training the model on RTX3090 chips meant working with constraints. It took patience. Overnight runs. Iteration. But I knew how to do that—because I’d already built something soulful from code. You learn something from everything you do, so go ahead, don’t let ££££ or critics or detractors hold you back from doing you.

Quantum Primates taught me that structure can hold spirit. That tokens can carry truth. And that when tech meets storytelling, healing becomes scalable.

Training Our Menopause AI: A Journey Through Time, Chips, and Wisdom

How do we train the Fab4050 Large Language Model (LLM), or indeed, any LLM?

 It’s a bit like teaching a very smart child everything there is to know about midlife health—except this child learns by reading millions of words and patterns.

We start by collecting trusted information from medical textbooks, clinical guidelines and reputable websites. Think of it as feeding the AI a carefully curated diet of expert knowledge—everything from hormone therapy to sleep struggles, cultural beliefs to ethical supplement use.

But here’s where it gets interesting: before the AI can “understand” this information, we have to break it down into tiny pieces called tokens—like syllables or puzzle pieces. These tokens help the AI recognise patterns, meanings and relationships between ideas.

Now, imagine trying to teach all this to a computer using a graphics card designed for gaming—the NVIDIA RTX3090. It’s powerful, yes, but not built for industrial-scale AI training. So instead of instant results, we let the model learn overnight. It’s like slow-roasting wisdom: the AI spends hours digesting the data, adjusting its internal “neurons,” and refining how it answers questions.

Why overnight? Because we’re not using supercomputers or billion-pound cloud setups. We’re building this with care, ethics and grassroots tech—making sure every answer is grounded in truth and empathy.

The result? An AI that doesn’t just spit out facts, but understands the emotional and cultural layers of menopause. It’s still learning, but every night it gets wiser.

Curious to see what it’s learned so far? Ask it a question about menopause—and watch how science meets soul. Launching soon! Drop us an email so we can keep you posted: jkfab4050@gmail.com

Leave a comment