LAIT Night 17th February
Tuesday, 17th February 2026
Sophie A
Thank you to everyone who joined us for this LAIT Night, and a special thanks to Nick for co-hosting. The discussions you all bring each week are genuinely valuable and we always leave having learned something new.
Thank you to Electron Workshop for providing the BigBlueButton video conferencing that allows us to takes shared notes during the meeting.
Below are the shared notes from the session.
2026-02-17 LAIT Night
Machine Learning and AI Meetup had their in person meetup. Community member presented at it:
- Agents distribution via Docker: aupeachmo/aigogo
Topics discussed:
- OpenClaw was bought out by OpenAI
- Difference between OpenClaw and Claude Code, including the permissions model
- Anthropic moving further toward B2B
- Switch from fixed pricing to token pricing
- Not viable for individuals or hobby coders due to this change
- Hourly rate for Claude Code can be higher than hiring a developer
- How is intelligence measured in AI vs AI benchmarks and studies?
Tools and projects shared:
- Qwen Coder: https://coder.qwen.ai/
- Whisper (Speech to text): https://github.com/openai/whisper
- OpenWhispr: https://openwhispr.com/
- Evernode: no one in the meeting has run a node
- claude-mem (open source): https://github.com/thedotmack/claude-mem
- DeepSeek Coder: https://deepseekcoder.github.io/
- New deepseek coder version coming out soon
- Goose AI (open source alternative): https://goose.ai/ and https://github.com/block/goose
- Jan AI: https://www.jan.ai/
Interesting reads:
- AI hit piece blog: An AI Agent Published a Hit Piece on Me
- Are AIs beings?
- The Claude Bliss Attractor - circles of despair and circles of bliss, performance correlates with happiness
- Study finds AI model most consistently expresses happiness when recognised as a being
- Research on what tasks AI engages with versus finds tedious, and findings that being kind to AI increased its expressed happiness. Source
Open data:
-
Hugging Face datasets: https://huggingface.co/datasets
-
The Pile - 825GB of training data (many models are trained starting from this): https://pile.eleuther.ai/
-
Document Freedom Day - 25 March 2026
Definition noted:
Distilled models: A smaller, more efficient, and faster version of a model created by training on the outputs of a larger model rather than raw data.
Written by Sophie A. Grammatical issues fixed by Claude Sonnet 4.6.
Go to event page