Tag: AI
-
Revisiting H2-powered datacentres
This is a follow-on post from an earlier post I dashed out – Hydrogen datacentres – is this legit? – where a podcast interview caught my attention about one approach being sold to address demands on the grid caused by new datacentres. This post will make more sense if you have read it. Basically ECL…
-
Does the EU AI Act really call for tracking inference as well as training in AI models?
I’m sharing this post as it I think it helped me realise something I hadn’t appreciated til today. I don’t build AI models, and to be honest, while I make sparing use of Github Co-pilot and Perplexity, I’m definitely not a power user. My interest in them is more linked to my day job, and…
-
TIL: training small models can be more energy intensive than training large models
As I end up reading more around AI, I came across this snippet from a recent post by Sayah Kapor, which initially felt really counter intuitive: Paradoxically, smaller models require more training to reach the same level of performance. So the downward pressure on model size is putting upward pressure on training compute. In effect,…
-
How do I track the direct environmental impact of my own inference and training when working with AI?
I can’t be the only person who is in a situation like this: These feel like things you ought to know before you start a project, to help decide if you even should go ahead, but that for many folks, that ship has sailed. So, I’ll try unpacking these points, and share some context that…