🤔 Welcome to Alex’s Website!

Greetings! I am Alex, and you are reading my website ~

  • I am an artificial intelligence engineer / researcher and general computer tinkerer.
  • I’ll be documenting my learning notes here, as well as anything else that catches my fancy.
  • You can read more about me here!

TchAIkovsky - Piano MIDI Generation with Transformers

I’ve been learning about machine learning on-and-off since about 2017. I first became interested in the field after stumbling across Andrej Karpathy’s classic blog, The Unreasonable Effectiveness of Recurrent Neural Networks. How exactly I came across it is lost to time, however I remember being struck with how impressive (at the time) the outputs were. My programming experience then was limited, so seeing programs capable of generating things – learning from data alone – was eye-opening....

Making Mac Bearable

Introduction Ilkley Moor A few weeks ago I was made redundant along with a decent chunk of the company, after just under two months being employed there. Thanks to some odd technicalities, I was able to buy back my work laptop for a very low price. This was an M2 Air which ordinarily I would never use unless I had to, mostly due to the locked down nature of Mac that makes it difficult to customise exactly to my tastes, and also a dislike of the software it ships with....

Sentence Mining with OpenAI's Whisper

Online, I tend to market myself as an AI / computery guy, creating and following content related to these areas. However, I have other interests and passions to tinker with – unfortunately too many to actually dedicate lots of time to all of them, with the exception of one. The One is language learning, specifically learning Chinese, which I have been generally consistent with. It has been a long road, starting in late 2018 but with large breaks and misguided routes taken along the way....

Easy JAX training loops with Flax and Optax

In my previous blog post, I discussed JAX – a framework for high performance numerical computing and machine learning — in an atypical manner. I didn’t create a single training loop, and only showed a couple patterns that looked vaguely machine learning-like. If you haven’t read that blog post yet, you can read it here. This approach was deliberate as I felt that JAX — although designed for machine learning research — is more general-purpose than that....

On Learning JAX – A Framework for High Performance Machine Learning

Recently, I took part in the Huggingface x Google Cloud community sprint which (despite being named a ControlNet sprint) had a very broad scope: involve diffusion models, use JAX, and use TPUs provided for free by Google Cloud. A lot of cool projects came out of it in a relatively short span of time. Our project was quite ambitious: to take my master’s dissertation work on combining step-unrolled denoising autoencoders (loosely adjacent to discrete diffusion models) with VQ-GAN, porting it all to JAX, then adding support for text-conditioned generation....