I’m a Software Engineer and Computational Physics PhD, passionate about building sustainable software to deliver business value driven by the why, not the what or how. I’ve mostly worked at the intersection of advanced mathematics, ML/AI, and physical modelling with cloud and distributed computing.
These days, my passion is software and system architecture. AI, VR, AR, and IoT are revolutionary technologies with transformative possibilities, and this is where I want to focus in the years ahead!
My areas of expertise include: numerical programming, parallel programming & distributed computing, HPC, backend & data engineering, devops, data analysis & visualisation, ML/AI, software architecture, and cloud architecture.
📧 Email: [email protected]
👨🏻💻 GitHub: github.com/chrisk314
🔗 LinkedIn: linkedin.com/in/chris-knight-3728a449
Brainpool AI / Tunedd AI, London, UK / remote – (Feb 2024 - present)
Currently I’m working at Brainpool AI building LLM based apps. My main focus is on a spinout venture called Tunedd: a technical due diligence platform targeted at VC investing powered by LLMs and agents based around a conversational RAG core. I'm leading the development of the platform: designing the cloud infrastructure in AWS; DynamoDB database schemas; configuring automation with Gitlab and Terraform; building APIs and microservices with FastAPI and Golang; and working with LLM frameworks (Haystack, LlamaIndex) to build out the core functionality.
OVO Energy, London, UK – (Apr 2021 - Mar 2023)
I developed Python scripts for ingesting and manipulating large volumes of org wide BigQuery, Kafka, and SQL Server metadata into a data catalogue (Datahub); deployed and managed AWS infra with Terraform; worked extensively with GraphQL, forming complex queries and mutations; and automated scripts and processes with Github Actions, Docker tasks, and Slack Apps.
Anglo American, Bristol, UK / remote– (Aug 2023 - Feb 2024)
I led a small team within Data Analytics supporting Digital Twinning use-cases. I designed and, with my team, implemented Python based software libraries and cloud infrastructure for modelling and optimising industrial processes.
The main tool provides a graph theory inspired API with process components, modelled as Python classes, forming the nodes of a graph and component i/o connections forming the edges. Models can be defined in yaml format and executed as Argo Workflows in Kubernetes with data exchange via a message broker. Another library facilitates large scale optimisation of models with Spark in Kubernetes and PostgreSQL. A REST API with a Python client was also created using Azure App Service and Azure Functions for launching and monitoring long running jobs.