Over the weekend, I used Claude Code agent teams to rewrite a popular open-source scientific simulation package in optimized Python. Refactoring resulted in roughly a million-fold reduction in wall-clock time and a 15-30% improvement in worst-case numerical accuracy. Collaborators are scaling to multi-year global simulations that would have taken years on a high-performance computing cluster.Continue reading “How I Use Claude Code Agent Teams for Scientific Computing”
Author Archives: acwatt
Getting Claude Code Agent Teams Working on WSL
Claude Code’s Agent Teams feature lets you run multiple Claude instances simultaneously on a single task — one acting as a coordinator, others working in parallel on different pieces. For researchers this means things like running a literature-style review from multiple angles at once, or having separate agents tackle independent parts of an analysis. It’sContinue reading “Getting Claude Code Agent Teams Working on WSL”
Structured LLM Prompting
TL;DR: Scroll down to the My Prompting Strategy section to copy-pasta my current workflow. Are you having a hard time getting what you expect out of an LLM? Are you giving it a bunch of files and it’s getting lost in the task? Are you giving it multiple tasks and it is only completing someContinue reading “Structured LLM Prompting”
Post-transformer Architecture
There are new AI architectures that are coming out that may be the next generation of new foundation models. I’m going to keep a running list of the promising ones here. The Dragon Hatchling (BDH) https://www.rohan-paul.com/i/175334051/the-dragon-hatchling-the-missing-link-between-the-transformer-and-models-of-the-brain This new architecture was published in September 2025 and claims to be a closer representation of the brain. OneContinue reading “Post-transformer Architecture”
Feature Engineering with Geospatial Data: Binned Statistics in GEE
In quantitative research, generating novel features from alternative datasets is often a primary source of identifying variation and predictive performance. Geospatial data, such as weather patterns and agricultural metrics, provides a rich source for these signals. A common feature engineering task is to aggregate one spatial dataset by the discrete bins of another—for example, calculatingContinue reading “Feature Engineering with Geospatial Data: Binned Statistics in GEE”
Learning Resources
General Learning Resources Statistical Learning and Computer Science Economics Topics Public Policy Career Topics Innovation
Retrieval Augmented Generation (RAG)
In short, RAG is a style of LLM usage where you give the LLM more information on top of your prompt. Between prompt engineering and RAG, you can dramatically increase the ability of the model to predict an accurate response. This can be in the form of internet searches the agent performs automatically (like GeminiContinue reading “Retrieval Augmented Generation (RAG)”
Programming with LLMs (not just generating code with a chatbot)
This page is about using LLM APIs in a programming project. Not to be confused with generating and editing code using a chatbot. Under construction: this is currently a place for me to dump links and quick thoughts. It might turn into a real post one day. Resources Python to use LLM libraries Python isContinue reading “Programming with LLMs (not just generating code with a chatbot)”
Transformer Architecture
This is a page to store resources and thoughts about transformer architecture and its applications. Components of the Transformer Architecture Tokenization The tokenization step takes the raw input and partitions it into tokens, bit-sized chunks of the data. Tokens are the unit of analysis of the transformer model, and the universe of all possible tokensContinue reading “Transformer Architecture”
LLM Tips and Tools
Large language models (LLMs) utilize a form of neural network architecture to learn the relationships between words and predict the next word in a sentence (more general than words with multimodal models but words and sentences are a fine mental model to have). Below are some resources and tips I am collecting to better useContinue reading “LLM Tips and Tools”