Hire Me

You should consider hiring me if:

  • You don’t know how to consistently and quickly improve your LLM products.
  • Your LLMs are too expensive or slow.
  • You feel blind re: how good/reliable your LLMs are.
  • You are overwhelmed by tools/frameworks.

What I can do for you

I am skilled in the following areas:

  1. Building domain-specific evaluation systems
  2. Fine-tuning models
  3. Creating dev tools and infrastructure to help you iterate quickly on LLMs
  4. Product strategy, including hiring
  5. Content and writing


I am an independent consultant and am available for hire. You can reach me at hamel@parlance-labs.com. I do both short-term and long-term engagements on a retainer or hourly basis.

Current & Past Clients

As an independent consultant, I have had the privilege of working with the most innovative companies in the space. Before becoming an independent consultant, I worked in the industry for over 20 years, including GitHub, where I led research on LLMs that was a precursor to GitHub Copilot. Below is a sample of clients I’ve worked with recently. I am happy to provide references upon request.


  • Modular: Unified infrastructure (including the Mojo programming language) for AI workloads.
  • Weights & Biases: Experiment tracking for ML, currently helping them with their LLM initiatives.
  • Replicate: A leading platform for generative AI
  • Posit: An ecosystem of data science / machine learning tools.
  • Outerbounds: A general purpose AI/ML platform.


  • Honeycomb: I am currently improving the natural language query assistant.
  • Rechat: I am currently working on Lucy, a conversational AI for real estate agents. I have given a detailed talk about this work here.
  • Answer.ai: I conduct research on new LLM applications and help with product strategy at this industrial AI research lab.

Open Source LLM Projects

  • Axolotl: I am currently a core contributor to axolotl, a library for efficient fine-tuning of LLMs. I have contributed or led a wide array of other projects, which you can read about here.