Enterprise essentials for generative AI

Portability or ‘don’t marry your model’

Andy Oliver is right: “The latest GPT, Claude, Gemini, and o-series models have different strengths and weaknesses, so it pays to mix and match.” Not only that, but the models are in constant flux, as is their pricing and, very likely, your enterprise’s risk posture. As such, you don’t want to be hardwired to any particular model. If swapping a model means rewriting your app, you only built a demo, not a system. You also built a problem. Hence, successful deployments follow these principles:

  • Abstract behind an inference layer with consistent request/response schemas (including tool call formats and safety signals).
  • Keep prompts and policies versioned outside code so you can A/B and roll back without redeploying.
  • Dual run during migrations: Send the same request to old and new models and compare via evaluation harness before cutting over.

Portability isn’t just insurance; it’s how you negotiate better with vendors and adopt improvements without fear.

Things that matter less than you think

I’ve been talking about how to ensure success, yet surely some (many!) people who have read up to this point are thinking, “Sure, but really it’s about prompt engineering.” Or a better model. Or whatever. These are AI traps. Don’t get carried away by:

Donner Music, make your music with gear
Multi-Function Air Blower: Blowing, suction, extraction, and even inflation

Leave a reply

Please enter your comment!
Please enter your name here