Casting An AI "Spellbook": The Powerful, Low-Cost, Continuous Improvement AI Strategy For Small, Fast-Moving Companies

Loading...
Loading...

Developing a proprietary AI large language model like OpenAI (ChatGPT), Google, Meta and other tech giants is a scale game where the stakes are counted in the billions. But that shouldn’t stop smaller, more entrepreneurial companies from deploying AI strategies now that can drive business results and grow enterprise value in the near term. 

The good news is the smartest strategy might prove to be both the least capital-intensive and the most nimble. And the results can be protected against competitors by adapting time-tested intellectual property strategies to AI.

Short of shelling out $700,000 per day for server farms and electricity like OpenAI is for ChatGPT, there are three primary AI strategies that small and mid-size companies might consider pursuing. They can:

  • Develop a proprietary AI model of their own based on their own extensive business data;
  • Work to create highly engineered, highly complex “expert-in-a-box” prompts that use an existing AI engine such as ChatGPT to perform complicated, extensive business tasks; or 
  • Consider the pursuit of what are being called “spellbooks” (think Harry Potter rather than spelling bee) that consist of smaller sets of simpler prompts that automate modest but important parts of a business process rather than attempting to knock the whole thing out in one swoop.

I’ll make the argument that given where the technology, the capital markets, and intellectual property law stand today, spellbooks are where all but the largest and most deep-pocketed companies should be spending most of their time and treasure.

Why? I’ll run down the power of spellbooks in a moment. First though, let’s look at the challenges of the first two options above.

A proprietary AI model is something that significantly sized organizations possessing large relevant data sets may consider. That data could be customer records, manufacturing data, routing data, service records, process maps – information that’s available only to that company and which may contain patterns or linkages that might provide important insights when an AI engine decodes what those linkages are.

The good news is, if your company is the only one possessing that information, you manage to get the AI engine working right, and it finds important/valuable linkages in the data, you’re golden. The bad news is you’re likely to be several million dollars in before you know what you’ve got. One AI strategist came up with a number of nearly $5 million to build a “foundation model” in-house, and even if your company outsources the task or has a more modest approach (such as fine-tuning an existing model), the outlays will still run in the millions. The returns: Uncertain.

If you’ve been reading about OpenAI’s recent release of customizable “GPTs” providing the equivalent of a proprietary model at a fraction of the cost, think again. GPTs, while capable of being tailored to specific tasks and business data, won’t give you the degree of business customization that a truly personalized model will provide. You still get what you pay for. On top of that, by uploading volumes of business data into a GPT to teach it about your organization, you’re simultaneously providing that data to OpenAI to train its next-generation models in what may be the biggest land grab of user data since the advent of “user-generated content” at the start of Web 2.0.

The second option, the highly engineered “uber prompt” that attempts to perform an entire business process in one pass, suggests that with enough process understanding, enough “prompt engineers,” enough time, and enough money, you might be able to construct a silver bullet-style prompt so comprehensive and so capable that a huge portion of your business value could ride upon it. 

That could work, and some companies have even sprung up that are, in the end, based on what amounts to a single, highly engineered prompt. But other experts have argued that prompt engineering is a fleeting phenomenon – that the growing capabilities of AI mean that problem formulation is more important than a highly designed AI prompt.

Be aware, though, that there remains a very widespread belief that these engineered uber-prompts can be very valuable and are the way of the future. But that doesn’t jibe with what I’m seeing – there are too many parts of this equation moving at too high a speed and at too high a cost for me to believe that perfect prompt engineering alone is enough. Too much is likely to be flamed off in pursuit of a rapidly moving target.

That’s why I have come to believe that one of the best strategies for businesses examining their next steps in an AI-powered world might be to develop its own internal library of reusable prompts – spellbooks -- that can be used to perform important parts of its company’s business processes, without the need for massive upfront investment or the ability to succeed at capturing all of the complexities of an entire business process in a single prompt. 

A company’s spellbook would ultimately consist of a large number of much less powerful prompts (or similar AI components, such as GPTs), each of which performs a modest task but does it well and in the unique style of the business. Spellbook creation could become a kind of AI throwback to the once-vaunted Japanese kaizen, with employees being encouraged to create prompts that improve the efficiency and effectiveness of their work, and in which the best prompts are identified and shared throughout the organization. 

Rather than mounting a multi-million-dollar, consultant- and vendor-laden process attempting to recreate ChatGPT or to re-engineer the business around a single, massive prompt, spellbooks are oriented toward picking off the low-hanging fruit of labor-intensive, data-intensive, or otherwise troublesome tasks or processes that could flow more smoothly, quickly or cheaply with AI-powered automation, especially when they are propagated throughout the organization for use by everyone.

Critically, these spellbooks can be reused many times without revealing them to the outside world. When used in this way they become a trade secret – something that only your company knows and that gives you a competitive edge. 

And trade secrets can afford durable protection for the creation of enormously valuable assets. Examples of trade secrets: the Coca-Cola and KFC recipes, Google’s search and Tik-Tok’s content-serving algorithms, and literally millions of other less well-known examples. 

You can retain and protect spellbooks as you would any other trade secret, with non-disclosure and employee invention agreements. That’s critical because it gives your company a potential path to keeping its AI-based capabilities out of the public domain and working for your business in a proprietary and value-building way. 

None of this should feel alien, either. Most companies’ internal processes and private data are protected (or should be!) as trade secrets. Many of those processes are stored in the form of written documentation (e.g., procedure manuals, checklists, document templates) that describe processes that people now conduct manually. Spellbooks provide a way to gradually transform that documentation from "dumb" or "dead" documentation into "live" or "self-executing" procedures that can be performed automatically by AI. 

By tackling the automation challenge in small pieces using individual spells in a spellbook, the organization can get returns quickly with modest incremental investments, thereby also minimizing risk, and keep implementing more and more spells over time with incremental investments, each of which provides a concrete business return on investment as the next best part of the process to automate is identified.

Spellbooks that assist or even take over parts of your core business process are also a way to get real business traction with AI – to get in the game and explore current and possible future capabilities and uses. The people who run a process at your company are likely to understand the problem that process solves, and that understanding is the basis upon which AI can be queried to perform tasks.

The methods for protecting trade secrets are tried and true and can be part of a solid intellectual property strategy. There are plenty of lawyers like me who understand what can and should be kept as a trade secret versus what should be patented, what a trade secret is and what documentation is needed for employees and corporate records to wall them off, and what elements of your business are commodities that you don’t need to worry about.

In short, the best way to get in the AI game might be the simplest, least expensive, most participatory, and most grounded in reality of any of the strategies companies are examining today. Do your homework, tap into your frontline leaders and team members, get a ChatGPT subscription and get to work. And lay the intellectual and legal foundations to keep your AI spells doing their magic for your business for generations to come.

This article is from an external contributor. It does not represent Benzinga's reporting and has not been edited for content or accuracy.

Market News and Data brought to you by Benzinga APIs
Comments
Loading...
Posted In: OpinionSmall BusinessTechartificial intelligencecontributors
Benzinga simplifies the market for smarter investing

Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.

Join Now: Free!

Loading...