Reasons not to use chatGPT / GPT-3 API in your products

Reasons not to use GPT-3

In this article, I list reasons not to use GPT-3 API in your products.

The age of commercial AI, available as a cloud API is here. One can call an AI API, like any other Cloud service, and build products around it.

One of the biggest players in this space is OpenAI. Founders are already starting & building profitable companies, built around products enabled by OpenAI APIs.

The possibilities for building new products or integrating OpenAI into existing products are limitless. And this goes beyond for-profit products.

GPT-3 is one of the best language-processing AI models ever created. It can be used for many different use cases.

However, there are certain use cases where GPT-3 is not the right AI to integrate. Here I list some of them

The Opportunity for Application Developers and Startups

This has enabled a big opportunity in democratizing the ability to create AI-enabled end-user products. Till a couple of years back, using ML/AI in your product meant training gigantic models with millions and currently billions of parameters, which frankly is not possible for 99% of people and companies. You could build and train small models but they were not of much use in an end-user product.

With the availability of GPT-3 as an API for example in the text generation and classification field, a single developer can build apps without having the expertise or the computing resources to train expensive AI models. This opens up building AI-enabled apps for a huge range of creators.

Reasons not to use chatGPT / GPT-3 API

No on-premise deployment

GPT-3 is a cloud-based model that can be used only via an API. There is no provision for on-premise deployment. This can be a show-stopper for certain enterprise customers or government agencies who have policies against sending any data outside of their domain. This can be one of the biggest reasons not to use GPT-3

High usage costs

The good engines from GPT-3 API can be pretty expensive. If your use case ends up using lots of tokens for each request but cannot recoup the cost, then GPT-3 might not be the right choice and one of the reasons not to use GPT-3

Not 100% accurate results

GPT-3 is great but it can sometimes produce results that are not 100% accurate. That could be problematic for some use cases, like facts-based output or high-stakes fields like legal work or Healthcare.

Strict OpenAI Terms

Update: OpenAI has relaxed the terms right now. Check out their latest terms.

OpenAI has strict terms of use that requires that you use the API only for approved purposes. For example, using GPT-3 to generate long forms of text in a public-facing application is not allowed. There are a few more of these kinds of requirements. So if your application requires some feature that goes against their terms for a public-facing application, GPT-3 may not be the right choice for you.

Conclusion

Before deciding to use GPT-3, you should consider the various factors mentioned above and see if any of them is a blocker for you to use GPT-3. In the end, it depends on the use case and should be decided on a case-by-case basis.