Add GraphQL Edge Caching to OpenAI

Add GraphQL Edge Caching to OpenAI

Over the past year, breakthroughs in artificial intelligence (AI) have significantly enhanced the capabilities of businesses, transforming everything from note-taking apps to food ordering services and vacation planning platforms. AI integration is becoming an indispensable part of contemporary products and services.

Although OpenAI is at the forefront of AI innovation, it currently does not offer a GraphQL API. Fortunately, you can automatically generate one by leveraging OpenAI's OpenAPI specification. Additionally, you can optimize performance with GraphQL Edge Caching through Grafbase.

The benefits of GraphQL Edge Caching include:

  1. Speed — Get even faster responses and lessen server load with Grafbase Edge Caching.
  2. Flexibility — Combine data from multiple APIs effortlessly using Grafbase Edge Gateway.
  3. Savings — Stay within your API limits and save money.
  4. Insights — Monitor your data in real-time with Grafbase Analytics.

You can also query OpenAI using a custom GraphQL resolver — read more.

Begin by executing the following command inside a new or existing project's directory:

npx grafbase init --template openapi-openai

This command will generate a new folder grafbase in the root of your project.

Next, open the file grafbase.config.ts and make any adjustments.

By default, Grafbase will:

  • Add OpenAI as a data source
  • Cache all queries for 60 seconds
  • Enable public access to the Grafbase Edge Gateway
  • Forward Authorization header to OpenAI
import { config, connector, graph } from '@grafbase/sdk' const g = graph.Standalone() const openai = connector.OpenAPI('OpenAI', { schema: '', headers: headers => { headers.set('Authorization', { forward: 'Authorization' }) }, transforms: { queryNaming: 'OPERATION_ID' }, }) g.datasource(openai) // Disabling namespace may cause conflicts with other connectors // g.datasource(openai, { namespace: false }) export default config({ graph: g, cache: { rules: [ { types: ['Query'], maxAge: 60, }, ], }, auth: { rules: rules => { rules.public() }, }, })

If you'd prefer not to pass the Authorization header with requests from the client, you can also set the Authorization to use an environment variable stored by Grafbase:

const openai = connector.OpenAPI('OpenAI', { schema: '', headers: headers => { headers.set('Authorization', `Bearer ${g.env('OPENAI_API_KEY')}`) }, transforms: { queryNaming: 'OPERATION_ID' }, })

If you don't use header forwarding, make sure to add your OPENAI_API_KEY to the file .env:

# Only if you set the Authorization header with a static value # OPENAI_API_KEY=

Finally, run the Grafbase development server by using the command below:

npx grafbase dev

You now have a GraphQL API running locally that acts as a proxy to OpenAI! 🎉

You can execute any GraphQL query using the new endpoint (locally):

Grafbase Pathfinder can be found at where you can explore the Grafbase Edge Gateway API and schema.

💡 Make sure to commit the grafbase folder with the rest of your application.

You can and should use the Grafbase CLI when building locally (or in a branch) to proxy OpenAI but you will need to deploy to Grafbase to take advantage of GraphQL Edge Caching.

Follow these steps to deploy to production:

  • Signup for a Grafbase account
  • Create a new project
  • Connect and deploy your application where the grafbase was added
  • Make sure to add your OPENAI_API_KEY when deploying, unless you made it optional
  • Update your host (Netlify, Vercel, Fly, etc.) with the new GraphQL API endpoint that Grafbase supplied for your new project.

That's it!

Grafbase is programmed to autonomously deploy a fresh gateway each time it identifies a change to grafbase.config.ts. Consequently, if you need to adjust any cache settings, including parameters like maxAge, staleWhileRevalidate, and mutationInvalidation, you're free to do so.

Grafbase will handle the rest of the process seamlessly. We'll explore extending the OpenAI API with custom fields in another post.

Get Started

Build your API of the future now.