Skip to main content

Get Started ― nlux And ChatGPT via Node.js

This getting started guide will help you to integrate the OpenAI ChatGPT model with the nlux library using Node.js.
We will perform the following steps:

  1. Create an Express.js server that connects to the OpenAI API (steps 1-3)
  2. Create an AI chat component using nlux and connect it to the Express.js server (steps 4-9)

Express.js is a back end web application framework for building RESTful APIs with Node.js.
If you are not familiar with Express.js, you can learn more about it here.

We will also use nlbridge to create the server endpoint that bridges the OpenAI API with the nlux library.
nlbridge is a middleware library created by the nlux team to simplify the integration of LLMs with web applications.


1. Get Your OpenAI API Key

Start by getting a new API key from OpenAI.

  1. If you don't have an account, go to the OpenAI signup page and create an account.
  2. Go to the API keys page
  3. Click the Creat new secret key button
  1. Give your API key a name and click Create secret key
  2. Copy the API key and save it in a safe place. You will need it to configure the OpenAI nlux adapter.

Back-end Node.js Steps


2. Create An Express.js Server

The following example was tested on Node.js v20.11.
We recommend using the latest LTS version of Node.js.
If you already have Node.js + Express.js project set up, you jump to the next section.

Set up a new Node.js and Typescript project

We will start by setting up a new Node.js and Typescript project, and install the dependencies.
Create a new directory for your project, navigate to it, and run the following commands:

npm init --yes
npm install --dev typescript ts-node @types/node @types/express @types/cors
npm install express cors
npx tsc --init

Create a simple Express.js endpoint

Next, we will create a simple Express.js endpoint that returns a welcome message.
Create a file called index.ts and add the following code:

import express, { Express, Request, Response } from 'express';
import cors from 'cors';

const app: Express = express();
const port = 8080;

app.use(cors());
app.use(express.json());

app.get('/', (req: Request, res: Response) => {
res.send('Welcome to nlux + Node.js demo server!');
});

app.listen(port, () => {
console.log(`[server]: Server is running at http://localhost:${port}`);
});

Run the Express.js server

Run your Express.js application using the following command:

npx ts-node index.ts

This will run your development server on http://localhost:8080.
When you navigate to this URL in your browser, you should see the following:

localhost-expressjs-empty-server

Now that we have an Express.js server set up, let's add some LLM capabilities.


3. Setup nlbridge Express.js Middleware

Now we will add a new endpoint to our Express.js app that is powered by @nlbridge/express library.

First, we start by adding the @nlbridge/express library to our project:

npm install @nlbridge/express

Then, modify index.ts file to add a new endpoint:

import {defaultMiddleware} from '@nlbridge/express';

app.post('/chat-api',
defaultMiddleware('openai', {
apiKey: '<YOUR_OPENAI_API_KEY>',
chatModel: 'gpt-3.5-turbo',
}),
);

Make sure to replace <YOUR_OPENAI_API_KEY> with your actual OpenAI API key obtained in step 1.
Then restart your server and you will have a new endpoint at POST http://localhost:3000/chat-api that is powered by OpenAI's gpt-3.5-turbo model, and ready for nlux integration.

It's important to note that the new API is created with post method.
This is a requirement for nlbridge integration.


Front-End Web App Steps

The following steps are specific to your front-end web application (either React JS or JavaScript).
You can use the toggle below to switch between the two platforms.


4. Install nlux Packages

Now that we have a Node.js server running with the nlbridge middleware, we can create a chat component using nlux and connect it to the server.

If you don't have a React JS app set up yet, and you are looking for a quick way to get started, you can use Vite's react-ts template to quickly set up a React JS app.

Set up a React JS project with vite

Use the following npm commands to set up a React JS app with Typescript using Vite's react-ts template:

npm create vite@latest my-ai-chat-app -- --template react-ts
cd my-ai-chat-app
npm install
npm run dev

The last command will start the development server and open the app in your default browser.

You can start by adding nlux and nlbridge adapter to your React JS app using your favorite package manager. At the root of your project, run the following command:

npm install @nlux/react @nlux/nlbridge-react

This will install the @nlux/react and @nlux/nlbridge-react packages.


5. Import Component And Hook

Import the useChatAdapter hook and the AiChat component in your JSX file:

import {AiChat} from '@nlux/react';
import {useChatAdapter} from '@nlux/nlbridge-react';

The AiChat component is the main chat component that you will use to display the chat UI.
The useChatAdapter hook is used to create an adapter for the nlbridge API.


6. Create nlbridge Adapter

You can use the useChatAdapter hook to create a LangServe adapter.
You can optionally import ChatAdapterOptions from @nlux/nlbridge-react to define the type of the options object.

import {useChatAdapter, ChatAdapterOptions} from '@nlux/nlbridge-react';

const adapterOptions: ChatAdapterOptions = {
url: 'http://localhost:8080/chat-api',
};

export const App = () => {
const nlbridgeAdapter = useChatAdapter(adapterOptions);
}

The ChatAdapterOptions interface has one required property: url. This is the URL of the nlbridge endpoint that the adapter should connect to.

In this example, we are connecting the endpoint created in the previous section.


7. Create Chat Component

Now that we have the nlbridge adapter, we will create the chat component and pass the adapter to it.

import {AiChat} from '@nlux/react';
import {useChatAdapter, ChatAdapterOptions} from '@nlux/nlbridge-react';

const adapterOptions: ChatAdapterOptions = {
url: 'http://localhost:8080/chat-api',
};

export const App = () => {
const nlbridgeAdapter = useChatAdapter(adapterOptions);

return (
<AiChat
adapter={nlbridgeAdapter}
promptBoxOptions={{
placeholder: 'How can I help you today?'
}}
/>
);
};

The AiChat component can take several parameters:

  • The first parameter adapter is the only required parameter, and it is the adapter that we created earlier.
  • The second parameter that we provide here is an object that contains the prompt box options. In this case, we are passing a placeholder text placeholder to customize the prompt box.

For full documentation on how to customize the AiChat component, please refer to the AiChat documentation.


8. Add CSS Styles

nlux comes with a default CSS theme that you can use to style the chat UI. There are 2 ways to import the stylesheet, depending on your setup.

Using JSX Bundler

You can import it in your JSX component file by installing the @nlux/themes package:

npm install @nlux/themes

Then import the default theme nova.css in your React component:

import '@nlux/themes/nova.css';

This will require a CSS bundler such as Vite, or Webpack that's configured to handle CSS imports for global styles. Most modern bundlers are configured to handle CSS imports.

Alternatively, you can include the CSS stylesheet in your HTML file.
We provide a CDN link that you can use to include the stylesheet in your HTML file:

<link rel="stylesheet" href="https://themes.nlux.ai/v1.0.0/nova.css" />

This CDN link is not meant for production use, and it is only provided for convenience. Make sure you replace it with the latest version of the stylesheet before deploying your app to production.


9. Run Your App

Your final code will look like this:

import {AiChat} from '@nlux/react';
import {useChatAdapter, ChatAdapterOptions} from '@nlux/nlbridge-react';
import '@nlux/themes/nova.css';

const adapterOptions: ChatAdapterOptions = {
url: 'http://localhost:8080/chat-api',
};

export const App = () => {
const nlbridgeAdapter = useChatAdapter(adapterOptions);

return (
<AiChat
adapter={nlbridgeAdapter}
promptBoxOptions={{
placeholder: 'How can I help you today?'
}}
/>
);
};

You can now run your app and test the chatbot.
The result is a fully functional chatbot UI:

And nlux is handling all the UI interactions and the communication with the nlbridge server.