Documentation

Quick Start

Welcome to Smarter documentation! This page will give you an introduction to the 80% of Smarter concepts that you will use on a daily basis.

You will learn

  • How to install the Smarter command-line interface
  • What are manifests?
  • How to create a chatbot
  • How to deploy chatbot
  • How to train a chatbot
  • How to test a chatbot
  • How to measure the quality of your chatbot's responses

Step I of IV: Install & Configure the CLI

learning objective: install the cli, add our Api key, and confirm that it works. The Smarter command-line interface is a terminal application that you install on your desktop/laptop computer.

Configure the cli with your smarter Api key, and then use it to create and manage your cloud resources on the Smarter platform.

Step II of IV: Create a ChatBot Manifest

learning objective: download an example Chatbot manifest, customize it to taste, and then save it as a text file on our computer.

You describe smarter cloud resources like ChatBots and Plugins using a smarter Manifest. Manifests are formatted ASCII text files that look like a form. Describe a resource with a Manifest, and smarter will create it for you. Modify you Manifest and re-apply, and smarter will update your resource to match your Manifest. It's a profoundly simple yet powerful concept that gives you the ability to achieve exactly the same caliber of result as a Python-programming data scientist, without actually having to write any code.

** Let's look at an example **

All Manifests, regardless of the kind of smarter resource they describe, have sections for 'metadata', 'spec' and 'status'. Manifests use YAML formatting which has the distinct claim to fame of being both human and machine readable. YAML can be converted to/from JSON, which is a common text serialization format for working with cloud platforms like smarter. Lets take a look at smarter's example `Chatbot` manifest, available both from the cli `smarter manifest chatbot -o yaml`, and from the Smarter docs site, https://beta.platform.smarter.sh/docs/manifest/chatbot/. Both sources are identical. Using any plain text editor, copy-paste the manifest output to a new text file and save it to your local computer's hard drive as, 'my-chatbot.yml'.

The easiest, best way to create a new smarter resource, be it a Chatbot or any other kind of resource, is to obtain and then modify one of smarter's example manifests. At a glance you can see that much of the Chatbot configuration dwells on its look and feel. You can enhance your bot's knowledge using any combination of functions and plugins, which we'll cover the next tutorial. Lets quickly summarize the contents of a Chatbot manifest:

  • apiVersion : this is required, and identifies the layout of the Manifest for the smarter Api backend.
  • kind : this is required, and identifies what kind of manifest this is
  • metadata : all of these fields are required, though none of them have any affect on the look & feel nor the behavior of your chatbot
  • spec : this section contains all of the "specification", which we'll cover item by item
    • apiKey: you can optional restrict access to your chatBot by creating and attaching a smarter Api key. we'll ignore this for now, so our Chatbot will be public access
    • field names that begin with 'app' describe the Look & Feel of your chatbot.
    • field names that begin with 'default' regard the macro settings of the LLM that backs your Chatbot. One of smarter most powerful features is the ability to attach any of a broad range of LLMs from retutable providers like OpenAI, Good, and HuggingFace.
    • subdomain: you can optionally attach a custom vanity domain name to your Chatbot. We'll skip this for now.
    • functions. It is important to recognize that Large language models like ChatGPT are only able to response to inquiries within the context of their training data, and hence, no LLM is capable of providing accurate real-time information for things like weather, sports scores and securities prices. Functions are one of a couple of ways that you can enhance the knowledge of your Chatbot. The example Chatbot loads a special built-in "Get Current Weather" function that provides real-time 7 day weather forecasts for most locations around the world. Smarter provides function results to the LLM as an interim step, so that the LLM can incorporate the data set into its final response to the end user. Note that functions are included in every prompt request, increasing the token count of the requests and thus, the overall cost of the LLM backing service for your Chatbot solution. The LLM, in its sole discretion will decide whether or not to actually use functions that you include in requests. Moreover, the LLM also have complete discretion over whether to incorporate the function results into its final response to the end user. Lastly, an editorial comment. Take note of the appExamplePrompts field, as these are how you convey your Chatbot's special knowledge or capabilities to the end user.
    • plugins: Plugins are a more advanced, more configurable variation of smarter functions. You're really going to love these, but, for the sake of brevity we'll defer to a later tutorial. For now, just delete the entire section.

Step III of IV: Deploy your Chatbot

learning objective: create and verify your new Chatbot

We'll use the cli 'apply' command to create our resource. 'Apply' performs a comparison of the details in your manifest to the sources in your user account. If the resource does not presently exist, then smarter will create it. If on the other hand it does exist, then smarter will perform a detailed comparison of the fields in the 'spec' and if these do not exactly match then smarter will make all changes necessary to get these to match one another.

to deploy: `smarter apply -f my-chatbot.yml`

Your Chatbot is now fully deployed. It is available to the general public, and it will run at scale. All Operations surrounding your Chatbot will be journaled, and you can inspect these by ....

Technically speaking, we've actually combined a couple of steps. We created the Chatbot, but we also set its state to 'deployed', in a single operation. To see a list of your chatbots, you can run the following command: `smarter get chatbots`. And to view the details of your Chatbot, you can run: `smarter describe chatbot EverlastingGobstopper`.

Step IV of IV: Try It Out in the Sandbox

learning objective: get hands on with the smarter web console Sandbox.

Time to get our hands dirty. Lets send some prompts and see what happens. Navigate to the web console sandbox -- ???? --- and, as the prompt text perhaps still reads, "ask me anything". The key difference between the smarter web console Sandbox and a more typical run-time web UX is from inside the Sandbox you can inspect detailed log output of each of the interim steps of a prompt conversation. That is:

  • natural language pre-processing of the prompt messagec
  • smarter Plugin 'selector' analysis
  • full preparation of the prompt
  • interim steps (if any) whereby the LLM makes requests to smarter functions and Plugins
  • receipt of the raw LLM response
  • delivery of the final response to the end user

Complete visibility into these details is immensely helpful during develop of your text prompting solution.