Integrating stuff with Chat GPT

Ways to connect people & tools to Chat GPT

1. Google Sheets

See some simple guides below on how to integrate GPT into Google Sheets.

Note that openai.com(Chat GPT) and google accounts(Gmail) are required. See here, here and here on youtube for guides.

2. Microsoft Teams

See this guide for integrating GPT to Microsoft teams

3. Databases via Langchain

Langchain is a tool for integrating Large Language Models with technology components such as chat bots, data stores and document containers. For data stores which are external to the LLMs integration a useful guide on how to merge the two can be found here.

See qabot for an example of a data store integrated to a LLM. Also, a useful guide on how to build this kind of integration can be found here

4. OpenAI APIs

See here for description of the APIs offered by OpenAI for its range of preoducts like GPT, DAll-E, Whisper.

A quickstart guide can be viewed here and a set of more detailed tutorials and samples can be explored.

5. MRKL (Modular Reasoning, Knowledge and Language)

MRKL is a technique for integrated external data stores to work alongside Large Language Models like GPT. See this page from company A121.com for more details and examples of what can be done using their tools.

6. Multiple Content stores (Asana, Elasticsearch, Notion, Google Drive, ...)

Llama.ai is a library for integrating multiple content stores and tools to LLM such as GPT. An example of integrating a postgres database can be found here

[Update]

A simple POC based on use of llama.ai library which involved connecting to Postgres database hosted on AWS Cloud. The database held market price and lending data for multiple crypto markets.

Some data on crypto coins was extracted using SQL and loaded into a custom index which was then shared with GPT via the llama.ai library.

Once the index was created and shared with GPT some questions could be posed against the data shared e.g. "what symbol is linked to the Binance market?" -> Answer was "BNN".

Observations:

  • Specific queries get specific answers. I asked about coins and markets stored in the database which was shared and got some specific, correct answers. More general questions like "how many coins are traded on Bitfinex markets?" did not receive a high quality or specific answer. I am guessing the integration of llama.ai with GPT pre-trained language model is not advanced enough to allow for easy merging of data sources. For example I could get a (qualified) answer for number of coins which was taken from 2021(GPT data cutoff) of 300 coins which could not be replicated by queries run though llama.ai.

  • Its slow. For whatever reasons (my computer, my network connection, llama.ai code, GPT model, OpenAI API performance, ...) queries took several minutes to run which is far slower than issuing queries directly via Chat GPT.

  • Its vague, at times. The answers could be taken from GPT language model OR from the supplied database contents OR both - but its not always clear where an answer or lack of a clear answer was coming from. Some form of attribution could help in this area as serious business decisions cannot be based on non-attributed sources.

  • Its fun. Its fun to be testing these kind of AI (model) to System integrations and seeing what appears to work or not. Like early days building internet or mobile apps there is a lot of discovery plus a few WOW moments to be enjoyed.

Previous
Previous

Uneasy lies the head that wears a Crown*

Next
Next

How do I start using GPT(normal user)