/BotSpeak

How to Speak Bot

Primary LanguageJupyter NotebookCreative Commons Zero v1.0 UniversalCC0-1.0

BotSpeak

How to contribute to the BotSpeak project

Run some of the phrases we proivde related to semantic linguistic concepts with your favorite image generating large language model and send the image results along with your interpretation of how well the LLM interpeted the phrase to ai at skunks.ai

The semantic linguistic concept phrase list is here

How to Speak Bot

Large language models (LLMs) represent a major advance in artificial intelligence (AI), and are the basis of power tools such as ChatGPT and DALL-E . A large language model is a deep learning algorithm that can recognize, summarize, translate, predict and generate text and other content based on knowledge gained from massive datasets. Large language models are among the most successful applications of transformer models. It’s sometimes that the deep learning models are “just statistics,” and any rogress in AI is illusory with regard to a Turung test like artificial general intelligence (AGI).

Here I take the view that LLMs are very powerful and useful tools, but to use them effectively one needs to understand how they “think.” Just one needs to often learn another language to effectively communicate in another country one needs to learn “botspeak” to effectively use these tools.

To explore botspeak I will, test several LLMs with several formal semantic linguistic concepts and visualize the output. These are concepts that are very naturally learned by very young children. Specifically the LLMs will be prompted with phrases related to the following semantic linguistic concepts:anaphora, ambiguity, binding, conditionals, definiteness, disjunction, evidentiality, focus, indexicality, lexical semantics, modality, negation, propositional attitudes, tense–aspect–mood, quantification, and vagueness.

I tested the phrases with two art bots Dall-E and artspace.ai. Readers are encouraged to test LLMs with phrases related to the semantic concepts themselves. These phrases and our results can be found at the AI Skunkworks BotSpeak GitHub

Anaphora

Anaphora is the use of an expression whose interpretation depends upon another expression in context (its antecedent or postcedent). In a narrower sense, anaphora is the use of an expression that depends specifically upon an antecedent expression and thus is contrasted with cataphora, which is the use of an expression that depends upon a postcedent expression. The anaphoric (referring) term is called an anaphor.

For example, in the sentence Cinderella arrived, but nobody saw her, the pronoun her is an anaphor, referring back to the antecedent Cinderella. In the sentence Before her arrival, nobody saw Cinderella, the pronoun her refers forward to the postcedent Cinderella, so her is now a cataphor (and an anaphor in the broader, but not the narrower, sense). Usually, an anaphoric expression is a pro-form or some other kind of deictic (contextually dependent) expression.[1] Both anaphora and cataphora are species of endophora, referring to something mentioned elsewhere in a dialog or text.

Anaphora (in the narrow sense, species of endophora) a. Cinderella dropped the plate. It shattered loudly. – The pronoun it is an anaphor; it points to the left toward its antecedent the plate.

The Anaphora linguistic concept phrase list that we currently use is: