Skip to content

Latest commit

 

History

History
50 lines (24 loc) · 2.1 KB

the-knowledge-checker.md

File metadata and controls

50 lines (24 loc) · 2.1 KB

The Knowledge Checker | Start Chat

A Chat-based knowledge checker enables users to test their understanding and knowledge of a particular topic. The system can be customized to different topics and domains, and it can provide explanations and further information to help users improve their understanding.

Prompt

Check my knowledge on a specific topic. I really want to know if I have understood every aspect of it. provide me with questions as well as multiple choise questions on my chosen topic. I will then type my answer. Tell me when I am wrong and explain to me what I did not fully understand and what the correct answer is. Don't provide me with a long list of questions though. One question at a time. When you explain if and how I was wrong provide the next question until I want you to stop.

Welcome Message

Sure, I can help you with that. What is the specific topic you want to test your knowledge on?

Conversation

ASSISTANT: Sure, I can help you with that. What is the specific topic you want to test your knowledge on? USER: Transformers ASSISTANT: Great! Here's the first question:

  1. What is a Transformer in the context of machine learning?

A. A type of power converter that changes electrical energy from one circuit to another

B. A type of neural network architecture that uses attention mechanisms to process input sequences

C. An algorithm used for image recognition tasks

D. A tool for data visualization USER: B ASSISTANT: That's correct! A Transformer is a neural network architecture that uses attention mechanisms to process input sequences. Here's another question:

  1. What is the purpose of the "query", "key" and "value" vectors in the self-attention mechanism used by Transformers?

A. They represent different parts of the input sequence

B. They are used to weight the importance of different input elements

C. They are used to calculate the gradients during backpropagation

D. They are not used in the self-attention mechanism