-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add chat components #4
Conversation
3be0b94
to
0d10cb2
Compare
|
||
export const { | ||
reducer, | ||
} = learningAssistantSlice.reducer; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm imagining that this reducer could be added here: https://github.com/openedx/frontend-app-learning/blob/master/src/store.js. That way we don't need to have a separate store.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That seems like the right place.
`${getConfig().CHAT_RESPONSE_URL}/${courseId}`, | ||
); | ||
|
||
const { data } = await getAuthenticatedHttpClient().post(url.href, payload); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should the payload (aka the message list) be in a JSON string?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think because the message_list
is a simple list whose objects' keys map to basic Javascript primitives, it isn't required. But it would be more flexible for the future to send a JSON string because we could represent more complex objects in the POST data. It looks like our REST API should, by default, support JSON, according to our REST API conventions. Since it's not necessary, I think we could either make the improvement now or do it in v2.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I left some comments and questions. I'd like to better understand this code before giving a thumb. Thank you!
src/data/thunks.js
Outdated
|
||
export function getChatResponse(courseId) { | ||
return async (dispatch, getState) => { | ||
const { messageList } = getState().learningAssistantState; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I noticed that in some places we reference getState().learningAssistantState
(as here), and in other places, like Xpert.jsx
, we reference state.learningAssistant
. Shouldn't these be the same?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And, to check my understanding, whatever we choose as the key - learningAssistant
or learningAssistantState
- that should be the key value for the slice reducer in the root reducer in frontend-app-learning
, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm I took a look at the learning MFE and it looks like I should be using state.learningAssistant 🤔 getState() is used, but only in tests. I'll update the component just to be consistent
And yes, your understanding is correct!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So after taking a bit of a deeper look, it looks like getState is what can be used in thunks. Here is an example: https://github.com/openedx/frontend-app-learning/blob/6232b0cb98db4880dff07d25414413b9cb82afbb/src/courseware/data/thunks.js#L155. The key should map to the correct reducer, so I've updated all of them to use learningAssistant
. The other way of accessing state came from looking at other frontend-app-learning components, like https://github.com/openedx/frontend-app-learning/blob/6232b0cb98db4880dff07d25414413b9cb82afbb/src/course-home/outline-tab/OutlineTab.jsx#L33-L36
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it. Thanks for the examples! Yeah, the mismatch between learningAssistant
and learningAssistantState
is what caught my eye, so I'm glad they'll be consistent now.
src/data/slice.js
Outdated
const initialMessage = { | ||
role: 'assistant', | ||
content: 'Hello! What can I help you to understand today?', | ||
timestamp: new Date(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we have any guidance from UX on what the initial message should be? I assume we'll want this to be different (e.g. introduce itself as the Xpert tool, etc.) We can always change it later; just checking if we could add that in now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I see now that we have a different message shown in the frontend than what's being sent to the backend.
Does it even make sense to have an initial assistant
message then? Or can we just send over the initial user
message as the first message? Or should we make the initialMessage
here match what's being shown to the learner in the UI?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No current guidance, but I think it would be fairly easy to update once we get a review.
What do you mean by the message being shown in the frontend being different? The initial message should be part of the message list. Do you mean the prompt is written in a way that the LLM responds with a question first?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was referring to the "Hi, I'm Xpert! Stuck on a concept? Need more clarification on a complicated topic? Let's chat!" message that's part of the Sidebar
component. Originally, I thought we were styling that to look like the assistant's first message to the user, but it looks like that's not being displayed as an actual message from the LLM based on reviewing the hackathon demo - it's just at the top of the sidebar. I was confused why we had two "intro" messages, but I see now that they're used in two different ways. That answers my question. Sorry about that!
I am curious about why we need to send an assistant message at all. Why can't the learner supply the first message? I'd expect the order of roles in a list of messages to be system
, user
, assistant
, user
, assistant
, and so on. I guess it's so that the message flow makes more sense to the user, but I wonder about the impact of actually supplying the assistant's first message to the LLM and how that would affect the conversation. It feels like it's system-prompty in a way, because it's not generated by the LLM and could affect the conversation, but we can always wait and see.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it, thank you for that clarification!
It looks like course explorer isn't sending an initial message after the initial user acknowledgement is shown, so I can remove this.
It may be worth a/b testing at some point to understand if the message being sent has an impact.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we remove the initialMessage
, I think there will be no "intro" message from the assistant to the learner, because the message list is currently rendered based on the Redux state. I worry that would be a worse learner experience, actually. We could add in this message in the ChatBox
component to ensure that a prompt is rendered to the user and that the message isn't sent to the LLM. Do you think that's too complicated, and we should just leave it as is?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's fine to leave as is. We can use the user acknowledgement as an introduction to the tool (this is what course explorer is doing).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand. I thought course explorer wasn't sending an initial message from the assistant based on your earlier comment, but leaving this code as is would send an initial message from the assistant. Am I missing something?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
😅 sorry for the confusion - I've removed the initial message locally, so that's what I meant by "leave as is". My conclusion is that we should remove the initial message, and use the user acknowledgement to explain the tool as is done with course explorer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That makes sense! Thanks for clearing that up.
const { messageList, currentMessage } = useSelector(state => state.learningAssistant); | ||
const chatboxContainerRef = useRef(null); | ||
|
||
useEffect(() => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is this function doing? I think it's scrolling to the appropriate spot in the sidebar when the amount of the chat history visible is smaller than the entire chat history (i.e. there's a scrollbar), but the inclusion of the startTime
, endTime
, and duration
in calculating where to scroll to is confusing to me. Could we leave a comment explaining it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I copied this over but can try to understand what it's doing and leave a comment!
`${getConfig().CHAT_RESPONSE_URL}/${courseId}`, | ||
); | ||
|
||
const { data } = await getAuthenticatedHttpClient().post(url.href, payload); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think because the message_list
is a simple list whose objects' keys map to basic Javascript primitives, it isn't required. But it would be more flexible for the future to send a JSON string because we could represent more complex objects in the POST data. It looks like our REST API should, by default, support JSON, according to our REST API conventions. Since it's not necessary, I think we could either make the improvement now or do it in v2.
|
||
export const { | ||
reducer, | ||
} = learningAssistantSlice.reducer; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That seems like the right place.
0d10cb2
to
4980bda
Compare
@@ -1,5 +1,3 @@ | |||
import { messages as headerMessages } from '@edx/frontend-component-header'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we remove their SCSS file imports (and that of the example index.scss
file) from index.scss too?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I removed the import from index.jsx, is there another use case of of src/example/index.scss being imported somewhere?
The entirety of src/example/index.scss is removed too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry just got what you were talking about - let me remove those!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I left one comment about the index.scss
file, but it looks good otherwise.
4980bda
to
6ad9330
Compare
MST-2031
Add necessary components for chat UI. Rearranged components so that redux pieces were in a separate file. Testing will be added as part of https://2u-internal.atlassian.net/browse/MST-2034.