-
Notifications
You must be signed in to change notification settings - Fork 1.4k
fix: chat fails to load older messages when scrolling to the top #6861
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
|
|
@@ -9,7 +9,6 @@ import updateMessages from './updateMessages'; | |||||
| import { generateLoadMoreId } from './helpers/generateLoadMoreId'; | ||||||
|
|
||||||
| const COUNT = 50; | ||||||
| const COUNT_LIMIT = COUNT * 10; | ||||||
|
|
||||||
| async function load({ rid: roomId, latest, t }: { rid: string; latest?: Date; t: RoomTypes }): Promise<IMessage[]> { | ||||||
| const apiType = roomTypeToApiType(t); | ||||||
|
|
@@ -21,7 +20,7 @@ async function load({ rid: roomId, latest, t }: { rid: string; latest?: Date; t: | |||||
| let mainMessagesCount = 0; | ||||||
|
|
||||||
| async function fetchBatch(lastTs?: string): Promise<void> { | ||||||
| if (allMessages.length >= COUNT_LIMIT) { | ||||||
| if (allMessages.length >= COUNT) { | ||||||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Critical: Batching logic is broken - prevents fetching multiple batches. Changing the check from This defeats the purpose of lines 51-59, which attempt to fetch additional batches when there are insufficient main messages (e.g., when many messages are threaded). The algorithm is designed to ensure at least 🔎 Recommended fixThe original issue appears to be that the "load more" marker at line 79 was rarely created because +const COUNT_LIMIT = 500;
+
async function fetchBatch(lastTs?: string): Promise<void> {
- if (allMessages.length >= COUNT) {
+ if (allMessages.length >= COUNT_LIMIT) {
return;
}And at line 79, use a more practical check: - if (!lastMessageRecord && data.length === COUNT) {
+ if (!lastMessageRecord && data.length >= COUNT) {
const loadMoreMessage = {This allows accumulating up to 500 messages across batches while adding the "load more" marker when we've fetched at least 50 messages (indicating more may be available).
🤖 Prompt for AI Agents |
||||||
| return; | ||||||
| } | ||||||
|
|
||||||
|
|
@@ -77,7 +76,7 @@ export function loadMessagesForRoom(args: { | |||||
| if (data?.length) { | ||||||
| const lastMessage = data[data.length - 1]; | ||||||
| const lastMessageRecord = await getMessageById(lastMessage._id as string); | ||||||
| if (!lastMessageRecord && (data.length === COUNT || data.length >= COUNT_LIMIT)) { | ||||||
| if (!lastMessageRecord && data.length === COUNT) { | ||||||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The condition change may fix the symptom but is coupled with the broken batching logic. The original check However, changing it to -if (!lastMessageRecord && data.length === COUNT) {
+if (!lastMessageRecord && data.length >= COUNT) {📝 Committable suggestion
Suggested change
🤖 Prompt for AI Agents |
||||||
| const loadMoreMessage = { | ||||||
| _id: generateLoadMoreId(lastMessage._id as string), | ||||||
| rid: lastMessage.rid, | ||||||
|
|
||||||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We've done this for a reason…
A few months ago, we had an issue with large threads being used as the last message.
If you test a thread with more than 100 replies as the last message in develop and in this branch, you'll see that this change rolls back the fix.
We must find a smart way to make it work without overloading in long-thread cases, if you want we can try to find a solution together on DM.