Skip to content
Draft
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 2 additions & 3 deletions app/lib/methods/loadMessagesForRoom.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ import updateMessages from './updateMessages';
import { generateLoadMoreId } from './helpers/generateLoadMoreId';

const COUNT = 50;
const COUNT_LIMIT = COUNT * 10;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We've done this for a reason…
A few months ago, we had an issue with large threads being used as the last message.
If you test a thread with more than 100 replies as the last message in develop and in this branch, you'll see that this change rolls back the fix.

develop your branch
Simulator Screenshot - iPhone 16 - 2025-12-22 at 19 36 54 Simulator Screenshot - iPhone 16 - 2025-12-22 at 19 35 38

We must find a smart way to make it work without overloading in long-thread cases, if you want we can try to find a solution together on DM.


async function load({ rid: roomId, latest, t }: { rid: string; latest?: Date; t: RoomTypes }): Promise<IMessage[]> {
const apiType = roomTypeToApiType(t);
Expand All @@ -21,7 +20,7 @@ async function load({ rid: roomId, latest, t }: { rid: string; latest?: Date; t:
let mainMessagesCount = 0;

async function fetchBatch(lastTs?: string): Promise<void> {
if (allMessages.length >= COUNT_LIMIT) {
if (allMessages.length >= COUNT) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Critical: Batching logic is broken - prevents fetching multiple batches.

Changing the check from COUNT_LIMIT to COUNT prevents the function from ever fetching more than one batch. After the first batch adds ~50 messages to allMessages, any recursive call to fetchBatch at line 58 will immediately return because allMessages.length >= COUNT (line 23) will be true.

This defeats the purpose of lines 51-59, which attempt to fetch additional batches when there are insufficient main messages (e.g., when many messages are threaded). The algorithm is designed to ensure at least COUNT main messages are loaded, but this change caps the total at one batch.

🔎 Recommended fix

The original issue appears to be that the "load more" marker at line 79 was rarely created because data.length === COUNT_LIMIT (500) was rarely true. A better fix that preserves batching:

+const COUNT_LIMIT = 500;
+
 async function fetchBatch(lastTs?: string): Promise<void> {
-  if (allMessages.length >= COUNT) {
+  if (allMessages.length >= COUNT_LIMIT) {
     return;
   }

And at line 79, use a more practical check:

-  if (!lastMessageRecord && data.length === COUNT) {
+  if (!lastMessageRecord && data.length >= COUNT) {
     const loadMoreMessage = {

This allows accumulating up to 500 messages across batches while adding the "load more" marker when we've fetched at least 50 messages (indicating more may be available).

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In app/lib/methods/loadMessagesForRoom.ts around line 23, the batching guard
uses COUNT which stops further batches after the first ~50-message batch; change
the guard back to COUNT_LIMIT so fetchBatch can accumulate up to the larger
limit across multiple batches. Also update the "load more" marker logic around
line 79 to use a check on the fetched batch size (data.length === COUNT) so the
marker is added when a full batch was returned (indicating more messages may
exist), while overall accumulation continues until COUNT_LIMIT or no more data.

return;
}

Expand Down Expand Up @@ -77,7 +76,7 @@ export function loadMessagesForRoom(args: {
if (data?.length) {
const lastMessage = data[data.length - 1];
const lastMessageRecord = await getMessageById(lastMessage._id as string);
if (!lastMessageRecord && (data.length === COUNT || data.length >= COUNT_LIMIT)) {
if (!lastMessageRecord && data.length === COUNT) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

The condition change may fix the symptom but is coupled with the broken batching logic.

The original check data.length === COUNT_LIMIT (500) was indeed problematic - it was rarely true, which likely caused the bug where "load more" markers weren't added, preventing users from scrolling to load older messages.

However, changing it to data.length === COUNT (50) only "works" because line 23 now incorrectly limits the total messages to 50. Once line 23 is fixed to restore proper batching, this condition should use >= instead of === to handle cases where data.length might be slightly different from COUNT due to batching logic:

-if (!lastMessageRecord && data.length === COUNT) {
+if (!lastMessageRecord && data.length >= COUNT) {
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (!lastMessageRecord && data.length === COUNT) {
if (!lastMessageRecord && data.length >= COUNT) {
🤖 Prompt for AI Agents
In app/lib/methods/loadMessagesForRoom.ts around line 79, the equality check
`data.length === COUNT` is fragile and tied to broken batching; change the
condition to `data.length >= COUNT` so "load more" markers are added when the
returned batch meets or exceeds the page size, and ensure this logic works with
the corrected batching on line 23 (restore proper batching limits there) so the
check doesn't rely on exact equality.

const loadMoreMessage = {
_id: generateLoadMoreId(lastMessage._id as string),
rid: lastMessage.rid,
Expand Down
Loading