Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Response time for NextJS version seems too high #1118

Open
smoya opened this issue Jun 13, 2024 · 32 comments · May be fixed by #1121 or #1122
Open

Response time for NextJS version seems too high #1118

smoya opened this issue Jun 13, 2024 · 32 comments · May be fixed by #1121 or #1122
Labels

Comments

@smoya
Copy link
Member

smoya commented Jun 13, 2024

Description

I'm working with @helios2003 in #224. In particular, I'm its mentor through GSoC2024.
As part of such project, we evaluated the possibility to run the AsyncAPI Parser-JS and parse AsyncaPI documents loaded via the base64 query param.
@helios2003 made a test measuring response time with and without the addition of the parsing (which would made the page from being statical to ssr). Our surprise came when we realized the NextJS version hosted in #224 took ~4 seconds to just serve the HTML for the Studio page (without even loading any doc on it!), just plain /.

Today I decided to run another test, and confirmed the findings. However, a weird caching mechanism is happening.

Let me share with you 3 consecutive requests I made and the results:

Request # Cache-Status Cache-Status meaning Seconds total
1st [Next.js; hit, Netlify Edge; fwd=miss] Missed Cache at Netlify level, Hit Cache of NextJS 4.072090
2nd [Netlify Edge; hit] Hi Cache at Netlify level, meaning response is served from Netlify edge directly 0.123647
3rd [Next.js; hit, Netlify Edge; fwd=miss] Missed Cache at Netlify level, Hit Cache of NextJS 0.572575

My assumption and understanding is that, for some reason:

  • During the first request there is stuff being rendered at server level dynamically (runtime) instead of something precompiled at build time.
  • The second request does a hit in Netlify Edge cache, the reason why it takes an insignificant time to give a response since is not even reaching Studio app.
  • The third request hits another Netlify edge server which still has no cached content (normal), then the request reaches the NextJS app and, in this case, the content is finally served by the NextJS cache.

The point is that I don't expect the first call to take 4 seconds but just as the third request since a request made to the root page should always give the same response (static). Besides that, no clue why the cache-status header in the 1st request saying the content was served as cached from NextJS.

cc @Amzani @helios2003 @KhudaDad414

@aeworxet
Copy link
Contributor

https://asyncapi-studio-studio-next.vercel.app is built with output: 'standalone'.

Please check if it has the same behavior.

This build includes f5676ec feat: add og tags to studio-next (#1106).

@helios2003
Copy link
Contributor

Iteration Time taken (milliseconds)
1st 569.98
2nd 49.12
3rd 202.69
4th 37.65
5th 160.47

Yes, in your case @aeworxet the time taken is much less for even the first hit. The output: standalone is what makes the difference I guess. Also the URL used is same as mentioned here: #224 (comment)

@helios2003
Copy link
Contributor

@aeworxet's instance is deployed on Vercel, when I try adding the same optimization he mentioned in Netlify the response time reduces significantly for me as well. Thanks @aeworxet

My instance can be found here: https://studio-helios2002.netlify.app

@smoya
Copy link
Member Author

smoya commented Jun 14, 2024

@aeworxet's instance is deployed on Vercel, when I try adding the same optimization he mentioned in Netlify the response time reduces significantly for me as well. Thanks @aeworxet

That doesn't seem to happen. Those are the times when requesting your site:

---
time_lookup: 0.013921
time_connect: 0.040744
time_appconnect: 0.070180
time_pretransfer: 0.070328
time_redirect: 0.000000
time_starttransfer: 3.804182
———
time_total: 3.831893

The diff in terms of the response between your app and @aeworxet is the fact the second is always cached: "x-vercel-cache":["HIT"]

I guess we should do some local testing first rather than relying on CDN.

@helios2003
Copy link
Contributor

Are those the times to load the entire page? By less time I meant to fetch the initial HTML that is server-side rendered and contains the meta tags.

@helios2003
Copy link
Contributor

Because these are times I see on my side:

Iteration Time (milliseconds)
1st 867.38
2nd 105.30
3rd 84.85
4th 84.92
5th 113.89

@smoya
Copy link
Member Author

smoya commented Jun 14, 2024

Are those the times to load the entire page? By less time I meant to fetch the initial HTML that is server-side rendered and contains the meta tags.

What I shared is the time of a curl request made against your site. The time_total is the time of the whole curl execution. the time_starttransfer is the time it passed until the server started serving content.

Because these are times I see on my side:

Aren't you hitting cache?

@helios2003
Copy link
Contributor

Because these are times I see on my side:

Iteration Time (milliseconds)
1st 867.38
2nd 105.30
3rd 84.85
4th 84.92
5th 113.89

These are the times I got for a fresh document itself, so I don't think I am hitting the cache and also the time for the first response is much higher than the next 4.

@smoya
Copy link
Member Author

smoya commented Jun 14, 2024

https://studio-helios2002.netlify.app

What do you mean by first response? The first response after a deployment?

@helios2003
Copy link
Contributor

Nope, I mean that I send 5 requests continuously to the site, and the time it takes to obtain the meta-tags for each time is what I am calling a response.

@smoya
Copy link
Member Author

smoya commented Jun 14, 2024

Ok, meta-tags. Interesting. I'm just doing a curl. Not sure what the client you use is doing. Anyway, the issue is still there.

For the record, sharing two responses and their headers. As you can see there is no difference, not even in cache headers.

The first request was made this early morning, the second right after the first:

1st

response_code: 200
headers: {"age":["35139"],
"cache-control":["public,max-age=0,must-revalidate"],
"cache-status":["\"Next.js\"; hit","\"Netlify Edge\"; fwd=miss"],
"content-type":["text/html; charset=utf-8"],
"date":["Fri, 14 Jun 2024 05:09:43 GMT"],
"etag":["\"3wn78und5ueoa\""],
"netlify-vary":["header=x-nextjs-data|x-next-debug-logging|RSC|Next-Router-State-Tree|Next-Router-Prefetch|Accept-Encoding,cookie=__prerender_bypass|__next_preview_data"],
"server":["Netlify"],
"strict-transport-security":["max-age=31536000; includeSubDomains; preload"],
"vary":["RSC,Next-Router-State-Tree,Next-Router-Prefetch,Accept-Encoding"],
"x-content-type-options":["nosniff"],
"x-nextjs-date":["Fri, 14 Jun 2024 05:09:43 GMT"],
"x-nf-request-id":["01J0AJDH1HTFWZK1YF8AAYZKSB"],
"x-powered-by":["Next.js"],
"content-length":["19062"]
}
---
time_lookup: 0.015968
time_connect: 0.045507
time_appconnect: 0.076425
time_pretransfer: 0.076563
time_redirect: 0.000000
time_starttransfer: 3.505065
———
time_total: 3.532842

2nd

response_code: 200
headers: {"age":["35548"],
"cache-control":["public,max-age=0,must-revalidate"],
"cache-status":["\"Next.js\"; hit","\"Netlify Edge\"; fwd=miss"],
"content-type":["text/html; charset=utf-8"],
"date":["Fri, 14 Jun 2024 05:16:34 GMT"],
"etag":["\"3wn78und5ueoa\""],
"netlify-vary":["header=x-nextjs-data|x-next-debug-logging|RSC|Next-Router-State-Tree|Next-Router-Prefetch|Accept-Encoding,cookie=__prerender_bypass|__next_preview_data"],
"server":["Netlify"],
"strict-transport-security":["max-age=31536000; includeSubDomains; preload"],
"vary":["RSC,Next-Router-State-Tree,Next-Router-Prefetch,Accept-Encoding"],
"x-content-type-options":["nosniff"],
"x-nextjs-date":["Fri, 14 Jun 2024 05:16:34 GMT"],
"x-nf-request-id":["01J0AJT4HJ7V04BBH4Z48QQP9J"],
"x-powered-by":["Next.js"],
"content-length":["19062"]
}
---
time_lookup: 0.014502
time_connect: 0.041648
time_appconnect: 0.071193
time_pretransfer: 0.071333
time_redirect: 0.000000
time_starttransfer: 0.530230
———
time_total: 0.558090

Here is the diff:

2c2
< headers: {"age":["35139"],
---
> headers: {"age":["35548"],
6c6
< "date":["Fri, 14 Jun 2024 05:09:43 GMT"],
---
> "date":["Fri, 14 Jun 2024 05:16:34 GMT"],
13,14c13,14
< "x-nextjs-date":["Fri, 14 Jun 2024 05:09:43 GMT"],
< "x-nf-request-id":["01J0AJDH1HTFWZK1YF8AAYZKSB"],
---
> "x-nextjs-date":["Fri, 14 Jun 2024 05:16:34 GMT"],
> "x-nf-request-id":["01J0AJT4HJ7V04BBH4Z48QQP9J"],
19,22c19,22
< time_lookup: 0.015968
< time_connect: 0.045507
< time_appconnect: 0.076425
< time_pretransfer: 0.076563
---
> time_lookup: 0.014502
> time_connect: 0.041648
> time_appconnect: 0.071193
> time_pretransfer: 0.071333
24c24
< time_starttransfer: 3.505065
---
> time_starttransfer: 0.530230
26c26
< time_total: 3.532842
---
> time_total: 0.558090

@helios2003
Copy link
Contributor

Even though in my case as well the entire document does need to be fetched this is the script I used for anyone wanting to try it out: https://gist.github.com/helios2003/2fdb65377a8b1580b91464cbc7a1d974

@smoya
Copy link
Member Author

smoya commented Jun 14, 2024

The theory that is gradually becoming more solid in my head is that we are always SSR. And that makes sense, because afaik NextJS is set up by default to SSR (even for static pages), then rely in cache.
Since SSR in Netlify happens in Netlify functions (serverless), the first request requires a function cold start, which takes a lot.
The rest, are either cached by the Edge or we find the serverless function to be warmed up.

Post backing up someho my theory: https://answers.netlify.com/t/slow-initial-load-time-on-ssg-with-nextjs/46384/3

@smoya
Copy link
Member Author

smoya commented Jun 17, 2024

The theory that is gradually becoming more solid in my head is that we are always SSR. And that makes sense, because afaik NextJS is set up by default to SSR (even for static pages), then rely in cache. Since SSR in Netlify happens in Netlify functions (serverless), the first request requires a function cold start, which takes a lot. The rest, are either cached by the Edge or we find the serverless function to be warmed up.

Post backing up someho my theory: https://answers.netlify.com/t/slow-initial-load-time-on-ssg-with-nextjs/46384/3

In order to validate this theory, I believe measuring the time since the request hits NextJS router until it serves the response should tell us the time spent on processing the request. The rest, would be time of spinning up such a serverless function.
Note that my NextJS knowledge is close to zero and I'm just assuming the SSR happens before hitting NextJS router. If it's not the case, we should test accordingly.

@smoya
Copy link
Member Author

smoya commented Jun 17, 2024

Additionally, can we check if we are using NextJS Runtime at Netlify? I have no permission to see build logs at https://app.netlify.com/sites/studio-next 🤷 .

Build logs should show something like
image

EDIT: Can you @helios2003 confirm https://studio-helios2002.netlify.app has the Netlify NextJS runtime enabled? So we can discard this as possible solution.

@helios2003
Copy link
Contributor

@smoya Yes, the NextJS runtime is enabled in https://studio-helios2002.netlify.app/.

image

@helios2003
Copy link
Contributor

A difference that I notice in Netlify's version and Vercel's version is the build cache. The attached image shows the build logs of the Vercel deployment.

image

I believe Studio maintainers can verify if this cache is being generated on the Netlify deployment. Because from what I observe, Netlify creates a NextJS cache but not a build cache.

@KhudaDad414
Copy link
Member

KhudaDad414 commented Jun 20, 2024

@smoya I did some testing and I think your theory is right.
I don't think there is any Static Pages in nextjs now.
there is always a runtime involved and by Static they mean we are going to cache this page for you and anytime we receive a request we are going to serve the page. not that We are going to generate a separate HTML page for it and it is servable with a CDN
Which means a Runtime is ALWAYS THERE on the server to decide what to serve, what parts to cache.
We will encounter cold starts and some extra time for backend to see if it should serve a cache or no.

BTW, we need this right? we need some part of the page to be rendered on the server so we can add the OpenGraph metadata?

@helios2003
Copy link
Contributor

@KhudaDad414, can you tell why we aren't caching certain components during the build time itself and following Static Site Generation (SSG).
Also, in the previous comment do you mean that in the first request the page is entirely rendered on the server?
Ref: https://nextjs.org/docs/pages/building-your-application/rendering/static-site-generation
Also is there a build cache being uploaded in the production deployment?

@jerensl
Copy link

jerensl commented Jun 21, 2024

The theory that is gradually becoming more solid in my head is that we are always SSR. And that makes sense, because afaik NextJS is set up by default to SSR (even for static pages), then rely in cache. Since SSR in Netlify happens in Netlify functions (serverless), the first request requires a function cold start, which takes a lot. The rest, are either cached by the Edge or we find the serverless function to be warmed up.

Post backing up someho my theory: https://answers.netlify.com/t/slow-initial-load-time-on-ssg-with-nextjs/46384/3

Hi @smoya, I'm new to the AsynApi community, but I disagree with some of your points here. Netlify Edge had used deno deploy which relies on V8 isolates and V8 isolates have been known for a fast start even when a cold start happens, it's different from what we see in virtual machines. My assumption here is probably related to some CSS package being downloaded during the initial startup, I will try to set the bundle analyzer and see what happens there

Reference:
https://news.ycombinator.com/item?id=31912582
https://www.netlify.com/blog/deep-dive-into-netlify-edge-functions/
https://deno.com/blog/anatomy-isolate-cloud

@KhudaDad414
Copy link
Member

KhudaDad414 commented Jun 21, 2024

@helios2003

can you tell why we aren't caching certain components during the build time itself and following Static Site Generation (SSG).

The whole page is statically generated and cached currently. not just some components. we would be able to make the page generation dynamic in the future because we have to generate OpenGraph metadata at some point.

First Scenario: No cache at CDN level (Netlify Edge) and had to cold start the server and get the Next.js cache.
Screenshot 2024-06-21 at 15 13 46

Second Scenario: cache at CDN level (Netlify Edge)
Screenshot 2024-06-21 at 15 17 51

Second Scenario: No cache at CDN(Netlify Edge) level but cache at Next.js.
Screenshot 2024-06-21 at 15 34 31

@jerensl I don't think the problem is with downloading some CSS. As you can see in the above examples the wait time increases in the Server Processing stage which is before any download begins.

The problems

  1. When we miss the CDN(Netlify Edge) cache, the lambda function(or whatever Netlify uses) needs a cold start, that's why we are getting the 4sec time in the first request. there is nothing we can do here as far as I know. 🤷

  2. CDN(Netlify Edge) cache misses randomly: the default behaviour should work fine. but for some reason, it doesn't.

Based on some tests that I have done on my fork, hosted here we can resolve this issue by having custom cache options in the response header:

  # Netlify CDN should keep the cache for 100 days.
  CDN-Cache-Control: public, max-age=3640000, must-revalidate
  # Other Layers (including browser) shouldn't do any caching.
  Cache-Control: public, max-age=0, must-revalidate

After we add those headers the cache-status will be:
"Netlify Edge"; fwd=stale: caching in progress in the current node in CDN. so they have to hit the Next.js.
"Netlify Edge"; hit: content is being served from CDN.

@jerensl
Copy link

jerensl commented Jun 22, 2024

@KhudaDad414
@jerensl I don't think the problem is with downloading some CSS. As you can see in the above examples the wait time increases in the Server Processing stage which is before any download begins.

Yeah, you are right there something related to Server Processing stage

The problems

  1. When we miss the CDN(Netlify Edge) cache, the lambda function(or whatever Netlify uses) needs a cold start, that's why we are getting the 4sec time in the first request. there is nothing we can do here as far as I know. 🤷

But I think we can do something with 4sec coldown. Let me explain, after checking using bundle analyzer, I found monaco-editor have been use both in the client side and nodejs(server side)
Server bundle
Client

And I'm running another test on this website https://studio-helios2002.netlify.app/, and found there are long running task in main thread related to monaco which identical to the cold start in nodejs(see the red arrow)
Network
Task

And then I'm checking the code where monaco being declared using web worker
Web Worker Code

Why there are no web worker tasks running here?
Web Worker

Conclusion

  1. Running the monaco editor using Web worker is not working here but what happened is it's running on the main thread which blocks around 4-6 seconds
  2. Client side component on the app router not mean the code will be rendered on the client, but it will render on the server and then in the client with help of hydration, that's mean it would run twice

@jerensl
Copy link

jerensl commented Jun 23, 2024

After making some contributions on Modelina, I realized they did it so well with monaco-editor, shout out to the maintainers there, they did a great job
screenshot-1719119300991
Ideally, Monaco Editor should be run in the worker and not block the main thread as in the image above, so the user gets their content first and does not block the render.

One thing I realized between Modelina and Studio is:

  1. Modelina use Pages router (By Default Client Side)
  2. Studio use App Router (By Default Server Side)

Let's check Theo's video here, he explains so well why App Router will mostly get a cold start and offers a solution to that problem https://www.youtube.com/watch?v=zsa9Ey9INEg&t=643s

Solution:

  1. Migrate to Pages router, I think this is the most visible solution I can think of because most of the Monaco Editor we want them to run on the client side and we can isolate them not using global.window from the server
  2. Other solutions as Theo mentioned in his video

@KhudaDad414
Copy link
Member

KhudaDad414 commented Jun 24, 2024

@jerensl

Running the monaco editor using Web worker is not working here but what happened is it's running on the main thread which blocks around 4-6 seconds

main thread of client or server? if you mean on server the page is static and is only being built once (at build time). if you mean on the client then why it always doesn't have that 4 sec waiting time? and is on par with https://studio.asyncapi.com/ which is a normal CRA.

Ideally, Monaco Editor should be run in the worker and not block the main thread as in the image above, so the user gets their content first and does not block the render.

It does (at least the two workers that are supposed to. , main worker and yaml worker)
Screenshot 2024-06-24 at 17 20 53

Migrate to Pages router, I think this is the most visible solution I can think of because most of the Monaco Editor we want them to run on the client side and we can isolate them not using global.window from the server

can you point out what feature do we need from pages directory that isn't accessible in app directory? and how would we isolate them not using global.window from the server? considering the page is static?

@jerensl
Copy link

jerensl commented Jun 25, 2024

@KhudaDad414

main thread of client or server? if you mean on server the page is static and is only being built once (at build time). if you mean on the client then why it always doesn't have that 4 sec waiting time? and is on par with https://studio.asyncapi.com/ which is a normal CRA.

can you point out what feature do we need from pages directory that isn't accessible in app directory? and how would we isolate them not using global.window from the server? considering the page is static?

I think the concept you mention is more related to the Pages router which is an old way of using NextJS without React Server components(RSC), the way you all implemented here is by using the App router which is built on top of React Server Components(RSC) by default. I also see you are using a Client Component, I think there are misconceptions about what it is supposed to do, what I know it's the Client Component will render both the server and client and do some technique called hydration to the Client Side by injecting some functionality, also see here how Dan Abramov explain about RSC in simple way

In my opinion, NextJS App Router and Pages Router are two different types of framework, as the discussion here I think it's not be considered as different types of architecture and on to do list(React Server Components), App Router is more similar to Remix than Create React App.

I also not find any decision around why you came up with the idea to use React Server Components here https://github.com/asyncapi/studio/blob/master/doc/adr/0007-use-nextjs.md

Based on how it works in React Server Components, it's not surprising why we got 4 second cold start, because component render both happened in Client and Server

Consider how huge the changes to use React Server Components, which makes us rethink how we are supposed to deal with server and client at the same time, which also makes some state managers think again about how their supposed to deal with it
pmndrs/zustand#2200

It does (at least the two workers that are supposed to. , main worker and yaml worker) Screenshot 2024-06-24 at 17 20 53

If the get fix, it's good then, btw I running test on the website mention above which is https://studio-helios2002.netlify.app/

@KhudaDad414
Copy link
Member

KhudaDad414 commented Jun 25, 2024

Thanks for the explanation @jerensl.

I think the concept you mention is more related to the Pages router which is an old way of using NextJS without React Server components(RSC)

by static I meant that the / route is statically rendered and by extension Full Route Cached.

Based on how it works in React Server Components, it's not surprising why we got 4 second cold start, because component render both happened in Client and Server

This would be valid if we had a Dynamically Rendered page. Since the Page is Statically Rendered and is Full Route Cached, the server side components won't render with requests but is rendered at build time and is served to the client as React Server Component Payload.

Are you suggesting that a cold start invalidates cache and the page is rendered on the server again?

@jerensl
Copy link

jerensl commented Jun 25, 2024

@KhudaDad414

by static I meant that the / route is statically rendered and by extension Full Route Cached.

'/' route I basically a server component that statically renders by default but it's work very different with client component which need a server rendering

This would be valid if we had a Dynamically Rendered page. Since the Page is Statically Rendered and is Full Route Cached, the server side components won't render with requests but is rendered at build time and is served to the client as React Server Component Payload.

But we still have components/StudioWrapper.tsx right? because of that the page '/' route which a static rendering before becomes dynamically rendering

Are you suggesting that a cold start invalidates cache and the page is rendered on the server again?

No, but I'm suggesting trying to experiment with partial prerendering but keep in mind it is still an experiment feature, basically it will render statically render without waiting for dynamic rendering

Reference:
https://www.youtube.com/watch?v=MTcPrTIBkpA

@KhudaDad414
Copy link
Member

KhudaDad414 commented Jun 25, 2024

@jerensl

But we still have components/StudioWrapper.tsx right? because of that the page '/' route which a static rendering before becomes dynamically rendering

Yes it does. but only at build time. No server side code runs for statically generated routes. doesn't matter if there is a "use client" directive or no.
Tested here

Can you give an example that a pre-rendered as static route renders on server (other than build time of course)?

the page '/' route which a static rendering before becomes dynamically rendering

Sorry, I don't understand, how does a route with static rendering "becomes" dynamic? can you explain it a bit more?

@jerensl
Copy link

jerensl commented Jun 26, 2024

@KhudaDad414

Yes it does. but only at build time. No server side code runs for statically generated routes. doesn't matter if there is a "use client" directive or no. Tested here

I think I got it wrong here but sure statically generated will run during build time to generate HTML except for client components during initial load without lazy loading SSR

Can you give an example that a pre-rendered as static route renders on server (other than build time of course)?
Sorry, I don't understand, how does a route with static rendering "becomes" dynamic? can you explain it a bit more?

It can happen under strict rules but not in our case, for example, if we have a cookie or turn no cache on fetch API

screenshot-1719403248330
Do you know what I'm missing here? Did NextJS statically generate JSX to HTML component, why majority of the website still on JSX and just the navbar which got only statically generated on built time? Did they mean skipping SSR meaning no Staticly Generated HTML during built time too? Is it the point we use NextJS because they making into the HTML

@KhudaDad414
Copy link
Member

Do you know what I'm missing here?

Full route cache (Statically generated route, if we can call it that) will only take effect when you are not opting out of it.

why majority of the website still on JSX and just the navbar which got only statically generated on built time?

Well, since we are using Monaco and it can't be rendered on the server, plus the other two components (Navigation and Preview) are dependent on the state has to be generated on the client.

the other two (Sidebar and the toolbar at the top) I am not sure if we can render them on the server. It may be possible.

Did they mean skipping SSR meaning no Staticly Generated HTML during built time too?

It means do not try to load this at the Server side since it relies on the window object and will fail.

Some questions that I completely don't know the answer to and we need to answer to decide how we are going to structure the Application.

  • what can be rendered on the server and what should render on the client?
  • How are we going to manage the state, should we keep the current approach?

Which are out of the scope of this issue and needs to be discussed separately.

@jerensl
Copy link

jerensl commented Jul 10, 2024

  • what can be rendered on the server and what should render on the client?

RSC Client Component is already smart enough to separate what belongs to the client and what is on the server via hydration mechanism. Before RSC we need to use useEffect as side effect for detecting window/browser API but with RSC we don't need that anymore

  • How are we going to manage the state, should we keep the current approach?

Just so you know, before server components existed, react-query had a solution to managing this complexity of state between server and client as their claim as an asynchronous state.

The implementation of RSC in react-query seems a bit complex and reminds me of why we are moving out from redux in the first place, and they are still figure it out how they will do it in the future https://tanstack.com/query/latest/docs/framework/react/guides/advanced-ssr. They also write a blog about trade-off to make on network waterfall https://tanstack.com/query/latest/docs/framework/react/guides/request-waterfalls. This network waterfall also make why remix is better then NextJS + RSC https://remix.run/blog/react-server-components#obsessed-with-ux

Also, let's talk about network waterfall, seems like it has been a hot topic between NextJS + RSC vs Remix, the way NextJS did it by rewriting the fetch standard on the server. This solution is supposed to fix deduplication of the same fetch request and introduce it as the default caching behavior in NextJS as we have seen now but other frameworks like Remix insist it should not rewrite in the web standard and let the developer control their own caching behavior, this one also led to controversy and made the React Team remove fetch deduplication from RSC, let's see if NextJS will follow it or no https://www.youtube.com/watch?v=AKNH7mXciEM&t=920s.

It's supposed to be a good answer but I'm also don't know yet, because we are at an awkward spot now as web developers.

Copy link

github-actions bot commented Nov 8, 2024

This issue has been automatically marked as stale because it has not had recent activity 😴

It will be closed in 120 days if no further activity occurs. To unstale this issue, add a comment with a detailed explanation.

There can be many reasons why some specific issue has no activity. The most probable cause is lack of time, not lack of interest. AsyncAPI Initiative is a Linux Foundation project not owned by a single for-profit company. It is a community-driven initiative ruled under open governance model.

Let us figure out together how to push this issue forward. Connect with us through one of many communication channels we established here.

Thank you for your patience ❤️

@github-actions github-actions bot added the stale label Nov 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: Backlog
5 participants