Firefox 90 supports Fetch Metadata Request Headers

  • Very happy to see this landing in Firefox!

    For the people wondering what the motivation is, https://www.w3.org/TR/fetch-metadata/#intro has a good summary:

    Interesting web applications generally end up with a large number of web-exposed endpoints that might reveal sensitive data about a user, or take action on a user’s behalf. Since users' browsers can be easily convinced to make requests to those endpoints, and to include the users' ambient credentials (cookies, privileged position on an intranet, etc), applications need to be very careful about the way those endpoints work in order to avoid abuse.

    Being careful turns out to be hard in some cases ("simple" CSRF), and practically impossible in others (cross-site search, timing attacks, etc). The latter category includes timing attacks based on the server-side processing necessary to generate certain responses, and length measurements (both via web-facing timing attacks and passive network attackers).

    It would be helpful if servers could make more intelligent decisions about whether or not to respond to a given request based on the way that it’s made in order to mitigate the latter category. For example, it seems pretty unlikely that a "Transfer all my money" endpoint on a bank’s server would expect to be referenced from an img tag, and likewise unlikely that evil.com is going to be making any legitimate requests whatsoever. Ideally, the server could reject these requests a priori rather than delivering them to the application backend.

    Here, we describe a mechanism by which user agents can enable this kind of decision-making by adding additional context to outgoing requests. By delivering metadata to a server in a set of fetch metadata headers, we enable applications to quickly reject requests based on testing a set of preconditions. That work can even be lifted up above the application layer (to reverse proxies, CDNs, etc) if desired.

  • Does that mean that the quest of finding a working direct link to the image/video will soon become impossible?

  • How is this different from the origin header? Does the origin header not tell the webbserver if the requested originated from the same website? Is the origin header flawed in some way?

  • This is FUD:

    > Hence the banking server or generally web application servers will most likely simply execute any action received and allow the attack to launch.

    While these are useful headers, there are protections today via XSRF tokens to prevent these attacks that all major sites implement, so it isn’t likely your bank is vulnerable.

  • The original CORS protection is enforced by the browser, not the server. That means that it is much harder for it to cause a privacy problem. Given that this only works if you are using a browser anyway (any other user agent can spoof all this) I don't see how there can be any security gain from the server doing the enforcement. Which leaves me wondering whether the increased flexibility is worth the potential privacy issue.

  • Pardon my ignorance. I thought the way to deal with csrf was csrf tokens. It seems like you would still have to ignore the headers and rely on the token in your logic if ever they disagreed. I’m not sure how to use these new headers

  • I’m quite surprised that Sec-Fetch-Dest doesn’t have a “form” type for form submissions, and the spec makes almost no mention of forms. Does this spec finally allow a simple header check to squash CSRF form posts or not?

  • For me, recent firefox releases have been MISERABLE. I guess it could just be my computers but the browser constantly locks, no performance whatsoever, and just no help on troubleshooting anywhere.

    I went with the long term support releases and have had a better experience. Course, still no sound lol but I use Chrome when I want sound. I still like Firefox, just can't use recent releases.

  • Some example code on how to use these headers to allow/reject requests:

    https://web.dev/fetch-metadata/#step-5:-reject-all-other-req...

  • Does this essentially solve XSRF? Would it no longer be necessary to use XSRF tokens?

  • In the example, couldn't the call from attacker.com to banking.com be thwarted by CORS headers defined by the server?

  • So basically CORS headers that works as expected. Excellent.

  • Have they fixed the 8 yo confirmed bugs yet?

  • Any idea when they'll fix Firefox so you can make streaming calls without turning your MBP into a toaster? 40x the battery usage vs. Safari when on a streaming call. It's so painful. Heh, literally. The machine gets too hot to hold comfortably. It's still my primary browser, but optimization is needed. Sucks to have to change browsers just to make calls. We live on streaming calls now. This has been an issue since at least the start of Covid when I first really noticed it.

  • Since Encrypted SNI was disabled in Firefox 85, all the hostnames are transferred in plaintext, even using HTTPS. It was also disabled from Firefox ESR 78 at one point around ESR 78.9

    This Not only makes DNS over HTTPS absolutely useless, but it is also giving browsing information by duplicate, to the ISP, to the intermediaries and to the DNS providers.

    From the article, "If you aren’t a Firefox user yet, you can download the latest version here to start benefiting from all the ways that Firefox works to protect you when browsing the internet"

    I did not expected Firefox 90 forget about this matter and talk about protection, when they got rid of Encrypted SNI in FF 85 without warning, and without having any other alternative actively working.

    We went from an incomplete solution (ESNI) to having nothing at all. Meanwhile ECH (encrypted client hello) keep sounding like vaporware by the moment. Please...

  • Is there anything Mozilla/Firefox has done in the past 10 years that at least CAN BE ARGUED is for the improvement of the user's experience?

    I've been following their work pretty closely, but I'm at a loss trying to think of anything...