I feel like Mozilla is going to join the annals of history with the likes of Xerox in the category of "Companies that created the technology of the future and casually tossed it to the wayside for competitors to scoop up" with Rust and Servo.
It's mind-boggling that for a company so often seemingly playing catch-up with Google, Mozilla actually leapfrogged Google in the browser development space for a time, and then...decided it wasn't worth pursuing any further.
It's still baffling to me that Mozilla threw out Firefox's technical future
Very little about Mozilla makes sense --- until you follow the money.
While it's hard to know what comes of it, there is also https://ladybird.org/ to challenge to monopoly of Blink.
> The Dogemania test ran at a smooth 60 FPS on my M4 Pro MacBook Pro until reaching around 400 images
I ran Dogemania on Chrome until 1400 images at steady 60 FPS at which point I got bored and closed the tab.
Describing Servo as "new" is a stretch ;)
I thought it was more like: rust made for the web browser Servo.
> This is a danger to the open web in more ways than one. If there is only one functioning implementation of a standard, the implementation becomes the standard.
I still don't understand why this is a problem. As long as the engine implementing the spec - governed by committee formed by entities other than Google itself - is open source. The problem and the waste of resource is how we are behaving now.
The browser engine should become as the Linux Kernel: one engine and different distros.
I'm hearing Servo got rebooted because Valve giving $ to Igalia to reboot Servo project. Can anyone confirm this?
its really sad to see what mozilla turned into. a competitive browser company to activism. no wonder its core product started to wane
"Most sites have at least a few rendering bugs, and a few are completely broken. Google search results have many overlapping elements, and the MacRumors home page crashed after some scrolling. Sites like Wikipedia, CNN Lite, my personal site, and text-only NPR worked perfectly."
Like many HN readers, I have read countless accounts of web browsers and web browsing over the years.
Unfortunately, I cannot recall even one that took an account such as this one and concluded something like, "We need to modify the Google and MacRumors pages so they work with Servo." Unfortunately, the conclusion is usually something like, "We need to fix Servo so it works like Chromium/Chrome."
The reason I believe this is unfortunate is that (a) it ultimately places control in an ad services company and (b) it creates the wrong incentives for people who create web pages. Pages could be modified to conform to what Wikipedia, CNN Lite, the author's personal site and text-only NPR have done. This is not difficult. In fact, it is easier than modifying Servo to do what Chromium/Chrome is doing.
IMO, the "standard" web browser should not be effectively defined by an ad services company (including its business partner, Mozilla) nor should the standard for a web page be defined by the "most popular" web browser, To me "popular" and "standard" are not necessarily the same. Web _pages_ (cf. web _browsers_) should work with unpopular browsers and popular browsers alike, According to OP, Wikipedia, CNN Lite, the author's personal site, and text-only NPR may meet the standard.
In sum, fix web pages not web browsers.
As a hobbyist, I still compile and experiment with w3c's original libww library and utilities. Below is short script I use to compile static binaries. With a TLS forward proxy these utilities, with few modifications, if any, can still work very well for me for retrieving web pages on today's web. (I am only interested in learning www history and optimising text retrieval, not graphics.) This library is generally "ancient" on the www timescale and yet it still works 30 years later. That's useful for www users like me, but maybe not for the online ad services companies and sponsored web browsers optimised for data collection and surveillance. Internet is supposed to be a public resource not a private one, i.e., highest priority of www pages and www browsers should be to serve www users not online ad service providers.
# previous: download and compile w3c-libwww-5.4.2
pwd|grep "w3c-libwww-"||exec echo wrong directory
export x=$(pwd)
export examples=$x/Library/Examples
export linemode=$x/LineMode/src
export commandline=$x/ComLine/src
export robot=$x/Robot/src
y="
libwwwinit.a libwwwapp.a libwwwhtml.a
libwwwtelnet.a libwwwnews.a libwwwhttp.a
libwwwmime.a libwwwgopher.a libwwwftp.a
libwwwdir.a libwwwcache.a libwwwstream.a
libwwwfile.a libwwwmux.a libwwwtrans.a
libwwwcore.a libwwwutils.a
$x/modules/md5/.libs/libmd5.a -lm"
cd $x/Library/src/.libs
for z in
head libapp_1 libapp_2 libapp_3 libapp_4 init chunk
chunkbody LoadToFile postform multichunk put post
trace range tzcheck mget isredirected listen
eventloop memput getheaders showlinks showtags
showtext tiny upgrade cookie
do
gcc -s -static -O2 -Wall -o $examples/$z $examples/$z.o $y
done
gcc -static -s -O2 -Wall -o $linemode/www
$linemode/www-HTBrowse.o $linemode/www-GridText.o
$linemode/www-ConView.o $linemode/www-GridStyle.o
$linemode/www-DefaultStyles.o
$x/PICS-client/src/.libs/libpics.a $y
gcc -static -s -O2 -Wall -o $robot/webbot
$robot/webbot-HTRobot.o $robot/webbot-RobotMain.o
$robot/webbot-RobotTxt.o $robot/webbot-HTQueue.o $y
gcc -static -s -O2 -Wall -o $commandline/w3c
$commandline/w3c-HTLine.o $y
# next: symlink binaries to a folder in $PATH
# or export PATH=$PATH:$examples:$commandline:$robot:$linemode
All-time opener
I feel guis have taken a wrong turn. We need a simple composable language. I want entire theorems built from the least amount of axioms possible.
HTML and css is not it. The sheer complexity is what causes all the bs.
[flagged]
[flagged]
Please bring this to iOS. WebKit is broken.
> the MacRumors home page crashed after some scrolling
I know there’s a ton going on, with GPUs and rendering and all kinds of things, but I guess because Rust’s memory safety and “no null pointers!” are so constantly hyped (especially in conversations about Go), that I’m always surprised when you fire up a Rust app and do something and it crashes out…
[To be clear, I’m a big fan of modern sum types, and like to imagine an alternate reality where Go had them from the start…]
> The current roadmap lists Shadow DOM and CSS Grid as priorities
I've been working on the CSS Grid support. About to land "named grid lines and areas" support which should make a bunch more websites layout correctly.
I'm biased because it's my project, but IMO the approach Servo is using for CSS Grid is pretty cool in that the actual implementation is in an external library (Taffy [0]) that can be used standalone and is widely used accross the Rust UI ecosystem, including in the Blitz [1] web engine (which also uses Taffy for Flexbox and Block layout), the Zed [2] text editor, and the Bevy [3] game engine.
I'm hopeful that this approach of breaking down a web engine into independently usable modules with public APIs (which builds upon Servo's earlier work on modular libraries such as Stylo and html5ever) will make it easier for people to get involved in web engine development (as they can understand each piece in isolation), and make it easier for people to create new web engines in future (as they won't have to start completely from scratch).
[0]: https://github.com/DioxusLabs/taffy [1]: https://github.com/DioxusLabs/blitz [2]: https://zed.dev [3]: https://bevy.org/