I would echo the advice of using a test framework. An alternative would be a browser extension, and using that to query the element ids while manually visiting each site.
The requirement to make it not tied to a Google account rules out what would be my preferred method of getting these via the YT API.
I think there are some open source git repos that already do what you're asking (e.g. https://github.com/egbertbouman/youtube-comment-downloader) but I haven't personally tried any of these.
Use a browser test automation tool like playwright or puppeteer and go to each page. On each page wait for comments to dynamically appear and then walk the DOM to extract that content and transform it to any format of your choosing.
Use: `yt-dlp --write-comments --no-download --batch-file FILE`
- FILE is a text file with a list of YouTube id's/URL's
- https://superuser.com/a/1732443/4390
- https://github.com/yt-dlp/yt-dlp