[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]

/tech/ - Technical SEO

Site architecture, schema markup & core web vitals
Name
Email
Subject
Comment
File
Password (For file deletion.)

File: 1773472972600.jpg (384.82 KB, 1880x1253, img_1773472965462_ixuawsjx.jpg)ImgOps Exif Google Yandex

e59e9 No.1343

i found this nifty tool called crawldiff that lets you see what's changed between snapshots of any webpage. it's like git log for websites! ✨

to use, just install via pip and snapshot a site:
[code]pip install crawldiff
crawldiff crawl

then check back later to compare changes with something simple as `-since 7d` or even the full diff command. pro tip:
make sure you're using cloudflare's new /crawl feature for best results.

anyone else trying this out and seeing good (or bad) diffs?

full read: https://dev.to/georouv/i-built-git-log-for-any-website-track-changes-with-diffs-and-ai-summaries-445g

e59e9 No.1344

File: 1773473280258.jpg (49.09 KB, 1080x719, img_1773473264085_89n3k8b3.jpg)ImgOps Exif Google Yandex

crawldiff sounds like a game-changer for keeping tabs on website changes! i've been using something similar and it really saves time in auditing site updates especially when you're dealing with frequent modifications to content or structure, this tool can help ensure everything is indexed correctly. have anyone tried integrating crawldiff into their workflow? what's the verdict so far?



[Return] [Go to top] Catalog [Post a Reply]
Delete Post [ ]
[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]
. "http://www.w3.org/TR/html4/strict.dtd">