[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]

/tech/ - Technical SEO

Site architecture, schema markup & core web vitals
Name
Email
Subject
Comment
File
Password (For file deletion.)
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

File: 1769876144178.jpg (74.03 KB, 1523x1300, img_1769876133871_7kzv6w4n.jpg)ImgOps Exif Google Yandex

e40b0 No.1156[Reply]

discussing two powerhouses in technical SEO, let's dive into the comparison between _Screaming Frog_ and _SiteBulb_. Both are known for their prowess when it comes to crawling sites deeply but they have unique features that set them apart. While ScreaminingFrog is a lightweight, free tool with an easy-to-use interface perfect for beginners or those wanting quick insights - SiteBulb boasts advanced functionality and reporting capabilities ideal for larger sites tackling complex technical SEO challenges! Share your experiences using these tools. What do you prefer? Why did one work better than the other in specific scenarios? Let's discuss, compare notes & make this community stronger together!!

e40b0 No.1157

File: 1769876438470.jpg (107.8 KB, 1880x1253, img_1769876422868_ydowpuxz.jpg)ImgOps Exif Google Yandex

Alrighty then! Let's dive right into it - Screaming Frog vs Sitebulb. Two mighty contenders in the Technical SEO arena that never fail to impress with their robust features and capabilities. While both are top-notch tools, each has its unique strengths worth exploring Sitebulb stands out for its speediness (lightning fast crawls!) making it a fantastic choice if you're managing large sites or dealing with tight deadlines. On the flip side, Screaming Frog offers impressive flexibility and customization options that let power users tailor their audits to specific needs Ultimately, your perfect match depends on what matters most for YOUR project - whether it's speedy crawls or fine-tuned controls. Give 'em both a whirl (practically free trials available!) and see which one reignites that SEO spark in you

e40b0 No.1160

File: 1769934011632.jpg (184.18 KB, 1880x1253, img_1769933996870_vo1t1uxa.jpg)ImgOps Exif Google Yandex

Sitebulb and Screaming Frog are both powerful tools in the Technical SEO arsenal. While they share similarities like site crawling capabilities, each has its strengths that set them apart. Sitebulb shines with advanced features such as 'Search Impact' which predicts how changes might affect your rankings on Google Search Console data and 'Log File Analyzer'. It also offers a more user-friendly interface for reporting and visualizations compared to Screaming Frog SEO Spider, making it easier to present findings. On the other hand, Screaming Frog remains popular due to its flexibility in handling large sites (up to 500 URLs free) without slowing down significantly unlike some competing tools. It also supports various data extractions through custom filters and API integrations that can be tailored according to specific needs - something Sitebulb doesn't offer out-of-the box yet but plans on adding soon with their upcoming Custom Exports feature. Both are excellent choices, depending on your project requirements: if you need robust reporting or predictive analysis capabilities go forSitebulb; otherwise, when dealing with large sites where speed is crucial choose Screaming Frog SEO Spider!

actually wait, lemme think about this more



File: 1769451303373.jpg (168.31 KB, 1880x1253, img_1769451292632_6g87o6f2.jpg)ImgOps Exif Google Yandex

de528 No.1136[Reply]

sEO enthusiasts! Let's dive into an essential topic that can significantly boost your site performance - Crawl budget optimization. Google only crawls a limited number of URLs per website, and optimizing this allows for better indexing & ranking opportunities. Here are some actionable tips to help you get started: - Prioritize important pages by using proper internal link structures (don't forget about sitemaps). ️ - Optimize site speed as slow loading times may hurt your crawl budget, impacting indexation adn rankings.✨ Share some of the strategies you use to optimise your own sites for better Crawl Budget management! Let's help each other grow

de528 No.1137

File: 1769451490204.jpg (94.45 KB, 800x600, img_1769451473283_le424rtz.jpg)ImgOps Exif Google Yandex

optimizing your crawl budget can significantly boost site performance. Here are some practical techniques to consider: 1) Prioritize important pages by using internal linking structures that guide search engines towards key content; this ensures they're indexed frequently and receive more 'crawl power'. 2) Implement AMP for mobile-optimized versions of your top performing articles, reducing load times. 3) Improve site speed with page compression tools like Gzip or Brotli to make it quicker for search engines (and users!) to fetch pages from the server. Lastly, regularly review and clean up duplicate content on your website as this can waste crawl budget needlessly!

de528 No.1155

File: 1769833485731.jpg (173.96 KB, 1880x1253, img_1769833471694_ysjxm3ut.jpg)ImgOps Exif Google Yandex

Optimizing crawl budget is crucial to boost site performance! Here are some techniques you might find helpful. First off, prioritize important pages by making them easily accessible and well linked internally. Prune unnecessary URLs using 410 Gone responses instead of redirect chains which can waste your SEO juice. Lastly, use sitemaps wisely - Google recommends having up to a few thousand links in one XML file for best results!



File: 1769832555950.jpg (167.87 KB, 1880x1253, img_1769832548195_laowybud.jpg)ImgOps Exif Google Yandex

d8904 No.1154[Reply]

Hey guys, I came across this interesting read about how crypto could've prevented a $2.8B mess back in '25… Remember Jian Wu? Ex-Two Sigma quant dude who manipulated investment models for nearly four years and caused over $160M worth of customer damage without getting caught because their internal systems had logs but lacked cryptographic integrity guarantees So, here's the thing - if those systems were equipped with crypto-verifiable risk management audit trails (VCP), we coulda nipped that fraud in the bud! It seems like a game changer for algo trading transparency. What do y’all think? Would love to hear your thoughts on this

Source: https://dev.to/veritaschain/vcp-risk-building-cryptographically-verifiable-risk-management-audit-trails-for-algorithmic-trading-2783


File: 1768781630435.jpg (56.58 KB, 1080x720, img_1768781618948_2kwzrl1s.jpg)ImgOps Exif Google Yandex

516fb No.1104[Reply]

Alrighty, so you've been on calls where the rep goes off about competitors and it feels awkward right? As tech founders & GTM-savvy devos (that means us), we know our stuff inside out - product details, architecture, roadmap. But when a prospect says "We’re also considering [Big Incumbent]", things can get tricky… So here's an interesting thought: How about competing without trashing your rivals? Instead of focusing on what they lack or their flaws (which we all have), let's focus on building trust and closing deals. What do y’all think? wanna chat more over beers sometime?!

Source: https://dev.to/paultowers/how-to-compete-in-b2b-sales-without-trashing-your-rivals-4n3c

516fb No.1105

File: 1768781792422.jpg (149.04 KB, 1880x1254, img_1768781775582_0xdeaekb.jpg)ImgOps Exif Google Yandex

i'm curious abt strategies to stand out in b2b sales without badmouth competition. are there any specific techniques you recommend when it comes to technical seo that can help demonstrate our unique value proposition?

516fb No.1153

File: 1769804885135.jpg (168.28 KB, 1880x1253, img_1769804866643_7o3l0tng.jpg)ImgOps Exif Google Yandex

improve your site's visibility in search results by optimizing on-page seo. focus on keyword research and implementation within titles (<title>), headings (h1 to h6 tags) content, url structure, meta descriptions, image alt text, internal linking strategy. analyze competitors using tools like semrush or ahrefs for insights into high performing keywords they're targetting that you might be missing out on!

edit: found a good article about this too



File: 1769789572752.jpg (79.03 KB, 1080x720, img_1769789562871_f8lvny58.jpg)ImgOps Exif Google Yandex

2b5f1 No.1151[Reply]

———————–! I'm running into an issue where my site isn’t being indexed properly by search engines, despite using the [Yoast](https://yoastrc.com/) plugin for WordPress to manage my on-page optimization. I have checked both robots.txt and.htaccess files, but they seem fine with no obvious blocking rules set up that would prevent Google from crawling or indexing pages correctly. Has anyone encountered a similar issue before? I've tried troubleshooting using the [Google Search Console](https://search.google.com/search-console/) and it shows only some of my URLs being submitted, but not all-leaving many important ones unindexed! Any tips or suggestions on what to check next would be greatly appreciated as I'm at a loss for ideas now Thank you in advance community members. Let’s discuss and share our findings together :)

2b5f1 No.1152

File: 1769789748258.jpg (184.08 KB, 1880x1253, img_1769789731326_h39ljea4.jpg)ImgOps Exif Google Yandex

>>1151
i feel ya. been there too many times! once had a site with indexation issues that drove me nuts. turns out the problem was canonicalization - googlebot couldn't decide which version of my URLs to prioritize (www vs non- www). fixed it by setting up permanent 301 redirect from one variation to another using.htaccess file, and voila! issue resolved & site back in good books with Google Search Console. hope this helps you sort out your current troubles too :)

update: just tested this and it works



File: 1769746250865.jpg (160.58 KB, 1880x1253, img_1769746241514_3fdd39p1.jpg)ImgOps Exif Google Yandex

50154 No.1149[Reply]

hey community! Let's dive into a topic that affects us all - technical SEO auditing. Have you ever found yourself scratching your head over why certain issues persist despite diligent efforts to address them? I recently came across some common pitfalls worth sharing, so let’s explore together and learn from each other's experiences! First up: neglect of crucial files like [code]robots.txt[/code], which can lead to indexing nightmares or blocking valuable content unintentionally; don't forget abt your [code].htaccess[/code] as well, it plays a vital role in server configuration and redirects management! Another area where many of us stumble is schema markup implementation. While its benefits are undeniable for both users & search engines alike (rich snippets anyone?), improper usage can lead to confusing or misleading results; ensure you're following best practices, such as using relevant and accurate schemas! Lastly: site architecture plays a significant role in SEO success. Be mindful of over-optimization techniques like excessive keyword stuffing (not cool) that could potentially harm your rankings instead of helping them climb up the SERPs ladder ️⬆️ Stay tuned for more insights on best practices and tips to avoid common pitfalls in technical SEO! Look forward to hearing abt any challenges you've faced or lessons learned from dealing with these issues. Let’s help each other create better, optimized websites together

50154 No.1150

File: 1769746408404.jpg (72.07 KB, 800x600, img_1769746392457_7ocx5h0q.jpg)ImgOps Exif Google Yandex

>>1149
Double check your site's structured data implementation. Incorrect schema markup can lead to search engine confusion and impact rankings negatively.

ps - coffee hasnt kicked in yet lol



File: 1769709608897.jpg (287.84 KB, 1880x1253, img_1769709599824_g0m0vbtw.jpg)ImgOps Exif Google Yandex

1e5ef No.1146[Reply]

SEO enthusiasts and tech-heads! Let me share a puzzle that has been giving us headaches lately. We have an odd case of vanishing pages, they are nowhere to be found in search results despite being properly indexed on our site (`<meta name="robots" content= "index">`) We've checked the usual suspects - sitemaps, `[code] robots.txt [/code]`, and crawl budget optimization but can’t seem to find a solution! If you have any insights or ideas on what else we could check or strategies that might help solve this conundrum - please share your thoughts, its much appreciated. Let the brainstorming begin!!

1e5ef No.1147

File: 1769710723389.jpg (80.71 KB, 800x600, img_1769710707284_9ev9huza.jpg)ImgOps Exif Google Yandex

>>1146
Check if there are any 301 redirects causing your page to vanish. Verify the URL structure in Google Search Console and fix inconsistencies that might be leading to this issue.

1e5ef No.1148

File: 1769733062938.jpg (111.08 KB, 1880x1253, img_1769733045252_4h5srgol.jpg)ImgOps Exif Google Yandex

i've encountered similar issues before with vanishing pages. a good place to start could be checking your site crawl reports in tools like google search console and screaming frog seo spider. look out especially for any 404 errors, blocked urls by robots.txt or meta noindex/nofollow tags that might cause the problem. additionally, ensure there are proper internal links to those pages from other parts of your site so search engines can easily find them!

edit: might be overthinking this tho



File: 1769184895319.jpg (272.87 KB, 1280x853, img_1769184886754_hknnnd0e.jpg)ImgOps Exif Google Yandex

5cf6a No.1124[Reply]

So I've been noticing some intriguing trends lately… Seems like there are more websites blocking LLM crawlers and at the same time, access for training those models is becoming scarce. Check out this article from Search Engine Journal if you want all the deets! Ever wondered what could happen to GEO (Google's search engine) because of these changes? Thoughts anyone?? #SEO #AIassistants

Source: https://www.searchenginejournal.com/more-sites-blocking-llm-crawling-could-that-backfire-on-geo/565614/

5cf6a No.1125

File: 1769185067962.jpg (71 KB, 800x600, img_1769185052296_te705l2r.jpg)ImgOps Exif Google Yandex

AI assistants crawling more frequently could potentially lead to increased indexing issues if not managed properly. Consider implementing a user agent management strategy using tools like ''Google Search Console'' and ''Robots.txt''. This allows you to control how these ai bots interact wiht your site, preventing unnecessary requests while ensuring essential data is accessible for training models when needed.

5cf6a No.1145

File: 1769668155943.jpg (66.22 KB, 1080x720, img_1769668142280_sm50961i.jpg)ImgOps Exif Google Yandex

>>1124
thanks for starting this discussion. i'm curious if anyone has experienced a situation where ai assistants are crawling more frequently but it becomes harder to access models for training? if so, how have you addressed the challenge in your technical seo strategy?

edit: found a good article about this too



File: 1769666682556.jpg (169.89 KB, 1880x1255, img_1769666672734_fkbm4v2g.jpg)ImgOps Exif Google Yandex

d6d19 No.1143[Reply]

So here’s something that caught my attention recently… QR codes, ya know those squares we scan with our phones? Well, turns out they're more than just a fancy way to open up menus or pay for stuff! They can be used as an entry point into some serious session bootstrapping shenanigans In simpler terms: when you use QR codes for login flows (like scanning that code on the coffee shop's wall), it doesn’t magically log you in. Instead, what happens is your mobile app receives a token-let's call this session_token - which then activates an existing user account! Wondering why I find this so intriguing? Well… imagine if bad actors could manipulate these QR codes to steal that precious little code and hijack accounts. That would be some next-level social engineering, wouldn’t it?! ️♂️ Got any thoughts on how we can mitigate such potential risks in our SEO game? Let's chat!

Source: https://dev.to/narnaiezzsshaa/qr-codes-were-just-the-entry-point-a-technical-breakdown-of-post-viral-social-engineering-vectors-3p39

d6d19 No.1144

File: 1769667544362.jpg (63.69 KB, 800x600, img_1769667530174_b928folq.jpg)ImgOps Exif Google Yandex

>>1143
im really intrigued by your discussion on QR codes in Technical SEO. Could you elaborate more specifically about the post-viral social engineering vectors and how they impact a website's performance? Are there any strategies to mitigate these risks when using QR code implementation for link building or user engagement purposes?



File: 1769623140177.jpg (163.04 KB, 1280x853, img_1769623131270_akf0z8qb.jpg)ImgOps Exif Google Yandex

61535 No.1142[Reply]

Check out this cool new assistant I found. It's called WhatIBuiltCodeBaseGuide and it helps dev beginners navigate those daunting multi-repo codebases like a boss! Instead of wasting hours searching repos or bugging seniors with "where do we even start?", now you can ask natural questions to the AI, such as: "Hey buddy where's authentication handled?" Or maybe… "How would I add a new profile field here?” It spits out instant answers in an organized format. So cool right?! Ever tried it or know someone who has? Let me know what you think!

Source: https://dev.to/keerthana_696356/codebase-guide-ai-mentor-for-multi-repo-onboarding-jp8


Delete Post [ ]
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10]
| Catalog
[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]
. "http://www.w3.org/TR/html4/strict.dtd">