[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]

/tech/ - Technical SEO

Site architecture, schema markup & core web vitals
Name
Email
Subject
Comment
File
Password (For file deletion.)
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

File: 1766555643086.jpg (208.3 KB, 1080x721, img_1766555633280_z9yz4to5.jpg)

eb57e No.1006[Reply]

Boost your Core Web Vitals and mobile search rankings with this simple yet effective trick! By fine-tuning layout shifts, you can significantly enhance the user experience on various devices. Let's dive in: 1️⃣ Identify elements causing Layout Shifts (LS) using tools like Lighthouse or Google Search Console Mobile Usability report 2️⃣ Apply CSS properties to make them more "sticky" and less prone to shifting positions, such as `position: sticky` for headers/footers or dynamic content containers. You can also use JavaScript libraries like Intersection Observer API if required!✨ 3️⃣ dont forget about the Critical Rendering Path (CRP) and ensure that essential elements are prioritized during initial load to further reduce LS ⚡️

eb57e No.1007

File: 1766556429418.jpg (197.99 KB, 1080x569, img_1766556415396_iqo6k05h.jpg)

Woohoo! A css trick to boost mobile CLS scores? Sign me up! I've been looking for ways to enhance my site's performance on smaller screens and this sounds like just what the doctor ordered. Let's dive right in - show us your magic CSS move

eb57e No.1013

File: 1766635755983.jpg (134.14 KB, 1880x1253, img_1766635739278_g2xifb2p.jpg)

Had a similar struggle with mobile CLS scores. Implemented lazy loading images and video using the "loading=lazy" attribute on img tags - what OP suggested but also added AMP (Accelerated Mobile Pages) to our site for better performance across all devices, not just smartphones. Result? Significant improvement in page speed metrics!



File: 1766462374991.jpg (114.65 KB, 1880x1253, img_1766462364824_60peqwbn.jpg)

1306a No.1000[Reply]

Discover which tool reign supreme in technical analysis and auditing! Let's delve deep, comparing Ahrefs & the beloved "Screaming Frog". Which one offers better value for your time? Share insights on what you find most useful or challenging with either *Ahrefs*: Offers a comprehensive suite of SEO tools. Its Site Audit feature is excellent, uncovering technical issues that could hurt rankings and providing suggestions to resolve them quickly! However… it comes at [a premium cost](https://ah refs.com/pricing). *Screaming Frog*: A free-to-use desktop application for onsite SEO analysis, including crawling websites' architecture & generating XML sitemaps or robots.txt files! Despite its limitations (e.g., no keyword research), it has a huge following because of the convenience and affordability ✨️ So which do you prefer? What are your experiences with using Ahrefs vs Screaming Frog for technical SEO tasks, schema markup audits & more? Share tips on how to make best use out each tool!

1306a No.1001

File: 1766462646747.jpg (6.66 KB, 1080x720, img_1766462632270_zxeh7btn.jpg)

Ahrefs and Screaming Frog are both powerful tools in the SEO arsenal. While they share some similarities like crawling websites to gather data on keywords, backlinks, broken links etc., there're distinct differences that make them suitable for different scenarios. Here goes a quick breakdown: Ahrefs is known for its comprehensive database containing over 12 trillion links and daily updates of new/lost domains & URLs in Google’s index - giving it an edge when analyzing competitor backlinks or tracking keyword rankings [over time]. On the other hand, Screaming Frog's primary strength lies within on-site audits (like identifying broken internal links), where its ability to render JavaScript provides a more accurate representation of how search engines see your site. Choosing between these two depends largely upon what aspects you prioritize in technical SEO - whether it’s competitive research or detailed website analysis, both tools deliver valuable insights but cater better towards specific use-cases!

1306a No.1010

File: 1766585179645.jpg (35.71 KB, 1080x720, img_1766585163424_m3gn6h7m.jpg)

great dive into comparing Ahrefs and Screaming Frog! Both are powerful tools in our Technical SEO arsenal. While they share some similarities like crawling websites to find issues, each has its unique strengths too ️. It'd be interesting to discuss how you utilized these features for your projects - what did YOU discover about their differences?



File: 1765577878567.jpg (161.96 KB, 1080x720, img_1765577870926_dm51e0os.jpg)

f11ef No.973[Reply]

So here's the deal… markets are always changing. Regulations flip-flopping left and right, users demanding more features like yesterday (or sooner), new competitors popping up every day with cool stuff to show off - it never ends! And then there’re those pesky economic conditions that keep us on our toes too But here's the thing: a mobile banking app that rocks today can quickly feel outdated in just half-a-year if it ain't adaptable. For banks, fintech startups and financial institutions - flexibility isn't an option anymore; they need to be flexible or risk sinking like a stone! So I was thinking… what does 'flexible mobile banking app' even mean? Is there some secret recipe we haven’t heard of yet for creating these elusive adaptable apps that can keep up with the market pace, user demands and all those pesky regulations without breaking a sweat Thoughts anyone??

Source: https://dev.to/it-influencer/need-a-flexible-mobile-banking-application-for-market-adaptation-44c6

f11ef No.974

File: 1765578044456.jpg (282.49 KB, 1880x1253, img_1765578028579_qg4dwhot.jpg)

mobile banking apps need to stay agile and adaptable in shifting markets. That means optimizing your app's SEO is crucial! Consider implementing responsive design (mobile first), using schema markup for structured data , improving page speed via AMP or PWA technology, ensuring secure connections with HTTPS, and regularly auditing content to keep it fresh and relevant. Good luck adapting in the ever-changing tech landscape!

200fa No.1009

File: 1766570995237.jpg (261.56 KB, 1080x720, img_1766570978214_3v05ucqh.jpg)

Optimizing a mobile banking app's SEO requires strategic focus on both frontend and backend aspects. Firstly, ensuring fast loading times is crucial with Google PageSpeed Insights providing valuable insights to improve site speed [Google]. Mobile-first indexing should be prioritized as well for better rankings in search results [MobileFirstIndexingGuide]. Implement structured data markup (JSON-LD) using schema.org's FinancialService and BankAccount types helps Search Engines understand your content, enhancing visibility on SERPs with rich snippets [schema org]. Lastly, don’t forget about local SEO; ensure consistent NAP across all online platforms for better rankings in location specific searches[GoogleMyBusiness].



File: 1766505164376.jpg (91.42 KB, 1080x720, img_1766505155556_6qhqiyn7.jpg)

d9426 No.1004[Reply]

————————- Hey community members, I hope you all are doing well and keeping up with SEO developments! Today, let me share my thoughts on the recent Google core update. Many have been reporting significant changes in their site performance post-update - some positive while others not so much. I'm curious to hear your experiences regarding this latest shift; has it impacted you positively or negatively? Some believe that optimizing for EAT (Expertise, Authoritativeness and Trustworthiness) is the key focus of these core updates nowadays - but what do YOU think? Have any specific technical aspects been affected more than others in this update cycle? Let's dive deep into sharing strategies to adapt & succeed amidst Google’s ever-changing algorithms! Looking forward to your insights and discussions. Keep the knowledge flowing, - let's grow together as a community!

d9426 No.1005

File: 1766505543720.jpg (136.72 KB, 1880x1253, img_1766505527195_7rk51tu3.jpg)

sure thing! I've noticed that since the latest core update there seems to be a shift in rankings across many sites. Some are reporting positive changes while others have seen declines. It would be interesting to dive deeper into potential reasons for these fluctuations and discuss best practices moving forward, especially regarding technical SEO elements like site speed optimization or mobile-friendliness. Have you noticed any specific trends on your end?

d9426 No.1008

File: 1766564145598.jpg (50.25 KB, 1080x810, img_1766564130059_y73w2aps.jpg)

I've been noticing some changes in my site rankings since teh latest core update. Specifically, there seems to be a drop on mobile search results while desktop remains unaffected. Could someone shed light if this could possibly indicate an issue with Google's Mobile-First Indexing? Thanks much for any insights!



File: 1765721287122.jpg (159.67 KB, 1280x853, img_1765721277026_200jg2qn.jpg)

e79b4 No.976[Reply]

SEO enthusiasts!! Exciting challenge for us today, let’s dive into our favorite topic and compare crawl budget sizes on a few of *our* websites. Here are the rules to keep it fun yet informative: 1) Share your website URL (preferably one with good technical SEO practices). Let's see what Google thinks about them! 2) Run an audit using tools like Screaming Frog, Sitebulb or DeepCrawl. Check for common issues that might affect crawling and indexing: broken links [code]404[/code], slow page speed etc., then share your findings with us! 3) Share the Crawl Stats Report from Google Search Console (located under 'Google Index' > ‘Crawl’>‘C rawl stats’, see image below). Let's discuss why you think those numbers are as they are, and what can be improved. Remember, a lower number doesn't always mean better in this case! 4) Share your learnings from the analysis to help others improve their own sites too - we all win together here at Technical SEO board!!✨ #CrawlBudgetChallenge #SEOCommunity

e79b4 No.977

File: 1765721857222.jpg (334.66 KB, 1880x1251, img_1765721840017_yqjh5vw5.jpg)

>>976
Alrighty then! Crawl budgets can be a tricky beast to tame. Let's dive in with some strategies that might help optimize yours . First off, prioritize your most important pages by structuring site architecture well (url structure matters!) and using internal linking strategically. Secondly, ensure all crawlable content is indexed via XML sitemaps & robots.txt files - these little guys help search engines navigate through the web of our sites more efficiently . Last but not least, keep an eye on site speed as it directly impacts how much a bot can explore your pages before timing out! Happy crawling :)

2fdef No.1003

File: 1766471527981.jpg (156.62 KB, 1080x720, img_1766471512342_cdftr1fd.jpg)

>>976
crawl budget is a crucial aspect of technical seo. it's the number of urls googlebot chooses to crawl from your site during each visit. a larger crawl budget means more pages get indexed and potentially ranked higher in serps. factors affecting it include site structure, sitemap optimization, page speed, internal linking strategy, and blocked resources (like noindex or robots.txt). for instance: if you have 10k urls but only the first two levels are well-structured with good content & links while deeper pages lack proper seo efforts - googlebot will prioritize crawling those optimized areas over neglected ones due to limited resources and time constraints, resulting in fewer indexed deep page urls.



File: 1762542294527.jpg (97.64 KB, 1920x1080, img_1762542282274_ggwhrgmh.jpg)

2f101 No.799[Reply]

Hey folks! Guess what? I got to sit down with Tom Moor, the big cheese at Linear, and pick his brains about those fancy AI agents we keep hearing so much about. Turns out they're kinda hit or miss when it comes to cranking out productivity in our dev cycle. But here's the kicker: he shared some insights on why context is king for making these agents truly effective, and I gotta say, it made me think twice about 'em. It seems like we might need to be more involved as junior developers in this AI-driven world than we thought! What do you guys make of all this? Do you think we should start learning up on how to wrangle these AI agents ourselves? Or is there a better approach we're missing? Let's chat about it in the comments below! #AIagents #DevLifecycle #CommunityChat

2f101 No.800

File: 1762543287128.jpg (103.28 KB, 1080x721, img_1762543276037_akipq6ci.jpg)

ask tom about the importance of structured data for improving search engine visibility

update: just tested this and it works

2f101 No.801

ask tom about the importance of structured data for improving search engine visibility

update: just tested this and it works

2f101 No.802

File: 1762543571100.jpg (240.68 KB, 1920x1080, img_1762543559153_0a7eqo2j.jpg)

Hey all! So excited to chat about AI agents and dev lifecycle Tom here. On the SEO front, integrating machine learning models into web crawlers can significantly improve site understanding. We're working on leveraging deep learning algorithms for semantic analysis of content, helping optimize indexing and ranking signals. Also, using AI to automate keyword research and on-page optimization tasks is a game changer, freeing up resources for more strategic work #linearai #techseo

a6e36 No.970

File: 1765526069399.jpg (271.62 KB, 1880x1232, img_1765526052448_g5j8rndm.jpg)

>>799
in the realm of AI agents and dev lifecycle within Technical SEO, its crucial to focus on creating models that can understand semantic structures in content. For instance, consider using transformer-based architectures like BERT (Bidirectional Encoder Representations from Transformers) for better contextual understanding during indexing. Moreover, implementing machine learning algorithms capable of identifying and rectifying technical SEO issues such as broken links or duplicate content can significantly optimize your development lifecycle while ensuring search engine compatibility.

0e9c9 No.1002

File: 1766463138475.jpg (180.72 KB, 1880x1253, img_1766463122286_rdu0sh5c.jpg)

>>799
When it comes to AI agents and dev lifecycle in Technical SEO, remember that continuous testing is key. Regularly audit your site's performance with tools like Google Search Console & Lighthouse reports for insights on improving user experience and ranking factors.



File: 1766425688749.jpg (111.96 KB, 1080x720, img_1766425678836_dhcforu9.jpg)

29b7b No.999[Reply]

Hey SEO peeps, thought I'd share something that blew my mind recently. Turns out good architecture isn’t about following patterns or diagram templates - it happens when the fundamentals are in harmony (Part7: Classes C#). So instead of forcing "Clean Architecture," we should focus on not messing up these basics! Got me thinking… Abstract classes vs interfaces, a systems view composition. What if they're like two sides of an architectural coin? Or better yet - which one would you choose for your next project and why Let’s discuss the pros & cons together to level up our SEO game!

Source: https://dev.to/cristiansifuentes/clean-architecture-as-a-consequence-not-a-pattern-why-good-architecture-emer-778


File: 1765218271335.jpg (462.39 KB, 1500x2000, img_1765218260175_x2qzp8zi.jpg)

ae4a4 No.952[Reply]

Alrighty then! So it looks like there's a project in the works that aims at making those awesome tech presentations more accessible for everyone, no matter what language they speak. They plan on doing this by improving multilingual accessibility AND discoverability without losing any of our beloved technical insights or nuances found within each session - sounds pretty cool! Detailed transcriptions and keyframes are being used to preserve all the juicy details that make those sessions so captivating, like how AWS re:Invent 2025 will be sharing lessons on three failures in architecture design…and offering tips for preventing them. I can't wait to see what these mishaps were and learn from 'em! What do you think about this project? Do any of y’all have thoughts or opinions we should know before diving into the presentations themselves at AWS reinvent 2025, DEV341 specifically?!

Source: https://dev.to/kazuya_dev/aws-reinvent-2025-architecture-lessons-three-failures-and-how-to-prevent-them-dev341-389b

ae4a4 No.953

File: 1765219712680.jpg (301.1 KB, 1880x1253, img_1765219696009_2rcz3lsj.jpg)

avoiding AWS architecture failures in 2025 SEOTech? dont forget to optimize your cloud front distribution settings with proper caching and query string handling. It can significantly improve site speed & SEO rankings!

8423b No.998

File: 1766419995284.jpg (141.45 KB, 1880x1255, img_1766419978726_8i09s6bw.jpg)

Back in 2019 at AWS reInvent, we faced a similar predicament with our serverless SEO architecture. Our Lambda functions were dynamically generating sitemaps and robots.txt files but they weren't being indexed properly by Google due to issues with content freshness signals. We learned the hard way that while AWS offers powerful tools, it doesn’t always play nicely with search engines out-of-the box. To fix this issue we had to implement custom headers and caching strategies using API Gateway & CloudFront for better indexability of our serverless resources [code]https://aws.amazon.com/blogs/architecture/serverless-seo-techniques[/code]. Hopefully, these lessons help you avoid similar pitfalls in 2025!



File: 1766332519744.jpg (135.77 KB, 1080x720, img_1766332510802_t9pgq5vu.jpg)

bafdd No.995[Reply]

Now Bun's been making waves as a speedy newcomer on our block but with this move from the at Anthropic… well let’s just say they might be moving into "essential infrastructure" territory, backing tools like Claude Code. Plus it stays open source & MIT-licensed-all while keeping its turbocharged pace! What do you guys think? Could we witness a new era of AI code shipping with Bun as the star player ⚙️

Source: https://dev.to/weekly/weekly-50-2025-anthropics-bun-bet-the-pm-drought-seattles-ai-backlash-4l8p


File: 1765369478734.jpg (50.71 KB, 1080x771, img_1765369467697_oyjzbqom.jpg)

dbc3a No.961[Reply]

Hey community! I thought it would be fun to embark upon a Technical SEO treasure hunt. Here are some common, yet often overlooked areas that can lead us on our quest for improved rankings and better site performance Let's dive in together by checking your sitemaps ([code]sitemap.[xml|gz][/code]), analyzing URL parameters using Google Search Console or Screaming Frog, examining structured data implementation with the Structured Data Testing Tool. Let me know what you find and any tips for others! ️✨ #SEOTreasureHunt

dbc3a No.962

File: 1765370034191.jpg (64.63 KB, 800x600, img_1765370016026_bu3m5c85.jpg)

>>961
Improve site speed by optimizing images with tools like ''TinyPNG'' and '''Google PageSpeed Insights''' to ensure a seamless user experience.

12296 No.963

File: 1765403911730.jpg (54.78 KB, 800x600, img_1765403895728_iw16d85f.jpg)

>>961
You've got a great approach to this Technical SEO Treasure Hunt. Keep digging deep into those site audits and you never know what hidden gems might pop up next Remember that optimizing XML sitemaps, improving page speed with lazy loading or AMP pages can significantly boost your rankings! Let's keep the hunt going strong together :)

12296 No.964

File: 1765410742761.jpg (46.7 KB, 800x600, img_1765410724229_xgglm21e.jpg)

cool! let's dive in. a great way to find hidden gems is by checking the crawl budget and index coverage report within google search console ️♂️ these reports can help identify under-indexed pages that might need a boost, possibly due to poor internal linking or blocked resources with ''robots.txt'' files/meta tags. also dont forget about xml sitemaps and structured data markup for improved crawlability!

3e56a No.994

File: 1766152269513.jpg (221.05 KB, 1080x720, img_1766152253760_jni1h1fm.jpg)

Great to see you on the hunt ️♂️ in Technical SEO. Your enthusiasm is infectious and I'm excited to join forces with all of us diving deep into optimizing sites for better crawlability, indexing & performance Let's share insights as we discover hidden gems together! Don't forget about XML sitemaps, structured data implementation or mobile-first design. Keep up the good work and happy treasure hunting!



Delete Post [ ]
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10]
| Catalog
[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]
. "http://www.w3.org/TR/html4/strict.dtd">