mastodontech.de ist einer von vielen unabhängigen Mastodon-Servern, mit dem du dich im Fediverse beteiligen kannst.
Offen für alle (über 16) und bereitgestellt von Markus'Blog

Serverstatistik:

1,5 Tsd.
aktive Profile

#searchengineoptimization

0 Beiträge0 Beteiligte0 Beiträge heute

I wrote this article in 2011. And the main point of the article was: "Your Holy SEO Mantra for Meta Descriptions for the next THREE YEARS must now be:
'The Entire Page Is The Meta Description'. GET THAT THROUGH YOUR THICK, CONSERVATIVE, GRAND-FATHER CLAUSED NOGGIN’ because I am *not* going to repeat myself again for a long, long time."

You're welcome.

#seo #searchengineoptimization #metadescriptions #google #bing #webmarketing #digitalmarketing

"These Aren't The Meta Descriptions You're Optimizing For"

seo-theory.com/these-arent-the

SEO Presentations Are Loaded With Empty Statistics

Empty statistics are white noise. They are data that is commonly cited, perhaps even to the extent that most of your intended audience knows the information before you begin your presentation. Empty statistics can also be irrelevant or meaningless within the context of your presentation or article. For some reason, every SEO presentation I've read through or sat through over the past couple of years is front-loaded with empty statistics.

seo-theory.com/seo-presentatio

People keep coming back to this article (originally written in 2015). I've revised and expanded it with a few points that I haven't shared publicly before. The suggestions for identifying low quality content and improving content quality are still relevant today.

#google #quality #search #algorithms #eeat #seo #searchengineoptimization #webmarketing #digitalmarketing #panda

"How The Panda Algorithm Might Evaluate Your Site"

seo-theory.com/google-panda-si

Web Marketers Must Stop Complaining About Google AI

My partner Randy Ray and I manage - for ourselves and clients - over 200 Websites. We see the changes in search referral trends in the analytics data. Websites that were once thriving in the search ecosystem are now receiving fractions of their previous traffic volumes. And, frankly, I'm not worried. The reactions I'm seeing among experienced SEOs to this latest change in the…

#ai #google #search #searchengineoptimization #seo #webmarketing #digitalmarketing

seo-theory.com/web-marketers-m

Fortgeführter Thread

Google did not invent query fan-out. The technique is older than Google. But here is a Google patent from 2009 that describes a query fan-out method.

"Generating sibling query refinements"

"ABSTRACT: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for identifying query refinements from sibling queries. In one aspect, a method includes associating each of a plurality of parent queries with a respective group of one or more child queries for the parent query, identifying one or more candidate sibling queries for a particular child query, selecting one or more final sibling queries for the particular child query from the one or more candidate sibling queries, and associating the final sibling queries with the particular child query as query refinements."

patents.google.com/patent/US82

patents.google.comUS8244749B1 - Generating sibling query refinements - Google Patents Methods, systems, and apparatus, including computer programs encoded on computer storage media, for identifying query refinements from sibling queries. In one aspect, a method includes associating each of a plurality of parent queries with a respective group of one or more child queries for the parent query, identifying one or more candidate sibling queries for a particular child query, selecting one or more final sibling queries for the particular child query from the one or more candidate sibling queries, and associating the final sibling queries with the particular child query as query refinements.
#ai#aimode#google

I love Professor Quirrell. Yes, he’s evil and a secret host for the disembodied spirit of Tom Riddle. But Ian Hart made him an unforgettable meme. The scene with Quirrell running into the dining hall at Hogwarts yelling “Troll! In the dungeon!” just plays out perfectly. He symbolizes half the SEO world every time Google burps. People panic, start declaring that “SEO has changed!” and “Google […]

https://www.seo-theory.com/chaos-in-the-dungeons-chaos-in-the-dungeons/

#ai#aimode#google

"Search With Stateful Chat" patent (Cf. patents.google.com/patent/US20 ) - appears to describe the Gemini app for smartphones.

"Method for Text Ranking with Pairwise Ranking Prompting" (Cf. patents.google.com/patent/US20 ) - documents an experimental process described in this research paper titled "Large Language Models are Effective Text Rankers with Pairwise Ranking Prompting" (Cf. arxiv.org/pdf/2306.17563 ). There is no indication this was introduced into a live agentic system like Gemini.

"User Embedding Models for Personalization of Sequence Processing Models" (Cf. patents.google.com/patent/WO20 ) - documents an experimental process for improving recommender (sub-)systems (like movie searches) that incorporate large language models. The process is described in this research paper titled "User Embedding Model for Personalized Language Prompting" (Cf. arxiv.org/pdf/2401.04858 ).

"Systems and methods for prompt-based query generation for diverse retrieval" (Cf. patents.google.com/patent/WO20 ) - updates a 2022 patent for a process named PROMPTAGATOR that generates queries more efficiently based on a small number of examples, as described in this research paper titled "Promptagator - Few-shot Dense Retrieval from 8 Examples" (Cf. arxiv.org/pdf/2209.11755 ). This could be used to generate query fan-outs (but query fan-out has been used in multiple systems at least since the 1990s, so there are many implementations).

"Instruction Fine-Tuning Machine-Learned Models Using Intermediate Reasoning Steps" (Cf. patents.google.com/patent/US20 ) - documents an older method for fine-tuning instructions submitted to LLMs, as described in this 2022 research paper titled "Scaling Instruction-Finetuned Language Models" (Cf. jmlr.org/papers/volume25/23-08 ). The work has been superseded by this paper titled "Mixture-of-Experts Meets Instruction Tuning: A Winning Combination for Large Language Models" (Cf. arxiv.org/pdf/2305.14705 ).

This is the AI Overviews patent, titled "Generative summaries for search results" (Cf. patents.google.com/patent/US11 )

patents.google.comUS20240289407A1 - Search with stateful chat - Google Patents Implementations are described herein for augmenting a traditional search session with stateful chat—via what will be referred to as a “generative companion”—to facilitate more interactive searching. In various implementations, a query may be received, e.g., from a client device operated by a user. Contextual information associated with the user or the client device may be retrieved. Generative model (GM) output may be generated based on processing, using a generative model, data indicative of the query and the contextual information. Synthetic queries may be generated using the GM output, and search result documents (SRDs) may be selected. State data indicative of: the query, contextual information, one or more of the synthetic queries, and the set of search result documents, may be processed to identify a classification of the query. Based on the classification downstream GM(s) may be selected and used to generate one or more additional GM outputs.
#google#aioverviews#aimode

Large Language Models do not crawl the Web but apparently the SEO community is now going to tell the world they do.

The credibility of this industry just takes a nosedive every time something new comes along - not because of our ignorance of what is new, but because of the rush to become "expert" in things about which we know virtually nothing.

You CANNOT REVERSE ENGINEER Google's processes from emails disclosed in court. You will learn bits and pieces. RESIST THE TEMPTATION to create your own picture with those bits and pieces. Take this email as an example. The highlighted section says: "Those signals will be very helpful for us to upweighting good, authoritative pages and downweighting the spammy, untrustworthy ones." But read the rest of the message.

It would be challenging to train a Large Language Model on quality scores. They're not words and phrases. They're numbers.

The pretraining process could be used to filter out documents they don't want to use for training, or to ensure documents they want to use ARE included. It could also be used to assign aggregated scores to documents that are chosen for training (maybe the weighting could be used to adjust the weighted averages used to compute the relationships between words and phrases across the body of training documents).

The second paragraph makes it clear they didn't want to directly integrate these signals into the training data.

#ai#google#gemini

SEARCH ENGINE OPTIMIZATION = Improving the relationship between websites and search engines (including site search).

ANSWER ENGINE OPTIMIZATION = Improving the relationship between brands and interactive AI applications.

Learn the difference. Accept the evolution in human-computer interaction. Stop arguing over acronyms in a vain attempt to preserve "SEO" as some meaningful parcel of the universe.

#webmarketing#branding#seo

How To Read Patents For SEO

Seems like everyone wants to be the first to read some fantastic patent and discover the secret to Google's algorithms. If only it worked that way. If only it were that easy. If only some people weren't consumed by hubris, announcing great discoveries every few months. Technology companies love their patents. Many of them award bonuses, grants, and special recognition to employees who contribute to patents.

#seo #searchengineoptimization #searchengines #patents #webmarketing #digitalmarketing #ai #machinelearning

seo-theory.com/how-to-read-pat

SEO Theory · How To Read Patents For SEO
Mehr von Michael Martinez