An Image Content Analysis And A Geo-Semantic Index For Recommendations

Image Content Analysis

This Patent Uses A Image Content Analysis With A Geo-Semantic Index To Build Search Recommendations It’s interesting seeing Google combining Google Maps with a better understanding of the Semantic Composition of areas within that map, analyzing images from sources such as Streetview. This patent reminded me of another one recently granted to Google, which I … Read more

Rewritten Queries and User-Specific Knowledge Graphs

on-device rewritten queries

Rewritten Queries On Search Engines Using Mobile Devices People write queries for search engines to find answers that fill their situational or informational needs. A recently granted patent from Google describes how a search engine might provide rewritten queries for people searching using handheld mobile devices such as mobile phones. Queries are rewritten using annotations … Read more

Semantic Relevance of Keywords

semantic relevance of keywords

Prelude – What is a Keyword? Usually, a term or a phrase is selected to get associated with a page so that the page may rank for that term or phrase in search results. That is known as selecting a keyword for a page. Domain Terms as Keywords There are other times when you may … Read more

Locally Prominent Semantic Features

locally prominent

Finding Locally Prominent Semantic Features Using a Computer This patent relates to determining locally prominent semantic features using a computer. Operations associated with the state of a geographic area can get implemented on a variety of computers. These include processing data associated with the geographic area for later access and use by a user or … Read more

BERT Question-Answering at Google

BERT Question Answering

BERT Question-Answering A Google Patent from May 11, 2021, is about Natural language processing (“NLP”) tasks such as question answering. It relies on a language model pre-trained using world knowledge. Advances in language-model pre-training have led to the use of language models. This one uses Bidirectional Encoder Representations from Transformers (“BERT”). Google has worked on … Read more