The amount of people accessing the Web on mobile phones and devices has been outpacing desktop computer users.
Google has been trying to convince site owners to make their sites mobile-friendly, and there have been some patents and papers focusing upon mobile devices accessing the Web.
Your Location History helps you get better results and recommendations on Google products. For example, you can see recommendations based on places you’ve visited with signed-in devices or traffic predictions for your daily commute.
You may see this history as your timeline, and there is a Google Help page to View or edit your timeline. This page starts out by telling us:
Apple has a new patent aimed at accelerating mobile Web pages. We’ve heard that from others elsewhere on the Web, and it’s beginning to look like a trend. Who wants faster web pages on their phones?
It’s become increasingly obvious to people doing Search Engine Optimization that improving the quality of websites has meant making pages of a site faster and mobile-device friendly as more people started accessing the internet through phones and tablets as their primary connection to the Web.
Both Google and Yahoo helped site owners by releasing tools that could be used to check upon how fast sites were. Google introduced the online tool Pagespeed for Insights, which details steps that a site owner could take to improve the speed of a site. Yahoo published a browser extension called YSlow that runs a site through a number of tests or Heuristics that measure things that could be changed or improved on a site to make it faster.
A recently granted Google patent explains how Google may find and recommend locations for people to take pictures at. It describes how it might use something like Google Now to recommend “photogenic locations to visit.”
The patent tells us:
The present disclosure relates generally to systems and methods for recommending photogenic locations to visit. More particularly, the present disclosure relates to prompting a mobile device user that a photogenic location is nearby based on clusters of photographs.
The article seems more filled with questions than answers, such as where Google is getting the menu information, and even why they are publishing menu information. I suspect that a lot of restaurants will be be begging Google for ways to submit their latest menus in the near future.
Knowing what the menu might look like at a restaurant might make the difference between whether you will dine there, or drive past. For example, if I didn’t know better based on word of mouth, I wouldn’t begin to suspect that the Inn at Little Washington, in the middle of nowhere rural Virginia, might be one of the best restaurants in the United States. Here’s part of their menu:
That phone in your pocket is filled with applications, with sensors to measure movement and the world around us, with communications tools that put us in touch with work, home, family, friends, service providers and strangers.
That phone in your pocket is poised to teach itself how to work better, based upon how you use it, which applications you run, and how you use it to communicate with others.
A patent granted to Google last week explores different ways that parts and pieces of your phone can communicate with each other to remember settings in different contexts, to re-rank information based upon location and time and place, under a mobile machine learning system.
Imagine, for instance, landing at San Francisco International Airport to visit your brother. As you step off the plane, your phone resets its location and displays time and weather information on its home page for San Francisco. You open your phone, and the number for your limo appears at the top, with your hotel next, and then your brother’s home number (it would show his work number if it were earlier in the day).
Imagine recording your life, so that you can search through it, and play it back later. Things that you record through audio and video might be sent to your own personal search database where pictures you take might be processed. Images of faces may go through facial recognition software. Landmarks and objects might also be recognized as well. You might be able to write or speak queries like the following:
What was the playlist of songs at the party last night?
What were the paintings I saw when I was on vacation in Paris
Who were the people at the business lunch this afternoon?
How many books did I read in May?
It’s possible that you might be able to collect information like this, and have it associated with both your user ID and a digital signature to keep it from others, unless you decided to join with a group such as a family, or fire fighters, or co-workers, to create a shared data base for one or more events.