White Hat Search Engine Optimization (SEO): Structured Web Data for Libraries
“White hat” search engine optimization refers to the practice of publishing web pages that are useful to humans, while enabling search engines and web applications to better understand the structure and content of your website. This article teaches you to add structured data to your website so that search engines can more easily connect patrons to your library locations, hours, and contact information. A web page for a branch of the Greater Sudbury Public Library retrieved in January 2015 is used as the basis for examples that progressively enhance the page with structured data. Finally, some of the advantages structured data enables beyond search engine optimization are explored.
search engine optimization; structured data; schema.org; RDFa
(ProQuest: … denotes formulae omitted.)
SEO White Hat Introduction
Search engine optimization (SEO) has acquired an unsavoury reputation through its association with spammers who want to sell you snake oil that will help your website rise in the ranks of search engine relevancy. Web rings, spam comments on blogs, and other underhanded tricks have cast aspersions on the perfectly valid pursuit of trying to make your content and services discoverable by people who might benefit from it. The term hackers was once a term associated with those who were simply interested in how technology works and how to make it work better, until it was spoiled by association with unethical actors who wanted to steal content or make lives miserable. Now, however, the developer community has reclaimed the term by differentiating “white hat” hackers vs. “black hat” hackers. So, too, are web developers reclaiming the term SEO by differentiating “white hat” SEO vs. “black hat” SEO.
In this article, I introduce you to some of the basic properties of a website that you need to look at from a white hat SEO perspective to make it easier for your prospective audience to connect to your library and its resources. This article uses examples drawn from a real library website. I encourage you to compare those examples with your own site as the article progresses.
Before addressing the addition of “structured data” to websites, there are a few prerequisites for improving website discoverability that you should first consider.
A “robot” is a computer program that crawls through websites to extract content and to process it for some purpose. Search engines like Google and Bing use robots to harvest content from sites so that they can serve up links to your pages, images, and library resources in search results; other robots might be interested only in product information so that they can tell you the cheapest place to buy something online. Even though they are computer programs, however, robots are easily confused.
For example, if a robot finds a search results page on your catalogue that lists thousands of results for the term “fish” based on a search result URL like http://catalogue.example.com/results?term=fish, it will dutifully crawlthrough all of the returned records. Each record may have a URL like http://catalogue.example.com/record/1?term=fish that maintains theoriginal search term “fish”. If the robot subsequently finds a search results page for the term “fishing” based the search result URL http://catalogue.example.com/results?term=fishing that, due tostemming, returns the same records as the term “fish”, it will dutifully crawl through all of the records again because the URL http://catalogue.example.com/record/1?term=fishing differs.
Humans can discern that the record details page for each result will display the same record, and that (if anything) the list of related search results might change on that page without affecting the substantive content on the page. Robots, in contrast, can find it extremely difficult to predict what content these links will return without human intervention. Providing guidance for robots is one of the roles of a webmaster, and while most search engines offer administrators control through “webmaster tools” interfaces, it is cumbersome to repeat those steps for every search engine of interest
SEO White Hat Tip and Tricks
as Alexatmedia Usually Provided in Each post, SEO Tip and Tricks, Search Engine Optimization is a huge process so If You Are Working On Optimize a Big Size and pages Website Then You Need To Use SEO Tip and Tricks To Make It Easy For Your, So What you Can Do If You Working On a big Project Like “OLX” Or “Amazone”, Then You Need a plan Then You Need a team Work, But What If You Want To Do It With your Self Only, Then You Have To Complete Reading.
SEO Plan :
SEO Plan For That Kind Of Shopping Websites Will Depending On Social Signal More Than Backlinks, But To Cover Social Channels You Have To Do Two Things, first if You are using WordPress It Will Be Easy all You Have To Do Is Install “Social Network Auto Poster Plugin” and we advice You With Pro Version Because It Contain a lot Of Create Features, But If The Website With Special Program Design, Then You Have To Do On Of Two Things “1-ask Developer To Add Social Share bottoms and make it auto share, or You Need a team Work To Do It Manually”.
That Was Social Signal With an Easy Way .
The Next Step In SEO Plan is Content :
as The SEO Professionals Say :Content Is The King , So You Have To Focus On High Quality Content, With Keyword Density Between 3 :5 % Of The Article “Alexatmedia Content Secret : Try To Use 5 Keywords In The Article With Density For The Main Keyword 3 :5 % and For Sub main Keywords Make It 2.5 % No More No Less”
-Write The Long Of The Article Depending On Analysis Of You Competitors Content, and Find Your Way To Rank On That Article “The Second Alexatmedia Content Secret : Create Long Article To Get Rank and Create 5 Articles To Serve and Help That Main Long Article To Get Ranking Easier ”
“Search Engine Optimization Is a Game So Keep up To Date With The Updates Of it “
“Search Engine Optimization Not Complex It Just Has Some Rules You Have To Follow “
“Put Your Own SEO Strategy and Follow It “
“Alexatmedia The Home Of SEO “