In the early nineties, search engines came out with submission forms that had to be filled out to be a part of the optimization process. Most website owners would then tag their websites and webpages with the required keyword information and then submit this to the search engines. Then the next process involved a bot crawling through these websites to be indexed. However, this did not work out very well since most submissions turned out to be spam content. Since 2001, the submission form process was discarded and earning links from other sites was given more prominence. If you go digging, you will be able to find samples of these ancient submission forms on one of the search engines.
A long time back, Meta tags were deemed to be very important to the SEO process, and it was necessary to include the keywords in your Meta tags. So that when the users typed in those keywords, your page would pop up in the results. However, this particular process was spammed a lot and was finally dropped by major search engines. Interestingly this has led to the Meta tags not being the sole focus of the crawler bots today.
Interestingly, many people made the mistake of stuffing their keyword with too many keywords making it appear as spam. Even though many tools se keywords as a measure of how relevant the content of a website is, keyword stuffing is frowned upon. Having more natural links to your website displays the presence of quality content better than a high keyword density. So if you manage to get a good editorial link from an authoritative site to your website, you will be able to improve your SERPs in no time at all. Today’s search algorithms are programmed to pick on any keyword stuffing quickly, so beware from bringing down your page ranking with a rookie mistake.