There are programs that read through the Internet, following links from one website to another. They have several names. I like the name spiders because they crawl through the World Wide Web. They are usually called robots or bots.
Some of these robots are controlled by search engines and crawl websites so the search engines can index all websites.
Others are looking for websites to exploit, and find forms to automatically fill out to promote or provide links.