Screen scraping with Colly in Go
Programming Snapshot – Colly
The Colly scraper helps developers who work with the Go programming language to collect data off the web. Mike Schilli illustrates the capabilities of this powerful tool with a few practical examples.
As long as there are websites to view for the masses of browser customers on the web, there will also be individuals on the consumer side who want the data in a different format and write scraper scripts to automatically extract the data to fit their needs.
Many sites do not like the idea of users scraping their data. Check the website's terms of service for more information, and be aware of the copyright laws for your jurisdiction. In general, as long as the scrapers do not republish or commercially exploit the data, or bombard the website too overtly with their requests, nobody is likely to get too upset about it.
Different languages offer different tools for this. Perl aficionados will probably appreciate the qualities of WWW::Mechanize
as a scraping tool, while Python fans might prefer the selenium
package [1]. In Go, there are several projects dedicated to scraping that attempt to woo developers.
One of the newer ones is Colly (possibly from "collect"). As usual in Go, it can be easily compiled and installed directly from its GitHub repository like this:
go get github.com/gocolly/colly
After installation, a program like Listing 1 [2], for example, can access the website of the CNN news channel, dig through the links hidden in its HTML, and display them for testing purposes.
Listing 1
linkfind.go
01 package main 02 03 import ( 04 "fmt" 05 "github.com/gocolly/colly" 06 ) 07 08 func main() { 09 c := colly.NewCollector() 10 11 c.OnHTML("a[href]", 12 func(e *colly.HTMLElement) { 13 fmt.Println(e.Attr("href")) 14 }) 15 16 c.Visit("https://cnn.com") 17 }
Goodbye, Dependency Hell
Customary in Go, go build linkfind.go
creates a binary named linkfind
, which weighs in at a hefty 14MB, but already contains all the dependent libraries and runs standalone on similar architectures without further ado. What a concept! Go shifting today's typical "Dependency Hell" from run time to compile time, and thus recruiting developers to do the heavy lifting instead of the end user, is probably one of the greatest ideas of recent times.
Listing 1 uses NewCollector()
to initialize a new colly
structure in Line 9; its Visit()
function later connects to the CNN website's URL and kicks off the OnHTML()
callback as soon as the page's HTML has arrived. As its first argument, the "a[href]"
selector calls the subsequent func()
code only for links in the format <A HREF=...>
. The Colly library ensures that each call receives a pointer to a structure of the colly.HTMLElement
type, containing all relevant data of the matching HTML structure, from which line 13 extracts the link URL as a string via e.Attr("href")
.
Moving on, how difficult would it be to determine which links branch to external websites and which reference the site internally? Listing 2 is based on the same basic structure, but also defines a counter structure, Stats
, and no longer hardwires the URL to be examined in the code, but accepts it as a parameter on the command line.
Listing 2
linkstats.go
01 package main 02 03 import ( 04 "fmt" 05 "github.com/gocolly/colly" 06 "net/url" 07 "os" 08 ) 09 10 type Stats struct { 11 external int 12 internal int 13 } 14 15 func main() { 16 c := colly.NewCollector() 17 baseURL := os.Args[1] 18 19 stats := Stats{} 20 21 c.OnHTML("a[href]", 22 func(e *colly.HTMLElement) { 23 link := e.Attr("href") 24 if linkIsExternal(link, baseURL) { 25 stats.external++ 26 } else { 27 stats.internal++ 28 } 29 }) 30 31 c.Visit(baseURL) 32 33 fmt.Printf("%s has %d internal "+ 34 "and %d external links.\n", baseURL, 35 stats.internal, stats.external) 36 } 37 38 func linkIsExternal(link string, 39 base string) bool { 40 u, err := url.Parse(link) 41 if err != nil { 42 panic(err) 43 } 44 ubase, _ := url.Parse(base) 45 46 if u.Scheme == "" || 47 ubase.Host == u.Host { 48 return false 49 } 50 return true 51 }
To distinguish external links from internal ones, the linkIsExternal()
function uses Parse()
from the net/url
package to split the link URL and the original base URL passed into the function into their component parts. It then checks via Scheme()
if the link is missing the typical http(s)://
protocol or if the host is identical in both URLs – in both cases, the link points to the original page, so it is internal.
Structured by Default
Line 19 initializes an instance of the Stats
structure defined previously in Line 10, and – as usual in Go – all members are assigned default values; in the case of the two integers, each starts out at
. This means that lines 25 or 27 only need to increment the integer value by 1
for each link examined; at the end of the program, line 33 can then output the number of internal and external links. For the Linux Magazine home page, this results in:
$ ./linkstats https://www.linux-magazine.com https://www.linux-magazine.com has 64 internal and 12 external links.
It's a pretty complex website! But collecting link stats does not exhaust Colly's usefulness. Colly's documentation [3] is still somewhat sparse, but the examples published there might give creative minds some ideas for future projects.
Let's Go Surfing
For example, I frequently visit the website surfline.com, which shows the wave height at selected beaches around the world. Since I love surfing (at a casual level, mind you), I've always wanted a command-line tool that quickly checks the site for my local beach (Ocean Beach in San Francisco) and discovers whether there are any monster waves preventing me from paddling out, because – as a hobby surfer – anything more than eight feet has me shaking in my wetsuit. Figure 1 shows the relevant information as it's displayed on the web page; Figure 2 illustrates where the data is hidden in the page's HTML according to Chrome's DevTools. It is now the scraper's task to shimmy through the tags and trickle out the numerical values indicating today's surf size.
Squinting at the HTML in Figure 2, the wave height is indicated in a span
tag of the quiver-surf-height
class. However, this tag occurs several times in the document, because the page also displays the conditions at other neighboring surf spots. The trick now is to find a path from the document root to the data that is unique and therefore only matches the main spot's data. As Figure 2 shows, this path is winding through an element of the sl-spot-forecast-summary
class.
Programming this is an easy job; you just pass the two class names to the OnHTML()
function in line 12 of Listing 3 as space-separated strings in the first argument. The query processor digs down on this path into the document to find just one, and thus the correct, wave height for the current choice of spot.
Listing 3
surfline.go
01 package main 02 03 import ( 04 "fmt" 05 "github.com/gocolly/colly" 06 "github.com/PuerkitoBio/goquery" 07 ) 08 09 func main() { 10 c := colly.NewCollector() 11 12 c.OnHTML(".sl-spot-forecast-summary " + 13 ".quiver-surf-height", 14 func(e *colly.HTMLElement) { 15 e.DOM.Contents().Slice(0,1).Each( 16 func(_ int, s *goquery.Selection) { 17 fmt.Printf("%s\n", s.Text()) 18 }) 19 }) 20 21 c.Visit("https://www.surfline.com/" + 22 "surf-report/ocean-beach-overview/" + 23 "5842041f4e65fad6a77087f8") 24 }
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.
News
-
Systemd Fixes Bug While Facing New Challenger in GNU Shepherd
The systemd developers have fixed a really nasty bug amid the release of the new GNU Shepherd init system.
-
AlmaLinux 10.0 Beta Released
The AlmaLinux OS Foundation has announced the availability of AlmaLinux 10.0 Beta ("Purple Lion") for all supported devices with significant changes.
-
Gnome 47.2 Now Available
Gnome 47.2 is now available for general use but don't expect much in the way of newness, as this is all about improvements and bug fixes.
-
Latest Cinnamon Desktop Releases with a Bold New Look
Just in time for the holidays, the developer of the Cinnamon desktop has shipped a new release to help spice up your eggnog with new features and a new look.
-
Armbian 24.11 Released with Expanded Hardware Support
If you've been waiting for Armbian to support OrangePi 5 Max and Radxa ROCK 5B+, the wait is over.
-
SUSE Renames Several Products for Better Name Recognition
SUSE has been a very powerful player in the European market, but it knows it must branch out to gain serious traction. Will a name change do the trick?
-
ESET Discovers New Linux Malware
WolfsBane is an all-in-one malware that has hit the Linux operating system and includes a dropper, a launcher, and a backdoor.
-
New Linux Kernel Patch Allows Forcing a CPU Mitigation
Even when CPU mitigations can consume precious CPU cycles, it might not be a bad idea to allow users to enable them, even if your machine isn't vulnerable.
-
Red Hat Enterprise Linux 9.5 Released
Notify your friends, loved ones, and colleagues that the latest version of RHEL is available with plenty of enhancements.
-
Linux Sees Massive Performance Increase from a Single Line of Code
With one line of code, Intel was able to increase the performance of the Linux kernel by 4,000 percent.