Monitor Website Status with Twitter

GeekThis remains completely ad-free with zero trackers for your convenience and privacy. If you would like to support the site, please consider giving a small contribution at Buy Me a Coffee.

After spending many hours working on your website, you want to ensure that it stays up and running. There are many tools available to monitor the status of your website and infrastructure such as system logs for identifying bugs and monitoring how much traffic is being received. Then there are third party services like StatusCake and Uptime Robot that ping your servers frequently and attempt HTTP requests and compare the result to predefined rules. These services are great for monitoring uptime but they lack the ability to identify non-fatal server bugs on the website. Such bugs could include not being able to edit profiles, create new posts, etc.

One solution available to monitor website problems is to monitor Twitter and other social media sites. When your website isn’t functioning properly people will start tweeting out problems they are having with your site. This won’t work well with sites with under tens of thousands of views a day, but your mileage may vary depending on your user base. When monitoring social media you need to expand what you monitor and not limit yourself to tweets that mention your website’s Twitter account. You will want to monitor for terms such as “outage”, “down”, “unavailable”, “error”, “access”, and more. I recommend searching your website name and others and see what are common terms people use when describing website problems and outages.

Below is a very basic Go application that uses the Twitter Streaming API to monitor in real time new tweets using our terms above. The Twitter Streaming API is a nice interface that gives you real time access to tweets without having to constantly poll for updates. To use the below code, you will need to register for a Twitter application and jot down the consumer key, consumer secret, access token, and the access secret.

The program below requires you to pass the four keys and secrets as command line arguments. You also pass keywords you want to track to the “–track” parameter. The “–track” parameter can be used multiple times. The code doesn’t do much right now other than print out tweets matching the words you are tracking. This should be modified to either send a request to an alert system you have in place (or just an email).

$ ./twitter-monitor \
--consumer-key=<value> \
--consumer-secret=<value> \
--access-token=<value> \
--access-secret=<value> \
--track="website down" \
--track="website unavailable"

The small script below relies on the go-twitter library. The library makes it really easy to use the Twitter API through Go. The function onTweet is called whenever a new tweet is fetched from the Twitter API. You can read the documentation for structure of the Tweet type on godoc.

package main

import (
	"log"
	"flag"
	"syscall"
	"os"
	"os/signal"
	"strings"
	"github.com/dghubble/go-twitter/twitter"
	"github.com/dghubble/oauth1"
)

type trackedStrings []string

func (t *trackedStrings) String() string {
	return strings.Join(*t, ", ")
}

func (t *trackedStrings) Set(value string) error {
	*t = append(*t, value)
	return nil
}

func onTweet(tweet *twitter.Tweet) {
	log.Println(tweet.Text)
}

func runStream(ck string, cs string, at string, as string, track []string) {
	config := oauth1.NewConfig(ck, cs)
	token := oauth1.NewToken(at, as)
	httpClient := config.Client(oauth1.NoContext, token)

	// Create Demux
	demux := twitter.NewSwitchDemux()
	demux.Tweet = onTweet

	// Create Twitter Client
	client := twitter.NewClient(httpClient)
	params := &twitter.StreamFilterParams{
		Track: track,
		StallWarnings: twitter.Bool(true),
	}

	stream, err := client.Streams.Filter(params)
	if err != nil {
		log.Fatal(err)
	}

	go demux.HandleChan(stream.Messages)

	// Signal
	ch := make(chan os.Signal)
	signal.Notify(ch, syscall.SIGINT, syscall.SIGTERM)
	log.Println(<-ch)

	stream.Stop()
}

func main() {
	var consumerKey string
	var consumerSecret string
	var accessToken string
	var accessSecret string
	var track trackedStrings

	flag.StringVar(&consumerKey, "consumer-key", "", "Consumer Key")
	flag.StringVar(&consumerSecret, "consumer-secret", "", "Consumer Secret")
	flag.StringVar(&accessToken, "access-token", "", "Twitter Access Token")
	flag.StringVar(&accessSecret, "access-secret", "", "Access Secret")
	flag.Var(&track, "track", "Strings to track")
	flag.Parse()

	if consumerKey == "" || consumerSecret == "" || accessToken == "" || accessSecret == "" {
		log.Fatal("All tokens are required.")
	}

	if len(track) == 0 {
		log.Fatal("At least one `track` keyword is required.")
	}

	runStream(consumerKey, consumerSecret, accessToken, accessSecret, track)
}

The code has a few problems and isn’t production ready by any means. The first main problem is that if any of the access tokens are incorrectly, there isn’t any error and the problem hangs until a SIGINT event. The other problem is there isn’t any error handling or fallback in the event the Twitter streaming API disconnects. I mostly threw this code together to practice using the Go language and to see if monitoring Twitter would be beneficial.

While writing this post a few websites were luckily having server problems. I ran the above code with keywords based on their website name to monitor what people were saying on Twitter. The program works in the sense that I was able to see that people were tweeting out that they were having trouble with a particular website. I could see the code as being useful as an additional monitoring service but not as a standalone solution. Your site has to be very popular to get any useful feedback through Twitter.

Related Posts

How to Separately Handle Multiple Website Deployments

Learn how to configure your computer to run multiple website deployments with rsync and multiple Linux accounts.

Selling Your Website on Flippa

A quick guide on how to sell your website on Flippa and get its true value.

Scraping Title and Meta Data with Scrapebox

Scrape a list of URLs to get site title, description, and keywords to research competition or find missing meta tags using Scrapebox.

Speeding Up Your Website

Multiple tips on how to speed up your website and tools to help analyze your website performance.