A quick and dirty url benchmarking script in Ruby
When trying to get to the bottom of a mis-behaving server issue, we needed to compare the time taken to serve URLs on different domains. So I knocked together a quick script to request urls a bunch of times and give us the average. Posting it here and also as a gist in case it saves someone some time.
#!/usr/bin/env ruby
# Used to perform multiple requests of urls for benchmarking
#
# @example
# # Requests the url 3 times outputting times for each
# ./benchmark_url 3 http://my.url.com
# ruby benchmark_url 3 http://my.url.com
#
# # Requests each url 5 times outputting times for each
# ./benchmark_url 5 http://my.url.com http://my.url2.com
# ruby benchmark_url 5 http://my.url.com http://my.url2.com
require 'benchmark'
require 'net/http'
require 'uri'
numtimes = ARGV.shift
urls = ARGV
urls.each do |u|
total = 0
puts u
puts '=' * u.length
(1..numtimes.to_i).each do |n|
time = Benchmark.realtime {Net::HTTP.get_response(URI.parse(u))}
puts "#{u} - #{n} : #{time}s"
total += time
end
puts '=' * u.length
puts "Average for #{u} : #{(total / numtimes.to_i).round(4)}s"
puts "\n"
end
Written by Adam Phillips
Related protips
Have a fresh tip? Share with Coderwall community!
Post
Post a tip
Best
#Ruby
Authors
Sponsored by #native_company# — Learn More
#native_title#
#native_desc#