Ryan MacDonald from Liquid Web and Kevin Ohashi from Review Signal join Jonathan for a conversation where we look at both sides around hosting benchmarks and testing.
- Kevin’s first benchmark and why benchmarks
- The paths leading to Review Signal
- Benchmarks appearing on the hosting radar
- The premise behind benchmarks
- Structuring the testing
- A hosts reaction to the benchmarks
- The value of benchmarking internally for a host
- Giving feedback and having it taken seriously
- Ethics in hosting
- Guidance on choosing a host
Show Transcript
Jonathan: Welcome to another episode of Do the Woo. I’m your host Jonathan Wold. With me, I have two fantastic guests today. I have Ryan McDonald. Say hi, Ryan.
Ryan: Hey, folks.
Jonathan: And Kevin Ohashi. Welcome, Kevin.
Kevin: Hi. Thank you for having me, Jonathan.
Past days of CMS and starting in WordPress
Jonathan: Okay. We have a lot we can go over today, but we want to set up some background first. Kevin, would you mind just introduce yourself and start with when did WordPress first come onto your radar and then tell us what you do?
Kevin: WordPress probably came on my radar sometime in the 2000s. I’m not exactly sure. I was in the building website world since about ’96. Is it bad to admit I think my first CMS definitely wasn’t WordPress?
Jonathan: No, of course, not. What was your first?
Kevin: That’s a good question. I feel like there’s a lot of open source scripts that you just downloaded and tried and probably gave you malware most of the time, but I do remember using Drupal quite a bit before WordPress. Also Joomla, and then a bunch of other lesser known ones, but eventually got into WordPress and Drupal were the main two.
Jonathan: So, you’ve been in this space since the early days of WordPress. You do a lot of different things, but in this particular context, could you explain where do you spend most of your time?
Kevin: That’s a very good question. In this context, I spend most of my time working on Review Signal. Review Signal is web hosting reviews. There’s two brands to it now. It started out as, the idea was, let’s look for how do we create honest web hosting reviews? Because the review space is pretty bad in general, and the behavior that you would see is a lot of people would just ask around. Ask your friends, try and get real recommendations. And so, the idea of Review Signal was, let’s try and listen to that organic way that people talk about other companies, so it monitored Twitter and looked at which hosting companies people are talking about, which ones they’re complaining about, which ones they’re recommending to their friends, and coming up with an overall score of how well like they are.
Jonathan: We’re learned a bit more since we’ve talked about where that started from. We’re going to get to that a little bit later. We’ll talk more about Review Signal and the context for it. Ryan, tell us when did WordPress first come onto your radar?
Ryan: I think similar to Kevin, early 2000s. My first CMS, I think, was like PHP-Nuke.
Jonathan: Oh, yes. Yes.
Ryan: Then osCommerce, and I think everyone played with phpBB at some point. But yeah, it was that organic transition of just use what works. Late 2000s, WordPress started to come into my domain a little more, having worked with web hosts largely my whole career. By the early 2010-ish era, had been working with some larger web hosts and WordPress hosting was then taking off and that’s where my path intersected with Kevin’s over at ASmallOrange, and we started to do more WordPress-focused hosting. He came knocking on the door, both as a customer partner and, “Hey, I want to do some performance benchmarks, how do we get going?” And then we started to click on that.
Kevin’s first benchmark and why benchmarks
Jonathan: Kevin, that first benchmark, how long ago was that, just roughly?
Kevin: I think it was 2013 is when I started working on it, and I have a published date of 2014 in March.
Jonathan: Okay. It’s been a long time. I’m curious, you met Ryan fairly early on right at the beginning of this process. What was your motivation? What was it that said, “Okay, I want to go do benchmarks?”
Kevin: It was a bet. Quite simply, Jeff King at GoDaddy, who I believe… I don’t remember what his title was, but he was running the hosting section group whenever there. He came to me and said, “Our WordPress hosting is as good as anybody there is, performance-wise.” I just went, “Really? Are you willing to put, well, metaphorically, your money where your mouth is?” Because it was free. But he said, “Yeah, absolutely.” I said, “Okay. Well, let me figure out how would I run a test like that?” The benchmarks were born from this bet that GoDaddy was as good as anything out there. I rounded up eight companies, including them, that did this specialized WordPress hosting thing and ran some load tests and published the results.
Jonathan: How did you pick the first eight companies?
Kevin: I think this early, there wasn’t that many that really focused on it, so it was basically the biggest people around and just the few of them.
The paths leading to Review Signal
Jonathan: Now, Ryan, when Kevin reached out, I assume, was that the first time that you two had connected?
Ryan: I think in an official capacity of like, “Hey, let’s do something together,” yeah, for sure. Kevin had been with ASmallOrange, I think, for a little while then, doing the Twitter aggregation and helping to paint the picture of the differentiation between what we then called ASmallOrange being a boutique web host, where we were smaller but tried to focus on customer service and technology. Kevin really helped us paint that picture in terms of data on how we differentiated.
Jonathan: What I want to get into, Kevin, I’d like to touch on this differentiation he’s alluding to, but would you give us just a little bit of background? Before you get into this hosting space, what were you doing back in your university studies?
Kevin: I feel like being led on a little bit here.
Jonathan: Yeah, a little bit.
Kevin: Let’s pretend that was the most organic transition. Before I started Review Signal, the actual technology that powers it was my master’s thesis. I wrote a master’s thesis about predicting box office movie sales, using Twitter data. I collected probably 5 million tweets about an array of movies and then analyze them and basically came up with a formula to predict how much money they would make based on what people were saying, when they were saying it, et cetera. That’s what I wrote a thesis on. There’s no money in predicting how much money someone else is going to make, at least not at that moment, and eventually figured out I had been in the developer website hosting space forever and thought I could apply that same general idea to web hosting and reviews, because the context was similar, and spent about two years building that.
Jonathan: How would you frame what you’d be trying to predict? In the case of hosting, you have the Twitter data, you have companies mentioned and whatever the sentiment, because sentiment analysis basically, right?
Kevin: Yeah, it’s sentiment analysis and, well, a bunch of an immense amount of spam filtering, because Twitter’s full of that. But the general idea is, I’m trying to see which companies like and don’t like. My hypothesis was, reviews are weird as like a behavior. Most people don’t sit down and write a review. Reviews are embedded in language. When we talk to each other, we mention brands, places, people, and the way we talk about everybody, we are communicating reviews at a micro level at all times about everything we’re talking about, so how can I listen to that? That was the key concept.
Benchmarks appearing on the hosting radar
Jonathan: Ryan, how did you get first introduced to the concept of Kevin’s work? He was doing this, it sounds like, a bit before the benchmarks came around. When did it first come across your radar?
Ryan: Yeah, I think it came across my radar just organically. Back then, there wasn’t a lot of review sites, and the ones that, there was a cohort of review sites that were very clearly the paid reviews, and then there was a cohort of power user type reviews. Back then, small company, we all wore multiple hats across the organization. I had my own Google search filters to bring things up, and I think Kevin one day came up on my alerts along with a few others within the company, and it appeared Kevin was already a fairly happy customer. I don’t know if it was you reached out to us or I reached out to you or something, we crossed each other’s paths and it started to form a relationship there that led to benchmarks.
Kevin: ASmallOrange has a special place in the history of Review Signal. I don’t know if you knew this, I cannot remember your CEO’s name off the top of my head right now at that time, but I reached out to him before Review Signal actually became live and published. What happened was after I’d been working on this for almost two years and looking at the data and got to that moment where I was like, “Wow, this works, this data is great,” privately, I’m testing it with people that I trust in the industry, saying, “Look at this data. Does this fit your gut, or is it wildly wrong? You have to gut check these things, because I’m analyzing things on a scale nobody’s done before and I can’t manually review 100,000 different things. Well, not easily.”
And so, one company was at the top and it was ASmallOrange. They were ahead of everybody. That was great, because here’s this small little boutique brand that has a great reputation, the customers are raving about it everywhere. Then my problem was they don’t have an affiliate program, so I’m going to send them all the customers, not make a dollar, and this business is never going to work.
I reached out to your CEO at the time and said, “Hey, this is what I’m doing, this is my data. You show up on the top. I want to promote you and make this work. But if you don’t have an affiliate program, I feel like my business is probably dead in the water from day one, because it’s never going to make any money if everyone goes with the top-rated company.” It probably took me a couple months to negotiate you guys creating an affiliate program, but he did it just for me after I convinced him and showed him what I was working on. If that had gone another way, I’m not sure we’d be here today. I would’ve probably just moved on to the next thing and been like, “Okay, there’s no money. Being honest doesn’t pay.”
Jonathan: Wow. Yeah, I remember, I was a customer of ASmallOrange back in the day and I don’t really remember much about the product experience. I just remember that positive association of both my own experience had been positive and that I was part of this collective of people who had positive association. So, it’s interesting. A lot of things fade. I’ve had lots of hosting over the years, and yeah, that holds a special spot. Ryan, anything to add from your perspective to that part of the story with Kevin?
Ryan: I mean, it makes me really proud to hear, I’ll be honest with that, ASmallOrange still holds a very special place for me as well. It was something I helped grow from being this little tiny incubator to a much larger brand that eventually got gobbled up. But no, Kevin remained the largest partner with ASmallOrange for a very long time and for a good reason.
Jonathan: Kevin, you’ve got this background, you find that it’s working, you’re able to check the Twitter data and these types of things are coming up and you build the relationship with ASmallOrange, which introduced the business viability for Review Signal to justify the time that you’ve invested into it.
Kevin: I’ll be honest, it was barely. I think I got $5 in a sale.
Jonathan: Barely justify the time. But at least it sets you on that course. Now, how long after did benchmarks come into the picture? So, you had the sentiment analysis first and then benchmarks.
Kevin: I think it was within about a year or two. Less than two years, I think the benchmarks came out. Yeah, because once Review Signal published, I started adding more companies to it. But then I think WordPress hosting started to evolve around then, too. And I got lucky.
The premise behind benchmarks
Jonathan: For those who don’t know or maybe have a fuzzy… Would you explain a benchmark to us? What’s the core premise behind it, and what are these in practice?
Kevin: Benchmarking is trying to figure out what a standard is, how can we expect something to behave when we do something else to it, and then monitoring the difference when we try other things? In the context for web hosting, I’m monitoring two things. I want to see how consistent a company is and how well it can perform under stress. It’s probably most known for the stress part, that’s caught load testing. It’s when you send fake users and simulate users to a website to see if my website gets on the front page of Reddit, is it going to just crash and nobody can visit it? Or if I want to do a big promotion on my Woo store and have a big sale, 100 people all try and buy it at the same time and it goes down and I miss everything, that’s a problem.
The idea of these benchmarks is, let’s set up that store and then send 100 users to the checkout and see how fast does it respond? Does it stay consistent in its performance or start to degrade? When does it degrade? How much does it degrade? That’s the general idea of the benchmarks I do. I want to make sure they can handle load.
Jonathan: One of the things that both initially perplex me and then eventually just began to annoy me is the… Yeah, I think it’s gotten a bit better, but a lot of hosts will go and say, “We are the fastest.” I’m like, “Well, all 10 of you can’t be the fastest. What is this actually? What are we talking about here?” Ryan, I’m curious for your perspective. You’d been in ASmallOrange and we’re at the time and this concept of the benchmarks comes around. Now, what role did you take at the time at ASmallOrange? When Kevin starts coming to you to talk about benchmarks, what were you focused on at the time?
Ryan: Yeah. As the director of technology at ASmallOrange, I ran all our platform teams. We built it, ran our ops teams, our monitoring teams, much similar to the role I’m in today.
Jonathan: Had you already been thinking about the concept of benchmarks yourself? Or how did you relate when it was presented?
Ryan: Back then, performance was very different than it is today, where there was a lot of special sauce to it, so to speak of it wasn’t well-defined how to do good WordPress hosting. There wasn’t really off the shelf solutions out there. There was no CloudFlare. CDNs were largely in their infancy. So, it really depended on what you did server side, and it was a whole other discipline in of itself to benchmark and measure what you create. It’s one thing to be a platform developer and build something that customers could come to as part of a holistic product. It’s another thing to be able to throw back then what we would call the Slashdot experience of just overwhelming that platform with what is very good simulated traffic. There’s always been really good synthetic benchmarks, but not really good real world benchmarks. That’s what Kevin started to bring to the table very early on, was really good real world simulated benchmarks.
Jonathan: Kevin, you get together the first eight and you perform these benchmarks. Now, the stress testing seems fairly straightforward. I’m imagining the technology and approaches have changed over the years. How did you manage that? Yeah.
Kevin: Yeah. I think the most difficult thing that I’ve seen, so I’m not sure if I was the first person, but I seemed to be the first person who got some attention in the space at least to do benchmarks. The hardest thing for me then and still today is methodology. Creating a good test is hard, and I see a lot of people doing what they call benchmarks, but as far as I can tell, almost nobody spends enough time sitting down and thinking through what makes a good test. Coming up with how do we measure success versus failure, you have to do all this beforehand, you have to set it up, it’s science. You’re using a scientific approach to test and see how these companies perform.
And so, early on it was very difficult to find a good methodology and it’s changed over the years. That first year we had… Some of the tools are still the same webpage test, the idea of let’s test the performance of a host from all these different points around the world, uptime monitors. Used uptime robot and I think one of those scripts you download off of somewhere and run it yourself on a server and hope it works it, I’m looking at the old post and it says the script I call or used was called Uptime.
Thanks to our Pod Friends FooSales and Weglot
Structuring the testing
Jonathan: Kevin, are you ready for a terrible question?
Kevin: Always.
Jonathan: Okay. How do you feel about fairness, and why was it important to you to structure the test so that ASmallOrange would win?
Kevin: Did they win? I don’t even know how they did early on. I’d have to look.
Jonathan: Do you remember, Ryan? Before he answered that question, do you remember how ASmallOrange did?
Ryan: I don’t think we did the best of first year. I feel like it was a organic progression of Kevin helped us get to the best and hold that spot for a few years running.
Jonathan:We’ll talk about that, but yeah, Kevin, I want to hear your thoughts on fairness, that’s the real question.
Kevin: I actually am reading the quick summary from how they did, and apparently, it didn’t go well on their first normal VPS.
Jonathan: This is interesting, because you have the positive, you have the customer, and now the sentiment, where the customer’s like, “This is fantastic.” Then the benchmarks. Did you expect them to go do well, going into this?
Kevin: I don’t think I had any expectations going into this. It was the wild west.
Jonathan: Yeah, this is the first time.
Kevin: One thing I like to stress about the benchmarks is benchmarks are testing one very specific thing, performance. As a whole, it doesn’t test customer support, it doesn’t test any other aspect, except what Ryan called the Slashdot effect. Can this service handle traffic? I also test, do they stay online?
Jonathan: This is what’s interesting, because especially within the context of WooCommerce, this becomes even more important than ever, right? Because do they stay online has very real world ramifications. If you get a bunch of your store shows up on TV or something happens, probably less TV these days, but whatever, gets a bunch of traffic, and if it goes off, there’s a very real world impact to that, you lose sales, right? Performance, even though it’s just one thing, for many folks when it comes to web hosting, especially within the WooCommerce realm, is a significant thing.
Now, what I’m interested in unpacking a bit here, Kevin, is the tension, because here you identify ASmallOrange as the sentiment winner, you put all the work into the affiliate relationship for that whopping five bucks a sale, but then it doesn’t show up well on the benchmarks, at least initially. How do you reconcile that in your head? You’re trying to build a business here and you have, presumably, competing pieces of information for a topic that’s quite relevant, which is performance. How did you reconcile that?
Kevin: I mean, personally it doesn’t bother me. I say, let the chips fall where they may. My attitude towards it has always been, let’s be transparent about everything. Even if you don’t do… Well, it started with a curiosity and it’s become it into a very big deal for probably a very small group of people that really care about performance these days. But when it started, I thought it was just going to be a one-off blog post that faded into obscurity after hopefully getting a whole bunch of traffic, like a Slashdot effect and not crashing my server.
Jonathan: Who were you hosted with at the time?
Kevin: Who was I hosted with? I think I was running my own on DigitalOcean. I mean, Review Signal didn’t run WordPress originally. It was just all custom-built, so I was hosting myself.
A hosts reaction to the benchmarks
Jonathan: Ryan, I’m curious, you get the sentiment analysis piece, the benchmarks are done and ASmallOrange did however it did. How did you react to that? Because that sounds, as I’m hearing it right, that began what became a long-term relationship. How did you react to the benchmarks?
Ryan: Yeah. I think from a high level standpoint, to be mindful of, Kevin’s not just doing benchmarks. He’s going through your whole product onboarding experience. During that, he is very obviously technical in his approach to how he uses that product, but it also lends himself to call out pain points that are missed by everyone that’s into weeds day-to-day within the brand. Very early on, I remember that first test, it was like, “Hey, this email, wrong links, wrong product. It wasn’t obvious X, Y, Z reasons for different things. I think very clearly what came to the surface early on was that external unbiased eyes onto our product was invaluable.” Then everything from there organically leading into the benchmarks of him poking support and asking very pointed questions and not necessarily getting the best responses. Even though ASO, we love to support and we are immensely proud of it, Kevin found ways to break that mold that just broke support and broke the product and broke the UX and broke the platform.
By the time he had actually gotten to running the benchmarks, I think we were like, “Kevin, please just keep going. Find the breakage. Find the things that we’re not seeing.” It was just this really awesome experience that Kevin has kept doing all these years of, he has this ability to not just break things down scientifically for the sake of the benchmarks or the sentiment analysis, but dut to the end-to-end amazing product review. We didn’t take any of it personally. It was not a notch or ego or anything internally at ASmallOrange. It was, “Wow, this is valuable feedback.” We just took it, internalized it and it was like, “Great, how do we get to run these tests again?”
Jonathan: Kevin, why did you decide to do it again?
Kevin: Well, fortunately, I did get that Slashdot effect from it being published. It was, I think, like 10,000 people on the first day all read this, and that was, I think, more than probably when it launched and got featured on TechCrunch when it launched. Immediately, it was like, “Wow, there’s this demand.” As soon as it went viral like that, hosting companies I had never heard of but coming out of the weeds like, “Oh, we want to be in that. Can you include us? How do we do this?” Almost right away, I had to run another benchmark. I think it was six months later the next one was published. It was literally just immediate turnaround of, “We need this again.” Yeah. That published in March 26, and I published another one in November, 2014.
Jonathan: At this point, do you recall how many were included in that one? So, you had eight in the first.
Kevin: In that one I had 16 and two companies dropped out.
Jonathan: Without disclosing, we don’t need to know who they were, but what was that dropout experience?
Kevin: It’s all published. Literally, I’m just looking at the old blog post there. There’s no secret. It was DigitalOcean and Pressable. DigitalOcean never opted in. Since they were so popular and I was running on them and they had a WordPress install, I just ran it up, but it didn’t really fit, it wasn’t managed WordPress. And so, it didn’t do well, but I’m not here to make someone look bad that’s not even competing in the space, so I just dropped them. I honestly can’t remember why Pressable dropped out.
Jonathan: Was it reciprocated at the time?
Kevin: It may have been in that transition period. My memory isn’t that good. That could be why is…
The value of benchmarking internally for a host
Jonathan: Ryan, you’re at ASmallOrange at the time, and so BenchmarkONE comes around. I assume that you guys participated in the second one. Yeah. What you alluded to earlier in the conversation, at some point, sounds like you guys eventually were performing at the top. What more do you think is worth highlighting in terms of the value internally to this objective outside perspective?
Ryan: Yeah. We broke it into two categories. There was the, “Hey, this is really good feedback for our product and go-to market standpoint, and here’s all the technical Kevin broke our platform.” When we separated those, we sent the team off and they just worked on refining our really simple things, welcome emails, points of clarity and call to actions on the website and in our documentation and just things to make the general onboarding experience better. Then down the technical side, we just looked at the results, which Kevin publishes very, very detailed and we just broke it down from there of, “Hey, where in our platform do we need to focus, invest? Is it hardware? Is it technology, configuration?” And we just started refining it.
Giving feedback and having it taken seriously
Jonathan: Kevin, on your side of things, you’re having this experience with Ryan where… I’m at least imagining, at least from my perspective, it’s helpful to give feedback and know that people are taking it seriously. Was this the experience that you were having across the board? Were other hosts responding to the same degree of doing things? What was that experience like for you?
Kevin: It varied across the board quite a lot. Some companies, I think, just participated in it, “Oh, we want to be listed. Here’s an account, just do your thing. We just want the attention.” Other companies, I think, were a little bit more serious and invested in it. Ryan, actually, he got a special call-out in the article itself in the second round because they did better, and I think we must have talked quite a bit about what went wrong the first time, and I think they went back to the drawing board and fixed a lot of those holes that showed up in the first benchmark and improved quite a lot. They got an award on that second one. I’m not sure if I did an award the first year. It was a, here’s the results, these are interesting. But the second time, I actually came up with an award system.
Jonathan: It’s interesting because based on what you’re saying, Ryan, there’s a lot of value for the host themselves in this objective, third party coming in, and as you’ve alluded to, Kevin, this is not trivial work to design a test that is fair, that’s thorough, that gives, that’s repeatable. There’s a lot to it. I think. Kevin, what do you personally find motivating about all this? You can tell us otherwise why. I’m guessing Review Signal’s not a cash cow.
Kevin: I wish. It is not.
Jonathan: It’s not today. You’ve been doing this for a long time. What do you personally find motivating about this work?
Kevin: I mean, it started with frustration. I thought about web hosting, because I’ve been in that world or connected to that world almost since the very beginning. If you were going to create websites, you had to be connected in somehow to a web hosting. I was even a moderator at Web Hosting Talk, which was the largest forum as I was mainly focused on the domain name space. That was one of the original areas I was in. But I realized 10, 15 years later, when somebody asked me, “Oh, where should I host something?” There was nowhere for me to send them to get good data. To me, that was just a problem. How can there be no good information about something so critical to modern e-commerce and just the web in general? It’s crazy that it just relies entirely on this backdoor word of mouth recommendations and that’s it.
And so, it was frustration that brought it about. And to keep it going, it goes up and down. I think the benchmarks have become a life of their own, separate from Review Signal. To respect that, last year I spun them out to their own WP hosting benchmarks.com, because it was just a blog post before and it really felt limiting in terms of the way I could display data, present it, and just acknowledge the fact that they’re their own thing at this point, really. It’s under the Review Signal brand of honest web hosting reviews data, but they’re their own thing, and they’re bigger than the race to the site already.
Ethics in hosting
Jonathan: Kevin, true or false? The hosting industry is highly ethical.
Kevin: I can’t imagine a scenario where anyone would accept that premises true.
Jonathan: You and I met, Kevin. I was a co-organizer in an event called Host Camp a couple of years back, and I got introduced to you by one of our co-organizers. I didn’t know about your work. I think I’ve seen Review Signal around, but I didn’t know all that much about it. Your opening talk, the thing that we worked on together was on ethics in hosting. Is it fair to say it’s something you care about?
Kevin: I mean, absolutely. Part of that frustration, creating Review Signal and running it for the last, is it more than 10 years now? It’s a scary thought, is this desire for something better. I do web hosting reviews, but I also use Review Signal as a platform to push what I think is right and people may or may not disagree with me, but I certainly have a lens that I’m viewing everything through and I try and push for what I think is better for consumers, fairer, and trying to protect them. I even get sidetracked. I think I spent probably half a year not too long ago, digging into some of the domain stuff with .org, where a private equity company tried to acquire the .org registry and essentially give themselves license to tax nonprofits. That really is not directly connected to hosting it anyway, but using Review Signal as the platform of I want to try and fight for a better, fairer, more ethical internet and I’m not afraid to co-mingle these things.
Jonathan: Yeah. Ryan, you’re at ASmallOrange at the time, you’ve been in the hosting industry for a long time now and you’ve seen a lot of things like… What’s your own perspective of having been on the other side at a hosting company to, I guess, this question of ethics or the value in having… Is there a need for accountability? How do you feel about this topic, broadly?
Ryan: As web host, obviously, we invest substantially into our support departments and our customer service experience. What Kevin was putting forward was helping us put data outside-in to that. Back then, NPS was around, but the hosting industry, it wasn’t widely adopted. We were just going off, “Hey, we were being successful, we’re getting good word of mouth, our cancellation feedback looks okay. What other metrics could we use? There was a lot of just paid reviews out there at the time, be it influencers on YouTube or Power Affiliates that were doing their own pseudo reviews. What Kevin really helped us just validate, “Hey, our investment in support is paying off,” and gave us fuel to just really just keep that investment going and to support and customer experience.
Jonathan: I’m personally quite interested in how this evolves. I’d love to see more people participating, and it’ll be interesting to see there’s a broad category of work. There’s both the benchmarks. I imagine, Kevin, in our work with Host Camp, together, you began working on, “Okay, what would it look like to have ethical standards?” People can do whatever they want, but for end users, I think there’s a lot of value in saying, “Okay, these are standards, and host can choose or not to adhere to those standards, and as a community, we can hold them accountable for it.
Guidance on choosing a host
There’s a lot there. As we’re wrapping up though, I’d like to bring this around to WooCommerce for a moment, because as we alluded to earlier, e-commerce and with WooCommerce specifically, it’s an application that I think in general it tends to… Of all that you could do on WordPress, WooCommerce asks a lot more of WordPress, and thus of the hosting. For folks who are trying to make good decisions about where to put their own stores, where to guide their own customers and clients to make better choices about hosting, Kevin, do you have any broad guidance that you’d offer, or specific, on how to make better choices about who to work with?
Kevin: Sure. My general guidance is… Because people reach out and ask, “What host should I use?” I like to think through it as a process. First, you need to figure out which hosts meet your specifications. Does my website need to be in a certain geography for reasons like my customer is there or legal reasons, like I can’t have data hosted in this country? So, you filter on the functional stuff first. Then you have to filter on… Budget tends to be the second thing. I have X dollars per month that I can spend on this, so if a host doesn’t meet that, they’re already disqualified, and that’s the starting point.
Then you actually get into the research of it, of, let me look at the reviews, look at the performance. But the test I like most, because people have a lot of questions for me is, test the support, reach out, ask some questions. All the questions you asked me, you should be asking each of these hosts and seeing how they respond and engage with you. The general advice I have is, if you’re not happy with the way the sales support has treated you, you’re probably not going to be any happier when you’re a paying customer and get the same level or worse. If it doesn’t really impress you from the start, I would be pretty hesitant to build a relationship off of that.
Jonathan: Yeah. Awesome, thank you. Ryan, Nexcess is a pod friend to do the Woo. You guys have been doing cool stuff in the space for a bit. Even with your Nexcess hat aside, you’ve been in the space for a while, what guidance would you have to offer? Anything to add to what Kevin said? Any different ways that you think about it?
Ryan: No. I mean, Kevin’s pretty spot-on. How you filter is going to be a fairly business focused or individualistic exercise, and all roads lead to support, so test that support, because it’s really what’s going to make or break your experience. Every web host is going to have downtime, every web post is going to have issues, and support and how they walk you through those pains are what keep you as a customer for the long term.
Jonathan: Excellent. Ryan, if people are interested in connecting with you, where can they find you on the web?
Ryan: They could find me @rfxn on Twitter, or check us out on the Nexcess.net blog.
Jonathan: Kevin, how about you? If someone wants to follow your work and connect with you, what’s the best place for them to do that?
Kevin: You can reach me @reviewsignal.com, or on Twitter under @ReviewSignal or @kevinohashi, or just Google my name and I’d probably pop up in 50 different places. I’m the only person at Review Signal, so if you reach out on any platform, it’s me reading it.
Jonathan: Excellent. Only person today, right Kevin?
Kevin: I don’t think that’s going to change anytime soon, if I’m being honest.
Jonathan: Well, gentlemen, thank you both for your time and for the dedication and effort that you’ve put into this space for all this time. We appreciate it. I hope to see Review Signal continue to grow and folks like yourself, Ryan, contributing and putting what you’re getting into practice. Thank you both. That wraps up this episode of Do the Woo. See you guys next time.
Kevin: Thank you.
Ryan: Thanks for having us, Jonathan.








Leave a Reply