Technology


Devon McDonald of OpenView Partners recently wrote a blog post on Scrum Agile Marketing in which she discussed Minimum Viable Marketing. It got me thinking about my clients’ prospects and customers, and it lead me to the following theory, which I am now testing, using a more agile approach:

  1. Prospects don’t want to be sold or marketed to, but most want to be educated.
  2. They’re not looking to get a degree. They just want an answer to a question.
  3. They don’t want to plow through long documents, so the answer has to be easily found.
  4. If they have another question, they want that answered, too.
  5. They are constantly dealing with others’ objections, so new-found knowledge has to be easily shared.

That lead me to create a series of one-minute videos, each designed to answer one question or cover one topic very succinctly. I’ve chosen as a topic, information technology in remote and branch offices. The series is called Branch Office Tech Tips. I’ve posted eight so far, but expect new content frequently, and don’t be surprised if some content is replaced. This is Agile.

As part of this experiment, I’ve hosted the videos on Wistia, because Wistia gives great insight into how many people watch, for how long, and when they stop watching. But seriously, if I can’t keep someone engaged for 60 seconds, then I need to go back to the content for a do-over.

I’ve added another page on this blog, specifically dedicated to Branch Office Tech Tips (BOTT). I’m also going to start making better use of IFTTT. Blog updates will automatically be posted to LinkedIn, Twitter, and other platforms. At least that’s my plan. We’ll see what works.

I do remember enough of my college statistics courses to know that using Twitter to evaluate trends and attitudes introduces significant sampling and reporting bias.   Still, ignoring the obvious data quality issues, I do enjoy reading through my TwitterFall feed every week. Because of my interest in application availability requirements, which drives the business of StorMagic, on whose board I serve as an independent director, I typically set the search terms to “computers” and “down,” but when “She shuts it down like computers” is trending, I’ll switch to searching on “computers” and “free.”

This is just a sampling of the “computers” and “free” tweets I saw this week. Apparently, for the patrons of some quick-serve restaurants, when computers go down, #lifeisgood.

 

AbbeyVoss just got a free coffee and biscuit at Caribou cause their computers are down #lifeisgood

lilseannn Public Service Announcement : Taco Bells computers are down; free TB!

r0danthony Computers shut down in the cafeteria. Your boy got a free meal. Lololol.

 

I’ve noticed over the course of the past two years, that Starbucks typically offers up free food and coffee when the computers go down, as captured in this tweet last week.

‏@NDarnell96  Getting free Starbucks cause their computers froze

 

My guess is that Starbucks gives away coffee when the computers go down, because it’s cheap marketing, and they assume the cost of delivering high-availability applications is too high. Coffee brewers basically turn water to gold, anyway. And if you’ve got as many gold buyers as Starbucks, what’s wrong with an occasional free giveaway when customers are willing to provide free advertising? I’ve never been able to verify this, so if anyone can validate the assumption, please let me know. If that’s the approach, at Starbucks, I get it. But it appears someone wasn’t on the program last week, as I also saw this tweet:

@Spencer_Westley@Starbucks‘ computers are down and WHAT IS LIFE?

 

This “give it away when the computers are down” approach works fine at quick-serve restaurants, when you only need computers to take payments, but it gets much more challenging if you need the computers to get the orders from the front counter to the kitchen, to apply loyalty credits, to recall the frequently ordered items on the automated order entry system, to actually make the food, to know when to plate the food, or when you take orders over the web, but fulfill them in the restaurant. These days, companies drive efficiency from automated operations and new revenue sources from processes that are dependent upon computers.  They also are driving the perception of customer intimacy by knowing more about their customers’ likes, dislikes, and preferences. Computers matter, not when you’re selling any cup of coffee to the next person in line, but when you are selling this particular, customized cup of coffee to that loyal customer.

A meeting this week with Amy O’Connor, Senior Director of Analytics at Nokia and author of the Im AmyO blog, has led me down an interesting path at the end of the year. Normally, I might spend the last day of the year in self-reflection: Am I happy with how I spent the past year? Do I feel good about the results? What will I resolve to do differently next year? This year, however, instead of self-reflection, I’ve decided to end the year in a little self analysis. What’s the difference between reflection and analysis? Data.

To help me with that, I’m re-reading “Competing on Analytics: The New Science of Winning,” written by Thomas Davenport and Jeanne Harris and published by Harvard Business School Press back in 2007. The first thing that became abundantly clear was that I didn’t have enough data on myself, my activities, and the results of those activities.  So, I decided to collect some. As a starting point, I decided to analyze my activity publishing content on Wikibon.

I posted my first article, “StorMagic Announces SvSAN and Offers Free Download,” on Wikibon on February 19, 2009. It’s the only article I published that year, and it was an experiment. It was also, admittedly, a little self serving, since I’m a non-investor director on the board of StorMagic. Upon analysis, the results of the posting were pretty good. It’s been viewed over 3000 times and received a community rating of 4 out of a possible 5. Perhaps it was ranked a little lower, because the article was a little self serving, though defensibly 100% accurate. Given the results, you’d think I might have published more, but I didn’t.

In 2010, I posted 18 articles on Wikibon, and I posted another 6 in 2011, despite an amazing amount of disruptions, which I won’t go into here. So over the almost three years, I’ve posted a total of 25 documents. The total views across all of my documents is almost 48,000, the average number of views is a respectable 1900+ and the average community rating is 4.8+, despite my lower starting point. I guess I’ve improved with age.

The documents were all relatively short (at an average of 525 words, a very quick read) and designed to be actionable. Personally, I think articles are best, when they spark a dialogue or provoke a comment, and I’m sorry to say that the average number of comments per post was just over .7 and more than half had no comments.  That’s something  that’s worth figuring out how to improve.

Reporting on minimums, maximums, totals, and central-tendency are interesting first steps. But they are just that: Reporting. The key now is to get to the next level, and evaluate the impact of article length, keywords, topics and themes on views.  If anyone can suggest an open-source text-analytics tool, I would be very grateful.

Over the next year, I have resolved to write more, measure more, and analyze more. Expect to see more articles published by me on Wikibon, because I like the team, and it’s an easy platform to use. I enjoy the exposure to end-users that Wikibon affords me, and I like the fact that when I publish content there, I know how much it’s being read. I also enjoy the opportunity to have an occasional conversation with an IT industry executive, with whom I have no current business relationship. I’m a curious and rather social guy, so it doesn’t have to all be about my business and potential business opportunities.

I also plan to learn more about the rapidly developing field of data analytics, currently promoted under the term “Big Data,” which is either a subset or super-set of analytics, depending on your point of view.  I’ve always enjoyed mathematics and analysis, but back when I was a math major (along with majors in Physics, Education, and Psychology), about the only opportunity for a B.S. graduate in Mathematics, outside of academia or education, was to become an actuary at an insurance company. Frankly, I didn’t want to spend my life figuring out morbidity rates. But the life of a data scientist, especially when that skill can be applied to making better products and creating more jobs, is significantly more interesting.

Finally, Amy O’Connor tells me that Nokia plans to re-invigorate a local Big Data user group that has been meeting at the Microsoft offices in Waltham. So you can expect to find me there. I’ll post details as soon as I get them. I hope to see you there.

Best wishes for a happy and healthy new year.

I recently attended the TEDxBoston conference. If you’ve never been, I encourage you to go.  This year’s conference was overflowing with both people and ideas.  For me, it’s a vacation from the day-to-day, an opportunity to find new inspiration, and a place to cross-pollinate ideas.  I never go to find more business, but rather to get better at what I do.

One presentation that I found particularly valuable was by Michelle Borkin, who explained her interdisciplinary approach to data visualization.  She brought together professionals from astronomy, who were working on how to get better 3-dimensional  pictures of objects in outer space, with radiologists, who were trying to get better 3-dimensional pictures of organs in the human body.  The results were both beautiful and amazing.

It would be easy to say “Outer space is very different from the human body, so it has no relevance to what I’m doing,” but she took the opposite approach and asked, “What are they doing that is similar to what I’m trying to do, and what can I apply from what they have already learned?” With all of the discussion around data generated on the internet, I kept wondering, during her presentation, what data visualization techniques can be taken from astronomy and radiology and applied to understanding consumers and influence.

After we got past our initial reactions to the “Man up and buy them,” comment from George Crump, at theBDevent last week, we  got down to some practical discussions about how to make OEM agreements work for both parties.  One of the big items that comes up in negotiations is the Source Code Escrow Agreement.  The party that wants to make use of some element of software, the OEM-In, wants to ensure access to the source code from the supplier, the OEM-Out, under a variety of scenarios, including defaults by the OEM-Out.

Of course, since the source code represents the crown jewels of the OEM-Out, they will typically do whatever possible to limit the conditions under which the OEM-In gets access to the source code.  The OEM-In, on the other hand, wants the broadest possible range of triggers for access to the source code.   (more…)

One of the sessions I attended at the New England Area VMware User Group meeting in Newport, Rhode Island last week included a discussion on how to take the internal storage of a VMware ESX host and turn it into a virtualized iSCSI storage appliance.  I happen to believe that the approach has great merit for many smaller IT shops and for remote office environments.  The internal storage of an ESX server, if totally useable and accessible to the ESX host and other ESX servers on the network, is probably the cheapest storage you will ever buy.  What I found particularly interesting about this session, however, was the fact that the presenter downplayed the approach as good enough to experiment with the storage virtualization software, but not good enough to run production applications.  In order to encourage companies to try the software, the developer offers a free 30-day trial, the expiration of which then renders the server unuseable, unless you purchase a permanent license.  While I believe the company has good software, I don’t understand the approach to the market. (more…)

I attended the New England Area VMware User Group meeting in Newport, Rhode Island last week.  It was a great opportunity to see what challenges IT managers are facing, what solutions they are adopting, and what problems remain to be solved.  It was also a good opportunity for me to revisit what I learned many years ago in studying the research of  Clayton Christensen and his concept of Disruptive Innovation.  Two of my clients have what I consider disruptive technologies.  I’ll write about Tek-Tools in this post, and then cover  StorMagic in a subsequent post. 

Tek-Tools offers the Profiler Suite of monitoring, reporting, and forecasting tools for servers, storage, applications, files, and, yes, VMware.  Why is it disruptive? Tek-Tools’ Profiler is easy to install, easy to afford, and easy to use, and it’s “good enough” for the bulk of today’s customers.  It does not overshoot current market requirements.  It gives quick answers to important questions like: How much storage do I have installed? How fast is it growing?  How much is allocated? How much is used? When will I need more storage? Where is my performance bottleneck? How old is my data? Who is violating data retention policies? Which virtual machines are using which storage? Which virtual machines are no longer in use? Which physical machines could I consolidate onto a  VMware ESX host, without encountering performance issues? Where is my orphaned storage? (That’s a technical term that means I deleted the virtual machine, but forgot to return the allocated storage to the storage pool.)  

(more…)

One of the things we used to discuss, when I was running the storage research practice at IDC, was “When will a market disappear and just become a feature of some larger market?”  Examples are numerous.  Remember when there was a market for browser software? And, while NetApp is going strong, both Microsoft and Sun Microsystems are trying to make NAS a feature of the operating system.

One of the reasons I joined the board of StorMagic was that I saw the potential for the company to be a market disruptor.  Today, StorMagic announced SvSAN software, which, when installed on a VMware ESX server, converts the internal storage of the ESX server into an iSCSI SAN.  VMware leverages the fact that most single applications don’t need all the computing power of today’s servers.  SvSAN leverages that same fact to provide the storage management function within the ESX server, and also takes advantage of the fact that the internal storage capacity of an ESX server, perhaps the least expensive storage you will ever purchase, is more than enough capacity for a large number of VMware ESX server-hosted applications.  (more…)

I was describing to my rather-precocious, thirteen-year-old son the problem that companies have of getting the word out.  As part of “Career Week” at his school (five different jobs for five days at the end of the school year), my son decided he would make a stop-motion Lego video for Tek-Tools, one of my clients, to promote the company.   I told him that, if it was good enough, I would show it to the CEO, and maybe he would use it.  Little did I know that my son was going to, upon completion, post the video on YouTube.  But he did.  Without permission.  And my wife asked me, once again, “Why don’t we have more controls on his computer?” 

Ken Barth, the CEO of Tek-Tools,  was our first client at Walden Technology Partners.  A lot of people in the computer storage industry know him, and beyond the fact that he has been successful in everything that he has done, everyone who meets him says the same thing: “He’s a great guy.”  Ken’s company provides a superb solution for reporting, monitoring, forecasting, and profiling IT infrastructure.  It’s easy to install, easy to use, and provides immediate value.  What could be better?   (more…)

I recently had the pleasure of reading a draft of Dave Hitz’ new book (title intentionally withheld, so as not to play the spoiler).  Dave is one of the co-founders of NetApp (nee’ Network Appliance), and he wrote the book, at least in part, to give current NetApp employees a view into the early days of the company.  At recent growth rates, I suspect that substantially more than half of the employees have been with the company fewer than five years and missed not only the startup days, but the turnaround days, post-2001. (more…)

Next Page »